Author Archives: Raghavendar T S

Query Elasticsearch Cluster in Filter Section when using Logstash

     This article details about on how to query Elasticsearch from Logstash when processing logs or when using any other inputs.

input {   
     	file {
     	}
     #Any other inputs
}

filter {
     	grok {
          		pattern => "%{NUMBER:attribute1} %{GREEDYDATA:attribute2}" 		
     	}

     elasticsearch {
          		hosts => ["HOSTNAME:9200/index_name/type_name"]
          		query => 'field_name.sub_field:%{attribute1}'
          		fields => ["attribute3_elasticsearch", "attribute3_new","attribute4_elasticsearch", "attribute4_new"] 
          #We can any number of fields
          		sort => "attribute3_new:desc"
     }
}

output {
     	stdout { 
		          codec => rubydebug 
     	}
}

     The elasticsearch filters can even take just the hostname and port. If you are using just hostname and port, make sure that only one index and one type exist in elasticsearch cluster of that host. It is better to give the index and type name along with the hostname and port. We can also give multiple hosts to query multiple clusters.

Related Links :

Web Application for Elasticsearch :
  1. ElasticTab – Elasticsearch to Excel Report (Web Application)
Elasticsearch Plugin:
  1. Elasticsearch Plugin To Generate (Save and E-Mail) Excel Reports
Elasticsearch:
  1. Execute Multiple Search Query in Elasticsearch
  2. Monitor Elasticsearch Servers with Shell Script - E-Mail Notification
  3. Execute Raw Elasticsearch Query using Transport Client – Java API
  4. Elasticsearch – Apply Nested Filter on Nested (Inner) Aggregation
  5. Execute Multiple Search Query in Elasticsearch
  6. Enable CORS to Send Cross Domain Request to Elasticsearch using AJAX
  7. Elasticsearch Java API – Get Index List
  8. Elasticsearch Java API – Get Alias List
  9. Elasticsearch Java API - Get Type List from given Index
  10. Elasticsearch Java API – Get Field List for a given Index and Type
  11. Elasticsearch Java API – Get Index Type List Mapping
  12. Elasticsearch – Use Script Filter/Conditon in Aggregation/Sub-Aggreagtion
  13. Elasticsearch – Compare/ScriptFilter/Condition on Two Fields using Script Filter – REST Query + Java API
  14. Elasticsearch - Date/Time(String)  Add/Subtract Duration - Days,Months,Years,Hours,Minutes,Seconds
  15. Elasticsearch Geo-Shape Slow Indexing Performance - Solved
  16. Chrome Elasticsearch Sense Not Working – Solved
Logstash:
  1. Logstash – Process Log File Once and Exit/Stop Logstash After Reading Log File Once
  2. Measure Logstash Performance using Metrics Filter – Issue/Error in Syntax (Unknown setting ‘message’ for stdout)
  3. Logstash – Process Same Log File (File Input) from Beginning/Start
  4. Create Custom Filter/Plugin to Emit New Events Manually in Logstash
Logstash and Elasticsearch:
  1. Query Elasticsearch Cluster in Filter Section when using Logstash
  2. Custom Elasticsearch Index Name/Type Name based on Events in Logstash
MongoDB and Elasticsearch:
  1. Import Data from Mongo DB to Elasticsearch using Elasticsearch River
 Read More...
Custom Elasticsearch Index Name/Type Name based on Events in Logstash

     The article details on how to use custom Index/Type name when using Logstash for proessing logs and Elasticsearch as Output.

input {   
     	file {
     	}
}

filter {
     	grok {
          		pattern => "%{NUMBER:attribute1} %{GREEDYDATA:attribute2}" 		
     	}

     ruby {
          		code => '				
               event["custom_index_name"] = event["attribute1"]				
               event["custom_type_name"]  = event["attribute2"]
                  '
          #You can also set an field or modify an existing field to suit your needs
     	}
}

output {
     elasticsearch { 
          		host => localhost
          		index => "index_name_%{custom_index_name}"
          index_type => "type_name_%{custom_type_name}"
          #You can also use index => "index_name_%{attribute1}" if you do not want to modify the field name
          #Other Configuration
     	}
     	stdout { 
		          codec => rubydebug 
     	}
}

     This feature can also be used when using Custom Filters to emit events manually.

Related Links :

Web Application for Elasticsearch :
  1. ElasticTab – Elasticsearch to Excel Report (Web Application)
Elasticsearch Plugin:
  1. Elasticsearch Plugin To Generate (Save and E-Mail) Excel Reports
Elasticsearch:
  1. Execute Multiple Search Query in Elasticsearch
  2. Monitor Elasticsearch Servers with Shell Script - E-Mail Notification
  3. Execute Raw Elasticsearch Query using Transport Client – Java API
  4. Elasticsearch – Apply Nested Filter on Nested (Inner) Aggregation
  5. Execute Multiple Search Query in Elasticsearch
  6. Enable CORS to Send Cross Domain Request to Elasticsearch using AJAX
  7. Elasticsearch Java API – Get Index List
  8. Elasticsearch Java API – Get Alias List
  9. Elasticsearch Java API - Get Type List from given Index
  10. Elasticsearch Java API – Get Field List for a given Index and Type
  11. Elasticsearch Java API – Get Index Type List Mapping
  12. Elasticsearch – Use Script Filter/Conditon in Aggregation/Sub-Aggreagtion
  13. Elasticsearch – Compare/ScriptFilter/Condition on Two Fields using Script Filter – REST Query + Java API
  14. Elasticsearch - Date/Time(String)  Add/Subtract Duration - Days,Months,Years,Hours,Minutes,Seconds
  15. Elasticsearch Geo-Shape Slow Indexing Performance - Solved
  16. Chrome Elasticsearch Sense Not Working – Solved
Logstash:
  1. Logstash – Process Log File Once and Exit/Stop Logstash After Reading Log File Once
  2. Measure Logstash Performance using Metrics Filter – Issue/Error in Syntax (Unknown setting ‘message’ for stdout)
  3. Logstash – Process Same Log File (File Input) from Beginning/Start
  4. Create Custom Filter/Plugin to Emit New Events Manually in Logstash
Logstash and Elasticsearch:
  1. Query Elasticsearch Cluster in Filter Section when using Logstash
  2. Custom Elasticsearch Index Name/Type Name based on Events in Logstash
MongoDB and Elasticsearch:
  1. Import Data from Mongo DB to Elasticsearch using Elasticsearch River
 Read More...
Create Custom Filter/Plugin to Emit New Events Manually in Logstash

     This article details about on how to create new events using custom filters in Logstash. There is already a SPLIT filter available in Logstash which can be used to split a single event into multiple events based on a delimiter. But that will not suit us in all the cases. So below is a sample Logstash configuration and a custom filter to create custom filter to emit events manually in Logstash. 

Logstash Configuration File

input {   
     	file {
     	}
}

filter {
     	grok {
          		pattern => "%{NUMBER:attribute1} %{GREEDYDATA:attribute2}" 		
     	}
     customfilter {}
}

output {
     	stdout { 
		          codec => rubydebug 
     	}
}

     In the filter section of the above configuration, we have GROK filter that reads and parses the input event into two fields namely attribute1 and attribute2. Then the control is moved to customfilter.

Ruby Custom Filter

require "logstash/filters/base"
require "logstash/namespace"
class LogStash::Filters::CustomFilter < LogStash::Filters::Base
config_name "customfilter"
milestone 1
public
def register
end

public
def filter(event)

     	#Read Events
     	attribute1_temp=event["attribute1"]
	     attribute2_temp=event["attribute2"]

     	#Your Business Logic

     	#Create New Event
     	custom_event=LogStash::Event.new()
     	custom_event["attribute1_modified"]=attribute1_temp
     	custom_event["attribute2_modified"]=attribute2_temp

     	#Emit New Event
     	yield custom_event

     	#Cancel the Main Event
     	event.cancel

end
end

     In custom filters, the parsed fields can be read using the syntax event[“attribute_name”]. Then we implement our own business logic and create new events based on the modified input fields. Thus we create a new event and assign custom fields as required and emit the event using the command yield event_name. Note that we can either emit the existing event(event in all the case) or emit the custom created event(custom_event in our case). Also make sure to cancel the existing event if you are emitting the custom_event. The main event which I refer is the event created by the Logstash. We can also use yield command in a loop to create any number of events as per the business requirement.

The customfilter is stored in path LOGSTASH_HOME/lib/logstash/filters/customfilter.rb.

Related Links :

Web Application for Elasticsearch :
  1. ElasticTab – Elasticsearch to Excel Report (Web Application)
Elasticsearch Plugin:
  1. Elasticsearch Plugin To Generate (Save and E-Mail) Excel Reports
Elasticsearch:
  1. Execute Multiple Search Query in Elasticsearch
  2. Monitor Elasticsearch Servers with Shell Script - E-Mail Notification
  3. Execute Raw Elasticsearch Query using Transport Client – Java API
  4. Elasticsearch – Apply Nested Filter on Nested (Inner) Aggregation
  5. Execute Multiple Search Query in Elasticsearch
  6. Enable CORS to Send Cross Domain Request to Elasticsearch using AJAX
  7. Elasticsearch Java API – Get Index List
  8. Elasticsearch Java API – Get Alias List
  9. Elasticsearch Java API - Get Type List from given Index
  10. Elasticsearch Java API – Get Field List for a given Index and Type
  11. Elasticsearch Java API – Get Index Type List Mapping
  12. Elasticsearch – Use Script Filter/Conditon in Aggregation/Sub-Aggreagtion
  13. Elasticsearch – Compare/ScriptFilter/Condition on Two Fields using Script Filter – REST Query + Java API
  14. Elasticsearch - Date/Time(String)  Add/Subtract Duration - Days,Months,Years,Hours,Minutes,Seconds
  15. Elasticsearch Geo-Shape Slow Indexing Performance - Solved
  16. Chrome Elasticsearch Sense Not Working – Solved
Logstash:
  1. Logstash – Process Log File Once and Exit/Stop Logstash After Reading Log File Once
  2. Measure Logstash Performance using Metrics Filter – Issue/Error in Syntax (Unknown setting ‘message’ for stdout)
  3. Logstash – Process Same Log File (File Input) from Beginning/Start
  4. Create Custom Filter/Plugin to Emit New Events Manually in Logstash
Logstash and Elasticsearch:
  1. Query Elasticsearch Cluster in Filter Section when using Logstash
  2. Custom Elasticsearch Index Name/Type Name based on Events in Logstash
MongoDB and Elasticsearch:
  1. Import Data from Mongo DB to Elasticsearch using Elasticsearch River
 Read More...

 

Milliarcsecond to Degree Conversion and Degree (Decimal Format) to DMS (degree/minute/second) for Latitude/Longitude Conversion

Milliarcsecond to Degree (Latitude and Longitude)

Degree (for both Latitude and Longitude) = Milliarcsecond  value/3600000

 

Milliarcsecond  Range

Latitude Range   :   -324,000,000 to +324,000,000

Longitude Range:   -648,000,000 to +648,000,000

Degree Range (Decimal Format)

Latitude Range   :   -90 to +90

Longitude Range:   -180 to +180


Degree (Decimal Format) to DMS (degree/minute/second) Format

Degree=int(Degree)
Minute=int(mod(Degree*60,60))
Second=round(mod(Degree*3600,60))

Latitude :

Positive Value   – North (N)
Negative Value – South (S)

Longitude :

Positive Value   – East (E)
Negative Value – West (W)

Access Local Files or Static Content (Images, Videos, Media or any other files outside of the Web Application(WAR)) in the System using APACHE TOMCAT Server

This article details on how to access local directory files(IMAGES, VIDEOS or any other files) from the web application when using APACHE TOMCAT SERVER. Note that this configuration we implement is independent of the web application that we are using.

STEPS TO FOLLOW

  1. Go to TOMCAT directory
  2. Go to  TOMCAT_DIRECTORY/conf/server.xml
  3. Add the following content in server.xml
<Context docBase=”F:/myproj” path=”/static” />

4. Full SERVER.XML file

<?xml version=’1.0′ encoding=’utf-8′?>
<Server port=”8005″ shutdown=”SHUTDOWN”>
<Listener className=”org.apache.catalina.core.AprLifecycleListener” SSLEngine=”on” />
<Listener className=”org.apache.catalina.core.JasperListener” />
<Listener className=”org.apache.catalina.core.JreMemoryLeakPreventionListener” />
<Listener className=”org.apache.catalina.mbeans.GlobalResourcesLifecycleListener” />
<Listener className=”org.apache.catalina.core.ThreadLocalLeakPreventionListener” />
<GlobalNamingResources>
<Resource name=”UserDatabase” auth=”Container”
type=”org.apache.catalina.UserDatabase”
description=”User database that can be updated and saved”
factory=”org.apache.catalina.users.MemoryUserDatabaseFactory”
pathname=”conf/tomcat-users.xml” />
</GlobalNamingResources>
<Service name=”Catalina”>
<Connector port=”8080″ protocol=”HTTP/1.1″
connectionTimeout=”20000″
redirectPort=”8443″ />
<Connector port=”8009″ protocol=”AJP/1.3″ redirectPort=”8443″ />
<Engine name=”Catalina” defaultHost=”localhost”>
<Realm className=”org.apache.catalina.realm.LockOutRealm”>
<Realm className=”org.apache.catalina.realm.UserDatabaseRealm”
resourceName=”UserDatabase”/>
</Realm>
<Host name=”localhost” appBase=”webapps”
unpackWARs=”true” autoDeploy=”true”>
<Context docBase=”F:/myproj” path=”/static” />
<Valve className=”org.apache.catalina.valves.AccessLogValve” directory=”logs”
prefix=”localhost_access_log.” suffix=”.txt”
pattern=”%h %l %u %t &quot;%r&quot; %s %b” />
</Host>
</Engine>
</Service>
</Server>

 

5. With this configuration, you will be able to access the content in F:/myproj folder from the web application i.e the browser.

6. Consider the TOMCAT SERVER is up and running ( localhost:8080 ).

7. The  contents in F:/myproj can be accessed using the URL localhost:8080/static.

8. Example – Consider we have the following files in F:/myproj 

Files – F:/myproj /test.png and F:/myproj /test1.png

URL to Access  – localhost:8080/static/test.png

The configuration will work for TOMCAT version 7 and above(Any Operating System). This configuration helps us to access static content(even web pages) which are located in the local file system of the server.

USE CASE OF THIS TECHNIQUE

If we have a web application where we will adding new images files to the server. In this case we can either add the required files to the web application, build WAR and deploy it in the server or configure the TOMCAT server so that we can add the required files to configured path and access it without building the WAR or restarting the TOMCATserver.