Splunk – Sort Command ”; Previous Next The sort command sorts all the results by specified fields. The missing fields are treated as having the smallest or largest possible value of that field if the order is descending or ascending, respectively. If the first argument to the sort command is a number, then at most that many results are returned, in order. If no number is specified, the default limit of 10000 is used. If the number 0 is specified, all of the results are returned. Sorting By Field Types We can assign specific data type for the fields being searched. The existing data type in the Splunk dataset may be different than the data type we enforce in the search query. In the below example, we sort the status field as numeric in ascending order. Also, the field named url is searched as a string and the negative sign indicates descending order of sorting. Sorting up to a Limit We can also specify the number of results that will be sorted instead of the entire search result. The below search result shows the sorting of only 50 events with status as ascending and url as descending. Using Reverse We can toggle the result of an entire search query by using the reverse clause. It is useful to use the existing query without altering and reversing the sort result as and when needed. Print Page Previous Next Advertisements ”;
Category: splunk
Splunk – Basic Chart
Splunk – Basic Chart ”; Previous Next Splunk has great visualization features which shows a variety of charts. These charts are created from the results of a search query where appropriate functions are used to give numerical outputs. For example, if we look for the average file size in bytes from the data set named web_applications, we can see the result in the statistics tab as shown below − Creating Charts In order to create a basic chart, we first ensure that the data is visible in the statistics tab as shown above. Then we click on the Visualization tab to get the corresponding chart. The above data produces a pie chart by default as shown below. Changing the Chart Type We can change the chart type by selecting a different chart option from the chart name. Clicking on one of these options will produce the chart for that type of graph. Formatting a Chart The charts can also be formatted by using the Format option. This option allows to set the values for the axes, set the legends or show the data values in the chart. In the below example, we have chosen the horizontal chart and selected the option to show the data values as a Format option. Print Page Previous Next Advertisements ”;
Splunk – Lookups
Splunk – Lookups ”; Previous Next In the result of a search query, we sometimes get values which may not clearly convey the meaning of the field. For example, we may get a field which lists the value of product id as a numeric result. These numbers will not give us any idea of what kind of product it is. But if we list the product name along with the product id, that gives us a good report where we understand the meaning of the search result. Such linking of values of one field to a field with same name in another dataset using equal values from both the data sets is called a lookup process. The advantage is, we retrieve the related values from two different data sets. Steps to Create and Use Lookup File In order to successfully create a lookup field in a dataset, we need to follow the below steps − Create Lookup File We consider the dataset with host as web_application, and look at the productid field. This field is just a number, but we want product names to be reflected in our query result set. We create a lookup file with the following details. Here, we have kept the name of the first field as productid which is same as the field we are going to use from the dataset. productId,productdescription WC-SH-G04,Tablets DB-SG-G01,PCs DC-SG-G02,MobilePhones SC-MG-G10,Wearables WSC-MG-G10,Usb Light GT-SC-G01,Battery SF-BVS-G01,Hard Drive Add the Lookup File Next, we add the lookup file to Splunk environment by using the Settings screens as shown below − After selecting the Lookups, we are presented with a screen to create and configure lookup. We select lookup table files as shown below. We browse to select the file productidvals.csv as our lookup file to be uploaded and select search as our destination app. We also keep the same destination file name. On clicking the save button, the file gets saved to the Splunk repository as a lookup file. Create Lookup Definitions For a search query to be able to lookup values from the Lookup file we just uploaded above, we need to create a lookup definition. We do this by again going to Settings → Lookups → Lookup Definition → Add New . Next, we check the availability of the lookup definition we added by going to Settings → Lookups → Lookup Definition . Selecting Lookup Field Next, we need to select the lookup field for our search query. This is done my going to New search → All Fields . Then check the box for productid which will automatically add the productdescription field from the lookup file also. Using the Lookup Field Now we use the Lookup field in the search query as shown below. The visualization shows the result with productdescription field instead of productid. Print Page Previous Next Advertisements ”;
Splunk – Knowledge Management ”; Previous Next Splunk knowledge management is about maintenance of knowledge objects for a Splunk Enterprise implementation. Below are the main features of knowledge management − Ensure that knowledge objects are being shared and used by the right groups of people in the organization. Normalize event data by implementing knowledge object naming conventions and retiring duplicate or obsolete objects. Oversee strategies for improved search and pivot performance (report acceleration, data model acceleration, summary indexing, batch mode search). Build data models for Pivot users. Knowledge Object It is a Splunk object to get specific information about your data. When you create a knowledge object, you can keep it private or you can share it with other users. The examples of knowledge object are: saved searches, tags, field extractions, lookups, etc. Uses of Knowledge Objects On using the Splunk software, the knowledge objects are created and saved. But they may contain duplicate information, or they may not be used effectively by all the intended audience. To address such issues, we need to manage these objects. This is done by classifying them properly and then using proper permission management to handle them. Below are the uses and classification of various knowledge objects − Fields and field extractions Fields and field extractions is the first layer of Splunk software knowledge. The fields automatically extracted from the Splunk software from the IT data help bring meaning to the raw data. The manually extracted fields expand and improve upon this layer of meaning. Event types and transactions Use event types and transactions to group together interesting sets of similar events. Event types group together sets of events discovered through searches. Transactions are collections of conceptually-related events that span time. Lookups and workflow actions Lookups and workflow actions are categories of knowledge objects that extend the usefulness of your data in various ways. Field lookups enable you to add fields to your data from external data sources such as static tables (CSV files) or Python-based commands. Workflow actions enable interactions between fields in your data and other applications or web resources, such as a WHOIS lookup on a field containing an IP address. Tags and aliases Tags and aliases are used to manage and normalize sets of field information. You can use tags and aliases to group sets of related field values together, and to give extracted field tags that reflect different aspects of their identity. For example, you can group events from set of hosts in a particular location (such as a building or city) together by giving the same tag to each host. If you have two different sources using different field names to refer to same data, then you can normalize your data by using aliases (by aliasing clientip to ipaddress, for example). Data models Data models are representations of one or more datasets, and they drive the Pivot tool, enabling Pivot users to quickly generate useful tables, complex visualizations, and robust reports without needing to interact with the Splunk software search language. Data models are designed by knowledge managers who fully understand the format and semantics of their indexed data. A typical data model makes use of other knowledge object types. We will discuss some of the examples of these knowledge objects in the subsequent chapters. Print Page Previous Next Advertisements ”;
Splunk – Event Types
Splunk – Event Types ”; Previous Next In Splunk search, we can design our own events from a dataset based on certain criteria. For example, we search for only the events which have a http status code of 200. This event now can be saved as an event type with a user defined name as status200 and use this event name as part of future searches. In short, an event type represents a search that returns a specific type of event or a useful collection of events. Every event that can be returned by the search gets an association with that event type. Creating Event Type There are two ways to create an event type after we have decided the search criteria. One is to run a search and then save it as an Event Type. Another is to add a new Event Type from the settings tab. We will see both the ways of creating it in this section. Using a Search Consider the search for the events which have the criteria of successful http status value of 200 and the event type run on a Wednesday. After running the search query, we can choose Save As option to save the query as an Event Type. The next screen prompts to give a name for the Event Type, choose a Tag which is optional and then choose a colour with which the events will be highlighted. The priority option decides which event type will be displayed first in case two or more event types match the same event. Finally, we can see the Event Type has been created by going to the Settings → Event Types option. Using New Event Type The other option to create a new Event Type is to use the Settings → Event Types option as shown below where we can add a new Event Type − On clicking the button New Event Type we get the following screen to add the same query as in the previous section. Viewing the Event Type To view the event we just created above, we can write the below search query in the search box and we can see the resulting events along with the colour we have chosen for the event type. Using the Event Type We can use the Event type along with other queries. Here we specify some partial criteria from the Event Type and the result is a mix of events which shows the coloured and non-coloured events in the result. Print Page Previous Next Advertisements ”;
Splunk – Dashboards
Splunk – Dashboards ”; Previous Next A dashboard is used to represent tables or charts which are related to some business meaning. It is done through panels. The panels in a dashboard hold the chart or summarized data in a visually appealing manner. We can add multiple panels, and hence multiple reports and charts to the same dashboard. Creating Dashboard We will continue with the search query from the previous chapter which shows the count of files by week days. We choose the Visualization tab to see the result as a pie chart. To put the chart on a dashboard, we can choose the option Save As → Dashboard Panel as shown below. The next screen will ask for fillings the details of the dashboard and the panel in it. We fill the screen with details as shown below. On clicking on Save button, the next screen gives an option to view dashboard. On choosing to view dashboard, we get the following output where we can see the dashboard and options to edit, export or delete. Adding Panel to Dashboard We can add a second chart to the dashboard by adding a new panel containing the chart. Below is the bar chart and its query which we are going to add to the above dashboard. Next, we fill up the details for the second chart and click Save as shown in the below image − Finally, we get the dashboard which contains both the charts in two different panels. As you can see in the image below, we can edit the dashboard to add more panels and you can add more input elements: Text, Radio and Dropdown buttons to create more sophisticated dashboards. Print Page Previous Next Advertisements ”;
Splunk – Tags
Splunk – Tags ”; Previous Next Tags are used to assign names to specific field and value combinations. These fields can be event type, host, source, or source type, etc. You can also use a tag to group a set of field values together, so that you can search for them with one command. For example, you can tag all the different files generated on Monday to a tag named mon_files. To find the field-value pair which we are going to tag, we need to expand the events and locate the field to be considered. The below image shows how we can expand an event to see the fields − Creating Tags We can create tags by adding the tag value to field-value pair using Edit Tags option as shown below. We choose the field under the Actions column. The next screen prompts us to define the tag. For the Status field, we choose the status value of 503 or 505 and assign a tag named server_error as shown below. We have to do it one by one by choosing two events, each with the events with status value 503 and 505. The image below shows the method for status value as 503. We have to repeat the same steps for an event with status value as 505. Searching Using Tags Once the tags are created, we can search for events containing the Tag by simply writing the Tag name in the search bar. In the below image, we see all the events which have status: 503 or 505. Print Page Previous Next Advertisements ”;
Splunk – Search Macros
Splunk – Search Macros ”; Previous Next Search macros are reusable blocks of Search Processing Language (SPL) that you can insert into other searches. They are used when you want to use the same search logic on different parts or values in the data set dynamically. They can take arguments dynamically and the search result will be updated as per the new values. Macro Creation To create the search macro, we go to the settings → Advanced Search → Search macros → Add new. This brings up the below screen where we start creating the macro. Macro Scenario We want to show various stats about the file size from the web_applications log. The stats are about max, min and avg value of the filesize using the bytes field in the log. The result should display these stats for each file listed in the log. So here the type of the stats is dynamic in nature. The name of the stats function will be passed as an argument to the macro. Defining the Macro Next, we define the macro by setting various properties as shown in the below screen. The name of the macro contains (1), indicating that there is one argument to be passed into the macro when it is used in the search string. fun is the argument which will be passed on to the macro during execution in the search query. Using the Macro To use the macro, we make it a part of the search string. On passing different values for the argument we see different results as expected. Consider finding the average size in bytes of the files. We pass avg as the argument and get the result as shown below. The macro has been kept under ` sign as part of the search query. Similarly, if we want the maximum file size for each of the files present in the log, then we use max as the argument. The result is as shown below. Print Page Previous Next Advertisements ”;
Splunk – Data Ingestion
Splunk – Data Ingestion ”; Previous Next Data ingestion in Splunk happens through the Add Data feature which is part of the search and reporting app. After logging in, the Splunk interface home screen shows the Add Data icon as shown below. On clicking this button, we are presented with the screen to select the source and format of the data we plan to push to Splunk for analysis. Gathering The Data We can get the data for analysis from the Official Website of Splunk. Save this file and unzip it in your local drive. On opening the folder, you can find three files which have different formats. They are the log data generated by some web apps. We can also gather another set of data provided by Splunk which is available at from the Official Splunk webpage. We will use data from both these sets for understanding the working of various features of Splunk. Uploading data Next, we choose the file, secure.log from the folder, mailsv which we have kept in our local system as mentioned in the previous paragraph. After selecting the file, we move to next step using the green coloured next button in the top right corner. Selecting Source Type Splunk has an in-built feature to detect the type of the data being ingested. It also gives the user an option to choose a different data type than the chosen by Splunk. On clicking the source type drop down, we can see various data types that Splunk can ingest and enable for searching. In the current example given below, we choose the default source type. Input Settings In this step of data ingestion, we configure the host name from which the data is being ingested. Following are the options to choose from, for the host name − Constant value It is the complete host name where the source data resides. regex on path When you want to extract the host name with a regular expression. Then enter the regex for the host you want to extract in the Regular expression field. segment in path When you want to extract the host name from a segment in your data source”s path, enter the segment number in the Segment number field. For example, if the path to the source is /var/log/ and you want the third segment (the host server name) to be the host value, enter “3”. Next, we choose the index type to be created on the input data for searching. We choose the default index strategy. The summary index only creates summary of the data through aggregation and creates index on it while the history index is for storing the search history. It is clearly depicted in the image below − Review Settings After clicking on the next button, we see a summary of the settings we have chosen. We review it and choose Next to finish the uploading of data. On finishing the load, the below screen appears which shows the successful data ingestion and further possible actions we can take on the data. Print Page Previous Next Advertisements ”;
Splunk – Field Searching
Splunk – Field Searching ”; Previous Next When Splunk reads the uploaded machine data, it interprets the data and divides it into many fields which represent a single logical fact about the entire data record. For example, a single record of information may contain server name, timestamp of the event, type of the event being logged whether login attempt or a http response, etc. Even in case of unstructured data, Splunk tries to divide the fields into key value pairs or separate them based on the data types they have, numeric and string, etc. Continuing with the data uploaded in the previous chapter, we can see the fields from the secure.log file by clicking on the show fields link which will open up the following screen. We can notice the fields Splunk has generated from this log file. Choosing the Fields We can choose what fields to be displayed by selecting or unselecting the fields from the list of all fields. Clicking on all fields opens a window showing the list of all the fields. Some of these fields have check marks against them showing they are already selected. We can use the check boxes to choose our fields for display. Besides the name of the field, it displays the number of distinct values the fields have, its data type and what percentage of events this field is present in. Field Summary Very detailed stats for every selected field become available by clicking on the name of the field. It shows all the distinct values for the field, their count and their percentages. Using Fields in Search The field names can also be inserted into the search box along with the specific values for the search. In the below example, we aim to find all the records for the date, 15th Oct for the host named mailsecure_log. We get the result for this specific date. Print Page Previous Next Advertisements ”;