How to display search results using the Splunk SDK for Java

After you run a search, you can retrieve different output from the search job:

  • Events: The untransformed events of the search.
  • Results: The transformed results of the search after processing has been completed. If the search does not have transforming commands, the results are the same as the events. The result count will be less than the event count if there are transforming commands.
  • Results preview: A preview of a search that is still in progress, or results from a real-time search. When the search is complete, the preview results are the same as the results. You must enable previews for non-real-time searches (previews are enabled automatically for real-time searches).
  • Summary: Summary information about the fields of a search from the results that have been read thus far. Set "status_buckets" on the search job to a positive value to access this data.
  • Timeline: The event distribution over time of the untransformed events that have been read thus far. Set "status_buckets" on the search job to a positive value to access this data.

This output is returned as a stream in XML, JSON, JSON_COLS, JSON_ROWS, CSV, ATOM, or RAW format. For examples, see Sample output in different formats below.

You can display the direct results using standard Java classes or make your own parser. But for convenience, the SDK includes results readers for XML, CSV, and JSON that properly parse and format the results for you, and handle the idiosyncrasies of each output type for each Splunk Enterprise version. For a comparison of XML output displayed using standard Java classes versus the SDK's XML results reader, see Results reader comparison, below.

The search results APIs

Retrieve a search job using the Job class (or for streaming searches, retrieve the stream using the InputStream class). From the search job, you can retrieve events, results, preview results, the summary, and timeline information:

  • The Job.getEvents method retrieves events from a search job. Use the JobEventsArgs class to specify additional arguments to the method.
  • The Job.getResults method retrieves results from a search job. Use the JobResultsArgs class to specify additional arguments to the method.
  • The Job.getResultsPreview method retrieves result previews from a search job. Use the JobResultsPreviewArgs class to specify additional arguments to the method.
  • The Job.getSummary method retrieves summary data from a search job. Use the JobSummaryArgs class to specify additional arguments to the method.
  • The Job.getTimeline method retrieves timeline data from a search job.

Use the following classes to display results with the results readers. For results from an export search, use the multi-results reader to parse the multiple results sets that are returned.

Code examples

This section provides examples of how to use the job APIs, assuming you first connect to a Splunk Enterprise instance:

The following parameters are available:

To display results without a reader

You can use the basic built-in capabilities of Java to read results streams in different output formats, or roll your own reader.

This example shows how to set the output mode and display a simple XML results stream using standard Java classes. No need to set XML as the output mode explicitly—XML is the default.

// Create a simple search job
String mySearch = "search * | head 5";
Job job = service.getJobs().create(mySearch);

// Wait for the job to finish
while (!job.isDone()) {
    Thread.sleep(500);
}

// Display results
InputStream results = job.getResults();
String line = null;
System.out.println("Results from the search job as XML:\n");
BufferedReader br = new BufferedReader(new InputStreamReader(results, "UTF-8"));
while ((line = br.readLine()) != null) {
    System.out.println(line);
}
br.close();

To display results using a results reader

For convenience, this SDK includes results readers for XML, JSON, and CSV that parse and format results for you, and handle the idiosyncrasies of each output type for each Splunk Enterprise version:

  • Use the ResultsReaderXml class for XML, which is the default format. The XML results reader is included in the splunk-*.jar file.
  • Use the ResultsReaderJson class for JSON and add /splunk-sdk-java/dist/gson-2.1.jar to your build path.
  • Use the ResultsReaderCsv class for CSV and add /splunk-sdk-java/dist/opencsv-2.3.jar to your build path.

For more about building these jar files, see Installation.

This example shows how to set the output mode to JSON and display the results stream using the ResultsReaderJson class:

// Create a simple search job
String mySearch = "search * | head 5";
Job job = service.getJobs().create(mySearch);

// Wait for the job to finish
while (!job.isDone()) {
    Thread.sleep(500);
}

// Specify JSON as the output mode for results
JobResultsArgs resultsArgs = new JobResultsArgs();
resultsArgs.setOutputMode(JobResultsArgs.OutputMode.JSON);

// Display results in JSON using ResultsReaderJson
InputStream results = job.getResults(resultsArgs);
ResultsReaderJson resultsReader = new ResultsReaderJson(results);
HashMap<String, String> event;
System.out.println("\nFormatted results from the search job as JSON\n");
while ((event = resultsReader.getNextEvent()) != null) {
    for (String key: event.keySet())
        System.out.println("   " + key + ":  " + event.get(key));
}
resultsReader.close();

ResultsReader or MultiResultsReader?

If you find it confusing as to whether you should use a ResultsReader or a MultiResultsReader, this section is for you.

You should only use a MultiResultsReader if you're running an export search and you want to retrieve preview results. The rest of the time, you should use a ResultsReader. Read on for the benefits of each reader type.

Use a ResultsReader...

  • With normal, blocking, oneshot, and real-time search results.
  • When you don't need to access preview events. (This reader skips them, unless the search is run in normal mode with previews enabled. For more information, see To display preview results.)
  • When working with search jobs (versus searches that return a stream).
  • When you want to return results in comma-separated values (CSV) format. (Both readers support JSON and XML formats.)

Use a MultiResultsReader...

  • With export search results. (ResultsReader will also work with export search results, but will not allow you to access preview events.)
  • When you need to access preview events. (The exception is when running a normal mode search with previews enabled. In this situation, you can use the ResultsReader. For more information, see To display preview results.)
  • When working with searches that result in a stream, not a search job.
  • With searches with a very large data set.

To paginate through a large set of results

The maximum number of results you can retrieve at a time from your search results is determined by the maxresultrows field, which is specified in a Splunk Enterprise configuration file. Here's a quick way to find out what your system setting is:

// Find out how many results your system is configured to return
Entity restApi = service.getConfs().get("limits").get("restapi");
int maxResults = Integer.parseInt((String)restApi.get("maxresultrows"));
System.out.println("Your system is configured to return a maximum of " + maxResults + " results");

However, we don't recommend changing the default value of 50,000. If your job has more results than this limit, you can retrieve your results in sets (0-49999, then 50000-99999, and so on), using the "count" and "offset" parameters to define how many results to retrieve at a time:

  1. Set the count to a value up to the size of maxresultrows to define the number of results in a set.
  2. Retrieve this set of results.
  3. Increment the offset by the count to retrieve the next set.
  4. Repeat until you've retrieved all of your results.

The example below shows how to retrieve 100 search results in sets of 10, and uses the ResultsReaderXml class to parse and format the results.

// Create a simple job that returns 100 results
String mySearch = "search * | head 100";
Job job = service.getJobs().create(mySearch);

// Wait for the job to finish
while (!job.isDone()) {
    Thread.sleep(500);
}

// Page through results by looping through sets of results
int resultCount = job.getResultCount(); // Number of results this job returned
int x = 0;          // Result counter
int offset = 0;     // Start at result 0
int count = 10;     // Get sets of 10 results at a time                    

// Loop through each set of results
while (offset < resultCount) {
    CollectionArgs outputArgs = new CollectionArgs();
    outputArgs.setCount(count);
    outputArgs.setOffset(offset);

    // Get the search results and display them
    InputStream results = job.getResults(outputArgs);
    ResultsReaderXml resultsReader = new ResultsReaderXml(results);
    HashMap<String, String> event;
   
    while ((event = resultsReader.getNextEvent()) != null) {
        System.out.println("\n***** RESULT " + x++ + " *****\n");
        for (String key: event.keySet())
            System.out.println("   " + key + ":  " + event.get(key));
    }
    resultsReader.close();
   
    // Increase the offset to get the next set of results
    offset = offset + count;
};

Another option when working with a very large data set is to use an export search, where data is streamed directly from a search rather than saved to the server as a search job. For more, see To run an export search and To work with results from an export search.

To display preview results

You can display a preview of the results of a search that is in progress as long as a couple conditions are met:

  • The search must be run in normal execution mode ("exec_mode" is "normal"). Previews aren't available for blocking searches (the search job ID is not available until the search is done) or streaming searches (results are returned as they become available anyway.)
  • Previews must be enabled ("preview" is "1"). By default, previews are only enabled for real-time searches, and searches with "status_buckets" set to a positive value. Use the Job.enablePreview method to enable previews for an existing search job.

To display the previews, run the search, enable previews for the search job, then retrieve the preview results from it. By default, the most recent 100 previews are retrieved. To change this number, set a value for "count". Use the "offset" value to page through large sets of previews.

The following example runs a normal search, enables preview for the search job, and then displays preview results while the search runs. For an example of a real-time search, see To run a real-time search.

// Set up the job properties
String mySearch = "search * | head 50000";

// Create an argument map for the job arguments:
JobArgs jobArgs = new JobArgs();
jobArgs.setExecutionMode(JobArgs.ExecutionMode.NORMAL);

// Create the job
Job job = service.search(mySearch, jobArgs);
job.enablePreview();
job.update();

// Wait for the job to be ready
while (!job.isReady()) {
    Thread.sleep(500);
}

// Display previews using the built-in XML parser 
int countPreview=0;  // count the number of previews displayed
int countBatch=0;    // count the number of times previews are retrieved
while (!job.isDone()) {
    JobResultsPreviewArgs previewargs = new JobResultsPreviewArgs();
    previewargs.setCount(500);  // Get 500 previews at a time
    previewargs.setOutputMode(JobResultsPreviewArgs.OutputMode.XML);

    InputStream results =  job.getResultsPreview(previewargs);
    ResultsReaderXml resultsReader = new ResultsReaderXml(results);
    HashMap<String, String> event;
    while ((event = resultsReader.getNextEvent()) != null) {
        System.out.println("BATCH " + countBatch + "\nPREVIEW " + countPreview++ + " ********");
        for (String key: event.keySet())
            System.out.println("   " + key + ":  " + event.get(key));
    }
    countBatch++;
    resultsReader.close();
}
System.out.println("Job is done with " + job.getResultCount() + " results");

To work with results from an export search

Working with search results from export searches is a little different than that of regular searches:

  • A reporting (transforming) search returns a set of previews followed by the final events, each as separate elements.
  • A non-reporting (non-transforming) search returns events as they are read from the index, each as separate elements.
  • A real-time search returns multiple sets of previews, each preview as a separate element.
  • For JSON output, each result set is not returned as a single JSON object, but rather each row is an individual object, where rows are separated by a new line and the last row of the set is indicated by "lastrow":true.

  • Here's sample JSON output that shows two results sets, each with five rows:
    {"preview":true,"offset":0,"result":{"sourcetype":"eventgen-2","count":"58509"}}
    {"preview":true,"offset":1,"result":{"sourcetype":"splunk_web_service","count":"119"}}
    {"preview":true,"offset":2,"result":{"sourcetype":"splunkd","count":"4153"}}
    {"preview":true,"offset":3,"result":{"sourcetype":"splunkd_access","count":"12"}}
    {"preview":true,"offset":4,"lastrow":true,"result":{"sourcetype":"splunkd_stderr","count":"2"}}
    {"preview":true,"offset":0,"result":{"sourcetype":"eventgen-2","count":"60886"}}
    {"preview":true,"offset":1,"result":{"sourcetype":"splunk_web_service","count":"119"}}
    {"preview":true,"offset":2,"result":{"sourcetype":"splunkd","count":"4280"}}
    {"preview":true,"offset":3,"result":{"sourcetype":"splunkd_access","count":"12"}}
    {"preview":true,"offset":4,"lastrow":true,"result":{"sourcetype":"splunkd_stderr","count":"2"}}
    

    This format allows results to be sent as a continuous stream of JSON data that is still easy to parse.

So, we recommend using the SDK's multi-results readers to parse the output—we've already done some of the heavy lifting here, and these results readers handle the output appropriately. Display the results stream in XML or JSON, and use one of the MultiResultsReader classes to parse and format the results.

This example below shows how to display a real-time export search with XML output, using a 30-second time range:

// Set up the job properties
String mySearch = "search index=_internal | head 10";

// Set up a real-time export with a 30-second window
JobExportArgs jobArgs = new JobExportArgs();
jobArgs.setSearchMode(JobExportArgs.SearchMode.REALTIME);
jobArgs.setEarliestTime("rt-30s");
jobArgs.setLatestTime("rt");
jobArgs.setOutputMode(JobExportArgs.OutputMode.XML);

// Create the job
InputStream exportStream = service.export(mySearch, jobArgs);

//Display previews
MultiResultsReaderXml multiResultsReader =
        new MultiResultsReaderXml(exportStream);

int counterSet = 0;  // count the number of results sets
for (SearchResults searchResults : multiResultsReader)
{
    System.out.println("Result set " + counterSet++ + " ********");
    int counterEvent = 0;  // count the number of events in each set
    for (Event event : searchResults) {
        System.out.println("Event " + counterEvent++ + " --------");
        for (String key: event.keySet())
            System.out.println("   " + key + ":  " + event.get(key));
    }
}

multiResultsReader.close();

This next example shows how to display previews from a normal export search. This search is useful when you are performing a reporting (transforming) search on a large data set and you want to view previews while the search runs.

// Set up the job properties
String mySearch = "search * | stats count by host";

JobExportArgs jobArgs = new JobExportArgs();
jobArgs.setOutputMode(JobExportArgs.OutputMode.XML);

// Create the job
InputStream stream = service.export(mySearch, jobArgs);

//Display previews
MultiResultsReaderXml multiResultsReader =
        new MultiResultsReaderXml(stream);

int counterSet = 0;
for (SearchResults searchResults : multiResultsReader)
{
    // Display whether the results is a preview (search in progress) or 
    // final (search is finished)
    String resultSetType = searchResults.isPreview() ? "Preview":"Final";
    System.out.println(resultSetType + " result set " + counterSet++ + " ********");
    int counterEvent = 0;
    for (Event event : searchResults) {
        System.out.println("Event " + counterEvent++ + " --------");
        for (String key: event.keySet())
            System.out.println("   " + key + ":  " + event.get(key));
    }
}

multiResultsReader.close();

For an example showing a simple export search, see To run an export search.

Sample output in different formats

The following is sample output in different formats for the search "search index=_internal | head 1":

***** ATOM *****

<?xml version="1.0" encoding="UTF-8"?>
<!--This is to override browser formatting; see server.conf[httpServer] to disable. . . .-->
<?xml-stylesheet type="text/xml" href="/static/atom.xsl"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:s="http://dev.splunk.com/ns/rest">
  <title>Search Results</title>
  <id>/services/search/jobs/1364517522.90</id>
  <updated>2013-03-28T17:38:43-07:00</updated>
  <generator build="143156" version="5.0.1"/>
  <author>
    <name>Splunk</name>
  </author>
  <entry>
    <title>Result Offset 0</title>
    <id>/services/search/jobs/1364517522.90/results?output_mode=atom&count=1&offset=0</id>
    <updated>2013-03-28T17:38:43-07:00</updated>
    <link href="/services/search/jobs/1364517522.90/results?output_mode=atom&count=1&offset=0" rel="alternate"/>
    <content type="text/xml">
      <s:dict>
        <s:key name="_bkt">_internal~43~6C9BAA0F-A852-421C-8399-630009F9FFFF</s:key>
        <s:key name="_cd">43:3672966</s:key>
        <s:key name="_indextime">1364517495</s:key>
        <s:key name="_raw">03-28-2013 17:38:14.565 -0700 INFO  Metrics - group=tpool, name=indexertpool, qsize=0, workers=6, qwork_units=0</s:key>
        <s:key name="_serial">0</s:key>
        <s:key name="_si">localhost_internal</s:key>
        <s:key name="_sourcetype">splunkd</s:key>
        <s:key name="_subsecond">.565</s:key>
        <s:key name="_time">2013-03-28T17:38:14.565-07:00</s:key>
        <s:key name="host">localhost</s:key>
        <s:key name="index">_internal</s:key>
        <s:key name="linecount">1</s:key>
        <s:key name="source">/Applications/splunk/var/log/splunk/metrics.log</s:key>
        <s:key name="sourcetype">splunkd</s:key>
        <s:key name="splunk_server">localhost</s:key>
      </s:dict>
    </content>
  </entry>
</feed>

***** CSV *****

"_bkt","_cd","_indextime","_raw","_serial","_si","_sourcetype","_subsecond","_time",host,index,linecount,source,sourcetype,"splunk_server","tag::host"
"_internal~43~6C9BAA0F-A852-421C-8399-630009F9FFFF","43:3672966",1364517495,"03-28-2013 17:38:14.565 -0700 INFO  Metrics - group=tpool, name=indexertpool, qsize=0, workers=6, qwork_units=0",0,"localhost
_internal",splunkd,".565","2013-03-28T17:38:14.565-07:00","localhost","_internal",1,"/Applications/splunk/var/log/splunk/metrics.log",splunkd,"localhost",

***** JSON *****

{"preview":false,"init_offset":0,"messages":[{"type":"DEBUG","text":"base lispy: [ AND index::_internal ]"},{"type":"DEBUG","text":"search context: user=\"admin\", app=\"search\", bs-pathname=\"/Applications/splunk/etc\""}],"results":[{"_bkt":"_internal~43~6C9BAA0F-A852-421C-8399-630009F9FFFF","_cd":"43:3672966","_indextime":"1364517495","_raw":"03-28-2013 17:38:14.565 -0700 INFO  Metrics - group=tpool, name=indexertpool, qsize=0, workers=6, qwork_units=0","_serial":"0","_si":"localhost\n_internal","_sourcetype":"splunkd","_subsecond":".565","_time":"2013-03-28T17:38:14.565-07:00","host":"localhost","index":"_internal","linecount":"1","source":"/Applications/splunk/var/log/splunk/metrics.log","sourcetype":"splunkd","splunk_server":"localhost"}]}

***** JSON_COLS *****

{"preview":false,"init_offset":0,"messages":[{"type":"DEBUG","text":"base lispy: [ AND index::_internal ]"},{"type":"DEBUG","text":"search context: user=\"admin\", app=\"search\", bs-pathname=\"/Applications/splunk/etc\""}],"fields":["_bkt","_cd","_indextime","_raw","_serial","_si","_sourcetype","_subsecond","_time","host","index","linecount","source","sourcetype","splunk_server","tag::host"],"columns":[["_internal~43~6C9BAA0F-A852-421C-8399-630009F9FFFF"],["43:3672966"],["1364517495"],["03-28-2013 17:38:14.565 -0700 INFO  Metrics - group=tpool, name=indexertpool, qsize=0, workers=6, qwork_units=0"],["0"],[["localhost","_internal"]],["splunkd"],[".565"],["2013-03-28T17:38:14.565-07:00"],["localhost"],["_internal"],["1"],["/Applications/splunk/var/log/splunk/metrics.log"],["splunkd"],["localhost"],[null]]}

***** JSON_ROWS *****

{"preview":false,"init_offset":0,"messages":[{"type":"DEBUG","text":"base lispy: [ AND index::_internal ]"},{"type":"DEBUG","text":"search context: user=\"admin\", app=\"search\", bs-pathname=\"/Applications/splunk/etc\""}],"fields":["_bkt","_cd","_indextime","_raw","_serial","_si","_sourcetype","_subsecond","_time","host","index","linecount","source","sourcetype","splunk_server","tag::host"],"rows":[["_internal~43~6C9BAA0F-A852-421C-8399-630009F9FFFF","43:3672966","1364517495","03-28-2013 17:38:14.565 -0700 INFO  Metrics - group=tpool, name=indexertpool, qsize=0, workers=6, qwork_units=0","0",["localhost","_internal"],"splunkd",".565","2013-03-28T17:38:14.565-07:00","localhost","_internal","1","/Applications/splunk/var/log/splunk/metrics.log","splunkd","localhost",null]]}

***** RAW *****

03-28-2013 17:38:14.565 -0700 INFO  Metrics - group=tpool, name=indexertpool, qsize=0, workers=6, qwork_units=0

***** XML *****

<?xml version='1.0' encoding='UTF-8'?>
<results preview='0'>
<meta>
<fieldOrder>
<field>_bkt</field>
<field>_cd</field>
<field>_indextime</field>
<field>_raw</field>
<field>_serial</field>
<field>_si</field>
<field>_sourcetype</field>
<field>_subsecond</field>
<field>_time</field>
<field>host</field>
<field>index</field>
<field>linecount</field>
<field>source</field>
<field>sourcetype</field>
<field>splunk_server</field>
<field>tag::host</field>
</fieldOrder>
</meta>
    <result offset='0'>
        <field k='_bkt'>
            <value><text>_internal~43~6C9BAA0F-A852-421C-8399-630009F9FFFF</text></value>
        </field>
        <field k='_cd'>
            <value><text>43:3672966</text></value>
        </field>
        <field k='_indextime'>
            <value><text>1364517495</text></value>
        </field>
        <field k='_raw'><v xml:space='preserve' trunc='0'>03-28-2013 17:38:14.565 -0700 INFO  Metrics - group=tpool, name=indexertpool, qsize=0, workers=6, qwork_units=0</v></field>
        <field k='_serial'>
            <value><text>0</text></value>
        </field>
        <field k='_si'>
            <value><text>localhost</text></value>
            <value><text>_internal</text></value>
        </field>
        <field k='_sourcetype'>
            <value><text>splunkd</text></value>
        </field>
        <field k='_subsecond'>
            <value><text>.565</text></value>
        </field>
        <field k='_time'>
            <value><text>2013-03-28T17:38:14.565-07:00</text></value>
        </field>
        <field k='host'>
            <value><text>localhost</text></value>
        </field>
        <field k='index'>
            <value><text>_internal</text></value>
        </field>
        <field k='linecount'>
            <value><text>1</text></value>
        </field>
        <field k='source'>
            <value><text>/Applications/splunk/var/log/splunk/metrics.log</text></value>
        </field>
        <field k='sourcetype'>
            <value><text>splunkd</text></value>
        </field>
        <field k='splunk_server'>
            <value><text>localhost</text></value>
        </field>
    </result>
</results>

Results reader comparison

Here's a single search result. For comparison, the output is displayed using the standard Java classes and the SDK's XML results reader:

***** Output from standard Java classes *****

<?xml version='1.0' encoding='UTF-8'?>
<results preview='0'>
<meta>
<fieldOrder>
<field>_bkt</field>
<field>_cd</field>
<field>_indextime</field>
<field>_raw</field>
<field>_serial</field>
<field>_si</field>
<field>_sourcetype</field>
<field>_time</field>
<field>host</field>
<field>index</field>
<field>linecount</field>
<field>source</field>
<field>sourcetype</field>
<field>splunk_server</field>
<field>tag::host</field>
</fieldOrder>
</meta>
    <result offset='0'>
        <field k='_bkt'>
            <value><text>main~0~6C9BAA0F-A852-421C-8399-630009F9FFFF</text></value>
        </field>
        <field k='_cd'>
            <value><text>0:698116</text></value>
        </field>
        <field k='_indextime'>
            <value><text>1349210437</text></value>
        </field>
        <field k='_raw'><v xml:space='preserve' trunc='0'>178.19.3.39 - - [01/Oct/2012:23:59:34] "GET /flower_store/category.screen?category_id=CANDY HTTP/1.1" 200 10567 "http://mystore.splunk.com/flower_store/cart.do?action=purchase&itemId=EST-14&JSESSIONID=SD5SL10FF8ADFF3" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.10) Gecko/20070223 CentOS/1.5.0.10-0.1.el4.centos Firefox/1.5.0.10" 3187 1245</v></field>
        <field k='_serial'>
            <value><text>0</text></value>
        </field>
        <field k='_si'>
            <value><text>localhost</text></value>
            <value><text>main</text></value>
        </field>
        <field k='_sourcetype'>
            <value><text>access_combined_wcookie</text></value>
        </field>
        <field k='_time'>
            <value><text>2012-10-01T23:59:34.000-07:00</text></value>
        </field>
        <field k='host'>
            <value><text>localhost</text></value>
        </field>
        <field k='index'>
            <value><text>main</text></value>
        </field>
        <field k='linecount'>
            <value><text>1</text></value>
        </field>
        <field k='source'>
            <value><text>Sampledata.zip:./apache3.splunk.com/access_combined.log</text></value>
        </field>
        <field k='sourcetype'>
            <value><text>access_combined_wcookie</text></value>
        </field>
        <field k='splunk_server'>
            <value><text>localhost</text></value>
        </field>
    </result>
</results>


***** Output from the XML results reader *****

   _sourcetype:  access_combined_wcookie
   index:  main
   host:  localhost
   _cd:  0:698116
   _serial:  0
   _si:  localhost,main
   splunk_server:  localhost
   linecount:  1
   _indextime:  1349210437
   _raw:  178.19.3.39 - - [01/Oct/2012:23:59:34] "GET /flower_store/category.screen?category_id=CANDY HTTP/1.1" 200 10567 "http://mystore.splunk.com/flower_store/cart.do?action=purchase&itemId=EST-14&JSESSIONID=SD5SL10FF8ADFF3" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.10) Gecko/20070223 CentOS/1.5.0.10-0.1.el4.centos Firefox/1.5.0.10" 3187 1245
   source:  Sampledata.zip:./apache3.splunk.com/access_combined.log
   sourcetype:  access_combined_wcookie
   _bkt:  main~0~6C9BAA0F-A852-421C-8399-630009F9FFFF
   _time:  2012-10-01T23:59:34.000-07:00

Event, results, and results preview parameters

Set these parameters using setters for the following classes:

For more, see the POST search/jobs endpoint.

ParameterDescriptionApplies to
countA number that indicates the maximum number of results to return.events, results, results preview
earliest_timeA time string that specifies the earliest time in the time range to search. The time string can be a UTC time (with fractional seconds), a relative time specifier (to now), or a formatted time string. For a real-time search, specify "rt".events
fA string that contains the field to return for the event set.events, results, results preview
field_listA string that contains a comma-separated list of fields to return for the event set.events, results, results preview
latest_timeA time string that specifies the earliest time in the time range to search. The time string can be a UTC time (with fractional seconds), a relative time specifier (to now), or a formatted time string. For a real-time search, specify "rt".events
max_linesThe maximum number of lines that any single event's "_raw" field should contain.events
offsetA number of the index of the first result (inclusive) from which to begin returning data. This value is 0-indexed.events, results, results preview
output_modeSpecifies the output format of the results (XML, JSON, JSON_COLS, JSON_ROWS, CSV, ATOM, or RAW).events, results, results preview
output_time_formatA string that contains a UTC time format.events
search A string that contains the post-processing search to apply to results.events, results, results preview
segmentationA string that contains the type of segmentation to perform on the data.events
time_formatA string that contains the expression to convert a formatted time string from {start,end}_time into UTC seconds.events
truncation_modeA string that specifies how "max_lines" should be achieved ("abstract" or "truncate").events