How to use manual batching with HTTP Event Collector stream for Bunyan

This topic demonstrates how to manually batch events before sending them to HTTP Event Collector (HEC) on Splunk Enterprise or Splunk Cloud using a Bunyan stream. The manual_batching.js example included in the examples directory of the HTTP Event Collector stream for Bunyan package represents a sample implementation. It has also been pasted below.

Note: The examples are not installed when using the npm installation method. To obtain copies of the examples, download the Splunk HTTP Event Collector stream for Bunyan package.

Example walkthrough

This example includes logic to batch events before sending them to HTTP Event Collector on Splunk Enterprise or Splunk Cloud. To send events manually, use the HEC Bunyan stream's flush() function.

First, we add require statements for Bunyan and the HEC stream for Bunyan..

Then, we declare a config variable to store the configuration information for the Splunk Enterprise instance or Splunk Cloud server. Only the token property is required, but in this example, we've set the url and maxBatchCount properties. We set the maxBatchCount property to zero so that the batch is never sent automatically.

Next, we create a Bunyan stream (splunkStream), plus an error handler.

Then, we create a logger (Logger) using the bunyan.createLogger() function, including a streams array as one of its inputs. Inside the streams array, we include splunkStream.

Next, we define the event payload in the payload variable. We've added fields for the event data itself (temperature and chickenCount in this case. Then we added several special keys to specify metadata that is to be assigned to the event data when HTTP Event Collector receives it. If any of these values (source, sourcetype, and so on) differ from the default values on the server, the values specified here will override the default values. Of course, your JavaScript app will determine what goes into the actual payload contents.

The payload is then queued for transmittal by calling Logger.info().

In our example, we include two event payloads in order to simulate batched events.

Finally, we send the payload to HTTP Event Collector by calling splunkStream.flush(). We then log the response from Splunk Enterprise or Splunk Cloud.

manual_batching.js

/*
 * Copyright 2015 Splunk, Inc.
 *
 * Licensed under the Apache License, Version 2.0 (the "License"): you may
 * not use this file except in compliance with the License. You may obtain
 * a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
 * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
 * License for the specific language governing permissions and limitations
 * under the License.
 */

/**
 * This example shows how to batch events with the
 * the Splunk Bunyan logger by manually calling flush.
 *
 * By setting maxbatchCount=0, events will be queued
 * until flush() is called.
 */

// Change to require("splunk-bunyan-logger");
var splunkBunyan = require("../index");
var bunyan = require("bunyan");

/**
 * Only the token property is required.
 * 
 * Here, maxBatchCount is set to 0.
 */
var config = {
    token: "your-token-here",
    url: "https://localhost:8088",
    maxBatchCount: 0
};
var splunkStream = splunkBunyan.createStream(config);

splunkStream.on("error", function(err, context) {
    // Handle errors here
    console.log("Error", err, "Context", context);
});

// Setup Bunyan, adding splunkStream to the array of streams
var Logger = bunyan.createLogger({
    name: "my logger",
    streams: [
        splunkStream
    ]
});

// Define the payload to send to HTTP Event Collector
var payload = {
    // Our important fields
    temperature: "70F",
    chickenCount: 500,

    // Special keys to specify metadata for HTTP Event Collector
    source: "chicken coop",
    sourcetype: "httpevent",
    index: "main",
    host: "farm.local"
};

// Send the payload
console.log("Queuing payload", payload);
Logger.info(payload, "Chicken coup looks stable.");

var payload2 = {
    // Our important fields
    temperature: "75F",
    chickenCount: 600,

    // Special keys to specify metadata for HTTP Event Collector
    source: "chicken coop",
    sourcetype: "httpevent",
    index: "main",
    host: "farm.local"
};

// Send the payload
console.log("Queuing second payload", payload2);
Logger.info(payload2, "New chickens have arrived");

/**
 * Call flush manually.
 * This will send both payloads in a single
 * HTTP request.
 *
 * The callback for flush is optional.
 */
splunkStream.flush(function(err, resp, body) {
    // If successful, body will be { text: 'Success', code: 0 }
    console.log("Response from Splunk", body);
});