Create a Node.js Lambda function and upload it in a deployment package

This topic describes how to create an AWS Lambda Node.js function that sends data to HTTP Event Collector (HEC) using Splunk logging for JavaScript. You write the function, integrating Splunk logging for JavaScript and any other libraries you need, and then package everything in a ZIP file and upload it to the AWS Lambda console. To instead use the built-in console editor (where you will not have access to Splunk logging for JavaScript), see Create a Node.js Lambda function in the Console editor.

This documentation guides you through our simple Node.js Lambda example function in detail.

Note: Splunk strongly suggests you familiarize yourself with Programming Model (Node.js) in the AWS Lambda Developer Guide before continuing.

Example contents

The custom code that implements our example is contained within two JavaScript files:

  • index.js contains the event handler based on the skeleton from the previous section. This is the code that actually sends events to HTTP Event Collector.
  • loggerFactory.js contains code that assembles the logger events to send to HEC.

In addition to these JavaScript files, we also customized the package.json file, which contains configuration and dependency information for your project in JSON format.

Our dependencies, which we must package up with our code in a ZIP file for upload to the AWS Lambda console, include the following two packages:

  • bunyan, which is a JSON logging library for Node.js.
  • splunk-bunyan-logger, which is Splunk's complement to Bunyan, and includes Splunk logging for Java as its own dependency.

For information about how to package up these components, see "Create the deployment package," later in this topic.

Example code

This section contains the contents of each custom file, plus an overview and explanation of each file.

index.js

var loggerFactory = require('./loggerFactory.js');

/**
 * Only the token property is required.
 * Defaults are listed explicitly.
 *
 * Alternatively, specify config.url like so:
 *
 * "https://localhost:8088/services/collector/event/1.0"
 */
var config = {
    token: "7496B593-0852-4120-8843-3B45B761AB36",
    url: "https://mysplunkserver:8088",
    source: "testSplunk" //functionName
    level: "info",
    autoFlush: false //batch until flushAsync is called
};

var logger = loggerFactory(config);

logger.info("Loading function testSplunk");

exports.handler = function(event, context) {
    var data = {"awsRequestId":context.awsRequestId};
    logger.info(data,'value1 =', event.key1);
    logger.info(data,'value2 =', event.key2);
    logger.info(data,'value3 =', event.key3);
    logger.streams[0].flush(function() {
        //context.succeed(event.key1);  // Echo back the first key value
    });
};

First, we create a config variable to store the configuration information for the Splunk Cloud server. The values are as follows:

  • token: The HTTP Event Collector token with which to authenticate. For more information, see "Create an Event Collector token" in the Getting Data In Manual.
  • url: The host including port for your Splunk Cloud instance.
  • path: The REST endpoint for HTTP Event Collector. Do not change this value.
  • protocol: The protocol to use. This value will be either http or https.
  • port: The HEC port. By default, this value is 8088, though you can change it.
  • level: The logging level to use. This value can be one of the following: trace, debug, info, warn or error.
  • autoFlush: Determines whether to send events immediately. We've set this to false here, so that events are batched

Next, we create a logger variable, passing in config to the loggerFactory. This calls the create() function that is defined within loggerFactory.js.

After logging an information message, we declare the event handler function. This is the function that actually sends the event data. The event and context parameters are passed to the function, three key values are logged to the logger variable, and then the flush function is called, which sends the event data to HEC.

loggerFactory.js

var splunkBunyan = require("splunk-bunyan-logger");
var bunyan = require("bunyan");

function create(config) {

    var splunkStream = splunkBunyan.createStream(config);

    splunkStream.on("error", function(err, context) {
        // Handle errors here
        console.log("Error", err, "Context", context);
    });

    // Setup Bunyan, adding splunkStream to the array of streams
    var logger = bunyan.createLogger({
        name: "my logger",
        streams: [
            splunkStream
        ]
    });
    return logger;
}

module.exports = create;

The create function is where we create a logger and a data stream to HEC. Event data is sent over this stream using the information in the config variable that was declared in index.js. Recall that this information includes the hostname for the Splunk Enterprise server or Splunk Cloud, plus the HEC token and other settings. The create function returns a logger.

package.json

{
  "name": "bunyanLambda",
  "version": "1.0.0",
  "description": "",
  "main": "bunyanLambda.js",
  "dependencies": {
    "bunyan": "*",
    "splunk-bunyan-logger": "0.8.0"
  }
}

The package.json file contains configuration and dependency information for the project in JSON format. To find out what the meaning of each key is, and whether it is optional, see Specifics of npm's package.json handling on the npmjs website.

Create the deployment package

The deployment package is a ZIP file that includes both your custom JavaScript code and any dependencies that your code needs to run.

The root level of the ZIP file contains any custom JavaScript code you've written, plus a directory called node_modules. Inside the node_modules directory is all the dependencies your code needs. For example, our deployment ZIP looks like this:

index.js
loggerFactory.js
package.json
node_modules/bunyan
node_modules/splunk-bunyan-logger

When you compress your Lambda function into a deployment ZIP, if you've been storing the code and dependencies in a directory, be sure to compress the contents of the directory, and not the directory itself.

For more information, see Creating Deployment Package (Node.js) in the AWS Lambda Developer Guide.

Upload the deployment package

When you've got your deployment ZIP, you're ready to upload it to AWS Lambda.

  • Log into the AWS Lambda Console, and then click Create a Lambda function.
  • On the Select blueprint page, click Skip.
  • On the Configure function page, enter a name for the function. This should be the same value as the name key's value in the package.json file. Enter a short description next to Description, and leave the Runtime pop-up menu set to Node.js.
  • Under Lambda function code, choose upload a ZIP file, and then click the Upload button. Upload your deployment ZIP.
  • Under Lambda function handler and role, enter the module-name.export value. If you've left the handler name as exports.handler, then leave the value as index.handler. In the Role pop-up menu, choose or create a role (likely lambda_basic_execution).
  • Finally, leave the Advanced settings section untouched, and then click Next.
  • On the Review page, review your settings. If you have to make any changes, click the Edit button. When you're done, click Create function.

Test the Lambda function

To test your new Lambda function, click the Test button.

  • You'll need to use sample event data for this function. Choose the "Hello World" sample event template from the Sample event template pop-up menu. You can change any of the values you want to in the edit window. Then click Submit. If the execution succeeded, you'll see a screen like the following:
    Screen shot showing successful test event transmission.
  • Now, log into Splunk Cloud and search for the data. For example, if you created a new token with default settings just for this walkthrough, search for source = <token_name>, where <token_name> is the name you gave the token when you created it. You'll see something like the following:
    Screen shot of Splunk Enterprise search app showing search results matching the events that were transmitted by AWS Lambda.

Each event has been logged.