Tagged: logstash

Logstash: Failed to flush outgoing items UndefinedConversionError workaround

If you have ever seen an error similar to this in Logstash it can be frustrating and can take your whole pipeline down (blocks). It appears that there are some outstanding tickets on this, one of which is here. This error can occur if you have an upstream input where the charset is defined as US-ASCII (such as from an ASCII file input), where that file contains some extended chars, and then being sent to an output where it needs to be converted (in my case it was elasticsearch_http). This was w/ Logstash 1.4.1

:message=>"Failed to flush outgoing items", :outgoing_count=>1,
:exception=>#<Encoding::UndefinedConversionError: ""\xC2"" from ASCII-8BIT to UTF-8>

For anyone else out there, here is a simple workaround fix I put in my filters which pre-processes messages from “known” upstream ASCII inputs. This fixed it for me: (using the Ruby code filter)

ruby {
code => "begin; if !event['message'].nil?; event['message'] = event['message'].force_encoding('ASCII-8BIT').encode('UTF-8', :invalid => :replace, :undef => :replace, :replace => '?'); end; rescue; end;"
}

To create a little log file to reproduce this try:

perl -e 'print "\xc2\xa0\n"' > test.log
Advertisements

Encrypting Logstash data

Note, the patch described below is now merged into the official logstash-filter-cipher plugin as of January 2016, version 2.0.3

UPDATE: Note the pending patch to fix various issues and add random IV support for encrypting logstash event messages is located here here: https://github.com/logstash-plugins/logstash-filter-cipher/pull/3

Logstash can consume a myriad of data from many different sources and likewise send this to all sorts of different outputs across your network. However the question that sometimes comes up is:  “how sensitive is this data?”

The answer to that question will vary as wildly as your inputs, however if it is sensitive in nature, you need to secure it at rest and in transit. There are many ways to do this and they can vary on the combination of your inputs/outputs and what protocols are supported and whether or not they use TLS or some other form of crypto. Your concerns might also be mitigated if you are transferring this data over a secured/trusted network, VPN or destination specific ssh tunnel.

However if none of the above apply, or you simply don’t have control over the parts of the infrastructure in between your Logstash agent and your final destination, there is one cool little filter plugin available to you in the Logstash Contrib Plugins project called the cipher filter.

If the only thing you have control over is the ability to touch your config file on both ends you can use this without having to fiddle with other infrastructure to provide a modest level of security for your data.  This is assuming you have a topology where Logstash agents, send data to some queueing system, and then they are consumed by another Logstash system downstream, which subsequently filters the data and sends it to its final resting place. Using the cipher filter comes into play where your “event” field(s) are encrypted at your source agent, flows through the infrastructure, and decrypted by your Logstash processor on the other end.

The cipher filter has been around for some time, but recently I had a use case for it similar to the above so I attempted using it and ran into some problems. In lieu of any answers, I ended up fixing the issues and adding some new features in this pull request. So if you are reading this, and that pull request is not yet approved, you might need to use my fork instead.

IMPORTANT: The security of your “key” is only as secure as the readability of your logstash config file that contains it! If this config file is world readable, your encryption is worthless. When using this method, be sure to analyze how and who could potentially access this configuration file.

Here is a simple Logstash conf example of using the cipher filter for encryption

input {

    file {
        # to test, just starting writing
        # line after line to the log
        # this will become the 'message'
        # in your logstash event
        path = "/path/to/test.log"
        sincedb_path = "./.sincedb"
        start_position = "end"
    }
}

filter {

    cipher {
        algorithm = "aes-256-cbc"
        cipher_padding = 1

        # Use a static "iv"
        #iv = "1234567890123456"

        # OR use a random IV per encryption
        iv_random_length = 16

        key = "12345678901234567890123456789012"
        key_size = 32

        mode = "encrypt"
        source = "message"
        target = "message_crypted"
        base64 = true

        # the maximum number of times the
        # internal cipher object instance
        # should be re-used before creating
        # a new one, default to 1
        #
        # On high volume systems bump this up
        max_cipher_reuse = 1
    }

    cipher {
        algorithm = "aes-256-cbc"
        cipher_padding = 1

        # Use a static "iv"
        #iv = "1234567890123456"

        # OR use a random IV per encryption
        iv_random_length = 16

        key = "12345678901234567890123456789012"
        key_size = 32

        mode = "decrypt"
        source = "message_crypted"
        target = "message_decrypted"
        base64 = true

        # the maximum number of times the
        # internal cipher object instance
        # should be re-used before creating
        # a new one, default to 1
        #
        # On high volume systems bump this up
        max_cipher_reuse = 1
    }

}

output {

    # Once output to the console you should see three fields in each event
    # The original "message"
    # The "message_crypted"
    # The "message_decrypted" verifying that the decryption worked.
    # In a real setup, you would only send along an encrypted version to an OUTPUT
    stdout {
        codec = json
    }

}

 

Logstash for ModSecurity audit logs

Recently had a need to take tons of raw ModSecurity audit logs and make use of them. Ended up using Logstash as a first stab attempt to get them from their raw format into something that could be stored in something more useful like a database or search engine. Nicely enough, out of the box, Logstash has an embeddable ElasticSearch instance that Kibana can hook up to.  After configuring your Logstash inputs, filters and outputs, you can be querying your log data in no time…. that is assuming writing the filters for your log data takes you close to “no time” which is not the case with modsec’s more challenging log format.

After searching around for some ModSecurity/Logstash examples, and finding only this one (for modsec entries in the apache error log), I was facing the task of having to write my own to deal w/ the modsecurity audit log format…..arrggg!

So after a fair amount of work, I ended up having a Logstash configuration that works for me… hopefully this will be of use to others out there as well. Note, this is certainly not perfect but is intended to serve as an example and starting point for anyone who is looking for that.

The Modsecurity Logstash configuration file (and tiny pattern file) is located here on Github: https://github.com/bitsofinfo/logstash-modsecurity

  1. Get some audit logs generated from modsecurity and throw them into a directory
  2. Edit the logstash modsecurity config file (https://github.com/bitsofinfo/logstash-modsecurity) and customize its file input path to point to your logs from step (1)
  3. Customize the output(s) and review the various filters
  4. On the command line:  java -jar logstash-[version]-flatjar.jar agent -v -f  logstash_modsecurity.conf

This was tested against Logstash v1.2.1 through 1.4.2 and relies heavily on Logstash’s “ruby” filter capability which really was a lifesaver to be able to workaround some bugs and lack of certain capabilities Logstash’s in growing set of filters. I’m sure as Logstash grows, much of what the custom ruby filters do can be changed over time.

The end result of it is that with this configuration, your raw Modsec audit log entries, will end up looking something like this JSON example below. Again this is just how I ended up structuring the fields via the filters. You can take the above configuration example and change your output to your needs.

Also note that ModSecurity Audit logs can definitely contains some very sensitive data (like user passwords etc). So you might want to also take a look at using Logstash’s Cipher filter to encrypt certain message fields in transit if you are sending these processed logs somewhere else: https://bitsofinfo.wordpress.com/2014/06/25/encrypting-logstash-data/

EXAMPLE JSON OUTPUT, using this Logstash configuration


{
  "@timestamp": "2013-09-17T09:46:16.088Z",
  "@version": "1",
  "host": "razzle2",
  "path": "\/Users\/bof\/who2\/zip4n\/logstash\/modseclogs\/proxy9\/modsec_audit.log.1",
  "tags": [
    "multiline"
  ],
  "rawSectionA": "[17\/Sep\/2013:05:46:16 --0400] MSZkdwoB9ogAAHlNTXUAAAAD 192.168.0.9 65183 192.168.0.136 80",
  "rawSectionB": "POST \/xml\/rpc\/soapservice-v2 HTTP\/1.1\nContent-Type: application\/xml\nspecialcookie: tb034=\nCache-Control: no-cache\nPragma: no-cache\nUser-Agent: Java\/1.5.0_15\nHost: xmlserver.intstage442.org\nAccept: text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2\nConnection: keep-alive\nContent-Length: 93\nIncoming-Protocol: HTTPS\nab0044: 0\nX-Forwarded-For: 192.168.1.232",
  "rawSectionC": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "rawSectionF": "HTTP\/1.1 200 OK\nX-SESSTID: 009nUn4493\nContent-Type: application\/xml;charset=UTF-8\nContent-Length: 76\nConnection: close",
  "rawSectionH": "Message: Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required. [file \"\/opt\/niner\/modsec2\/pp7.conf\"] [line \"69\"] [id \"960010\"] [msg \"Request content type is not allowed by policy\"] [severity \"WARNING\"] [tag \"POLICY\/ENCODING_NOT_ALLOWED\"]\nApache-Handler: party-server-time2\nStopwatch: 1379411176088695 48158 (1771* 3714 -)\nProducer: ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.\nServer: Whoisthat\/v1 (Osprey)",
  "modsec_timestamp": "17\/Sep\/2013:05:46:16 --0400",
  "uniqueId": "MSZkdwoB9ogAAHlNTXUAAAAD",
  "sourceIp": "192.168.0.9",
  "sourcePort": "65183",
  "destIp": "192.168.0.136",
  "destPort": "80",
  "httpMethod": "POST",
  "requestedUri": "\/xml\/rpc\/soapservice-v2",
  "incomingProtocol": "HTTP\/1.1",
  "requestBody": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "serverProtocol": "HTTP\/1.1",
  "responseStatus": "200 OK",
  "requestHeaders": {
    "Content-Type": "application\/xml",
    "specialcookie": "8jj220021kl==j2899IuU",
    "Cache-Control": "no-cache",
    "Pragma": "no-cache",
    "User-Agent": "Java\/1.5.1_15",
    "Host": "xmlserver.intstage442.org",
    "Accept": "text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2",
    "Connection": "keep-alive",
    "Content-Length": "93",
    "Incoming-Protocol": "HTTPS",
    "ab0044": "0",
    "X-Forwarded-For": "192.168.1.232"
  },
  "responseHeaders": {
    "X-SESSTID": "009nUn4493",
    "Content-Type": "application\/xml;charset=UTF-8",
    "Content-Length": "76",
    "Connection": "close"
  },
  "auditLogTrailer": {
    "Apache-Handler": "party-server-time2",
    "Stopwatch": "1379411176088695 48158 (1771* 3714 -)",
    "Producer": "ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.",
    "Server": "Whoisthat\/v1 (Osprey)",</pre>
"messages": [ { "info": "Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required.", "file": "\/opt\/niner\/modsec2\/pp7.conf", "line": "69", "id": "960010", "msg": "Request content type is not allowed by policy", "severity": "WARNING", "tag": "POLICY\/ENCODING_NOT_ALLOWED" } ] }, "event_date_microseconds": 1.3794111760887e+15, "event_date_milliseconds": 1379411176088.7, "event_date_seconds": 1379411176.0887, "event_timestamp": "2013-09-17T09:46:16.088Z", "XForwardedFor-GEOIP": { "ip": "192.168.1.122", "country_code2": "XZ", "country_code3": "BRZ", "country_name": "Brazil", "continent_code": "SA", "region_name": "12", "city_name": "Vesper", "postal_code": "", "timezone": "Brazil\/Continental", "real_region_name": "Region Metropolitana" }, "matchedRules": [ "SecRule \"REQUEST_METHOD\" \"@rx ^POST$\" \"phase:2,status:400,t:lowercase,t:replaceNulls,t:compressWhitespace,chain,t:none,deny,log,auditlog,msg:'POST request must have a Content-Length header',id:960022,tag:PROTOCOL_VIOLATION\/EVASION,severity:4\"", "SecRule \"REQUEST_FILENAME|ARGS|ARGS_NAMES|REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:Referer\" \"@pm jscript onsubmit onchange onkeyup activexobject vbscript: <![cdata[ http: settimeout onabort shell: .innerhtml onmousedown onkeypress asfunction: onclick .fromcharcode background-image: .cookie onunload createtextrange onload <input\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:compressWhiteSpace,t:lowercase,nolog,skip:1\"", "SecAction \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,nolog,skipAfter:950003\"", "SecRule \"REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:'\/^(Cookie|Referer|X-OS-Prefs)$\/'|REQUEST_COOKIES|REQUEST_COOKIES_NAMES\" \"@pm gcc g++\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:lowercase,nolog,skip:1\"", ], "secRuleIds": [ "960022", "960050" ] }