Logstash for ModSecurity audit logs

Recently had a need to take tons of raw ModSecurity audit logs and make use of them. Ended up using Logstash as a first stab attempt to get them from their raw format into something that could be stored in something more useful like a database or search engine. Nicely enough, out of the box, Logstash has an embeddable ElasticSearch instance that Kibana can hook up to.  After configuring your Logstash inputs, filters and outputs, you can be querying your log data in no time…. that is assuming writing the filters for your log data takes you close to “no time” which is not the case with modsec’s more challenging log format.

After searching around for some ModSecurity/Logstash examples, and finding only this one (for modsec entries in the apache error log), I was facing the task of having to write my own to deal w/ the modsecurity audit log format…..arrggg!

So after a fair amount of work, I ended up having a Logstash configuration that works for me… hopefully this will be of use to others out there as well. Note, this is certainly not perfect but is intended to serve as an example and starting point for anyone who is looking for that.

The Modsecurity Logstash configuration file (and tiny pattern file) is located here on Github: https://github.com/bitsofinfo/logstash-modsecurity

  1. Get some audit logs generated from modsecurity and throw them into a directory
  2. Edit the logstash modsecurity config file (https://github.com/bitsofinfo/logstash-modsecurity) and customize its file input path to point to your logs from step (1)
  3. Customize the output(s) and review the various filters
  4. On the command line:  java -jar logstash-[version]-flatjar.jar agent -v -f  logstash_modsecurity.conf

This was tested against Logstash v1.2.1 through 1.4.2 and relies heavily on Logstash’s “ruby” filter capability which really was a lifesaver to be able to workaround some bugs and lack of certain capabilities Logstash’s in growing set of filters. I’m sure as Logstash grows, much of what the custom ruby filters do can be changed over time.

The end result of it is that with this configuration, your raw Modsec audit log entries, will end up looking something like this JSON example below. Again this is just how I ended up structuring the fields via the filters. You can take the above configuration example and change your output to your needs.

Also note that ModSecurity Audit logs can definitely contains some very sensitive data (like user passwords etc). So you might want to also take a look at using Logstash’s Cipher filter to encrypt certain message fields in transit if you are sending these processed logs somewhere else: https://bitsofinfo.wordpress.com/2014/06/25/encrypting-logstash-data/

EXAMPLE JSON OUTPUT, using this Logstash configuration

  "@timestamp": "2013-09-17T09:46:16.088Z",
  "@version": "1",
  "host": "razzle2",
  "path": "\/Users\/bof\/who2\/zip4n\/logstash\/modseclogs\/proxy9\/modsec_audit.log.1",
  "tags": [
  "rawSectionA": "[17\/Sep\/2013:05:46:16 --0400] MSZkdwoB9ogAAHlNTXUAAAAD 65183 80",
  "rawSectionB": "POST \/xml\/rpc\/soapservice-v2 HTTP\/1.1\nContent-Type: application\/xml\nspecialcookie: tb034=\nCache-Control: no-cache\nPragma: no-cache\nUser-Agent: Java\/1.5.0_15\nHost: xmlserver.intstage442.org\nAccept: text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2\nConnection: keep-alive\nContent-Length: 93\nIncoming-Protocol: HTTPS\nab0044: 0\nX-Forwarded-For:",
  "rawSectionC": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "rawSectionF": "HTTP\/1.1 200 OK\nX-SESSTID: 009nUn4493\nContent-Type: application\/xml;charset=UTF-8\nContent-Length: 76\nConnection: close",
  "rawSectionH": "Message: Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required. [file \"\/opt\/niner\/modsec2\/pp7.conf\"] [line \"69\"] [id \"960010\"] [msg \"Request content type is not allowed by policy\"] [severity \"WARNING\"] [tag \"POLICY\/ENCODING_NOT_ALLOWED\"]\nApache-Handler: party-server-time2\nStopwatch: 1379411176088695 48158 (1771* 3714 -)\nProducer: ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.\nServer: Whoisthat\/v1 (Osprey)",
  "modsec_timestamp": "17\/Sep\/2013:05:46:16 --0400",
  "uniqueId": "MSZkdwoB9ogAAHlNTXUAAAAD",
  "sourceIp": "",
  "sourcePort": "65183",
  "destIp": "",
  "destPort": "80",
  "httpMethod": "POST",
  "requestedUri": "\/xml\/rpc\/soapservice-v2",
  "incomingProtocol": "HTTP\/1.1",
  "requestBody": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "serverProtocol": "HTTP\/1.1",
  "responseStatus": "200 OK",
  "requestHeaders": {
    "Content-Type": "application\/xml",
    "specialcookie": "8jj220021kl==j2899IuU",
    "Cache-Control": "no-cache",
    "Pragma": "no-cache",
    "User-Agent": "Java\/1.5.1_15",
    "Host": "xmlserver.intstage442.org",
    "Accept": "text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2",
    "Connection": "keep-alive",
    "Content-Length": "93",
    "Incoming-Protocol": "HTTPS",
    "ab0044": "0",
    "X-Forwarded-For": ""
  "responseHeaders": {
    "X-SESSTID": "009nUn4493",
    "Content-Type": "application\/xml;charset=UTF-8",
    "Content-Length": "76",
    "Connection": "close"
  "auditLogTrailer": {
    "Apache-Handler": "party-server-time2",
    "Stopwatch": "1379411176088695 48158 (1771* 3714 -)",
    "Producer": "ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.",
    "Server": "Whoisthat\/v1 (Osprey)",</pre>
"messages": [ { "info": "Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required.", "file": "\/opt\/niner\/modsec2\/pp7.conf", "line": "69", "id": "960010", "msg": "Request content type is not allowed by policy", "severity": "WARNING", "tag": "POLICY\/ENCODING_NOT_ALLOWED" } ] }, "event_date_microseconds": 1.3794111760887e+15, "event_date_milliseconds": 1379411176088.7, "event_date_seconds": 1379411176.0887, "event_timestamp": "2013-09-17T09:46:16.088Z", "XForwardedFor-GEOIP": { "ip": "", "country_code2": "XZ", "country_code3": "BRZ", "country_name": "Brazil", "continent_code": "SA", "region_name": "12", "city_name": "Vesper", "postal_code": "", "timezone": "Brazil\/Continental", "real_region_name": "Region Metropolitana" }, "matchedRules": [ "SecRule \"REQUEST_METHOD\" \"@rx ^POST$\" \"phase:2,status:400,t:lowercase,t:replaceNulls,t:compressWhitespace,chain,t:none,deny,log,auditlog,msg:'POST request must have a Content-Length header',id:960022,tag:PROTOCOL_VIOLATION\/EVASION,severity:4\"", "SecRule \"REQUEST_FILENAME|ARGS|ARGS_NAMES|REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:Referer\" \"@pm jscript onsubmit onchange onkeyup activexobject vbscript: <![cdata[ http: settimeout onabort shell: .innerhtml onmousedown onkeypress asfunction: onclick .fromcharcode background-image: .cookie onunload createtextrange onload <input\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:compressWhiteSpace,t:lowercase,nolog,skip:1\"", "SecAction \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,nolog,skipAfter:950003\"", "SecRule \"REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:'\/^(Cookie|Referer|X-OS-Prefs)$\/'|REQUEST_COOKIES|REQUEST_COOKIES_NAMES\" \"@pm gcc g++\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:lowercase,nolog,skip:1\"", ], "secRuleIds": [ "960022", "960050" ] } 

22 thoughts on “Logstash for ModSecurity audit logs

  1. Great work.
    Could you post a mod_security configuration?
    I tried you config, but in logstash it just stops at filter received, without any error

      1. 1.2.1
        I made it in the end. It was because of my architecture, I had to remove the tag field.

        Does you logstash works with a lot of audit logs? like several thousand?
        For me in debug mode it works, but in the background when run through service-wrapper it stop parsing the files.

  2. Yeah it processes tens of thousands of entries. However I have seen logstash crash after running for a few days of consuming logs. I’ve yet to determine why. Have you tried adjusting the JVM heap settings when you startup logstash? Try -Xms512m -Xmx512m where “512” means 512 megabyte heap. You can adjust that as you see fit.

  3. No luck 😦 I can get logstash started, but then I get 100 lines of warnings about deprecated methods and then it just sits there. Path to my audit_logs is ok and there are files there. I’m not even getting any messages if it is parsing anything or just hangs.

    1. What version of logstash are you using? What is the command you are using to start it up? What is the output/warnings/errors that logstash reports? What does your config file look like exactly?

  4. Logstash version is 1.3.2
    Nginx-version is 1.4.1
    ModSecurity is 2.7.1


    java -jar logstash-1.3.2-flatjar.jar agent -f /usr/local/nginx/conf/logstash-modsecurity.conf


    EDITED: original comment was the config file, waaaaayyy tooo biiigg!

    I’ll send any following files by mail, so I don’t spam the blog

    1. Add “-v” to the startup command and the output will be way more verbose as to what is going on. I.E. “java -jar logstash-1.3.2-flatjar.jar agent -v -f /usr/local/nginx/conf/logstash-modsecurity.conf”

      1. Hmm, looks like its just not picking up your log file at all, otherwise you would see more output. Are you sure your path/pattern to your logs is correct? Permissions ok? Maybe try renaming one of those logs files to something simple like name.log and explicitly only point to that file to rule things out first.

  5. i got an error saying its deprecated, and logstash crashed. im using logstash 1.4.2. i copied your logstash-modsecurity.conf file to /bin directory of my logstash and called it with “/bin/logstash -f logstash-modsecurity.conf”. Am i missing something here? how can we fix this?

  6. 3 years on and this still works a treat. I just had to make a couple of very minor tweaks to get this working with my modsec logs. Excellent work. Saved me loads of time.

  7. Thanks for the doc, I think it is working with ls 5, but i am getting the following logs on terminal after starting ls.

    root@rc:/usr/share/logstash/bin# ./logstash -f logstash.conf
    ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
    WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using –path.settings. Continuing using the defaults
    Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console
    10:46:38.825 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[]}}
    10:46:38.837 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>, :path=>”/”}
    10:46:39.346 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch – Restored connection to ES instance {:url=>#}
    10:46:39.349 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Using mapping template from {:path=>nil}
    10:46:40.246 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Attempting to install template {:manage_template=>{“template”=>”logstash-*”, “version”=>50001, “settings”=>{“index.refresh_interval”=>”5s”}, “mappings”=>{“_default_”=>{“_all”=>{“enabled”=>true, “norms”=>false}, “dynamic_templates”=>[{“message_field”=>{“path_match”=>”message”, “match_mapping_type”=>”string”, “mapping”=>{“type”=>”text”, “norms”=>false}}}, {“string_fields”=>{“match”=>”*”, “match_mapping_type”=>”string”, “mapping”=>{“type”=>”text”, “norms”=>false, “fields”=>{“keyword”=>{“type”=>”keyword”, “ignore_above”=>256}}}}}], “properties”=>{“@timestamp”=>{“type”=>”date”, “include_in_all”=>false}, “@version”=>{“type”=>”keyword”, “include_in_all”=>false}, “geoip”=>{“dynamic”=>true, “properties”=>{“ip”=>{“type”=>”ip”}, “location”=>{“type”=>”geo_point”}, “latitude”=>{“type”=>”half_float”}, “longitude”=>{“type”=>”half_float”}}}}}}}}
    10:46:40.748 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – New Elasticsearch output {:class=>”LogStash::Outputs::ElasticSearch”, :hosts=>[#]}
    10:46:41.259 [[main]-pipeline-manager] INFO logstash.filters.geoip – Using geoip database {:path=>”/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.2.1-java/vendor/GeoLite2-City.mmdb”}
    10:46:41.375 [[main]-pipeline-manager] INFO logstash.filters.geoip – Using geoip database {:path=>”/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.2.1-java/vendor/GeoLite2-City.mmdb”}
    10:46:41.377 [[main]-pipeline-manager] INFO logstash.pipeline – Starting pipeline {“id”=>”main”, “pipeline.workers”=>2, “pipeline.batch.size”=>125, “pipeline.batch.delay”=>5, “pipeline.max_inflight”=>250}
    10:46:42.671 [[main]-pipeline-manager] INFO logstash.pipeline – Pipeline main started
    The stdin plugin is now waiting for input:
    10:46:43.193 [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for []:Array
    10:46:43.192 [[main]>worker0] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for []:Array
    10:46:43.255 [Api Webserver] INFO logstash.agent – Successfully started Logstash API endpoint {:port=>9600}
    10:46:43.277 [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}, {}, {}]:Array
    10:46:43.385 [[main]>worker0] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}, {}, {}]:Array
    10:46:43.418 [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}]:Array

    Whenever there is new entry in my mod-sec audit log file, i get this error

    [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}]:Array

    It seems to be some error in wrt ruby, but unable to find a solution.

    I think it has got something to do with the ls conf file which i concatnated after following the instruction from the github repo readme.

    Please help me.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s