Logstash for ModSecurity audit logs

Recently had a need to take tons of raw ModSecurity audit logs and make use of them. Ended up using Logstash as a first stab attempt to get them from their raw format into something that could be stored in something more useful like a database or search engine. Nicely enough, out of the box, Logstash has an embeddable ElasticSearch instance that Kibana can hook up to.  After configuring your Logstash inputs, filters and outputs, you can be querying your log data in no time…. that is assuming writing the filters for your log data takes you close to “no time” which is not the case with modsec’s more challenging log format.

After searching around for some ModSecurity/Logstash examples, and finding only this one (for modsec entries in the apache error log), I was facing the task of having to write my own to deal w/ the modsecurity audit log format…..arrggg!

So after a fair amount of work, I ended up having a Logstash configuration that works for me… hopefully this will be of use to others out there as well. Note, this is certainly not perfect but is intended to serve as an example and starting point for anyone who is looking for that.

The Modsecurity Logstash configuration file (and tiny pattern file) is located here on Github: https://github.com/bitsofinfo/logstash-modsecurity

  1. Get some audit logs generated from modsecurity and throw them into a directory
  2. Edit the logstash modsecurity config file (https://github.com/bitsofinfo/logstash-modsecurity) and customize its file input path to point to your logs from step (1)
  3. Customize the output(s) and review the various filters
  4. On the command line:  java -jar logstash-[version]-flatjar.jar agent -v -f  logstash_modsecurity.conf

This was tested against Logstash v1.2.1 through 1.4.2 and relies heavily on Logstash’s “ruby” filter capability which really was a lifesaver to be able to workaround some bugs and lack of certain capabilities Logstash’s in growing set of filters. I’m sure as Logstash grows, much of what the custom ruby filters do can be changed over time.

The end result of it is that with this configuration, your raw Modsec audit log entries, will end up looking something like this JSON example below. Again this is just how I ended up structuring the fields via the filters. You can take the above configuration example and change your output to your needs.

Also note that ModSecurity Audit logs can definitely contains some very sensitive data (like user passwords etc). So you might want to also take a look at using Logstash’s Cipher filter to encrypt certain message fields in transit if you are sending these processed logs somewhere else: https://bitsofinfo.wordpress.com/2014/06/25/encrypting-logstash-data/

EXAMPLE JSON OUTPUT, using this Logstash configuration


{
  "@timestamp": "2013-09-17T09:46:16.088Z",
  "@version": "1",
  "host": "razzle2",
  "path": "\/Users\/bof\/who2\/zip4n\/logstash\/modseclogs\/proxy9\/modsec_audit.log.1",
  "tags": [
    "multiline"
  ],
  "rawSectionA": "[17\/Sep\/2013:05:46:16 --0400] MSZkdwoB9ogAAHlNTXUAAAAD 192.168.0.9 65183 192.168.0.136 80",
  "rawSectionB": "POST \/xml\/rpc\/soapservice-v2 HTTP\/1.1\nContent-Type: application\/xml\nspecialcookie: tb034=\nCache-Control: no-cache\nPragma: no-cache\nUser-Agent: Java\/1.5.0_15\nHost: xmlserver.intstage442.org\nAccept: text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2\nConnection: keep-alive\nContent-Length: 93\nIncoming-Protocol: HTTPS\nab0044: 0\nX-Forwarded-For: 192.168.1.232",
  "rawSectionC": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "rawSectionF": "HTTP\/1.1 200 OK\nX-SESSTID: 009nUn4493\nContent-Type: application\/xml;charset=UTF-8\nContent-Length: 76\nConnection: close",
  "rawSectionH": "Message: Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required. [file \"\/opt\/niner\/modsec2\/pp7.conf\"] [line \"69\"] [id \"960010\"] [msg \"Request content type is not allowed by policy\"] [severity \"WARNING\"] [tag \"POLICY\/ENCODING_NOT_ALLOWED\"]\nApache-Handler: party-server-time2\nStopwatch: 1379411176088695 48158 (1771* 3714 -)\nProducer: ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.\nServer: Whoisthat\/v1 (Osprey)",
  "modsec_timestamp": "17\/Sep\/2013:05:46:16 --0400",
  "uniqueId": "MSZkdwoB9ogAAHlNTXUAAAAD",
  "sourceIp": "192.168.0.9",
  "sourcePort": "65183",
  "destIp": "192.168.0.136",
  "destPort": "80",
  "httpMethod": "POST",
  "requestedUri": "\/xml\/rpc\/soapservice-v2",
  "incomingProtocol": "HTTP\/1.1",
  "requestBody": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "serverProtocol": "HTTP\/1.1",
  "responseStatus": "200 OK",
  "requestHeaders": {
    "Content-Type": "application\/xml",
    "specialcookie": "8jj220021kl==j2899IuU",
    "Cache-Control": "no-cache",
    "Pragma": "no-cache",
    "User-Agent": "Java\/1.5.1_15",
    "Host": "xmlserver.intstage442.org",
    "Accept": "text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2",
    "Connection": "keep-alive",
    "Content-Length": "93",
    "Incoming-Protocol": "HTTPS",
    "ab0044": "0",
    "X-Forwarded-For": "192.168.1.232"
  },
  "responseHeaders": {
    "X-SESSTID": "009nUn4493",
    "Content-Type": "application\/xml;charset=UTF-8",
    "Content-Length": "76",
    "Connection": "close"
  },
  "auditLogTrailer": {
    "Apache-Handler": "party-server-time2",
    "Stopwatch": "1379411176088695 48158 (1771* 3714 -)",
    "Producer": "ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.",
    "Server": "Whoisthat\/v1 (Osprey)",</pre>
"messages": [ { "info": "Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required.", "file": "\/opt\/niner\/modsec2\/pp7.conf", "line": "69", "id": "960010", "msg": "Request content type is not allowed by policy", "severity": "WARNING", "tag": "POLICY\/ENCODING_NOT_ALLOWED" } ] }, "event_date_microseconds": 1.3794111760887e+15, "event_date_milliseconds": 1379411176088.7, "event_date_seconds": 1379411176.0887, "event_timestamp": "2013-09-17T09:46:16.088Z", "XForwardedFor-GEOIP": { "ip": "192.168.1.122", "country_code2": "XZ", "country_code3": "BRZ", "country_name": "Brazil", "continent_code": "SA", "region_name": "12", "city_name": "Vesper", "postal_code": "", "timezone": "Brazil\/Continental", "real_region_name": "Region Metropolitana" }, "matchedRules": [ "SecRule \"REQUEST_METHOD\" \"@rx ^POST$\" \"phase:2,status:400,t:lowercase,t:replaceNulls,t:compressWhitespace,chain,t:none,deny,log,auditlog,msg:'POST request must have a Content-Length header',id:960022,tag:PROTOCOL_VIOLATION\/EVASION,severity:4\"", "SecRule \"REQUEST_FILENAME|ARGS|ARGS_NAMES|REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:Referer\" \"@pm jscript onsubmit onchange onkeyup activexobject vbscript: <![cdata[ http: settimeout onabort shell: .innerhtml onmousedown onkeypress asfunction: onclick .fromcharcode background-image: .cookie onunload createtextrange onload <input\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:compressWhiteSpace,t:lowercase,nolog,skip:1\"", "SecAction \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,nolog,skipAfter:950003\"", "SecRule \"REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:'\/^(Cookie|Referer|X-OS-Prefs)$\/'|REQUEST_COOKIES|REQUEST_COOKIES_NAMES\" \"@pm gcc g++\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:lowercase,nolog,skip:1\"", ], "secRuleIds": [ "960022", "960050" ] } 

38 thoughts on “Logstash for ModSecurity audit logs

  1. Great work.
    Could you post a mod_security configuration?
    I tried you config, but in logstash it just stops at filter received, without any error

      1. 1.2.1
        I made it in the end. It was because of my architecture, I had to remove the tag field.

        Does you logstash works with a lot of audit logs? like several thousand?
        For me in debug mode it works, but in the background when run through service-wrapper it stop parsing the files.

  2. Yeah it processes tens of thousands of entries. However I have seen logstash crash after running for a few days of consuming logs. I’ve yet to determine why. Have you tried adjusting the JVM heap settings when you startup logstash? Try -Xms512m -Xmx512m where “512” means 512 megabyte heap. You can adjust that as you see fit.

  3. No luck 😦 I can get logstash started, but then I get 100 lines of warnings about deprecated methods and then it just sits there. Path to my audit_logs is ok and there are files there. I’m not even getting any messages if it is parsing anything or just hangs.

    1. What version of logstash are you using? What is the command you are using to start it up? What is the output/warnings/errors that logstash reports? What does your config file look like exactly?

  4. Logstash version is 1.3.2
    Nginx-version is 1.4.1
    ModSecurity is 2.7.1

    command:

    java -jar logstash-1.3.2-flatjar.jar agent -f /usr/local/nginx/conf/logstash-modsecurity.conf

    warnings:

    EDITED: original comment was the config file, waaaaayyy tooo biiigg!

    I’ll send any following files by mail, so I don’t spam the blog

    1. Add “-v” to the startup command and the output will be way more verbose as to what is going on. I.E. “java -jar logstash-1.3.2-flatjar.jar agent -v -f /usr/local/nginx/conf/logstash-modsecurity.conf”

      1. Hmm, looks like its just not picking up your log file at all, otherwise you would see more output. Are you sure your path/pattern to your logs is correct? Permissions ok? Maybe try renaming one of those logs files to something simple like name.log and explicitly only point to that file to rule things out first.

  5. i got an error saying its deprecated, and logstash crashed. im using logstash 1.4.2. i copied your logstash-modsecurity.conf file to /bin directory of my logstash and called it with “/bin/logstash -f logstash-modsecurity.conf”. Am i missing something here? how can we fix this?

  6. 3 years on and this still works a treat. I just had to make a couple of very minor tweaks to get this working with my modsec logs. Excellent work. Saved me loads of time.

  7. Thanks for the doc, I think it is working with ls 5, but i am getting the following logs on terminal after starting ls.

    root@rc:/usr/share/logstash/bin# ./logstash -f logstash.conf
    ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
    WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using –path.settings. Continuing using the defaults
    Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console
    10:46:38.825 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
    10:46:38.837 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>”/”}
    10:46:39.346 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch – Restored connection to ES instance {:url=>#}
    10:46:39.349 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Using mapping template from {:path=>nil}
    10:46:40.246 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – Attempting to install template {:manage_template=>{“template”=>”logstash-*”, “version”=>50001, “settings”=>{“index.refresh_interval”=>”5s”}, “mappings”=>{“_default_”=>{“_all”=>{“enabled”=>true, “norms”=>false}, “dynamic_templates”=>[{“message_field”=>{“path_match”=>”message”, “match_mapping_type”=>”string”, “mapping”=>{“type”=>”text”, “norms”=>false}}}, {“string_fields”=>{“match”=>”*”, “match_mapping_type”=>”string”, “mapping”=>{“type”=>”text”, “norms”=>false, “fields”=>{“keyword”=>{“type”=>”keyword”, “ignore_above”=>256}}}}}], “properties”=>{“@timestamp”=>{“type”=>”date”, “include_in_all”=>false}, “@version”=>{“type”=>”keyword”, “include_in_all”=>false}, “geoip”=>{“dynamic”=>true, “properties”=>{“ip”=>{“type”=>”ip”}, “location”=>{“type”=>”geo_point”}, “latitude”=>{“type”=>”half_float”}, “longitude”=>{“type”=>”half_float”}}}}}}}}
    10:46:40.748 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch – New Elasticsearch output {:class=>”LogStash::Outputs::ElasticSearch”, :hosts=>[#]}
    10:46:41.259 [[main]-pipeline-manager] INFO logstash.filters.geoip – Using geoip database {:path=>”/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.2.1-java/vendor/GeoLite2-City.mmdb”}
    10:46:41.375 [[main]-pipeline-manager] INFO logstash.filters.geoip – Using geoip database {:path=>”/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.2.1-java/vendor/GeoLite2-City.mmdb”}
    10:46:41.377 [[main]-pipeline-manager] INFO logstash.pipeline – Starting pipeline {“id”=>”main”, “pipeline.workers”=>2, “pipeline.batch.size”=>125, “pipeline.batch.delay”=>5, “pipeline.max_inflight”=>250}
    10:46:42.671 [[main]-pipeline-manager] INFO logstash.pipeline – Pipeline main started
    The stdin plugin is now waiting for input:
    10:46:43.193 [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for []:Array
    10:46:43.192 [[main]>worker0] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for []:Array
    10:46:43.255 [Api Webserver] INFO logstash.agent – Successfully started Logstash API endpoint {:port=>9600}
    10:46:43.277 [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}, {}, {}]:Array
    10:46:43.385 [[main]>worker0] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}, {}, {}]:Array
    10:46:43.418 [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}]:Array

    ———————
    Whenever there is new entry in my mod-sec audit log file, i get this error

    [[main]>worker1] ERROR logstash.filters.ruby – Ruby exception occurred: undefined method `to_hash’ for [{}]:Array

    It seems to be some error in wrt ruby, but unable to find a solution.

    I think it has got something to do with the ls conf file which i concatnated after following the instruction from the github repo readme.

    Please help me.

  8. I have a Sophos XG firewall and it’s running Mod Security on it for our WAF I wanted to ship the logs out but because of service contract issues they don’t allow us to alter the os or install any utilities. I have been able to netcat the reverseproxy.log file over to logstash but of course as you know it’s hard to work with. I’m trying to get your filters to work with our data. what is the best way to pipe the data to logstash through this grok?
    basically I’m just doing a tail -f /log/reverseproxy.log | nc {Logstash-IP} {PORT}.
    i get a “failed to find message” error in Kibana. I even tried to send the data to a file on the logstash server and hen open it as if it was a local log with input{file…
    no love.. any ideas?
    thanks!

    1. to be clear you want to be consuming the modsec AUDIT log, not the general modsec request log. Secondly, I’m not sure the best way given the firewall you are using. You might want to ask on the logstash github/forums.

      1. Thanks but the problem is that I’m not sure how much the vendor modified the modsecurity suite.
        They restrict access to the OS so much that we cant really browse around as root is not available.
        I was able to locate the reverseproxy.log which is the running log that reports on the OWASP and other WAF events.. the fields we are interested in are there I think They may have simply renamed the logfile itself.
        Like I had mentioned I am able to stream it to logstash via TCP to a file albeit with the logstash zulu time stamp and sender IP .. or I can send it to a file as a syslog file but the fields dont always start off in the same place in the log file.. I was assuming that the ruby code that breaks the line up into it’s components cant find the ‘message’ object. although it is there in the string..

  9. The trick for me is to tail the log and send it over to logstash via ip But I can’t figure out how to get logstash to ingest it with your parser.
    I can send it through to ES without issue, but all of the valuable searchable info is compacted in the single message without your tool.

  10. what is the best way to revamp this parser without using the fragmenter ruby?
    My entries are all one line timestamp to rule #.. I guess there is no real need for me to section it up…(A-K)

    or is there?
    Thanks

  11. The error is “Failed to find message” Im I right to assume thats becasue there is no “message”:” object in the data record? my log entries start like this.. each on on a single line,..
    [Thu Aug 29 15:47:18.283829 2019] timestamp=”1567108038″ srcip=”xxx.xxx.xxx.xxx” localip=”xxx.xxx.xxx.xxx” user=”-” host=”xxx.xxx.xxx.xxx” method=”POST” statuscode=”200″ … So they look to be properly formatted to work with the grok you have, but I’m new to grok and not a coder..
    Thanks for any help..

  12. Sorry about the multiple posts.. I cant edit my prevous one to add to them.
    I guess ultimately the ruby that splits the message line into the rawSection components doesnt understand that the lines I’m feeding it are the message that I need split and then filtered…
    Any help is greatly appreciated.. really..

  13. if I use
    input {
    tcp {
    type => “mod_security”
    port => 5554
    codec => line {
    charset => “US-ASCII”

    }
    }
    }
    I get a Failed to find message. presumably because the ruby section splitter isnt working..
    if I turn off the type => “mod_security” the record flows right into logstash and then to ES..
    but again, I’m missing out on the broken out object headings…. i.e. srcip=, host=, set-cookie=, uagent=…
    This is what this filter is supposed to do right? lol.. just making sure I’m not going nuts over here.

  14. Hah! all of that and all I needed to do was filter my stream through kv{}
    Thanks for the tip to go to the forums,
    If I ever get a device with a native audit log I’ll try this again. But for now. with my data… I’m good!
    Thanks again.

  15. Good afternoon, sorry for my Ingres, I am using the translator.

    First of all I would like to thank you for sharing this, it is very useful to me.

    Now I have a problem being able to filter the severity which appears within “auditLogTrailer.messages”

    auditLogTrailer.messages {
    “info”: “Warning. Pattern match \”(?:;|\\\\{|\\\\||\\\\|\\\\||&|&&|\\\\n|\\\\r|\\\\$\\\\(|\\\\$\\\\(\\\\(|`|\\\\${|\\\\(|\\\\(\\\\s*\\\\))\\\\s*(?:{|\\\\s*\\\\(\\\\s*|\\\\w+=(?:[^\\\\s]*|\\\\$.*|\\\\$.*|.*|\\\\’.*\\\\’|\\\”.*\\\”)\\\\s+|!\\\\s*|\\\\$)*\\\\s*(?:’|\\\”)*(?:[\\\\?\\\\*\\\\[\\\\]\\\\(\\\\)\\\\-\\\\|+\\\\w’\\\”\\\\./\\\\\\\\]+/)?[\\\\\\\\’\\\”]*(?:s[\\\\\\\\’\\\”]* …\” at ARGS:query.”,
    “id”: “932105”,
    “data”: “Matched Data: &sleep found within ARGS:query: query&sleep 15&”,
    “tag”: “application-multi”,
    “line”: “122”,
    “file”: “/usr/share/modsecurity-crs/rules/REQUEST-932-APPLICATION-ATTACK-RCE.conf”,
    “severity”: “CRITICAL”
    },

    I see the severity field, but since it is inside “auditLogTrailer.messages” kibana doesn’t let me do searches.

    Could you help me with this?

    Thank you.

  16. Hey Bits. I’m back.. So ok..we have given up on the crippled Sophos implementation of Modsecurity and installed it on our IIS servers. I have the modsecurity_audit.log being saved locally on the server.. I will be trying to see if I can figure out how to ship it to logstash with filebeat but my question is that the TCP input plugin cant seem to work with the current input codec multiline. I’m not sure what the best course of action would be to get the logs to stream into logstash.. any ideas are welcome I’m at a dead end.
    thanks

  17. The work done here is quite impressive, congratulations! ModSecurity also records a lot of messages in error_log, do you have any references of patterns or filters for parsing those messages in error_log?

  18. Hi, thanks a lot 🙂 Using logstash 6.8.6 I can’t have a working output, always got a ruby failure with
    [2020-01-30T09:50:14,421][ERROR][logstash.filters.ruby ] Ruby exception occurred: undefined method `/’ for nil:NilClass
    [2020-01-30T09:50:14,423][ERROR][logstash.filters.ruby ] Ruby exception occurred: undefined method `/’ for nil:NilClass
    [2020-01-30T09:50:14,428][ERROR][logstash.filters.ruby ] Ruby exception occurred: can’t convert nil into an exact number

    Does anyone get this ?

Leave a reply to kdb Cancel reply