Tagged: mongodb

Logstash for ModSecurity audit logs

Recently had a need to take tons of raw ModSecurity audit logs and make use of them. Ended up using Logstash as a first stab attempt to get them from their raw format into something that could be stored in something more useful like a database or search engine. Nicely enough, out of the box, Logstash has an embeddable ElasticSearch instance that Kibana can hook up to.  After configuring your Logstash inputs, filters and outputs, you can be querying your log data in no time…. that is assuming writing the filters for your log data takes you close to “no time” which is not the case with modsec’s more challenging log format.

After searching around for some ModSecurity/Logstash examples, and finding only this one (for modsec entries in the apache error log), I was facing the task of having to write my own to deal w/ the modsecurity audit log format…..arrggg!

So after a fair amount of work, I ended up having a Logstash configuration that works for me… hopefully this will be of use to others out there as well. Note, this is certainly not perfect but is intended to serve as an example and starting point for anyone who is looking for that.

The Modsecurity Logstash configuration file (and tiny pattern file) is located here on Github: https://github.com/bitsofinfo/logstash-modsecurity

  1. Get some audit logs generated from modsecurity and throw them into a directory
  2. Edit the logstash modsecurity config file (https://github.com/bitsofinfo/logstash-modsecurity) and customize its file input path to point to your logs from step (1)
  3. Customize the output(s) and review the various filters
  4. On the command line:  java -jar logstash-[version]-flatjar.jar agent -v -f  logstash_modsecurity.conf

This was tested against Logstash v1.2.1 through 1.4.2 and relies heavily on Logstash’s “ruby” filter capability which really was a lifesaver to be able to workaround some bugs and lack of certain capabilities Logstash’s in growing set of filters. I’m sure as Logstash grows, much of what the custom ruby filters do can be changed over time.

The end result of it is that with this configuration, your raw Modsec audit log entries, will end up looking something like this JSON example below. Again this is just how I ended up structuring the fields via the filters. You can take the above configuration example and change your output to your needs.

Also note that ModSecurity Audit logs can definitely contains some very sensitive data (like user passwords etc). So you might want to also take a look at using Logstash’s Cipher filter to encrypt certain message fields in transit if you are sending these processed logs somewhere else: https://bitsofinfo.wordpress.com/2014/06/25/encrypting-logstash-data/

EXAMPLE JSON OUTPUT, using this Logstash configuration

  "@timestamp": "2013-09-17T09:46:16.088Z",
  "@version": "1",
  "host": "razzle2",
  "path": "\/Users\/bof\/who2\/zip4n\/logstash\/modseclogs\/proxy9\/modsec_audit.log.1",
  "tags": [
  "rawSectionA": "[17\/Sep\/2013:05:46:16 --0400] MSZkdwoB9ogAAHlNTXUAAAAD 65183 80",
  "rawSectionB": "POST \/xml\/rpc\/soapservice-v2 HTTP\/1.1\nContent-Type: application\/xml\nspecialcookie: tb034=\nCache-Control: no-cache\nPragma: no-cache\nUser-Agent: Java\/1.5.0_15\nHost: xmlserver.intstage442.org\nAccept: text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2\nConnection: keep-alive\nContent-Length: 93\nIncoming-Protocol: HTTPS\nab0044: 0\nX-Forwarded-For:",
  "rawSectionC": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "rawSectionF": "HTTP\/1.1 200 OK\nX-SESSTID: 009nUn4493\nContent-Type: application\/xml;charset=UTF-8\nContent-Length: 76\nConnection: close",
  "rawSectionH": "Message: Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required. [file \"\/opt\/niner\/modsec2\/pp7.conf\"] [line \"69\"] [id \"960010\"] [msg \"Request content type is not allowed by policy\"] [severity \"WARNING\"] [tag \"POLICY\/ENCODING_NOT_ALLOWED\"]\nApache-Handler: party-server-time2\nStopwatch: 1379411176088695 48158 (1771* 3714 -)\nProducer: ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.\nServer: Whoisthat\/v1 (Osprey)",
  "modsec_timestamp": "17\/Sep\/2013:05:46:16 --0400",
  "uniqueId": "MSZkdwoB9ogAAHlNTXUAAAAD",
  "sourceIp": "",
  "sourcePort": "65183",
  "destIp": "",
  "destPort": "80",
  "httpMethod": "POST",
  "requestedUri": "\/xml\/rpc\/soapservice-v2",
  "incomingProtocol": "HTTP\/1.1",
  "requestBody": "{\"id\":2,\"method\":\"report\",\"stuff\":[\"kborg2@special292.org\",\"X22322mkf3\"],\"xmlrpm\":\"0.1a\"}",
  "serverProtocol": "HTTP\/1.1",
  "responseStatus": "200 OK",
  "requestHeaders": {
    "Content-Type": "application\/xml",
    "specialcookie": "8jj220021kl==j2899IuU",
    "Cache-Control": "no-cache",
    "Pragma": "no-cache",
    "User-Agent": "Java\/1.5.1_15",
    "Host": "xmlserver.intstage442.org",
    "Accept": "text\/html, image\/gif, image\/jpeg, *; q=.2, *\/*; q=.2",
    "Connection": "keep-alive",
    "Content-Length": "93",
    "Incoming-Protocol": "HTTPS",
    "ab0044": "0",
    "X-Forwarded-For": ""
  "responseHeaders": {
    "X-SESSTID": "009nUn4493",
    "Content-Type": "application\/xml;charset=UTF-8",
    "Content-Length": "76",
    "Connection": "close"
  "auditLogTrailer": {
    "Apache-Handler": "party-server-time2",
    "Stopwatch": "1379411176088695 48158 (1771* 3714 -)",
    "Producer": "ModSecurity for Apache\/2.7 (http:\/\/www.modsecurity.org\/); core ruleset\/1.9.2.",
    "Server": "Whoisthat\/v1 (Osprey)",</pre>
"messages": [ { "info": "Warning. Match of \"rx (?:^(?:application\\\\\/x-www-form-urlencoded(?:;(?:\\\\s?charset\\\\s?=\\\\s?[\\\\w\\\\d\\\\-]{1,18})?)??$|multipart\/form-data;)|text\/xml)\" against \"REQUEST_HEADERS:Content-Type\" required.", "file": "\/opt\/niner\/modsec2\/pp7.conf", "line": "69", "id": "960010", "msg": "Request content type is not allowed by policy", "severity": "WARNING", "tag": "POLICY\/ENCODING_NOT_ALLOWED" } ] }, "event_date_microseconds": 1.3794111760887e+15, "event_date_milliseconds": 1379411176088.7, "event_date_seconds": 1379411176.0887, "event_timestamp": "2013-09-17T09:46:16.088Z", "XForwardedFor-GEOIP": { "ip": "", "country_code2": "XZ", "country_code3": "BRZ", "country_name": "Brazil", "continent_code": "SA", "region_name": "12", "city_name": "Vesper", "postal_code": "", "timezone": "Brazil\/Continental", "real_region_name": "Region Metropolitana" }, "matchedRules": [ "SecRule \"REQUEST_METHOD\" \"@rx ^POST$\" \"phase:2,status:400,t:lowercase,t:replaceNulls,t:compressWhitespace,chain,t:none,deny,log,auditlog,msg:'POST request must have a Content-Length header',id:960022,tag:PROTOCOL_VIOLATION\/EVASION,severity:4\"", "SecRule \"REQUEST_FILENAME|ARGS|ARGS_NAMES|REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:Referer\" \"@pm jscript onsubmit onchange onkeyup activexobject vbscript: <![cdata[ http: settimeout onabort shell: .innerhtml onmousedown onkeypress asfunction: onclick .fromcharcode background-image: .cookie onunload createtextrange onload <input\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:compressWhiteSpace,t:lowercase,nolog,skip:1\"", "SecAction \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,nolog,skipAfter:950003\"", "SecRule \"REQUEST_HEADERS|XML:\/*|!REQUEST_HEADERS:'\/^(Cookie|Referer|X-OS-Prefs)$\/'|REQUEST_COOKIES|REQUEST_COOKIES_NAMES\" \"@pm gcc g++\" \"phase:2,status:406,t:lowercase,t:replaceNulls,t:compressWhitespace,t:none,t:urlDecodeUni,t:htmlEntityDecode,t:lowercase,nolog,skip:1\"", ], "secRuleIds": [ "960022", "960050" ] } 

How to access your OpenShift MongoDB database remotely on OS-X

I recently started playing around with Redhat’s Openshift PaaS and installed the MongoDB and RockMongo cartridges on my application. My use case was just to leverage the Openshift platform to run my MongoDB instance for me, and I really was ready (nor needing) to push an actual application out to the application running @ openshift; instead I just wanted my local Java program to leverage the remote MongoDB instance, pump some data into it and then view it in Rockmongo (also running on the app at openshift).

Turns out you can enable this by enabling port forwarding locally on the computer you want to connect from. This is on OS-X:

  • First ensure that your SSH key is available for use by the command we will run: “ssh-add /path/to/your/openshift/ssh.key” This should be the key that you created when you initially signed up for Openshift
  • Attempt to run the port forwarding enable command “rhc port-forward -a [yourAppName] -l [yourOpenShiftLoginId]“, it will prompt you for your openshift credentials, then will likely output something like the below:

Checking available ports...

Binding httpd ->…
Binding mongod ->…

Use ctl + c to stop

bind: Can’t assign requested address
channel_setup_fwd_listener: cannot listen to port: 27017
bind: Can’t assign requested address
channel_setup_fwd_listener: cannot listen to port: 8080
Could not request local forwarding.

  • The above error is because we need to now enable loopback aliases for each failed 127.x address listed above (those addresses will be different in your output above). To do this run the following for EACH loopback that failed: “sudo ifconfig lo0 alias 127.x.x.x“. Note that even after running this, make sure you already don’t have something else local bound on those ports (i.e. a local dev mongodb instance or something else on 8080)
  • After you add the loopback aliases re-run the port forwarding command above, this time it should successfully complete. If you go to https://127.x.x.x:8080 your request will be going directly to your openshift instance.
  • You can also now connect to your mongodb instance programmatically as well just using the new address:ports forwarded above plus the appropriate credentials
  • Interestingly enough the port forwarding goes over an SSH tunnel to your app there, and it appears that Openshift itself uses AWS!
Some relevant links to help you out which also helped me: