Awesome Open Source
Awesome Open Source

ModSecurity & Logging

Demo code for the talk Hands-On ModSecurity and Logging. This is insecure and generally bad code only use it for demos and education on what not to do.


  1. Show and then focus on the login form After a successful login, try it with ' or true -- and a random password to skip the login form.
  2. Let's look at this looks potentially interesting, right?
  3. Validate the suspicion with sqlmap --url "" --purge. This assumes you have installed sqlmap (for example with Homebrew), otherwise download and run it with python
  4. So this has potential. Quickly show the code with a focus on the string concatenation and mysqli_multi_query.
  5. Exploit the bad code by attaching ;INSERT INTO employees (name) VALUES ('Bad Actor') to
  6. Also we are not escaping the output, so ;INSERT INTO employees (name) VALUES ('<script>alert("Hello Friend")</script>') will add more fun to the demo.
  7. Dive into the logging by showing /var/log/app.log and then how Filebeat is collecting this information.
  8. In Kibana show the relevant parts either in Discover or the Log UI by filtering down to application : "app".
  9. Try to DELETE or DROP data for example with ;DROP TABLE employees, which doesn't work since our connection only allows SELECT or INSERT.
  10. Point to, which is using the same code but runs on Apache and ModSecurity instead of nginx.
  11. Run sqlmap --url "" --purge, which results in 403 (Forbidden) - 134 times.
  12. Show the Apache Filebeat dashboard where you can see the blocked requests.
  13. Also show the raw ModSecurity logs by filtering to application : "mod_security" and point out that JSON logging is the important configuration here as well as the rename to the message field.
  14. Show the custom rule in action by trying to add someone called Shay Banon or just Shay and show the log message.
  15. But this isn't fully fool proof. For example '(Or)1=1() still allows you to skip the login form.


Make sure you have run this before the demo.

  1. Have your AWS account set up, access key created, and added as environment variables in AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. Protip: Use to keep your environment variables safe.
  2. Create the Elastic Cloud instance with the same version as specified in variables.yml's elastic_version and set the environment variables with the values for ELASTICSEARCH_HOST, ELASTICSEARCH_USER, ELASTICSEARCH_PASSWORD, as well as KIBANA_HOST, KIBANA_ID.
  3. Change the settings to a domain you have registered under Route53 in inventory,, and variables.yml. Set the Hosted Zone for that domain and export the Zone ID under the environment variable TF_VAR_zone_id. If you haven't created the Hosted Zone yet, you should set it up in the AWS Console first and then set the environment variable.
  4. If you haven't installed the AWS plugin for Terraform, get it with terraform init first. Then create the keypair, DNS settings, and instances with terraform apply.
  5. Apply the configuration to the instance with ansible-playbook configure.yml and then deploy with ansible-playbook deploy.yml.

When you are done, remove the instances, DNS settings, and key with terraform destroy.


  • Add file inclusion (local or remote) trickery?
  • Add cookie trickery?

Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
Php (404,198
Elasticsearch (3,819
Kibana (833
Filebeat (182
Related Projects