Awesome Open Source
Awesome Open Source

Take a list of domains, crawl urls and scan for endpoints, secrets, api keys, file extensions, tokens and more

go-report-card workflows ubuntu-build win10-build pr-welcome
Mainteinance yes ask me anything gobadge license-GPL3
Coded with by edoardottt
Share on Twitter!

Preview Install Get Started Examples Changelog Contributing License

Preview 📊



Using Docker

docker build -t cariddi .
docker run cariddi -h

Building from source

You need Go.

  • Linux

    • git clone
    • cd cariddi
    • go get
    • make linux (to install)
    • make unlinux (to uninstall)

    Or in one line: git clone; cd cariddi; go get; make linux

  • Windows (executable works only in cariddi folder.)

    • git clone
    • cd cariddi
    • go get
    • .\make.bat windows (to install)
    • .\make.bat unwindows (to uninstall)

Get Started

cariddi -h prints the help in the command line.

Note: Don't rely on the CLI output, use always -ot/-oh to save the output.

Usage of cariddi:
  -c int
    	Concurrency level. (default 20)
    	Use the .cariddi_cache folder as cache.
  -d int
    	Delay between a page crawled and another.
    	Print debug information while crawling.
  -e	Hunt for juicy endpoints.
  -ef string
    	Use an external file (txt, one per line) to use custom parameters for endpoints hunting.
    	Hunt for errors in websites.
    	Print the examples.
  -ext int
    	Hunt for juicy file extensions. Integer from 1(juicy) to 7(not juicy).
  -h	Print the help.
  -headers string
    	Use custom headers for each request E.g. -headers "Cookie: auth=yes;;Client: type=2".
  -headersfile string
    	Read from an external file custom headers (same format of headers flag).
  -i string
    	Ignore the URL containing at least one of the elements of this array.
    	Hunt for useful informations in websites.
    	Crawl searching for resources matching 2nd level domain.
  -it string
    	Ignore the URL containing at least one of the lines of this file.
  -oh string
    	Write the output into an HTML file.
  -ot string
    	Write the output into a TXT file.
    	Print only the results.
  -proxy string
    	Set a Proxy to be used (http and socks5 supported).
    	Use a random browser user agent on every request.
  -s	Hunt for secrets.
  -sf string
    	Use an external file (txt, one per line) to use custom regexes for secrets hunting.
  -t int
    	Set timeout for the requests. (default 10)
    	Print the version.


  • cariddi -version (Print the version)

  • cariddi -h (Print the help)

  • cariddi -examples (Print the examples)

  • cat urls | cariddi -s (Hunt for secrets)

  • cat urls | cariddi -d 2 (2 seconds between a page crawled and another)

  • cat urls | cariddi -c 200 (Set the concurrency level to 200)

  • cat urls | cariddi -e (Hunt for juicy endpoints)

  • cat urls | cariddi -plain (Print only useful things)

  • cat urls | cariddi -ot target_name (Results in txt file)

  • cat urls | cariddi -oh target_name (Results in html file)

  • cat urls | cariddi -ext 2 (Hunt for juicy (level 2 out of 7) files)

  • cat urls | cariddi -e -ef endpoints_file (Hunt for custom endpoints)

  • cat urls | cariddi -s -sf secrets_file (Hunt for custom secrets)

  • cat urls | cariddi -i forum,blog,community,open (Ignore urls containing these words)

  • cat urls | cariddi -it ignore_file (Ignore urls containing at least one line in the input file)

  • cat urls | cariddi -cache (Use the .cariddi_cache folder as cache)

  • cat urls | cariddi -t 5 (Set the timeout for the requests)

  • cat urls | cariddi -intensive (Crawl searching also subdomains, same as *

  • cat urls | cariddi -rua (Use a random browser user agent on every request)

  • cat urls | cariddi -proxy (Set a Proxy (http and socks5 supported))

  • cat urls | cariddi -headers "Cookie: auth=admin;type=2;; X-Custom: customHeader"

  • cat urls | cariddi -headersfile headers.txt (Read from an external file custom headers)

  • cat urls | cariddi -err (Hunt for errors in websites.)

  • cat urls | cariddi -info (Hunt for useful informations in websites.)

  • cat urls | cariddi -debug (Print debug information while crawling.)

  • For Windows:

    • use powershell.exe -Command "cat urls | .\cariddi.exe" inside the Command prompt
    • or just cat urls | cariddi.exe using PowerShell
  • To integrate cariddi with Burpsuite make sure to follow these steps.


Detailed changes for each release are documented in the release notes.


Just open an issue/pull request.

Help me building this!

Special thanks to: go-colly, zricethezav, projectdiscovery, tomnomnom and RegexPassive.

To do:

  • [ ] Tests ()

  • [ ] Tor support

  • [x] Custom headers support

  • [x] Proxy support

  • [x] Ignore specific types of urls

  • [x] Plain output (print only results)

  • [x] HTML output

  • [x] Output color

  • [x] Endpoints (parameters) scan

  • [x] Secrets scan

  • [x] Extensions scan

  • [x] TXT output


This repository is under GNU General Public License v3.0. to contact me.

Alternatives To Cariddi
Select To Compare

Alternative Project Comparisons
Related Awesome Lists
Top Programming Languages
Top Projects

Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
Go (158,480
Golang (158,480
Security (31,687
Crawler (10,174
Crawling (10,174
Penetration Testing (3,141
Pentesting (3,140
Security Tools (1,897
Osint (1,138
Recon (892
Reconnaissance (589
Endpoint (121
Secret Keys (29
Endpoint Security (28
Secrets Detection (26
Endpoint Discovery (10
Endpoint Protection (9
Assetfinder (5