Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Wooyun_public | 3,701 | 4 years ago | 29 | PHP | ||||||
This repo is archived. Thanks for wooyun! 乌云公开漏洞、知识库爬虫和搜索 crawl and search for wooyun.org public bug(vulnerability) and drops | ||||||||||
Ambar | 1,797 | 2 years ago | 2 | mit | JavaScript | |||||
:mag: Ambar: Document Search Engine | ||||||||||
Open Source Search Engine | 1,376 | 7 months ago | 89 | apache-2.0 | C++ | |||||
Nov 20 2017 -- A distributed open source search engine and spider/crawler written in C/C++ for Linux on Intel/AMD. From gigablast dot com, which has binaries for download. See the README.md file at the very bottom of this page for instructions. | ||||||||||
Spider | 919 | 5 months ago | 88 | Python | ||||||
Python website crawler. | ||||||||||
Angrysearch | 866 | a year ago | 37 | gpl-2.0 | Python | |||||
Linux file search, instant results as you type | ||||||||||
Ipfs Search | 758 | 2 | 2 months ago | 19 | April 20, 2021 | 41 | agpl-3.0 | Go | ||
Search engine for the Interplanetary Filesystem. | ||||||||||
Fess | 714 | 12 | 19 | 20 hours ago | 135 | June 13, 2022 | 27 | apache-2.0 | Java | |
Fess is very powerful and easily deployable Enterprise Search Server. | ||||||||||
Tweetscraper | 698 | 2 years ago | 8 | April 29, 2018 | 1 | gpl-2.0 | Python | |||
TweetScraper is a simple crawler/spider for Twitter Search without using API | ||||||||||
Go Dork | 677 | 6 months ago | 4 | April 03, 2021 | 4 | mit | Go | |||
The fastest dork scanner written in Go. | ||||||||||
Filemasta | 625 | a year ago | 4 | gpl-3.0 | C# | |||||
A search application to explore, discover and share online files |
This is an open source, multi-threaded website crawler written in Python. There is still a lot of work to do, so feel free to help out with development.
Note: This is part of an open source search engine. The purpose of this tool is to gather links only. The analytics, data harvesting, and search algorithms are being created as separate programs.