Skip to content

This script performs reconnaissance on a list of domains. It creates directories for storing output, extracts headers and response bodies from each domain, extracts script endpoints and downloads scripts, extracts relative URLs from scripts, and runs nmap scans on each domain.

License

Notifications You must be signed in to change notification settings

ReekElderblood/JSpy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🕵️‍♂️ JSpy

JSpy is a powerful reconnaissance tool designed to automate the process of gathering information about a list of domains. It creates output directories for each domain, extracts headers and response bodies, script endpoints and relative URLs, and runs nmap scans. Whether you're a bug bounty hunter, pen tester, or security researcher, JSpy is an essential tool for any reconnaissance mission. With JSpy, you can save time and increase your chances of finding vulnerabilities in your target domains.

🚀 Features:

  • Extracts headers and response bodies
  • Extracts script endpoints and downloads scripts
  • Extracts relative URLs from scripts
  • Runs nmap scans on each domain

📖 Requirements

  • curl
  • awk
  • bash
  • grep
  • find
  • rmdir
  • tee
  • nmap
  • Ruby (and the extract.rb script located in /root/Tool/relative-url-extractor/)

⚙️ How SETUP IT

  1. Create a file named scope.txt that contains the target domain:
┌──(root㉿kali)-[~/path/to/directory]
└─# cat scope.txt
example.com
  1. Create a directory in the root user named Tools and download relative-url-extractor in it:
┌──(root㉿kali)-[~/path/to/directory]
└─# mkdir /root/Tools/ && cd /root/Tools/ && git clone https://github.com/jobertabma/relative-url-extractor.git
  1. Create a file named alive.txt that contains the subdomains of your target in URL form:
┌──(root㉿kali)-[~/path/to/directory]
└─# cat alive.txt 
https://subdomain1.example.com
https://subdomain2.example.com
http://subdomain3.example.com
  1. Create a folder named after the target domain in JSpy directory:
┌──(root㉿kali)-[~/path/to/directory]
└─# mkdir example.com
  1. Move alive.txt to the folder that you just created
┌──(root㉿kali)-[~/path/to/directory]
└─# mv alive.txt /root/kiwi.com/
  1. You are now ready to run the script:
┌──(root㉿kali)-[~]
└─# git clone https://github.com/SecuritySphinx/JSpy.git
┌──(root㉿kali)-[~]
└─# cd JSpy
┌──(root㉿kali)-[~/JSpy]
└─# echo kiwi.com > scope.txt
┌──(root㉿kali)-[~/JSpy]
└─# mkdir kiwi.com
┌──(root㉿kali)-[~/JSpy]
└─# mv /root/Target/alive.com /root/JSpy/kiwi.com/
┌──(root㉿kali)-[~/JSpy]
└─# bash jspy.sh scope.txt

🐕 DataHound

The DataHound script uses command-line options to search for specific input within HTML, JavaScript, Nmap scans, and header files. The script traverses through all the collected data and uses grep to find matching keywords.

root@ubuntu:~/example.com$ ./search.sh -h
[+]USAGE: ./search.sh  (OPTIONS)
-j (string) - search in javascript files
-x (string) - search in header files
-e (string) - search in  html files
-n (string) - search nmap scans
-h - help

💡 Move the DataHound.sh script to the JS folder of your target.

Examples to use Datahound script

$ ./search.sh -j "admin"          # search for "admin" in JavaScript files
$ ./search.sh -x "nginx"          # search for "nginx" in header files
$ ./search.sh -e "s3.amazonaws"   # search for "s3.amazonaws" in HTML files
$ ./search.sh -n "ssh"            # search Nmap scans for the string "ssh"

Support

Thanks for visiting my repository! If you find my work useful, please consider buying me a coffee to support my future projects.

Buy Me A Coffee

About

This script performs reconnaissance on a list of domains. It creates directories for storing output, extracts headers and response bodies from each domain, extracts script endpoints and downloads scripts, extracts relative URLs from scripts, and runs nmap scans on each domain.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages