Skip to content

EnesKeremAYDIN/linkMap

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

linkMap

linkMap is an advanced CLI tool for recursively discovering endpoints and JavaScript files from web pages. It is designed for security researchers, bug bounty hunters, and developers who want to map out all endpoints and JS dependencies of a website.

Features

  • CLI-based, easy to use
  • Supports single and recursive (multi) crawling modes
  • Extracts endpoints and JS files from HTML and JS sources
  • Avoids duplicate crawling
  • Regex filtering for endpoints
  • Cookie support for authenticated requests
  • Optionally saves results as a structured JSON file or prints to console
  • Detailed logging of all actions
  • Unique output filenames with crawl context

Installation

  1. Clone the repository:
    git clone https://github.com/EnesKeremAYDIN/linkMap.git
    cd linkMap
  2. Install dependencies:
    pip install -r requirements.txt

Usage

Run the tool from the command line:

python linkMap.py -i <url> [<url2> ...] [options]

Arguments

  • -i, --input : Target URL(s) to crawl (required, one or more)
  • -m, --mode : Crawl mode: s (single, only the given URL) or m (multi, recursively crawl discovered JS files). Default: s
  • -r, --ragex : Regex filter for endpoints (optional)
  • -c, --cookies : Add cookies to requests (e.g. sessionid=abc123;token=xyz) (optional)
  • -s, --save : Save results as a JSON file (default: print to console)
  • -h, --help : Show help message

Example Commands

  • Single page crawl, print to console:
    python linkMap.py -i https://example.com
  • Recursive crawl, save to file:
    python linkMap.py -i https://example.com -m m -s
  • Crawl with cookies and regex filter:
    python linkMap.py -i https://example.com -c "sessionid=abc123" -r "api"

Output

  • If --save is used, results are saved as a uniquely named JSON file (e.g. linkmap_multi_example_com_20240607T153000.json).
  • If not, results are printed as JSON to the console.
  • The JSON contains:
    • endpoints: List of discovered endpoints
    • details: Mapping of each crawled URL to its found endpoints and JS files
    • usage: The crawl parameters
    • timestamp: The crawl time (UTC)

Requirements

  • Python 3+
  • requests
  • beautifulsoup4

Install dependencies with:

pip install -r requirements.txt

License

This project is for educational and research purposes.

About

linkMap is an advanced CLI tool for recursively discovering endpoints and JavaScript files from web pages. It is designed for security researchers, bug bounty hunters, and developers who want to map out all endpoints and JS dependencies of a website.

Resources

License

Stars

Watchers

Forks

Contributors

Languages