Scanning JS Files for Endpoints and Secrets

It’s been 7 months since I started my journey in infosec, and our community has taught me a lot of things. From getting my first bounty to getting a job in Infosec, I have only grown stronger. So it is only fair to try and give back to the community. So here it is, my first blog post. I hope it helps you in some way. Enjoy!!

Everyone knows Recon is all about increasing the attack surface while doing penetration testing or bug bounty hunting. One way to do this is by looking at JavaScript files. JavaScript files are the backbone of the web applications. They define how a website is going to interact with different web services. So I wanted to do the same. I wanted to analyze javascript files to discover more endpoints within the application and also to look for hardcoded secrets within those files which developer sometimes forget about.

At the very beginning, I tried doing this manually. Loading the javascript file in the browser and then searching for endpoints and naughty strings. Totally manual. Its a good approach and can give you quite interesting results. But it’s a time-consuming process and can be troublesome when dealing with 1000 or at least 100 domains. So I decided to automate this workflow to some extent. I am not into programming that much but I believe in automation. It saves a ton of time which we can spend on other things like manually understanding the website functionality. So I decided to create a bash script to automate javascript static analysis

I had only 3 goals

  • Gather the javascript file links present in a domain.
  • Discover the endpoints present in those javascript
  • Then save those javascript files for further static analysis where we can look for hardcoded credentials and stuff

So in this script, I used the tool LinkFinder by Garben Javado, which I modify a bit to suppress some error messages. Here is the link to his article about the LinkFinder tool. Here I had a couple of other options such as JSParser by the nahamsec and Relative-url-extractor by Jobert Abma. the LinkFinder has cli output feature which prints the result directly onto the terminal. So I just went with it and created this bash wrapper around it to get the job done.

So this script will crawl the domain present in alive.txt.

When you run the tool it will process the domain present in alive.txt and will create two folders js and db.

It will then crawl the javascript files from those domains and save the javascript links and LinkFinder result to js folder.

It will save the javascript in db folder for further manual analysis.

Now comes the part to look for secrets or hardcoded strings in those js files. So we can get into the db folder and grep for anything we like.

At this stage, you can use your own creativity and look for certain keywords like api_key, api_secret, token. or also for var to identify potential GET or POST parameters.

Here is the link to my GitHub repository . You can easily access the tool and download it. I also added an install script that you can use to install few things before you run the tool. Then run it directly after modifying the alive.txt with your testing domains list.

I hope it helps you in some way. Feel free to connect with me on twitter @dark_warlord14 if you get stuck somewhere. And stay tuned for my next blog post where I will be introducing S3 bucket penetration testing tool which covers all the test cases around AWS s3 bucket.

Till then Happy Hacking!!

12 Replies to “Scanning JS Files for Endpoints and Secrets”

  1. Hello,
    Before installing your it i is mandatory to install 3 tools mentioned in your article ?

    1. The bash script depends on those tools. So I have added an install script also in the GitHub repository. Use that install script to setup things. Then just run the script to get the results

      1. There is install script included along with the tool. The main requirement is the modified version of linkfinder. If you just run the everything will get setup and you are good to go.

  2. Hey there 🙂

    Your wordpress site is very sleek – hope you don’t
    mind me asking what theme you’re using? (and don’t mind if I steal
    it? :P)

    I just launched my site –also built in wordpress like yours– but the theme slows (!) the site down quite a bit.

    In case you have a minute, you can find it by searching for
    “royal cbd” on Google (would appreciate any feedback) – it’s still in the works.

    Keep up the good work– and hope you all take care of yourself during the coronavirus scare!

    1. Clone the tool first. Then run bash Change alive.txt with URLs you want to test. Then just run bash

Comments are closed.