Skip to main content


Getting Started with Burpsuite & Running a basic Web-Spider


Burpsuite is a collection of tools bundled into a single suite made for Web Application Security or Penetration testing. Its a java executable and hence its cross platform. Kali Linux comes with Buprsuite free edition installed. There is also a professional version available. The main features of burpsuite is that it can function as an intercepting proxy. Burpsuite intercepts the traffic between a web browser and the web server.


Other Features include:
  • Application Aware Spider : Used for spidering/crawling a given scope of pages.
  • Scanner :  Automatically scans for vulnerabilities just like any other automated scanners
  • Intruder : Used to perform attacks & bruteforces on pages in a highly customize-able manner.
  • Repeater : Used for manipulating and resending individual requests.
  • Sequencer : Used mainly for testing/fuzzing session tokens.
  • Extensibility, allowing you to easily write your own plugins, to perform complex and highly customized tasks within Burp.
  • Comparer & Decoder used for misc purposes that might come along the way when you conduct a Web Security test

Spidering a Website

A web crawler is a bot program which systematically browses the pages of a website for the purpose of indexing. Precisely a web crawler  maps the structure of a website by browsing all its inner pages. The crawler is also reffered to as spider or automatic indexer.
Burpsuite has got its own spider called the burpspider. The burp spider is a program which crawls into all the pages of a target specified in the scope. Before starting the burp spider, burpsuite has to to be configured to intercept the HTTP traffic.

Interface & Options

Like any other GUI/Windows tool, burpsuite contains a standard menu bar, 2 rows of tabs & different set of panels as seen below.


The above figure shows the options & details about the target. In the above figure there are mainly 4 sections. They are described against the corresponding numbers as follows:
  1. Tool & Options selector Tabs – Select between Various tools & settings of burpsuite
  2. Sitemap View – Displays the sitemap once spider has started
  3. Requests Queue – Displays the requests being made
  4. Request/Response Details – The HTTP requests made & the responses from the servers.

Lab 1 : Spidering a website

Spidering is a major part of recon while performing Web security tests. It helps the pentester to identify the scope & archetecture of the web-application.As described earlier, burpsuite has it’s own spider called the burp spider which can crawl into a website.

Scenario: Attacker – Kali Linux VM, IP =

Target – OWASP Broken Web Application VM, IP =


Step 1 : Setup Proxy.
First start burpsuite and check details under proxy tab in Options sub-tab. Ensure IP is localhost IP & port is 8080.

Proxy Options & Information
Also ensure that Intercept is ON in the Intercept Sub-Tab

Turning ON intercept
Then on IceWeasel/Firefox, Goto Options > Preferences > Network > Connection Settings.
Choose Manual Proxy Configuration

Setting Proxy in IceWeasel
If you want, you can try installing proxy add-ons. Here is one such.
Install the proxy selector from addons page and goto preferences

Setting Up Addons

Goto Manage Proxies & add a new proxy filling out the relevant information. It’s simple.

Configuring Addon Proxy
Click the Proxy Selector button at the Top right & select the Proxy you just created.

Setting Up Addons
Step 2 : Getting Content into Burp 
After you have setup the proxy, goto the target normally by entering the URL in the address bar. You can notice that the page will not be loading up. This is because burpsuite is intercepting the connection.

Page Loading
Meanwhile in burpsuite, you can see the request details. Click forward to forward the connection. Then you can see that the page has loaded up in the browser.

burp intercepting
Page Loaded
Comming back to burpsuite, you can see that all sections are populated.

Sitemap, Requests & Request/Response Details
Step 3 : Scope Selection & Starting Spider
Now narrow down the target as you want. Here the target/mutillidae is selected. Right click the mutillidae from the sitemap & select Spider from Here option

Selecting the target
After the spider starts, You get a prompt as shown in the following figure. It’s a login form. If you know the details, fill in as needed & thus the spider wil be able to crawl from the inside also. You can skip this step by pressing the Ignore Form button.

Submitting a Login form

Step 4 : Manipulating Details
Now you can see as the spider runs, the tree inside of the mutillidae branch gets populated. Also the requests made are shown in the queue and the details are shown in the Request tab.

More details get Populated
Move on to different Tabs and see all the underlying information.

Interesting Cookie information
Response Details from the target
The page source
Finally check if spider is finished by viewing the Spider tab.

Spider Status

These are the very basics & starting point of a web security test. Spidering is an important part of the recon during the test and by clearly executing this, we can understand about the architecture of the target site.  In upcomming tutorials, we will extend this to other tools in the Burpsuite set of tools.

Source :--

Like Our Page on Facebook :-- Grey Hat Hackers
NOTE: This is for educational purpose only we are not responsible for any type of inconvenience caused by reader.


Popular posts from this blog

Social Engineering Toolkit (SET)

Social Engineering Toolkit  (SET) is an advanced,  multifunctional, and easy-to-use computer-assisted social engineering toolset, created by the founders of  TrustedSec ( It helps you prepare the most effective way to exploit client-side application vulnerabilities and makes a fascinating attempt to capture the target's confidential information (for example, e-mail passwords). Some of the most efficient and useful attack methods employed by SET include targeted phishing e-mails with a malicious file attachment, Java applet attacks, browser-based exploitation, gathering website credentials, creating infectious portable media (USB/ DVD/CD), mass-mailer attacks, and other similar multiattack web vectors. This combination of attack methods provides you with a powerful platform to utilize and select the most persuasive technique that could perform an advanced attack against the human element.

 To start SET, navigate to  Applications  |  Kali Linux  |  Expl…

Cracking Wifi Using :Fern(GUI)


As a part of Kali linux , fern can be directly used from kali linux , i would be recommending the use of kali because while using other linux environments it could be a trouble because while using fern it automatically detects the path of aircrack-ng and python installed , while in other environments it is needed to set it follow the following steps :-->

1.) Download kali linux iso and make a bootable pendrive .....if you dont know how to make bootable pendrive  then follow the steps given in blog of trinity rescue kit

Kali linux iso (amd64) recommended -->here
Link to trinity rescue kit blog --> here

 2.)Open Kali linux Goto Applications-->Wireless Attack--> Fern

3.)Select Interface card wlan0

4.)Double click any where in GUI

5.)Select enable x-terms that you can view that happening ...while through a automated program...

6.)Click on select network

7.)Choose the type of network that is WEP/WPA

8.)I would recommend to add dictonary file .…

Deep Web - Part 1

What is deep web ?

Deep web Aka Invisible web Aka Hidden web are parts of the world wide web whose contents are not indexed by standard search engines.
The surface web also known as Visible web , Clearnet , Indexed web or Lightnet is that portion of the world wide web that is readily available to the general public and searchable with standard web search engines. Level of web where vast majority of internet users are connected to and which is accessible in any nation that does not block internet access.E.g: Social media sites like Facebook, informational websites like Wikipedia, general websites, etc


It is the part of world wide web that is not indexed by search-engines,which is directly accessible and no proxy required. E.g: Google locked results, recently web crawled old content, pirated media, pornography etc
Deep web Aka Invisible web Aka Hidden web are parts of the wo…