I recently blogged about one of my favourite tools POSTMAN and how to set up its proxy server. POSTMAN is a great a tool but there are a few limitations I find with it:
- It's a Chrome app so therefore you can't use it with other browsers or other tools (I use proxies to debug HTTP calls from IDEs a lot to compare and contrast for debug purposes)
- Whilst POSTMAN saves requests it doesn't save responses which can be a problem when you are trying to capture context specific requests, such as deleting data that has been specifically setup before that request.
So while I use POSTMAN regularly there are times I require a different proxy server that offer solutions to the problems above and that proxy is BurpSuite. So in spirit of balance and because I think others may benefit from using BurpSuite I thought I would blog about setting up the BurpSuite proxy and demonstrate some of the features that I use regularly.
Unless a client has a pro license I tend to use the free version of BurpSuite since it contains all the features I need. You can download the free version for the Portswigger site and it is a standalone JAR file that you can double click on to load up.
Setting up BurpSuite Proxy
BurpSuite proxy comes out of the box pre-configured and I find that those settings work for me, so the act of loading up BurpSuite means that its proxy server is on. However, as default BurpSuite enables it interceptor which can be used for things like man in the middle attacks or data driven testing, but I tend to use proxies in a passive manner so I prefer to turn it off.
To turn it off click on the ‘Proxy’ tab and click on ‘Intercept is on’ to turn it off.
It’s worth adding that if you do need to change the port or IP for your proxy you can update it in the ‘Options’ tab in the ‘Proxy Listeners’ section. Be warned though that the free version doesn’t save your settings so you will need to do this each time.
Pointing a browser to the BurpSuite proxy
Since the last blog post was focused on Chrome lets look at configuring Firefox to use BurpSuite. All you need to do is update the browsers settings to include details of the BurpSuite proxy which you can do in a couple of ways:
Through the Firefox settings
Navigate to Preferences > Advanced > Network and click on the ‘Settings’ button under the ‘Connection’ banner. You should see options for setting a proxy. We are going to manually set ours so click on ‘Manual proxy configuration’ and enter your BurpSuite ip and port which is defaulted to 127.0.0.1:8080 and opt to ‘Use this proxy server for all protocols’ and save those settings.
Once saved, open a tab and navigate to a page and observe in the ‘HTTP history’ tab the HTTP requests appearing as you request a page. This is the simplest way of setting up a browser, but it can be a pain to do this each time you have to turn on and off your proxy connection (You will get a connection error if you close BurpSuite down and don’t turn of the proxy settings). There is another option though.
Using an extension like FoxyProxy
FoxyProxy is a great tool because it allows you to easily switch proxy connections on and off with the click of a button from your browser toolbar, it can manage multiple proxies and setup filters to determine which sites should be forwarded to the proxy and which shouldn’t. I’m not going to go into the setup in this blog but you can find details on how to use it here.
So with the BurpSuite proxy setup and receiving requests let’s take a look at some of the features.
Reviewing requests and response
One of the key features I use a lot is the request and response tabs that can found in ‘HTTP History’. They are basic tools but having all the details laid in plain text to search can very helpful for test design, debugging automation or learning about an application. Additionally both views have find functionality that takes strings and regexes which is great for finding content in large HTML bodies or testing out regexes for automation.
The other main feature I use is the repeater feature which allows me to store an HTTP request, manipulate it and send it again as many times I like (context dependent of course). I find this feature useful for a few reasons:
- Great for exploratory testing. If you are running testing on a web form and are focused on how test data is consumed, I find capturing the initial request from the site and then using the repeater to try different data payloads is a rapid and focused way of running an exploratory test session.
- Debugging bad requests in automation. There are times when I automate HTTP requests in a tool but may have built them incorrectly wrong or missing headers, bad cookies, etc. So I find capturing the original request from the application and my request into repeater then slowly copying content from one request to the other, each time re-sending my request, can help determine what I have done wrong or am missing.