Category:OWASP Webslayer Project

Click here to return to OWASP Projects page. Click here to see (& edit, if wanted) the template.

Overview
WebSlayer is a tool designed for bruteforcing Web Applications, it can be used for finding not linked resources (directories, servlets, scripts, etc), bruteforce GET and POST parameters, bruteforce Forms parameters (User/Password), Fuzzing, etc. The tools has a payload generator and a easy and powerful results analyzer.

It's possible to perform attacks like:


 * Predictable resource locator (File and directories discovery)
 * Login forms brute force
 * Session brute force
 * Parameters brute force
 * Parameter fuzzing and Injection (XSS, SQL, etc)
 * Basic and Ntml Bruteforcing

Features
Some features are:


 * Encodings: 15 encodings supported
 * All parameters attack: the tool will inject the payload in every parameter (Headers, Get, Post)
 * Authentication: Webslayer supports Ntml and Basic authentication, also you can brute force the authentication
 * Multiple payloads: you can use 2 paylods in different parts
 * Proxy support (authentication supported)
 * Live filters: You can change the filters as the attack is taking place
 * Multiple threads: You can set how many threads to use in the attack
 * Session import/export: Allows you to save the session and to continue working with the results
 * Integrated web browser: a full fledge webkit browser is included to analyze the results
 * Predefined dictionaries for predictable resource location, based on known servers (Thanks to Dark Raver, www.open-labs.org)
 * Payload Generator (custom payload generator)

For Resource Location prediction, it supports:

 * Recursion: When discovering directories, you can set how deep to go
 * Non standard code error checking: Webslayer will detect NoN Standard Code, to avoid presenting trash results
 * Extensions: You can add a list of extensions to try with a dictionary

Results analysis
The power of Webslayer resides in the way you can work with the results, for every attack you will have all the responses, and for each ; request you will have:


 * Html results
 * Source code
 * Headers
 * Web browser view (it will replay the request via the browser)

Multiple filters for improving the performance and for producing better results for the analyst


 * Return Code
 * Characters length
 * Words length
 * Lines length
 * MD5
 * Regular expression

Webslayer will maintain all the attacks in the session so you can work with them, compare, check later, etc.