Summer Code Sprint2015

Goal
The OWASP Summer Code Sprint 2015 is a program that aims to provide incentives to students to contribute to OWASP projects. By participating in the OWASP Summer Code Sprint a student can get real life experience while contributing to an open source project. A student that successfully completes the program will receive in total $1500.

Program details
Projects that are eligible: All code/tools projects. Documentation projects are excluded.

Duration: 2 months of full-time engagement.

How it works
Any code/tool project can participate in the OWASP Summer Code Sprint. Each project will be guided by an OWASP mentor. Students are evaluated in the middle and at the end of the coding period, based on success criteria identified at the beginning of the project. Successful students will receive $750 after each evaluation, a total of $1500 per student.

Projects are focused on developing security tools. It is required that the code any student produces for those projects will be released as Open Source.

Note on language: English is required for code comments and documentation, but not for interactions between students and advisers. Advisers who speak the same language as their students are encouraged to interact in that language.

As a student:
1. Review the list of OWASP Projects currently participating in the OWASP Summer Code Sprint 2015.

2. Get in touch with the OWASP Project mentor of your choice.

3. Agree deliverables with OWASP mentor.

4. Work away during Summer 2015.

5. Rise to Open Source Development Glory :-)

(Students apply now!)Google form application link

As an OWASP Project Leader:
1. Edit this page adding your project and some proposed tasks as per the examples

2. Promote the initiative to your academic contacts

Timeplan
Phase 1: Proposals

Project leaders who want to include their project to the program should submit some initial proposal ideas on this page. These ideas serve as guidance to the students; they are things that project leaders would like to get done, like new features, improvements, etc.

Subsequently students are invited to submit detailed proposals that can (but do not necessarily have to) be based on these ideas. Students are strongly encouraged to engage with project leaders and each project's community (e.g. through the project's mailing list) in order to discuss the details of their proposal. Proposals should provide details about the implementation, time plan, milestones, etc.

Phase 2: Scoring of proposals

After the submission of proposals, project leaders and contributors/mentors are required to review the submitted proposals and score them (on a 1 to 5 scale). Each proposal should receive at least 3 assessments/scores from different mentors. Each mentor, contributor or leader can score only proposals for their OWN project. All assessments should provide justification. Reviewers are strongly encouraged to provide constructive comments for students so that they can improve in the future.

Project leaders are responsible to attract a sufficient number of volunteer mentors to score proposals and subsequently supervise those that will get selected.

Phase 3: Slot allocation.

When proposal scoring has been completed, each project leader requests a specific number of slots. This number should be based on: The number of truly outstanding proposals according to submitted scores. The importance of the proposal to the project's roadmap. The number of available mentors for the project. At least 2 mentors are needed for each proposal that gets accepted. If the total number of requested slots is less than or equal to the available number of slots, then all projects get the requested slots. If not, the following rules apply: All projects that have requested a slot get at least 1 slot, provided they have a high quality proposal and sufficient number of mentors. Two mentors are required per slot allocated to the project. The program's administrators get in touch with project leaders, especially those that have requested a large number of slots to receive additional feedback on the requested slots and explore any available possibilities for reducing the requested number of slots. A project leader might choose to donate one or more requested slots back to the pool so that other projects can get more slots. The program administrators can choose to initiate a public discussion between projects in need of more slots and projects that have requested a lot of slots in order to determine the best possible outcome for everyone. If all else fails, slots are equally allocated to projects, i.e. all projects get 1 slot; projects that have requested 2 or more slots get an extra slot if available; projects that have requested 3 or more slots get an extra slot if available, etc. When there are no more slots available for all projects that have requested them a draw is used to allocate the remaining slots.

In any case, the program's administrators should perform a final review of the selected proposals to ensure that they are of high quality. If concerns arise they should request additional information from project leaders.

Phase 4: Coding.

This is the main phase of the program. Students implement their proposal according to the submitted timeplan and under the supervision of their mentors.

Evaluations
In the middle of the coding period, mentors should submit an evaluation of their students to ensure that they are on track and provide some feedback both to OWASP and the students.

If no/little progress has been made up to this point, the mentors could decide to fail the student in which case the student does not receive money. If successful, OWASP will pay half the amount ($750). The final evaluations are submitted at the end of the coding period and the second installment ($750) is paid to the student if all agreed deliverables are met. If the student has failed to demonstrate progress during the second period, then the second installment will not be paid and the student will get only half of the amount.

Deadlines
Program announcement: June 1st, 2015

Student Applications: June 21st, 2015

Proposal Evaluations: from June 22nd until June 28th.

Successful proposals announcement: July 1st

Coding Period Starts: July 10th

Mid-term evaluations: Submitted from August 10th until August 15th.

Coding period ends: September 10th.

Final evaluations: Until September 18th.

Mailing List
Please subscribe to the following mailing list to receive updates or ask any particular questions:

OWASP Hackademic
=== Docker Sandbox for challenges

 Brief explanation:  Background problem to solve:

We are trying to enable users to freely upload vulnerable applications to the platform. After some research we concluded that writing a python application that uses docker to deploy challenges would be the best way to go about it. Also, we need to provide a frontend to manage the deployed containers and integrate our existing analytics gathering system into the dockerized challenges. The installer of the application should take care of initializing both the cms and the containers without introducing too much complexity.

Proposed solution:

There is an easy to use python library for docker and there should be a frontend managing the containers we can use off the self with minimal modifications. The feature has already had a poc using linux containers, you can find it in the open merge requests of the project's github page.

 Expected results 
 * Integrate dockerized challenges in the platform.
 * PEP-8 compliant code in all provided python code
 * PSR compliant code in all php provided code
 * Sphinx/phpdoc friendly comments
 * Excellent reliability
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

 Prerequisites 

Some knowledge of test driven development, php, python, docker

 OWASP Mentors 

Spyros Gasteratos (spyros.gasteratos@owasp.org) -- anyone else who already knows of puppet(I think we'll end up writing puppet manifests for installation) docker, python, php?

Javascript Based Development Challenges
Brief explanation: Background Problem to solve:

We are looking for challenges which are aimed towards secure coding. The user should be served with a piece of vulnerable javascript which fails the unit tests provided. The user has to fix the vulnerability in a way that makes the unit tests pass.

Example Solution of one of the challenges: Provide the user with the following code: '' function sayHi(string userInput){ var hiField = document.getElementById("hiField"); hiField.innerHtml = userInput; } '' Use any javascript unit - testing framework to design a set of unit tests which call the function with all sorts of payloads and test if the user seems to have escaped userInput correctly, we're not looking for completeness a solid proof of concept implementation for future reference is acceptable.

 Expected Results  A complete implementation of challenges covering the top 10 according to our coding standards.

Migrate old code to the new coding standards
In the last project summit we decided to introduce code style and standards compliance checking for new commits and slowly migrate the old ones to the new setting. We also decided to prefer contributions with unit tests.

We're looking for someone to assist in this migration. So far we have 0% of the classes in the new standards/coding style but the frontend tests cover a significant part of the platform. You could help us reach at least 70% line coverage in tests and a similar coverage in standard/code style

New Theme
Problem to solve: Our current theme is from 2012 and looks even older, moreover it could do with some usability improvements. Your job will be to design and implement a new shiny theme for the platform.

OWASP OWTF - VMS - OWTF Vulnerability Management System
Brief explanation:

Background problem to solve:

We are trying to reduce the human work burden where there will be hundreds of issues listing apache out of date or php out of date.

Proposed solution:

We can meta aggregate these duplicate issues into one issue of "outdated software / apache / php detected". with XYZ list of issues in them.

A separate set of scripts that allows for grouping and management of vulnerabilities (i.e. think huge assessments), to be usable *both* from inside + outside of OWTF in a separate sub-repo here: https://github.com/owtf

VMS will have the following features:
 * Vulnerability correlation engine which will allow for quick identification of unique vulnerability and deduplication.
 * Vulnerability table optimization : combining redundant vulnerabilities like example : PHP <5.1, PHP < 5.2 , PHP < 5.3 all suggest upgrade php so if multiple issues are reported they should be combined.
 * Integration with existing bug tracking system like example bugzilla, jira : Should not be too hard as all such system have one or the other method exposed (REST API or similar)
 * Fix Validation : Since we integrate with bug tracking once dev fixed the bug and code deployed we can run specific checks via * OWTF or other tool (may be specific nessus or nexpose plugin or similar.)
 * Management Dashboard : Could be exposed to Pentester, Higher Management where stats are shown with lesser details but more of high level overview.

Similar previous work for Nessus

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * CRITICAL: Excellent reliability -i.e. the Health Monitor cannot crash! :)-
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

Knowledge Prerequisite:

Python and bash experience would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

OWASP OWTF - HTTP Request Translator Improvements
Brief explanation:

Problem to solve:

There are many situations in web app pentests where just no tool will do the job and you need to script something, or mess around with the command line (classic example: sequence of steps where each step requires input from the previous step). In these situations, translating an HTTP request or a sequence of HTTP requests, takes valuable time which the pentester might just not really have.

Proposed solution:

An HTTP request translator, a *standalone* *tool* that can:

1) Be used from inside OR outside of OWTF.

2) Translate raw HTTP requests into curl commands or bash/python/php/ruby/PowerShell scripts

3) Provide essential quick and dirty transforms: base64 (encode/decode), urlencode (encode/decode)
 * Transforms with boundary strings? (TBD)
 * Individually or in bulk? (TBD)

Essential Function: "--output" argument

CRITICAL: The command/script should be generated so that the request is sent as literally as possible.

Example: NO client specific headers are sent. IF the original request had "User-Agent: X", the generated command/script should have EXACTLY that (i.e. NOT a curl user agent, etc.). Obviously, the same applies to ALL other headers.

NOTE: Ideally the following should be implemented using an extensible plugin architecture (i.e. NEW plugins are EASY to add)
 * http request in => curl command out
 * http request in => bash script out
 * http request in => python script out
 * http request in => php script out
 * http request in => ruby script out
 * http request in => PowerShell script out

Basic additional arguments:

- "--proxy" argument: generates the command/script with the relevant proxy option

NOTE: With this the command/script may send requests through a MiTM proxy (i.e. OWTF, ZAP, Burp, etc.)

- "--string-search" argument: generates the command/script so that it:

1) performs the request

2) then searches for something in the response (i.e. literal match)

- "--regex-search" argument: generates the command/script so that it: 1) performs the request

2) then searches for something in the response (i.e. regex match)

OWTF integration

The idea here, is to invoke this tool from:

1) Single HTTP transactions:

For example, have a button to "export http request" + then show options equivalent to the flags

2) Multiple HTTP transactions:

Same as with Single transactions, but letting the user "select a number of transactions" first (maybe a checkbox?).

Desired input formats:


 * Read raw HTTP request from stdin -Suggested default behaviour! :)-

Example: cat path/to/http_request.txt | http-request-translator.py --output


 * Interactive mode: read raw HTTP request from keyboard + "hit enter when ready"

Suggestion: This could be a "-i" (for "interactive") flag and/or the fallback option when "stdin is empty"

Example:

1) User runs tool with desired flags (i.e. "--output ruby --proxy 127.0.0.1:1234 ...", etc.)

2) Tool prints: "Please paste a raw HTTP request and hit enter when ready"

3) User pastes a raw HTTP requests + hits enter

4) Tool outputs whatever is relevant for the flags + http request given


 * For bulk processing: Maybe a directory of raw http request files?

Nice to have: Transforms

In the context of translating raw HTTP requests into commands/scripts, what we want here is to provide some handy "macros" so that the relevant command/script is generated accordingly.

Example:

NOTE: Assume something like the following arguments: "--transform-boundary=@@@@@@@ --transform-language=php"

Step 1) The user provides a raw HTTP request like this:

GET /path/to/urlencode@@@@@@@abc d@@@@@@@/test Host: target.com ...

Step 2) The tool generates a bash script like the following:

#!/bin/bash PARAM1=$(echo 'abc d' | php -r "echo urlencode(fgets(STDIN));") curl ...... "http://target.com/path/to/$PARAM1/test"

OR a "curl command" like the following: PARAM1=$(echo 'abc d' | php -r "echo urlencode(fgets(STDIN));"); curl ...... "http://target.com/path/to/$PARAM1/test"

This feature can be valuable to shave a bit more time in script writing.

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * CRITICAL: Excellent reliability -i.e. the Health Monitor cannot crash! :)-
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

Knowledge Prerequisite:

Python and bash experience would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

OWASP OWTF - JavaScript Library Sniper Improvements
Brief explanation: This is a project that tries to resolve a very common problem during penetration tests:

The customer is running a number of outdated JavaScript Libraries, but there is just not enough time to determine if something useful -i.e. something *really* bad! :)- can be done with that or not.

To solve this problem, we propose a *standalone* *tool* that can:

1) Be run BOTH from inside AND outside of OWTF

2) Build and *update* a fingerprint JavaScript library database of:
 * Library File hashes => JavaScript Library version
 * Library File lengths => JavaScript Library version
 * (Nice to have:) As above, but for each individual github commit (possible drawback: too big?)

3) Build and *update* a vulnerability database of:
 * JavaScript Library version => CVE - CVSS score - Vulnerability info

4) Given a [ JavaScript file OR hash OR length ], found in the database, provides:
 * JavaScript Library version
 * List of vulnerabilities sorted in descending CVSS score order

5) (very cool to have) Given a list of JavaScript files (maybe a directory), provides: Once the standalone tool is built and verified to be working, OWTF should be able to:
 * ALL Library/vulnerability matches described on 4)

Feature 1) GREP plugin improvement (Web Application Fingerprint):

Step 1) Lookup file lengths and hashes in the "JavaScript library database"

Step 2) If a match is found: provide the list of known vulnerabilities against "JavaScript library X" to the user

Feature 2) SEMI-PASSIVE plugin improvement (Web Application Fingerprint):

1) Requests all referenced BUT missing JavaScript files -i.e. scanners won't load JavaScript files! :)-

2) re-runs the GREP plugin on the new files (i.e. to avoid missing vulns due to unrequested JavaScript files)

Potential projects worth having a look for potential overlap/inspiration:
 * OWASP Dependency Check?

How many JavaScript libraries should be included?
 * As many as possible, but especially the major ones: jQuery, knockout, etc.
 * "Nirvana" Nice to have: ALL Individual versions of ALL JavaScript files from ALL opensource projects, (ideally) even if the project is not a JavaScript library -i.e. JavaScript files from Joomla, Wordpress, etc.-

Common JavaScript library fingerprinting techniques include:
 * Parse the JavaScript file and grab the version from there
 * Determine the JavaScript version based on a hash of the file
 * Determine the JavaScript version based on the length of the file

Other Challenges: 1) The commit that matches the closest should (ideally) be found	2) The NEXT library version after that commit (if present) should be found 3) From there, it is about reusing the knowledge to figure out public vulnerabilities, CVSS scores, etc. again
 * "the file" could be "the minimised file", "the expanded file" or even "a specific JavaScript file from Library X"
 * When the JavaScript file does not match a specific version:

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * CRITICAL: Excellent reliability -i.e. the Health Monitor cannot crash! :)-
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

Knowledge Prerequisite:

Python and bash experience would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

OWASP OWTF - Off-line HTTP traffic uploader
Brief explanation:

Although it is awesome that OWTF runs a lot of tools on behalf of the user, there are situations where uploading the HTTP traffic of another tool off-line can be very interesting for OWTF, for example:


 * Tools that OWTF has trouble proxying right now: skipfish, hoppy
 * Tools that the user may have run manually OR even from a tool aggregator -very common! :)-
 * Tools that we just don't run from OWTF: ZAP, Burp, Fiddler

This project is about implementing an off-line utility able to parse HTTP traffic:

1) Figure out how to read output files from various tools like: skipfish, hoppy, w3af, arachni, etc. Nice to have: ZAP database, Burp database

2) Translate that into the following clearly defined fields:


 * HTTP request
 * HTTP response status code
 * HTTP response headers
 * HTTP response body

3) IMPORTANT: Implement a plugin-based uploader system

4) IMPORTANT: Implement ONE plugin, that uploads that into the OWTF database

5) IMPORTANT: OWTF should ideally be able to invoke the uploader right after running a tool	Example: OWTF runs skipfish, skipfish finishes, OWTF runs the HTTP traffic uploader, all skipfish data is pushed to the OWTF DB.

6) CRITICAL: The off-line HTTP traffic uploader should be smart enough to read + push 1-by-1 instead of *stupidly* trying to load everything into memory first, you have been warned! :)

Why? Because in a huge assessment, the output of "tool X" can be "10 GB", which is *stupid* to load into memory, this is OWTF, we *really* try to foresee the crash before it happens! ;)

CRITICAL: It is important to implement a plugin-based uploader system, so that other projects can benefit from this work (i.e. to be able to import third-party tool data to ZAP, Burp, and other tools in a similar fashion), and hence hopefully join us in maintaining this project moving forward.

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * CRITICAL: Excellent reliability -i.e. the Health Monitor cannot crash! :)-
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

Knowledge Prerequisite:

Python and bash experience would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

OWASP OWTF - Health Monitor
Brief explanation:

In some cases, especially on large assessments (think: > 30 URLs) a number of things often go wrong and OWTF needs to recover from everything, which is difficult.

For this reason, OWTF needs an independent module, which is completely detached from OWTF (a different process), to ensure the health of the assessment is in check at all times, this includes the following:

Feature 1) Alerting mechanisms

When any of the monitor alerts (see below) is triggered. The OWTF user will be notified immediately through ALL of the following means:
 * Playing an mp3 song (both local and possibly remote locations)
 * Scan status overview on the CLI
 * Scan status overview on the GUI

NOTE: A configuration file from where the user can enable/disable/configure all these mechanisms is desired.

Feature 2) Corrective mechanisms

Corrective mechanisms are also expected in this project, these will be accomplished sending OWTF api messages such as:
 * Stop this tool
 * Freeze this process (to continue later)
 * Freeze the whole scan (to continue later)

Additional mechanisms:
 * Show a ranking of files that take the most space

Feature 3) Target monitor

Brief overview:

All target URLs are checked for availability periodically (i.e. once x 5 minutes?), if a URL in scope goes down the pentester is alerted (see above).

Potential approach: Check if length of 1st page changes every 60 seconds.

NOTE: It might be needed to change this on the fly.

More background

Consider the following scenario:

Current Situation aka "problem to solve":

1) Website X goes down during a scan

2) the customer notices

3) the customer tells the boss

4) the boss tells the pentester

5) the pentester stops the tool which was *still* trying to scan THAT target (!!!!)

Desired situation aka "solution":

It would be much more professional AND efficient that:

1) The pentester notices

2) The pentester tells the boss

3) The boss tells the customer

4) OWTF stops the tool because it knows that website is DEAD anyway

A target monitor could easily do this with heartbeat requests + playing mp3s

The target monitor will use the api to tell OWTF "this target is dead: freeze(stop?) current tests, skip target in future tests"

Feature 4) Disk space monitor

Another problem that is relatively common in large assessments, is that all disk space is used and the scanning box becomes unresponsive or crashes. When this happens it is too late, the pentester may also see this coming but wonder “which are the biggest files in the filesystem that I can delete”, it is not ideal to have to look for these files in a moment when the scanning box is about to crash :).

Proposed solution:

Regularly monitor how much disk space is left, especially on the partition where OWTF is writing the review (but also tool directories such as /home/username/.w3af/tmp, etc.). Keep track of files created by OWTF and all called tools and sort them by size in descending order. Then when the disk space is going low (i.e. predefined threshold), an mp3 or similar is played and this list is displayed to the user, so that they know what to delete to survive :).

Feature 5) Network/Internet Connectivity monitor

Sometimes it may also happen that ISP, etc. connectivity go down in the middle of a scan, this is often a very unfortunate situation since most tools are scanning in parallel and they won’t be able to produce a report OR even resume (i.e. A LOT is lost). The goal here is that OWTF does all of the following automatically:

1) Detects the lack of connectivity

2) Freezes all the tools (read: processes) in progress

3) Resumes the scan when the connectivity is back.

Feature 6) Tool crash detection

Sometimes, certain tools (most notably, ahem, w3af), when they crash they do NOT exit. This leaves OWTF in a difficult position where 1+ process is waiting for nothing, forever (i.e. because “Tool X” will never finish)

Feature 7) Tool (Plugin?) CPU/RAM/Bandwidth abuse detection and correction

OWTF needs to notice when some tools crash and/or “go beserk” with RAM/CPU/Bandwidth consumption, this is different from the existing built-in checks in OWTF like “do not launch a new tool if there is less than XYZ RAM free” and more like “if tool X is using > XYZ of the available RAM/CPU/Bandwidth” and this is (potentially) negatively affecting other tools/tests, then throttle it.

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * CRITICAL: Excellent reliability -i.e. the Health Monitor cannot crash! :)-
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

Knowledge Prerequisite:

Python and bash experience would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

OWASP OWTF - Installation Improvements and Package manager
Brief explanation:

This project is to implement what was suggested in the following github issue: https://github.com/owtf/owtf/issues/192

Recently i tried to make a fresh installation of OWTF. The installation process takes too much time. Is there any way to make the installation faster? Having a private server with: Additionally a minimal installation which will install the core of OWTF with the option of update can increase the installation speed. The update procedure will start fetching the latest file versions from the server and copy them to the right path. Additional ideas are welcome.
 * pre-installed files for VMs
 * pre-configured and patched tools
 * Merged Lists
 * Pre-configured certificates

-- They could be hosted on Dropbox or a private VPS :)

2 Installation Modes and the installation crashed because i runned out of space in the vm IMPORTANT NOTE: OWTF should check the available disk space BEFORE installation starts + warn the user if problems are likely
 * For high speed connections (Downloading the files uncompressed)
 * For low speed connections (Downloading the files compressed)

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * Excellent reliability (i.e. proper exception handling, etc.)
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

Knowledge Prerequisite:

Python and bash experience would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

OWASP OWTF - Testing Framework Improvements
Brief explanation:

As OWASP OWTF grows it makes sense to build custom unit tests to automatically re-test that functionality has not been broken. In this project we would like to improve the existing unit testing framework so that creating OWASP OWTF unit tests is as simple as possible and all missing tests for new functionality are created. The goal of this project is to update the existing Unit Test Framework to create all missing tests as well as improve the existing ones to verify OWASP OWTF functionality in an automated fashion.

Top features

In this improvement phase, the Testing Framework should: For example: Improve coverage of OWASP Testing Guide, PTES, etc. (lots of room for improvement there!) The goal here is to help contributors write tests for the functionality that they implement. This should be as easy as possible. This will be challenging but very worth trying after top priorities. The wiki should be heavily updated so that contributors create their own unit tests easily moving forward.
 * (Top Prio) Focus more on functional tests
 * (Top Prio) Put together a great wiki documentation section for contributors
 * (Top Prio) Fix the current Travis issues :)
 * (Nice to have) Bring the unit tests up to speed with the codebase

General background

The Unit Test Framework should be able to:
 * Define test categories: For example, "all plugins", "web plugins", "aux plugins", "test framework core", etc. (please see this presentation for more background)
 * Allow to regression test isolated plugins (i.e. "only test _this_ plugin")
 * Allow to regression test by test categories (i.e. "test only web plugins")
 * Allow to regression test everything (i.e. plugins + framework core: "test all")
 * Produce meaningful statistics and easy to navigate logs to identify which tests failed and ideally also hints on how to potentially fix the problem where possible
 * Allow for easy creation of _new_ unit tests specific to OWASP OWTF
 * Allow for easy modification and maintenance of _existing_ unit tests specific to OWASP OWTF
 * Perform well so that we can run as many tests as possible in a given period of time
 * Potentially leverage the python unittest library: http://docs.python.org/2/library/unittest.html

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * Performant and automated regression testing
 * Unit tests for a wide coverage of OWASP OWTF, ideally leveraging the Unit Test Framework where possible
 * Good documentation

Knowledge Prerequisite:

Python, experience with unit tests and automated regression testing would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

OWASP OWTF - Tool utilities module
WARNING: This idea is taken from the 1st round of OWCS selections (Sept. 15th), please do NOT apply

Brief explanation:

The spirit of this feature is something that may or may not be used from OWTF: These are utilities that may be chained together by OWTF OR a penetration tester using the command line. The idea is to automate mundane tasks that take time but may provide a lever to a penetration tester short on time.

Feature 1) Vulnerable software version database:

Implement a searchable vulnerable software version database so that a penetration tester enters a version and gets vulnerabilities sorted by criticality with MAX Impact vulnerabilities at the top (possibly: CVSS score in DESC order).

Example: Vulnerabilities against specific software version

Feature 2) Nmap output file merger:

Unify nmap files *without* losing data: XML, text and greppable formats For example: Sometimes 2 scans pass through the same port, one returns the server version, the other does not, we obviously do not want to lose banner information :).

Feature 3) Nmap output file vulnerability mapper

From an nmap output file, get the unique software version banners, and provide a list of (maybe in tabs?):

1) CVEs in reverse order of CVSS score, with links.

2) Metasploit modules available for each CVE / issue

NOTE: Can supply an *old* shell script for reference

3) Servers/ports affected (i.e. all server / port combinations using that software version)

Feature 4) URL target list creator:

Turn all “speaks http” ports (from any nmap format) into a list of URL targets for OWTF

Feature 5) Hydra command creator:

nmap file in => Hydra command list out

grep http auth / login pages in output files to identify login interfaces => Hydra command list out

Feature 6) WP-scan command creator:

look at all URLs (i.e. nmap file), check if they might be running word press, generate a list of suggested wp-scan commands for all targets that might be running word press

For background on OWASP OWTF please see: https://www.owasp.org/index.php/OWASP_OWTF

Expected results:


 * IMPORTANT: PEP-8 compliant code in all modified code and surrounding areas.
 * IMPORTANT: OWTF contributor README compliant code
 * IMPORTANT: Sphinx-friendly python comments example Sphinx-friendly python comments here
 * Excellent reliability (i.e. proper exception handling, etc.)
 * Good performance
 * Unit tests / Functional tests
 * Good documentation

Knowledge Prerequisite:

Python, experience with unit tests and automated regression testing would be beneficial, some previous exposure to security concepts and penetration testing is welcome but not strictly necessary as long as there is will to learn

OWASP OWTF Mentor:

Abraham Aranguren, Bharadwaj Machiraju - OWASP OWTF Project Leaders - Contact: Abraham.Aranguren@owasp.org, bharadwaj.machiraju@gmail.com

 OWASP Mentors 

OWASP ZAP
There are a number of ZAP related projects students can work on, including:
 * Bug tracker support
 * Convert active and passive scan rules to scripts
 * Field enumeration
 * Form handling
 * Gauntlet integration
 * Script console code completion
 * Support java as a scripting language
 * Testing guide integration
 * Zest text representation and parser

For more details see the ZAP Open Projects wiki page.