Difference between revisions of "Testing: Conduct search engine discovery/reconnaissance for information leakage (OTG-INFO-001)"

From OWASP
Jump to: navigation, search
(Description of the Issue)
m (Added PunkSPIDER, Google Hacking Database, changed reference numbers to make them unique)
(36 intermediate revisions by 11 users not shown)
Line 1: Line 1:
{{Template:OWASP Testing Guide v3}}
+
{{Template:OWASP Testing Guide v4}}
  
'''This is a draft of a section of the new Testing Guide v3'''
+
== Summary ==
 +
There are direct and indirect elements to search engine discovery and reconnaissance. Direct methods relate to searching the indexes and the associated content from caches. Indirect methods relate to gleaning sensitive design and configuration information by searching forums, newsgroups and tendering websites.
  
== Brief Summary ==
+
Once a search engine robot has completed crawling, it commences indexing the web page based on tags and associated attributes, such as <TITLE>, in order to return the relevant search results. [1]
<br>
+
This section describes how to search the Google Index and remove the associated web content from the Google Cache.
+
<br>
+
  
== Description of the Issue ==
+
If the robots.txt file is not updated during the lifetime of the web site, and inline HTML meta tags that instruct robots not to index content ahve not been used, then it is possible for indexes to contain web content not intended to be included in by the owners. Website owners may use the previously mentioned robots.txt, HTML meta tags, authentication and tools provided by search engines to remove such content.
<br>
+
Once GoogleBot has completed crawling, it commences indexing the web page based on tags and associated attributes, such as <TITLE>, etc in order to return the relevant search results. [1]
+
  
If the robots.txt file is not updated during the lifetime of the web site then it is possible for web content not intended to be included in Google's Search Results to be returned.
+
== Test Objectives ==
  
Therefore, it must be removed from the Google Cache.
+
To understand what sensitive design and configuration information is exposed of the application/system/organisation both directly (on the organisation's website) or indirectly (on a third party website)
<br>
+
  
== Black Box Testing==
+
== How to Test ==
The Advanced "site:" Search Operator of Google restricts its Search Results to a specific domain.
+
  
Google provides the Advanced "cache:" Search Operator but this is the equivalent of clicking the "Cached" link of individual Google Search Results.  Hence, the use of the Advanced "site:" Search Operator is advocated.
+
Using a search engine, search for:
 +
* Network diagrams and configurations
 +
* Archived posts and emails by administrators and other key staff
 +
* Logon procedures and username formats
 +
* Usernames and passwords
 +
* Error message content
 +
* Development, test, UAT and staging versions of the website
  
== Example ==
+
=== Black Box Testing ===
 +
Using the advanced "site:" search operator, it is possible to restrict search results to a specific domain [2]. Do not limit testing to just one search engine provider - they may generate different results depending on when they crawled content and their own algorithms. Consider:
  
To find the web content of http://www.owasp.org indexed by Google Cache the following Google Search Query is issued:
+
* Baidu
 +
* binsearch.info
 +
* Bing
 +
* Duck Duck Go
 +
* ixquick/Startpage
 +
* Google
 +
* Shodan
 +
* PunkSpider
 +
 
 +
Duck Duck Go and ixquick/Startpage provide reduced information leakage about the tester.
 +
 
 +
Google provides the Advanced "cache:" search operator [2], but this is the equivalent to clicking the "Cached" next to each Google Search Result.  Hence, the use of the Advanced "site:" Search Operator and then clicking "Cached" is preferred.
 +
 
 +
The Google SOAP Search API supports the doGetCachedPage and the associated doGetCachedPageResponse SOAP Messages [3] to assist with retrieving cached pages. An implementation of this is under development by the [[::Category:OWASP_Google_Hacking_Project |OWASP "Google Hacking" Project]].
 +
 
 +
PunkSpider is web application vulnerability search engine. It has little use for pentester doing manual work. However it can be useful as demonstration of easiness of finding vulnerabilities by script-kiddies.
 +
 
 +
 
 +
==== Example ====
 +
To find the web content of owasp.org indexed by a typical search engine, the syntax required is:
 
<pre>
 
<pre>
 
site:owasp.org
 
site:owasp.org
 
</pre>
 
</pre>
 +
[[Image:Google_site_Operator_Search_Results_Example_20121219.jpg]]
  
== Remediation ==
+
To display the index.html of owasp.org as cached, the syntax is:
 +
<pre>
 +
cache:owasp.org
 +
</pre>
 +
[[Image:Google_cache_Operator_Search_Results_Example_20121219.jpg]]
  
If the removal is not urgent then simply modifying the robots.txt file to "Disallow:" this content will result in its removal from the index once Googlebot has completed crawling the web site.
+
==== Google Hacking Database ====
  
For urgent removal, Google provide the "URL Removal" function as part of their "Google Webmaster Tools" service.
+
Google Hacking Database is list of useful search queries for for google. Queries are put in several categories:
 +
* Footholds
 +
* Files containing usernames
 +
* Sensitive Directories
 +
* Web Server Detection
 +
* Vulnerable Files
 +
* Vulnerable Servers
 +
* Error Messages
 +
* Files containing juicy info
 +
* Files containing passwords
 +
* Sensitive Online Shopping Info
  
== Gray Box testing and example ==
 
Grey Box testing is the same as Black Box testing above
 
  
== References ==
+
=== Gray Box testing and example ===
[1] "Google 101: How Google crawls, indexes, and serves the web" - http://www.google.com/support/webmasters/bin/answer.py?answer=70897 <br>
+
Gray Box testing is the same as Black Box testing above.
[2] "Advanced Google Search Operators" - http://www.google.com/help/operators.html <br>
+
 
[3] "Preventing content from appearing in Google search results" - http://www.google.com/support/webmasters/bin/topic.py?topic=8459
+
 
 +
 
 +
== Vulnerability References ==
 +
'''Web'''<br>
 +
[1] "Google Basics: Learn how Google Discovers, Crawls, and Serves Web Pages" - http://www.google.com/support/webmasters/bin/answer.py?answer=70897 <br>
 +
[2] "Operators and More Search Help" - http://support.google.com/websearch/bin/answer.py?hl=en&answer=136861 <br>
 +
[3] "Google Hacking Database" - http://www.exploit-db.com/google-dorks/ <br>
 +
 
 +
== Tools ==
 +
[4] FoundStone SiteDigger - http://www.mcafee.com/uk/downloads/free-tools/sitedigger.aspx <br>
 +
[5] Google Hacker - http://yehg.net/lab/pr0js/files.php/googlehacker.zip<br>
 +
[6] Stach & Liu's Google Hacking Diggity Project - http://www.stachliu.com/resources/tools/google-hacking-diggity-project/ <br>
 +
[7] PunkSPIDER - http://punkspider.hyperiongray.com/ <br>
 +
 
 +
== Remediation ==
 +
Carefully consider the sensitivity of design and configuration information before it is posted online.
 +
 
 +
Periodically review the sensitivity of existing design and configuration information that is posted online.

Revision as of 10:17, 11 December 2013

This article is part of the new OWASP Testing Guide v4. 
At the moment the project is in the REVIEW phase.

Back to the OWASP Testing Guide v4 ToC: https://www.owasp.org/index.php/OWASP_Testing_Guide_v4_Table_of_Contents Back to the OWASP Testing Guide Project: http://www.owasp.org/index.php/OWASP_Testing_Project

Contents


Summary

There are direct and indirect elements to search engine discovery and reconnaissance. Direct methods relate to searching the indexes and the associated content from caches. Indirect methods relate to gleaning sensitive design and configuration information by searching forums, newsgroups and tendering websites.

Once a search engine robot has completed crawling, it commences indexing the web page based on tags and associated attributes, such as <TITLE>, in order to return the relevant search results. [1]

If the robots.txt file is not updated during the lifetime of the web site, and inline HTML meta tags that instruct robots not to index content ahve not been used, then it is possible for indexes to contain web content not intended to be included in by the owners. Website owners may use the previously mentioned robots.txt, HTML meta tags, authentication and tools provided by search engines to remove such content.

Test Objectives

To understand what sensitive design and configuration information is exposed of the application/system/organisation both directly (on the organisation's website) or indirectly (on a third party website)

How to Test

Using a search engine, search for:

  • Network diagrams and configurations
  • Archived posts and emails by administrators and other key staff
  • Logon procedures and username formats
  • Usernames and passwords
  • Error message content
  • Development, test, UAT and staging versions of the website

Black Box Testing

Using the advanced "site:" search operator, it is possible to restrict search results to a specific domain [2]. Do not limit testing to just one search engine provider - they may generate different results depending on when they crawled content and their own algorithms. Consider:

  • Baidu
  • binsearch.info
  • Bing
  • Duck Duck Go
  • ixquick/Startpage
  • Google
  • Shodan
  • PunkSpider

Duck Duck Go and ixquick/Startpage provide reduced information leakage about the tester.

Google provides the Advanced "cache:" search operator [2], but this is the equivalent to clicking the "Cached" next to each Google Search Result. Hence, the use of the Advanced "site:" Search Operator and then clicking "Cached" is preferred.

The Google SOAP Search API supports the doGetCachedPage and the associated doGetCachedPageResponse SOAP Messages [3] to assist with retrieving cached pages. An implementation of this is under development by the OWASP "Google Hacking" Project.

PunkSpider is web application vulnerability search engine. It has little use for pentester doing manual work. However it can be useful as demonstration of easiness of finding vulnerabilities by script-kiddies.


Example

To find the web content of owasp.org indexed by a typical search engine, the syntax required is:

site:owasp.org

Google site Operator Search Results Example 20121219.jpg

To display the index.html of owasp.org as cached, the syntax is:

cache:owasp.org

Google cache Operator Search Results Example 20121219.jpg

Google Hacking Database

Google Hacking Database is list of useful search queries for for google. Queries are put in several categories:

  • Footholds
  • Files containing usernames
  • Sensitive Directories
  • Web Server Detection
  • Vulnerable Files
  • Vulnerable Servers
  • Error Messages
  • Files containing juicy info
  • Files containing passwords
  • Sensitive Online Shopping Info


Gray Box testing and example

Gray Box testing is the same as Black Box testing above.


Vulnerability References

Web
[1] "Google Basics: Learn how Google Discovers, Crawls, and Serves Web Pages" - http://www.google.com/support/webmasters/bin/answer.py?answer=70897
[2] "Operators and More Search Help" - http://support.google.com/websearch/bin/answer.py?hl=en&answer=136861
[3] "Google Hacking Database" - http://www.exploit-db.com/google-dorks/

Tools

[4] FoundStone SiteDigger - http://www.mcafee.com/uk/downloads/free-tools/sitedigger.aspx
[5] Google Hacker - http://yehg.net/lab/pr0js/files.php/googlehacker.zip
[6] Stach & Liu's Google Hacking Diggity Project - http://www.stachliu.com/resources/tools/google-hacking-diggity-project/
[7] PunkSPIDER - http://punkspider.hyperiongray.com/

Remediation

Carefully consider the sensitivity of design and configuration information before it is posted online.

Periodically review the sensitivity of existing design and configuration information that is posted online.