Quote

"Between stimulus and response there is a space. In that space is our power to choose our response.
In our response lies our growth and freedom"


“The only way to discover the limits of the possible is to go beyond them into the impossible.”


Thursday, 12 March 2015

Web Application Security Analysis: Comprehensively Evaluate Before Starting



The Need of Dynamic Web Application Security Analysis

The process generally begins with "Why is it needed?", "Can we defer it to the 'Next' quarter?","Cant we simply ask the developers to write 'Good' code?". Once these questions are answered by effectively communicating the 'Business impact', organizations generally tend to move to the next stage of questions, "How do we do it?". The answer generally goes through the use of web application security scanning tools. Web application security scanners crawl a web application and locate application layer vulnerabilities and weaknesses, either by manipulating HTTP messages or by looking for suspicious attributes. After clear establishment of the need we descent to the question, which tool do we pick up?
 

What exactly it is?

Dynamic Web Application security validation is performed using web application scanners to verify web applications for security problems commonly exploited by attackers. It helps in detecting vulnerabilities such as Cross-Site Scripting, SQL Injection, Directory Traversal, insecure configurations, and remote command execution. While it is a good practice to measure the security preparation, at times this validation is also mandatory as in case of compliance such as PCI-DSS(Payment Card Industry-Data Security Standard).

What do we look for in the Scanners?

There are many tools available in the market, few are commercial and few are open source. How do we check the effectiveness of these tools? It is going to be an important factor in success of securing the applications against threats. We need to have a competent model and set of evaluation criteria against whom each tool has to be bench-marked. So what are the criteria on which the tool should be evaluated? The primary areas in which the competency of the tool/scanner has to be examined are as follows:

Protocol Support
Check the protocol support list of the scanner and verify that SSL/TLS, required HTTP versions, HTTP compression are present in the list.

Check the ability of the scanner to keep the connections open for multiple requests.

Check the ability of the tool to change user agents to support different browser types.

Check the ability to automatically distinguish 'direct'/'via proxy' requests through specified parameters.

Authentication
Check for presence of all the authentication schemes used in your org. If it is difficult to get the complete list, then check the scanner to make sure the supported authentication schemes include the following:

-- Basic
-- Digest
-- Single Sign-on  
-- HTML Form-Based
-- HTTP Negotiate (Decides which to use between NTLM and Kerberos)
-- Client SSL Certificates
-- Custom implementation of extensible HTTP frameworks

Session Handling
Proper session handling capabilities to maintain valid sessions is required for effectively crawling web applications. Essentially all the tools should have the ability to:

-- Start a new session
-- Refresh session token
-- Detect expired session
-- Reacquire session token in case of expiry
-- Handle HTTP cookies
-- Handle HTTP Parameters used for session tracking
-- Handle session token embedded in the URL
-- Enable automatic detection of session tokens or manually configure session tokens
-- Define session token refresh policy, such as using fixed tokens, tokens from login process, dynamic tokens (apps using "nonces", i.e. an arbitrary number used only once in a cryptographic communication)

Crawling

Effective crawling of web applications ensures that all the links/pages of the web application have been covered. Crawling begins from the provided start page and leads to the pages opened through the links present on each page. The process should end at a user specified criteria or when there are no pages/links to crawl.

Crawling configuration options of the scanner should allow exclusion/inclusion of hostnames, IPs, URLs/URL Patterns, file extensions. Additional options such as support for concurrent sessions, request delays, maximum crawl depth, and crawler training should also be checked.

While crawling, the scanner should also be able to support redirects(http, javascript, meta refresh), it should identify and accept cookies, support AJAX, and support auto submission of forms.


Parsing
To effectively understand the web application a scanner needs to map the pages and functionality. Some of it can be done through manual training of the scanner but the intelligence of the tool depends on the ability to parse web content of the application. The essential web content types that the scanner should be able to parse are HTML, JavaScript, VBScript, XML, Plaintext, ActiveX Objects, Java Applets, Flash, CSS, and many more.
The parser should support different encoding types, such as ISO-8859-1(default for http content), UTF-7, UTF-8 and UTF-16. The parser should be configurable/customizable, should have robust error handling, and emulate client side logic with minimum training.

Attacking/Testing

Attacking/Testing capabilities should be the primary measure of core functionality of a Web application scanner. 
A scanner should allow configurations to include/exclude/set URL, file extensions, parameters, host names/IP, cookies and http headers. Scanner should allow users to define policy for attacking/testing web applications as per its design and characteristics.
It should be able to identify vulnerabilities, architectural weaknesses, authorization and authentication related problems. Scanners should cover/detect following types of attacks/vulnerabilities:

-- Authentication
                --  Insufficient Authentication
                --  Weak Password
                --  Lack of SSL on Login page
                --  Brute Force
                --  Auto-complete not disabled on password fields

-- Authorization
                --  Credential/session prediction (sequential/non random session token)
                --  Insufficient session expiration
                --  Session Fixation (e.g. failure to generate new session IDs)
                -- Session Weakness (e.g. session token in URL, session cookie without secure attribute)
                --  Site does not force SSL connection

-- Client-Side Attacks
                --  Content spoofing
                --  Cross Site Scripting
                --  Cross-Frame Scripting
                --  HTML Injection
                --  Cross-Site Request Forgery
                --  Cross-Site flashing
                --  Open Cross-Domain policy

-- Command Execution
                --  OS Command Injection
                --  SQL Injection
                --  LDAP Injection
                --  Format String Attack
                --  SSI Injection
                --  XPath Injection
                --  HTTP Header Injection/ Response Splitting
                --  Remote File Inclusion
                --  Local File Inclusion

-- Information Disclosure
                --  Directory Indexing
                --  Information leakage
                --  Path Traversal
                --  Predictable resource location
                --  Insecure HTTP methods enabled
                --  Internal IP address disclosure

Command, Control, and Reporting
These factors contribute to the usability of the application and are often major factors in decision making for all practical reasons. The primary factors to be considered here are as follows:

-- Ability to schedule scans
-- Ability to Pause and resume scans
-- Ability to view real time scan status
-- Ability to define re-usable configurable templates
-- Multi user and multi scan support
-- API for Scans
-- Integration with bug tracking/ALM

Reporting of the scan results is an important consideration as it helps plan efforts and contingency. Reports should be intuitive for layman and people with non security background. It should support different report types to cater to different audiences in the org, such as Executive Summary, Technical Details, Delta(difference between scans) Report, Compliance Reports. It should also have the ability to customize reports. Customizations such as marking of false positives, change risk level of vulnerabilities, identification of vulnerability location and changing the report formats to XML, PDF, HTML are needed.

Summary
Ideally these areas should have specific vendor neutral benchmarks established before the evaluation exercise begins. But evaluating scanners can be a daunting exercise as it is very complex by design. As much as above criteria's are important, so much so awareness to the specific needs of the products, design and architecture are important. 
Proper preparation such as setting of environment(hardware and Software), keeping the victim web applications ready, being aware of evaluation timeline, cost and budget. Few non technical factors, such as purchase cost, support cost, training cost and quality of documentation also should be properly weighed in.