3 DIGIT TestCentre Organizational location:DIGIT.A.3 Physical location: DRB D3 (LUX) Service manager:Gabriel BABIANO Performance testing service since 2002 (currently 6 testers) Vulnerability assessment service since 2011 (currently 3 testers)
4 DIGIT TestCentre – clients and figures Clients European Commission (including Executive Agencies and Services) Other European institutions under agreement (e.g. Court of Justice of the European Union) Around 50 VTs per year Breakdown per DGs
5 Grounds for vulnerability assessment Motivation: Legal constraints Reputation Data stolen Continuity of the service 75% cyber-attacks directed to web application layer (Gartner) Network security alone does not protect web apps!!!
7 Cost versus life-cycle stage "Finding and fixing a software problem after delivery is often 100 times more expensive than finding and fixing it during the design and requirements phase" (Barry Boehm) VT Secure coding guidelines
8 DIGIT TC Vulnerability service deliverables Vulnerability assessment reports (per test/iteration) Filtered potential vulnerabilities (no false positive…) Classification on criticality and prioritization Potential remediation Evolution from previous iterations Secure coding guidelines Best practices in secure coding Recommended languages (HTML, JAVA, ColdFusion) Aligned to threats evolution Both for developers and operational managers 1 st draft release due for 01/2013
9 DIGIT VT service tests Black Box Vulnerability Test (dynamic analysis) Need a working application target (closest to PROD) No access to source code required Not specific to coding language(s) Automatic tools + manual testing to supplement the tools Complement to Penetration Testing and WBVT White Box Vulnerability Tests (static analysis) Access to buildable source code Automatic tools + manual revision to avoid false positives All recommended languages are supported (Java, CF…) No absolute need for application target but it helps a lot Detects more vulnerabilities than black box
10 DIGIT TestCentre service procedure workflow Several iterations are normally required
11 DIGIT TC Vulnerability service tools Static code analysis (SAST) Automatic tools Manual code review: Eclipse Dynamic program analysis (DAST) Automatic tools Manual tools: Firefox and plugins: Tamper Data Database tools
14 Tools evaluation – critical metrics Correctness of the results Accurate Minimum false positive Minimum inconclusive Minimum duplicates Completeness of the results % detected % missed False negatives Misnamed Performance Scan duration
15 Tools lists Static code analysis (SAST) http://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis https://www.owasp.org/index.php/Source_Code_Analysis_Tools Dynamic program analysis (DAST) http://en.wikipedia.org/wiki/Dynamic_program_analysis Open source DAST tools: WebScarab Nikto / Wikto Open Web Application Security Project (OWASP) Google ratproxy and skipfish W3af Websecurify
16 Costs per test In-house service: Assumption: complete VTs (WB & BB) takes 10 working days in average (15 tests per tester per year) Strong investment in licenses the first year Costs are similar after the 4 th year Security skilled tester with an "industrialized" procedure required Outsourced service: No requires investment Less flexible for the development? Quality? Iterations?
22 2011 CWE Top 25 Most Dangerous Software Errors http://cwe.mitre.org/top25/
23 Comparison OWASP Top Ten 2010 – CWE Top 25 2011 http://cwe.mitre.org/top25/
24 DIGIT TestCentre Score = Risk * Impact Priorities are adapted for every application
25 Vulnerability assessment Assess and secure all parts individually The idea is to force an attacker to penetrate several defence layers As a general rule, data stored in databases are considered as "untrusted" "In God we trust, for the rest, we test"
26 Recommendations for remediation are founded in the report Cover high priority first. Then others when “affordable” Begin with risky vulnerabilities that are easy to remediate Vulnerability remediation priorities
27 Vulnerabilities type occurrence in the 1st iteration (%) http://cwe.mitre.org/top25/ http://projects.webappsec.org/w/page/13246989/Web%20Application%20Security%20Statistics
28 Improvements in Design and Coding stages Iteration Vulnerability group123456 Cross-Site Scripting43142211 Injection23611 Insecure Transmission of credentials/tokens103 Password Management1362 Cookie Security97 Path Manipulation3211 Weak authentication42 Open redirect5 Logging of credentials21 Cross-Site Request Forgery16411 Header Manipulation1531 Weak cryptography1421 File Upload8311 Forced Browsing721 Log Forging6111 Information disclosure432 security increases in every iteration Flaws can appear in future iterations
29 Threats to the VT success Tested source code not the same as PROD Testing environment differs from PROD Vulnerability testing tools can’t cover all automatically Hacking techniques faster than service coverage If necessary, penetration tests are conducted by a 3 rd party 100% risks & vulnerability-free cannot be guaranteed and security is not only a secure source code…
30 Some references Open Web Application Security Project (OWASP): www.owasp.orgwww.owasp.org Web Application Security Consortium (WASC): www.webappsec.orgwww.webappsec.org Common Vulnerability Scoring System (CWSS): http://www.first.org/cvss/http://www.first.org/cvss/ Common Weakness Enumeration (CWE): http://cwe.mitre.orghttp://cwe.mitre.org Common Attack Pattern Enumeration and Classification (CAPEC): http://capec.mitre.org/ http://capec.mitre.org/ SANS Institute: www.sans.orgwww.sans.org