Presentasi sedang didownload. Silahkan tunggu

Presentasi sedang didownload. Silahkan tunggu

I.T. DIGIT TestCentre Vulnerability assessment service Gabriel BABIANO DIGIT.A.3 29/11/2012.

Presentasi serupa

Presentasi berjudul: "I.T. DIGIT TestCentre Vulnerability assessment service Gabriel BABIANO DIGIT.A.3 29/11/2012."— Transcript presentasi:

1 I.T. DIGIT TestCentre Vulnerability assessment service Gabriel BABIANO DIGIT.A.3 29/11/2012

2 2 Agenda Service presentation Lessons learned

3 3 DIGIT TestCentre Organizational location:DIGIT.A.3 Physical location: DRB D3 (LUX) Service manager:Gabriel BABIANO Performance testing service since 2002 (currently 6 testers) Vulnerability assessment service since 2011 (currently 3 testers)

4 4 DIGIT TestCentre – clients and figures Clients European Commission (including Executive Agencies and Services) Other European institutions under agreement (e.g. Court of Justice of the European Union) Around 50 VTs per year Breakdown per DGs

5 5 Grounds for vulnerability assessment Motivation: Legal constraints Reputation Data stolen Continuity of the service 75% cyber-attacks directed to web application layer (Gartner) Network security alone does not protect web apps!!!

6 6 Tests in Information Systems life-cycle

7 7 Cost versus life-cycle stage "Finding and fixing a software problem after delivery is often 100 times more expensive than finding and fixing it during the design and requirements phase" (Barry Boehm) VT Secure coding guidelines

8 8 DIGIT TC Vulnerability service deliverables Vulnerability assessment reports (per test/iteration) Filtered potential vulnerabilities (no false positive…) Classification on criticality and prioritization Potential remediation Evolution from previous iterations Secure coding guidelines Best practices in secure coding Recommended languages (HTML, JAVA, ColdFusion) Aligned to threats evolution Both for developers and operational managers 1 st draft release due for 01/2013

9 9 DIGIT VT service tests Black Box Vulnerability Test (dynamic analysis) Need a working application target (closest to PROD) No access to source code required Not specific to coding language(s) Automatic tools + manual testing to supplement the tools Complement to Penetration Testing and WBVT White Box Vulnerability Tests (static analysis) Access to buildable source code Automatic tools + manual revision to avoid false positives All recommended languages are supported (Java, CF…) No absolute need for application target but it helps a lot Detects more vulnerabilities than black box

10 10 DIGIT TestCentre service procedure workflow Several iterations are normally required

11 11 DIGIT TC Vulnerability service tools Static code analysis (SAST) Automatic tools Manual code review: Eclipse Dynamic program analysis (DAST) Automatic tools Manual tools: Firefox and plugins: Tamper Data Database tools

12 12 Tools evaluation - methodology

13 13 Tools evaluation – criteria

14 14 Tools evaluation – critical metrics Correctness of the results Accurate Minimum false positive Minimum inconclusive Minimum duplicates Completeness of the results % detected % missed False negatives Misnamed Performance Scan duration

15 15 Tools lists Static code analysis (SAST) Dynamic program analysis (DAST) Open source DAST tools: WebScarab Nikto / Wikto Open Web Application Security Project (OWASP) Google ratproxy and skipfish W3af Websecurify

16 16 Costs per test In-house service: Assumption: complete VTs (WB & BB) takes 10 working days in average (15 tests per tester per year) Strong investment in licenses the first year Costs are similar after the 4 th year Security skilled tester with an "industrialized" procedure required Outsourced service: No requires investment Less flexible for the development? Quality? Iterations?

17 17 Engineering for attacks

18 18 Vulnerability risk areas Security controls Security functions

19 19 OWASP Top Ten (2010 Edition)

20 20

21 21

22 22 2011 CWE Top 25 Most Dangerous Software Errors

23 23 Comparison OWASP Top Ten 2010 – CWE Top 25 2011

24 24 DIGIT TestCentre Score = Risk * Impact Priorities are adapted for every application

25 25 Vulnerability assessment Assess and secure all parts individually The idea is to force an attacker to penetrate several defence layers As a general rule, data stored in databases are considered as "untrusted" "In God we trust, for the rest, we test"

26 26 Recommendations for remediation are founded in the report Cover high priority first. Then others when “affordable” Begin with risky vulnerabilities that are easy to remediate Vulnerability remediation priorities

27 27 Vulnerabilities type occurrence in the 1st iteration (%)

28 28 Improvements in Design and Coding stages Iteration Vulnerability group123456 Cross-Site Scripting43142211 Injection23611 Insecure Transmission of credentials/tokens103 Password Management1362 Cookie Security97 Path Manipulation3211 Weak authentication42 Open redirect5 Logging of credentials21 Cross-Site Request Forgery16411 Header Manipulation1531 Weak cryptography1421 File Upload8311 Forced Browsing721 Log Forging6111 Information disclosure432 security increases in every iteration Flaws can appear in future iterations

29 29 Threats to the VT success Tested source code not the same as PROD Testing environment differs from PROD Vulnerability testing tools can’t cover all automatically Hacking techniques faster than service coverage If necessary, penetration tests are conducted by a 3 rd party 100% risks & vulnerability-free cannot be guaranteed and security is not only a secure source code…

30 30 Some references Open Web Application Security Project (OWASP): Web Application Security Consortium (WASC): Common Vulnerability Scoring System (CWSS): Common Weakness Enumeration (CWE): http://cwe.mitre.org Common Attack Pattern Enumeration and Classification (CAPEC): SANS Institute:

31 31 Questions?

32 32 Thank you!

Download ppt "I.T. DIGIT TestCentre Vulnerability assessment service Gabriel BABIANO DIGIT.A.3 29/11/2012."

Presentasi serupa

Iklan oleh Google