Fuzzing Cows: The No Bull Talk on Fuzzing Security

Slide Note
Embed
Share

This presentation at B-Sides Ottawa in 2010 discussed the concept of fuzzing in security, its history, objectives, and limitations. It aimed to raise awareness about fuzzing as an option in assessments and product evaluations, sharing challenges, real examples, and motivating the audience to start fuzzing. The talk also touched upon the Scan Monkey tool, its benefits, drawbacks, and when to apply fuzzing in different scenarios.


Uploaded on Sep 15, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Fuzzing Cows The No Bull Talk on Fuzzing Security B-Sides Ottawa November 13, 2010 Mike Sues (Rigel Kent) Karim Nathoo (Inverse Labs)

  2. Objectives We can t cover fuzzing in-depth in 50 minutes Raise awareness of fuzzing as an option in higher assurance/product evaluations/more focused assessments Go over challenges/experiences from the field Provide real examples Get you thinking about how you can start fuzzing Expose the scan monkey Collect free chicken wings honorarium

  3. Whats With The Title An inside joke that went wrong It is Mike s fault

  4. WTF is Fuzzing Pass malicious input to interfaces Interfaces to target are attacker accessible ones (either direct or indirect) Detect anomalous conditions that might be exploitable Usually there is some form of automation All the kewl people are doing it

  5. Fuzzing History Manual & custom scripts Unintelligent i.e. cat /dev/random | service to 0wn It worked! A bit more intelligent Modeling protocols Block-based modeling Frameworks

  6. Fuzzing History Tool integration Inline fuzzing Fuzzing and root cause analysis Process stalking Fuzzing and code coverage Commercialization Fuzzing support Reverse engineering of protocols and code

  7. Limits of the Scan Monkey The Scan Monkey uses nmap and Nessus without discrimination in a failed attempt at world domination Good Stuff: Tools determine presence of known vulnerabilities Audit configurations Verify patches Highly automatable You can get co-op students to do this For some situations this is perfectly fine (low assurance environments, operational audits, time constrained etc.) Co-op students will work for Twizzlers Bad Stuff: For new technologies, Scan Monkey tools don t have signatures Aside from getting lucky on occasion, effectiveness limited for product or new technology evaluation It is boring, contributions to the human condition are limited

  8. When to Fuzz New product/technology Old product but a high level of assurance is required Internal QA as part of SDLC if you are a product vendor If you are a bug hunter If you don t really have a lot going on in your life

  9. When Not to Fuzz If you actually have a life When you re testing systems/products in production THIS IS NOT A VULNERABILITY ASSESSMENT!!

  10. Different Types of Fuzzing Network Server perspective (example: fuzz web server) Client perspective (example: fuzz web browser) Protocol (example: fuzz IPv6 stack) Local File format API Driver

  11. Different Types of Fuzzing Wireless 802.11x Bluetooth IR Zigbee RFID

  12. Generating Payloads/Tests Generation Based Reverse engineer, Protocol API Field encoding MIME/BER Manually Your brain and many test communications Wireshark Strace Time-intensive

  13. Generating Payloads/Tests Generation Based Semi-automatic protocol analysis Proprietary and open protocols Open protocols still have grey areas Analyze or proxy network communications Wireshark Research & tools Discoverer PI (Protocol Informatics) PDB (Protocol Debugger)

  14. Generating Payloads/Tests Generation Based Modeling input to generate test cases in their entirety Block-based modeling s_string ("USER "); s_string_variable("bob"); s_string("\r\n"); s_string("PASS "); s_string_variable("bob"); s_string("\r\n");

  15. Generating Payloads/Tests Mutation Based Use existing valid payload and perturb it Re-writing proxy PDB (Protocol Debugger) Taof (The Art of Fuzzing) Modify stock client if you have source code (ex: openSSL)

  16. Target Observability and Traceability Need to be able to observe anomalies as the target is being stressed Not only detect an anomalous condition/state but CORRELATE to test case Absolutely key to effective fuzzing If you do it wrong you will waste lots of time and FAIL

  17. Methods for Target Observability Process monitoring (Debugger) Usually the best way Network Heartbeats Log Files Test Case Timing

  18. Beware the State Machine If you don t setup protocols properly, all you do is fuzz the crap out of the error state perl e print A x 41 is not always enough You may also just fuzz decoder code MIME/BER encoded fields

  19. Beware the State Machine Protocol Handshake Transport Crypto Payload Processing Authentication Decoder Fuzzing Ninja Error State

  20. Fuzzing Work Flow Rough methodology, Reverse/research target Prioritize areas/inputs to stress Code coverage Model inputs Create test cases Automate Analyze results Root cause analysis Determine exploitability Develop proof of concept/full exploit Iterate!

  21. Prioritizing Fuzzing takes a long time, might not be able to cover everything within engagement scope Lots of ways to approach, lots of tradeoffs Obscure versus common functionality (commercial development experience teaches not everything is QA d) Level of access (ex: kernel mode versus user mode) May be trade off in terms of level of access or probability of finding a bug versus affected user base (ex: bug in IE versus Safari)

  22. Prioritizing Contd Embedded RTOS as an example: Servers probably best vendor coverage Setuid programs - privilege escalation Regular user programs -limited privileges Drivers very target specific System call API might find bug that is not attacker accessible

  23. Root Cause Analysis Challenges Difficulties: Black box: all you have is raw crash data and assembly code Bug could be triggered before it becomes apparent using fault detection technique, examples: simple stack based overflow triggered early in function but not raise exception till function return. heap overflow: corrupted memory location might not be used until well after function return, making it even harder Analyst needs knowledge of different vulnerability classes (stack overflows, heap overflows, integer overflows, format string, etc.) to do thorough RCA

  24. Network Fuzzing Challenges Binary protocols Checksums/verifiers, state machine challenges Closed systems (appliances) Limited debug support Target side instrumentation difficult or impossible Multi-threaded/multi-process servers Test case throughput limited by network

  25. Network Fuzzing Demo 1 The traditional FTP server example

  26. Network Fuzzing Demo 1 Summary: State machine needed to properly setup authenticated session to find vulnerability Fault detection based on network heart beat works in this example Correlating test case to exception avoids search space nightmare Needed to switch to target debugger view to determine exact target state and exploitability Exception is an access violation, fits pattern standard for stack based buffer overflows Demonstrated how some analysis is required to get to root cause and formulate an exploit (quick) It s not always this easy :)

  27. Network Fuzzing Demo 2 Physical security system Found in field in a real assessment

  28. Network Fuzzing Demo 2 Summary Target observability relying on a network heartbeat in this case would have resulted in missing the bug Multiple threads Server doesn t crash when one thread generates exception We need a debugger/ deployed agent in this case Root Cause Analysis does not appear exploitable for remote code exec, unhandled C++ exception with no opportunity to overwrite exception handler We can DoS the crap out of the alarm system console and web server though :) Amount of root cause analysis depends on target, in this case alarm DoS as interesting as remote code execution

  29. File Format Fuzzing Headers and internal structure PE Microsoft Office PDF Media files Images Anti-virus File parsing

  30. File Format Fuzzing Software reads and interprets these formats Client or supporting library (e.g. image library) Model input structure and fields Launch client on fuzzed input file Look for crash Process monitoring Integration of launch and detection in one tool

  31. File Format Fuzzing Issues File formats are complex and many interesting ones are closed source Formats can be embedded Down the rabbit hole Many test cases Fuzz till the cows come home File formats can change radically between software versions

  32. File Format Fuzzing Tools FileFuzzer FuzzyWuzzy SPIKEfile notSPIKEfile Distributed fuzzing .

  33. Client-side Fuzzing Why do we like clients? They pay my bills They are fun to work with They have interesting work Exploiting them gets me right on an internal workstation Mike is happy

  34. Client-side Fuzzing Coordinated approach Fuzzing server and test client Fuzzing model resides on server Client connects Server delivers fuzzed input Client goes boom

  35. Client-side Fuzzing Issues Server maintains state of fuzzing cases Distributed fuzzing considerations Maintaining state across clients Client must be activated and pointed to fuzzing server Detection of client crash Process monitoring on client machine Client or support library?

  36. Client-side Fuzzing Issues Complex client inputs Client inputs Support library inputs Many test cases Distributed fuzzing!

  37. Client-side Fuzzing Tools Peach Sulley Condenomicon COM and ActiveX fuzzers

  38. Driver Fuzzing Diving into Ring0 Different approaches Remote protocol fuzzing (e.g. stack fuzzing) Local API fuzzing

  39. Driver Fuzzing Local API fuzzing User mode -> kernel mode Privilege escalation Important for multi stage attacks Application specific User land components Driver components

  40. Driver Fuzzing Issues Identify the interface and inputs Device name/Link IOCTL Header files Reversing user-land components Identifying a crash Blue screen in Windows Slow down testing

  41. Driver Fuzzing Tools Immunity Debugger Driverlib Discover driver names/links pyCommand script Proxy IOCTL calls Mutation-based fuzzer Direct fuzzing Generation-based fuzzer Kartoffel

  42. Developing Exploits You don t go from crash -> 0day in a few minutes Generating crashes is easy, analysis is hard part Difficulties: It s not 2001 anymore Memory corruption mitigations in modern OS s DEP ASLR EMET 3rdparty support libraries Specific setup conditions Analyst often needs expert knowledge

  43. Developing Exploits Goal of engagement Exploit development might not be in scope Working with developers/vendor Clients might not want to fund you to develop an exploit Customers paying for gaps in vendor development practices? Smells like a buck is being passed

  44. The Evolution of Cows Driver fuzzing tools/techniques continuing to improve and becoming more accessible Continued integration of fuzzersand RCA tools File format fuzzing continuing to increase and a blurring of file-format and client-side fuzzing More device fuzzing (e.g. smart device stuff) Better automated tools for developing our models Distributed fuzzing frameworks and tools

  45. Fuzzing Cows Questions?

  46. Moo Mike Sues: msues@rigelksecurity.com www.rigelksecurity.com Karim Nathoo: knathoo@inverselabs.com www.inverselabs.com

More Related Content