2010 Data Breach Investigations Report: Difference between revisions

From Cybersecurity Wiki
Jump to navigation Jump to search
Line 33: Line 33:
==Synopsis==
==Synopsis==


In partnership with the U.S. Secret Service (USSS), Verizon  
In partnership with the U.S. Secret Service (USSS), Verizon analyzed first-hand evidence collected during breach investigations. 
 
Highlights of the study include:
 
* Driven largely by organized groups, the majority of breaches and almost all data stolen (98%) in 2009 was still the work of criminals outside the victim organization. Insiders, however, were more common in cases worked by the USSS, which boosted this figure in the joint dataset considerably. Breaches  linked to business partners continued the decline observed in our last report and reached the lowest level since 2004.
 
* Related to the larger proportion of insiders, Misuse sits atop the list of threat actions leading to breaches in 2009. That’s not to say that Hacking and Malware have gone the way of the dinosaurs; they ranked #2 and #3 and were responsible for over 95% of all data comprised. Weak or stolen credentials, SQL injection, and data-capturing, customized malware continue to plague organizations trying to protect information assets. Cases involving the use of social tactics more than doubled and physical attacks like theft, tampering, and surveillance ticked up several notches.
 
* As in previous years, nearly all data were breached from servers and applications. This continues to be a defining characteristic between data-at-risk incidents and those involving actual compromise. The proportion of breaches stemming from highly sophisticated attacks remained rather low yet once again accounted for roughly nine out of ten records lost. In
keeping with this finding, we assessed that most breaches could have been avoided without difficult or expensive controls. Yes, hindsight is 20/20 but the lesson holds true; the
criminals are not hopelessly ahead in this game. The more we know, the better we can prepare. Speaking of being prepared, organizations remain sluggish in detecting and responding to
incidents. Most breaches are discovered by external parties and only then after a considerable amount of time.
 
Suggestions to Mitigate the Risk of Attacks:
 
* Eliminate unnecessary data; keep tabs on what’s left
* Ensure essential controls are met
* Check the above again
* Test and review web applications
* Audit user accounts and monitor privileged activity
* Filter outbound traffic
* Monitor and mine event logs
 
While we’ve added some new suggestions to the Conclusions and Recommendations section of this report, what you see above is similar to the message we’ve been preaching from the beginning. That’s not because we don’t feel like writing another sermon; it’s simply that, based on the data before us, all the points in this one still apply.
 
This study always reminds us that our profession has the necessary tools to get the job done. The challenge for us lies in selecting the right tools for the job at hand and then not
letting them get dull and rusty over time. Evidence shows when that happens, our adversaries are quick to take advantage of it.
 
The amount of breaches that exploit authentication in some manner is a problem. In our last report it was default credentials; this year it’s stolen and/or weak credentials. Perhaps this is because attackers know most users are over-privileged. Perhaps it’s because they know we don’t monitor user activity very well. Perhaps it’s just the easiest way in the door. Whatever the reason, we have some work to do here. It doesn’t matter how hardened our defenses are if we can’t distinguish the good guys from the bad guys.
 
Malware gets increasingly difficult to detect and prevent (especially once the attacker owns the system). Therefore, protect against the damage malware does after infection, much of which can be mitigated if outbound traffic is restricted.  Finally, the value of monitoring (perhaps we should say “mining”) logs cannot be overstated. The signs are there; we just need to get better at recognizing them.
 
The report includes two appendices provided by the Secret Service.  One delves into online criminal communities and the other focuses on prosecuting cybercrime using Albert Gonzalez as a case study.


==Additional Notes and Highlights==
==Additional Notes and Highlights==

Revision as of 10:00, 19 August 2010

Full Title of Reference

2010 Data Breach Investigations Report

Full Citation

Verizon, 2010 Data Breach Investigations Report (2010). Online paper. Web

BibTeX

Categorization

Key Words

Computer Network Attack, COTS Software, Cyber Warfare, Department of Homeland Security, Denial of Service Attacks, Hackers, Intelligence Community, Malicious Code (Malware), National Security, Risk Modeling, Sponsored Attacks, State Affiliation, Worm, Zero-Day Exploit

Synopsis

In partnership with the U.S. Secret Service (USSS), Verizon analyzed first-hand evidence collected during breach investigations.

Highlights of the study include:

  • Driven largely by organized groups, the majority of breaches and almost all data stolen (98%) in 2009 was still the work of criminals outside the victim organization. Insiders, however, were more common in cases worked by the USSS, which boosted this figure in the joint dataset considerably. Breaches linked to business partners continued the decline observed in our last report and reached the lowest level since 2004.
  • Related to the larger proportion of insiders, Misuse sits atop the list of threat actions leading to breaches in 2009. That’s not to say that Hacking and Malware have gone the way of the dinosaurs; they ranked #2 and #3 and were responsible for over 95% of all data comprised. Weak or stolen credentials, SQL injection, and data-capturing, customized malware continue to plague organizations trying to protect information assets. Cases involving the use of social tactics more than doubled and physical attacks like theft, tampering, and surveillance ticked up several notches.
  • As in previous years, nearly all data were breached from servers and applications. This continues to be a defining characteristic between data-at-risk incidents and those involving actual compromise. The proportion of breaches stemming from highly sophisticated attacks remained rather low yet once again accounted for roughly nine out of ten records lost. In

keeping with this finding, we assessed that most breaches could have been avoided without difficult or expensive controls. Yes, hindsight is 20/20 but the lesson holds true; the criminals are not hopelessly ahead in this game. The more we know, the better we can prepare. Speaking of being prepared, organizations remain sluggish in detecting and responding to incidents. Most breaches are discovered by external parties and only then after a considerable amount of time.

Suggestions to Mitigate the Risk of Attacks:

  • Eliminate unnecessary data; keep tabs on what’s left
  • Ensure essential controls are met
  • Check the above again
  • Test and review web applications
  • Audit user accounts and monitor privileged activity
  • Filter outbound traffic
  • Monitor and mine event logs

While we’ve added some new suggestions to the Conclusions and Recommendations section of this report, what you see above is similar to the message we’ve been preaching from the beginning. That’s not because we don’t feel like writing another sermon; it’s simply that, based on the data before us, all the points in this one still apply.

This study always reminds us that our profession has the necessary tools to get the job done. The challenge for us lies in selecting the right tools for the job at hand and then not letting them get dull and rusty over time. Evidence shows when that happens, our adversaries are quick to take advantage of it.

The amount of breaches that exploit authentication in some manner is a problem. In our last report it was default credentials; this year it’s stolen and/or weak credentials. Perhaps this is because attackers know most users are over-privileged. Perhaps it’s because they know we don’t monitor user activity very well. Perhaps it’s just the easiest way in the door. Whatever the reason, we have some work to do here. It doesn’t matter how hardened our defenses are if we can’t distinguish the good guys from the bad guys.

Malware gets increasingly difficult to detect and prevent (especially once the attacker owns the system). Therefore, protect against the damage malware does after infection, much of which can be mitigated if outbound traffic is restricted. Finally, the value of monitoring (perhaps we should say “mining”) logs cannot be overstated. The signs are there; we just need to get better at recognizing them.

The report includes two appendices provided by the Secret Service. One delves into online criminal communities and the other focuses on prosecuting cybercrime using Albert Gonzalez as a case study.

Additional Notes and Highlights

Expertise required: None

Link to Verizon's Security Blog