In computer science, we're often taught that security is only as strong as the weakest link in the chain. This weakest link principle is true, but looking for that weakest link is not always the best way to harden a system.
Microsoft's analysis of how China (Storm-0558) breached the email accounts of senior USA officials earlier this year is an interesting case of cumulative mistakes, where a series of limited issues results in catastrophic damage. Even though the analysis is not confirmed and some details are missing, it's interesting to have a high-quality analysis of a real-world example of an attack exploiting multiple weaknesses:
- an unstable software component crashing
- a race condition causing sensitive data to be included in the crash dump
- that crash dump being moved to a wider organizational network (the debugging environment) following a failure to identify its sensitivity
- compromission of the corporate account of an engineer with access to the debugging environment
- an authorization bug allowing a consumer key to access "enterprise" email, apparently as a result of unclear API-s
As a senior developer having served numerous organizations for various projects, it's easy to relate to most of these weaknesses. And yet, it's easy to imagine how reporting most of these issues could have easily been brushed off by management as unlikely/alarmist, failing to see the risk from cumulative negligence.
Security is about strengthening each link, but it's also about keeping security in mind at all times.