About 10 years ago I completed some analysis of how IT security budgets were being spent, what was interesting was that over 80% of IT security was operational cost. This helps explain the attractiveness of integrated over a best of breed strategy.
Can Big Data Help?
Today, many recognize that operational efficiency is one of the keys to success, leveraging big data solutions to help. In doing so, however, is there a danger that we overlook the basics that drive operational efficiency?
The remedy du jour is big data solutions that crunch more data faster. Already we typically see 37% of organizations getting 10,000+ events per month. But this approach has an inherent problem: as the number of IPs managed continues to multiply its unclear this approach can scale. The more we feed in the more time and human skill it requires to decipher the output.
Historically attacks were simple singular binary objects and so easy to detect, so the information sharing was direct actionable data. Today’s attacks are typically multi flow and vector, meaning we have to correlate disparate incident event data to discover the big picture of todays advanced attacks. The simple binary information--signature--is insufficient. Organizations try to piece the modern threat puzzle together by looking for any suspicious behavior, which often means very high false positives for every one real incident alert. As companies try to patch together incident data to find advanced attacks you see why this is a complex and time-consuming approach does which does not scale. It’s not surprising that 52% of alerts are false positives and for many (40% of companies) a manual judgment call is the only way to have confidence.
FireEye’s MVX engine—which drives our NX product line—takes a different approach. It gathers and connects all of the components in the attack anatomy together starting from the initial breach and exploit through to the call back and exfiltration. It’s this ability—akin to how humans use multiple senses to improve cognition—that allows MVX to get a much richer analysis that allows for far greater accuracy of detection.
So what can you do?
Looking forward, companies are looking to reduce their operational overhead to reduce costs and free up resource to focus more on response. Most companies have limited experience executing this shift. One key operational hangup will be forensic analysis, which has a large driven by petabytes of data. Time and expense we can ill afford.
How do we reduce the time and costs of forensics, we need to provide as much fidelity around the incident as possible that reduces the scope of data an incident responder has to forensically examine. Considerations two main activities:
- Triage - Qualifying the intentions of the attack
If you can add context around the “who” and the “why” of an incident you can typically quickly qualify if you actually need to do further forensic analysis. Traditional intelligence only provides technical details of how the attack works and how to remove it. This lacks the fidelity of the context businesses are blind or have to leverage expensive dedicated services that provide adhoc human analysis.
Key questions to consider: how many incidents did you investigate that ended up having no impact and worse were there any that you should have investigated earlier? Security is increasingly going to be about our ability to triage through the volume of emergencies much as hospitals have needed to at the accident and emergency centers.
- Reducing the forensics costs
Forensics are time consuming and typically very expensive. Most commonly they are leveraged to ascertain the human element. The attacker leverages their access through compromised systems for a purpose. The scope of their actions are broad, just one example would be adding genuine accounts so they can retain access long after the malware has been discovered and removed.
Most security solutions are designed to focus detecting and blocking, they look at the code that is injected into your systems. Very few try to capture and analyze how this injected code interacted in your systems post breach, this is typically the domain of forensics tools and to date these have not been part of the same security ecosystem. If we can link these processes we can dramatically reduce the scale and scope of data forensics expertise need to investigate as we give them clear timelines and targeted data streams to analyze.
For many years the discussion of best of bread versus consolidated suite has raged on, as cost of implementation has been just as important as the capability itself. Typically a suite is a collection of products that cohabit effectively and share a reporting platform. It should however be how we streamline and link operational workflows and how to gain the most effective usage from our skilled staff. Automation of processes is about reducing the time & cost to resolution and leveraging technology to free up the human skills for where its most needed.
Although we focus on streamlining our detection processes to both reduce time and human input as the volume of events continues to grow, we must start to consider what’s next:
- How do I link together the discovery and response processes?
- What’s key is that as we look at technology we must not look only at its capability, but also its usability if we are to be effective both in terms of securing the business, and the costs to achieve this.