In an article at SYSCON Media, Gorka Sadowski writes about SIEM technologies and specifically about the complexity of event correlation.
Why Rule-Based Log Correlation Is Almost a Good Idea: The Future of SIEM
He points out that there are some challenges with static rule-based correlation. But, he calls it "the engine for the first generation of [SIEM]". That sounds about right. What scares me is that the future solutions to which Sadowski alludes look even more complicated. So, there may be a trade off to get the perceived increase in value.
I have an alternative solution that simplifies things for the SIEM. Over the past few years at NetVision, we've had a number of organizations interested in the NVMonitor solution (now called StealthINTERCEPT) because of its advanced filtering and from-the-source event collection. It doesn't rely on logs and enables a highly advanced ability to filter events as they happen eliminating the need for after-the-fact correlation.
For example, when looking at Active Directory Security Group events, you can return only changes to high-risk groups or changes to business-line groups that are not made by a specified subset of users (even if they may be a domain administrator). These events are pre-filtered and sent to the SIEM only when appropriate. It can also block events, btw, and send the event to the SIEM as an "attempt" rather than an actual event. And of course, it has it's own alerting and response mechanisms built in for real-time, contextual response.
Improved data collection on key source systems may be a better alternative to mathematic modeling from the event archive. Perhaps not in every case, but on core security infrastructure like Active Directory where rules are definable and today's challenge lies in the ability to implement, it's not only better, it's here today and already proven in production environments.