Recap: Geeking Out II with Marcus
Ron and I spent most of the webcast rotating around the theme of detection algorithms: how do you determine what is normal and what is not? We started off with one of my favorite questions, "Are there only two algorithms? Statistics - of some sort - or matching?"
I think that, by the time we were done, the two approaches had withstood the argument. We also dug into some of the issues in designing large-scale log analysis systems, and how to tier architectures, do your filtering at the edges of the network, and where to maintain copies of the actual logs themselves.
On the algorithms side, we discussed some of the techniques for filtering knowns from unknowns, including the advantages of building fully specified parse trees as a way of identifying variations from the norm. We kept coming back to what I think is Ron and my basic model: "know what is good, and subtract that from what you've got. Then everything else is suspicious." Of course, we all know that's not exactly easy, but nobody ever promised that this was going to be easy. Log analysis remains a hard problem.
Up next is a discussion on malware response programs on June 4th. More details will be available soon.
mjr.
Learn more
- Event Monitoring
- Log Analysis
- Passive Network Monitoring
Tenable One
Request a demo
The world’s leading AI-powered exposure management platform.
Thank You
Thank you for your interest in Tenable One.
A representative will be in touch soon.
Form ID: 7469
Form Name: one-eval
Form Class: c-form form-panel__global-form c-form--mkto js-mkto-no-css js-form-hanging-label c-form--hide-comments
Form Wrapper ID: one-eval-form-wrapper
Confirmation Class: one-eval-confirmform-modal
Simulate Success