Facebook Google Plus Twitter LinkedIn YouTube RSS Menu Search Resource - BlogResource - WebinarResource - ReportResource - Eventicons_066 icons_067icons_068icons_069icons_070

Tenable Blog

Subscribe

Recap: Geeking Out II with Marcus

Ron and I spent most of the webcast rotating around the theme of detection algorithms: how do you determine what is normal and what is not? We started off with one of my favorite questions, "Are there only two algorithms? Statistics - of some sort - or matching?"

I think that, by the time we were done, the two approaches had withstood the argument. We also dug into some of the issues in designing large-scale log analysis systems, and how to tier architectures, do your filtering at the edges of the network, and where to maintain copies of the actual logs themselves.

On the algorithms side, we discussed some of the techniques for filtering knowns from unknowns, including the advantages of building fully specified parse trees as a way of identifying variations from the norm. We kept coming back to what I think is Ron and my basic model: "know what is good, and subtract that from what you've got. Then everything else is suspicious." Of course, we all know that's not exactly easy, but nobody ever promised that this was going to be easy. Log analysis remains a hard problem.

Up next is a discussion on malware response programs on June 4th. More details will be available soon.

mjr.

Related Articles

Cybersecurity News You Can Use

Enter your email and never miss timely alerts and security guidance from the experts at Tenable.