CHICAGO -- Network security practitioners need to base their technology and policy decisions less on what attacks...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
are possible and more on which are probable, according to the chief scientist for Resonance Networks.
"Most decisions are based on folklore, anecdotes and inappropriate theoretical models," said Eric Rescoria during the keynote address Wednesday at the Information Security Decisions conference in Chicago. How we size up threats and defenses needs to change, he argues, to what he calls "evidence-based network security."
Traditionally, IT security departments base safeguards on all potential risks to a network, but Rescoria says that approach is impractical. Instead, he believes businesses should devote a bulk of their resources to combating the most likely attacks, based on
Rescoria used the example of an SSL certificate flaw in which the Internet Explorer verification didn't work properly, creating a potentially serious situation. But, since its publication in 2002, no attacks using this mechanism have been reported.
Rather than give in to fear and doubt, Rescoria advises a more rational approach to patch management and technology and services. "What it means is an approach that depends on measurement and experiment to determine which attacks are actually threats," he said.
Based on the latest CSI/FBI survey, viruses and denial-of-service attacks remain the most costly threats to a company. Viruses cost surveyed companies more than $50 million last year, while DoS attacks amounted to almost $30 million. Conversely, Web site defacements, system penetrations and industrial sabotage barely made a financial dent. "You want to worry about the attacks that are expensive and common and not the ones that are inexpensive and uncommon."
The real threat environment, he contends, has not changed much over the years despite advances in secure software development. That's partially because most companies still use older, flawed applications and hardware on their systems. And, he said, "Patch uptake is still slow, so lots of vulnerable machines remain."
He also cautioned against placing a lot of faith in automated patching, noting that by Microsoft's April 15 deadline, only 40% of users had installed Windows XP Service Pack 2, an update primarily designed to better protect the operating system.
He concluded: "In principle, auto patching is great, but in practice we can't get people to do it."
This story originally appeared on SearchSecurity.com.