Q

Closing code won't save you

Security expert James Turnbull explains why equating safety with hiding your source code is a fallacy.

How can exposing source code be good for security? Doesn't it follow that if there's less information available for attackers, then apps and systems should be safer?

What you are describing is called within the security world as 'security by obscurity'. The principle is one that many government and intelligence organizations operate by, the NSA being a prime example of that mindset. If no one knows about your potential vulnerabilities or weaknesses, then you are safer.

The big thing about 'security through obscurity' that concerns me is that very assumption -- that no one knows about your vulnerabilities or weaknesses. I think this assumption is a fallacy. First, reverse engineering code has become easier and easier with automated tools available that perform the function for you. Secondly, a large percentage of attacks and compromises are launched internally by employees, who are often privy to large amounts of information about an application or system. Closing the source code does not protect you from these internal perpetrators.

Finally, crackers have proven more than adequate at comprising closed operating systems and applications (Microsoft Windows, for example). Therefore, I don't think the obscurification of source code provides any greater security.

This was first published in August 2006

Dig deeper on Linux system security best practices

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever's puzzling you.

You will be able to add details on the next page.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchDataCenter

SearchServerVirtualization

SearchCloudComputing

SearchEnterpriseDesktop

Close