Security's Everyman

Security's Everyman

Monday, April 30, 2007

Bruce Almighty

The talk of the blogsphere and IT Security news sites lately has been about the comments that Bruce Schneier made at InfoSecurity Europe 2007.

Most of the talk has been people expressing their disbelief that he would make such a comment. They are saying things like "computers and the code that runs them are designed and developed by humans and therefore they will contain errors, flaws and mistakes. So how could he expect them to not be insecure?" Some are upset and they actually seem afraid that his comments will signal the demise of the security profession in a vein similar to Alan Greenspan making a comment that causes the stock market to rise or fall.

My first take on it is that of course it's an absurd comment. There is no way that systems and the code that runs them can be secure. If we had started with a security mindset from the early days of computers it would be a much more secure environment now but there would still be a need for security professionals because of the human factor. People make mistakes. Designers, developers, testers, implementers, and users all make mistakes that make it necessary to have security professionals.

My second take on his comment is that he is partly right. There is no real excuse for systems and code being released that is insecure out of the box. We have known the issues for years but vendors have chosen to ignore them so they can get products to market faster. They would rather send out faulty products and fix them later because it gets money in their pocket faster. Then they look like heroes when they patch something quickly. This is what is absurd. How would we feel if others did this. Imagine buying a car and having to have it patched regularly because the manufacture didn't check things like making sure the hood latch keeps your hood in place. How about buying a house that was build w/ half the nails because the builder wasn't sure if it needed all the recommended nails. How about buying a gas stove that has a newly designed gas regulator that was quickly sent to market. Finding that bug could be deadly.

I've commented before on the fact that vendors need to take more time in ensuring that their code is secure and that they need to do away with insecure practices. Things like default passwords in hardware that don't have to be changed, java upgrades that leave the old insecure code in place, and on and on.....

I make a pretty good living as a Security Professional and many of my friends and colleagues in the industry make lots more than I do. This is what is unnecessary. If vendors did their jobs then there would not be a need to pay security professionals the salaries that they often command. There also would not be a need for tech support staffs that are bloated and often inexperienced. There would not be a need for security conferences such as Infosecurity Europe 2007 and others.

There is no incentive for vendors to take extra time to ensure that their code is safe and secure. They know that when it hits the shelf it will be bought quickly. They know that once they release a service pack sales will again pick up. They know that there are hoards of Security Professionals out there working to ensure that vendor mistakes won't affect users. They also know that there are conferences that draw lots of people and they can attempt to sell more and more and more.

It's all about the money. Bruce is right. We shouldn't need many security professionals and we shouldn't have to go to security conferences. Software and systems should be secure, or close to it, out of the box. But we all know that it won't happen until there is no financial incentive for them to ship insecure products.

2 comments:

Security Retentive said...

Andy,

Bruce and others have been writing about this quite a bit, the externality of the costs of bad security in software. I've written a little about this lately and while I believe strongly that we need more emphasis on liability, Bruce and a few others haven't done a lot to answer the question of what a proper regulatory regime would look like.

On a job-related note however we can either spend the time designing security into products, or trying to achieve it after the fact. In either case we're still going to need two groups of people:

- People working on securing the products in the first place.

- People working on integration and secure solutions.

Systems integration is still necessary even if each standalone product is "secure" per its specification. There is still a lot of work to be done to build a complete solution out of smaller parts - so I think security people will still have a job - hopefully it will be less brain-numbing than today :)

Dr Anton Chuvakin said...

>But we all know that it won't happen
>until there is no financial
>incentive for them to ship insecure
>products.

Which simply means THEY WON'T! :-)

Creative Commons License
This work is licensed under a Creative Commons Attribution-NC-SA 3.0.