OWASP Podcast/Transcripts/041

Interview with David Rice (OWASP Podcast 41)
Jim Manico
 * We have with us today David Rice. David is an internationally recognized information security professional and is an accomplished educator and visionary.  He is also the author of Geekonomics: The Real Cost of Insecure Software.  Thank you so much for taking the time to come on the show with us.

David Rice
 * Thank you for the invitation. I love it.

Jim Manico
 * So can you start by telling us your IT background and how you got into infosec in the first place?

David Rice
 * Sure thing. I think I fell into infosec the way many people did, and that is largely by accident.  I was actually in my Master’s degree program at my Naval postgraduate school.  The degree program was systems engineering and information warfare.  To the military, the information warfare aspect of my curriculum was the more important one.  I was only the second cohort at the Naval postgraduate school to actually go through the program.  It was really the military’s or at least the Navy’s first prototype event for training a cybercore.  I do not think that they realized that at the time.  I know they knew that metro centered warfare was a big idea that was coming up in the revolution of military affairs, so they were really trying to get their heads around it.  The cool thing about information warfare was that it was not just technology focused.  It was really more like a psychological warfare, like a type of curriculum that was based in systems engineering.  That sounds completely contradictory, but it actually wasn’t.  If you think, most human interactions are really just a system of systems.  You have a cultural system interacting with a cultural system interacting with a religious system, so it really took a full spectrum look at how systems operate and function together, whether it is a technology system, a human system, or whatever it happens to be.  The focus of information warfare was how do you affect the mind.  It could be through weaponeering.  It could be through psychological operations.  Of course, at the time we studied the Gulf War and how we used psychological operations to get Iraqi soldiers to surrender without having to blow them up.  Those were pretty important aspects, but the other aspect was, of course, information sources.  Information sources were recognized to be largely technology sources or at least being converted to technology sources, so instead of just having repository, the static information like we had in libraries or institutions of learning, we now have these dynamic sources of information.  It is much more ephemeral and of course much more easy to tamper with.  Because the computer systems themselves were highly vulnerable, the presumption was what if you get in and start messing with somebody's data sources.  Those data sources, of course, are sometimes real-time battlefield items.  They can also be psychological operations, the new sources that people went to.  What happens if you start changing web pages or changing the stories or if you just change one or two words?  So you are using technology, but you are also getting to the words of what people are reading on what they think is a page.  Really, it is just a web page.  It is highly ephemeral, but they treated it with the authority or many times treated it with the authority of a written page.  So, you had this interesting intermix of technology, human interpretation, psychology, and all these different things.  That is actually what led me into infosec because we did not have a network attack capability at the school.  Of course, it would just be purely research, but we were still just kind of fumbling our way through.  We knew attacks were possible.  That is actually where I got my start was breaking into infosec.  Literally, we would break into computer systems.  We built this lab and we got funding through SPAWAR in San Diego, which is the Space and Naval Warfare Center down in San Diego.  I got a fellowship through that, which gave me some starter funds.  Then the National Naval Agency, once they saw some of our first round of work said hey, this is an interesting thing.  We would like to keep funding it, so I wrote my research papers, and it turned out that it happened to be highly classified and I no longer have the clearances, so I can't even read my own research paper anymore.  It is still classified, which is nice, I guess.  Then actually that is what lead me into working for the National Security Agency.  I went there to work at the System and Network Attack Center.  That is where I really cut my teeth on some pretty hairy technical issues.  That is all I can say about that.  That is how I got into it.  Anyway, the public facing side of the work was actually the early NSA security guides.  The Windows 2000 Guide, the Cisco IOS Guide, all of those came out of the defense side of the NSA.  That was a public contribution to say this is what we think is a good idea in terms of configuration to protect your systems.  That led into the CIS benchmarks that eventually was the final version of the benchmarks.  Of course, everybody else's input, DISA, the commercial sector, all of those things are built in, so that is the public facing side of our work.  That is kind of the long and the short of how I got into infosec.  It was a fun ride.

Jim Manico
 * So David, why did you leave high profile consulting to go work for the Cyber Consequences Unit?

David Rice
 * So I am still working with the Monterey Group, and I work part time with the U.S. Cyber Consequences Unit, so my full title technically is Consulting Director for Policy Reform at the U.S. Cyber Consequences Unit. My role there is to look into cyber security from a policy perspective.  That is, there are some that may argue, myself being one of them, that maybe our approach to cyber security is not the better approach or is not the best approach that we can be doing now.  We have the danger that if we implement policy changes of creating vast unintended consequences.  As example right now, PCI is an example of unintended consequence.  What you see right now with PCI, the payment card industry, at a security standard is a race to the bottom.  That is, what is the quickest way we can get compliance and the cheapest way possible, so PCI was really meant to be the floor of cyber security, but it turned out to be the ceiling.  That is, there is really no incentive of going beyond, becoming PCI compliant.  The irony is that the way to become PCI compliant is that you need a race to the bottom.  You need to find the cheapest, least expensive way of getting to be compliant, so that is an unintended, so actually driving a race to the bottom in cyber security as opposed to a race to the top, so in the policy reform role, part of my responsibility is to ask the questions, what are the unintended consequences?  What is likely to occur because of this behavior that we are trying to instigate in the marketplace, so I get to balance my public facing work in policy reform with real world private sector experience by considering my consulting gigs, so that is the story on that.

Jim Manico
 * So David, why do you think software security matters so much in comparison to traditional security approaches like firewalls, intrusion detection, and so on?

David Rice
 * Sure, so software in my eyes creates the fabric of the Internet. It creates the fabric of all of our interactions, so without software, you do not have the Internet.  It is just a bunch of pipes so to speak.  It is just a bunch of routers.  It is just a bunch of boxes and cables.  Software is what brings the whole thing to life.  That software creates the rules by which the environment lives, acts, breathes, everything.  If you do not get the software right, then the very fabric of the Internet is not right either, then it allows all sorts of behaviors.  One thing that I love to focus on, this is from Loren Fleisig, is his line that says software or code is law, not only from a regulatory perspective, but also from a real literal perspective.  Code determines what is and what is not possible on the Internet, so it helps us change our mindset a little bit to realize that software developers are not just developers, they are legislators.  They literally create the law that allows the universe of the Internet to behave, to exist, to interact, etc., so we need to get the law right.  We bash our politicians for creating laws that have loopholes in them.  When we write software that has vulnerabilities or defects in it, that is a loophole that allows an attacker to get in and change the law and do stuff that we bash our congressmen and senators for doing.  How can you let these lobbyists in and do this?  That is what cyber attackers are really doing.  They are looking for the loopholes in the law that we created.  Therefore, you see this immensely disheveled environment that is supposed to be this technological marvel.  It is in many respects, but it is also marveling in how bad it actually is in terms of the software that runs it, so we have to get the software right.  Once we get the software right, I think that will change a lot of our dependencies on security products as they are now.  I am not saying that with perfect software, even if it was possible, that we would not need firewalls, that we would not need IDSs or any of the other bevy of technologies or contraptions that we use.  Certainly, we would not have to rely on them so much as we do now or as deeply as we do now, so the systems are getting overwhelmed.  We think that if we throw more money or more processes at security, it will improve.  I do not think so.  I think that we really need to take a deep serious look at how we approach software, both technologically and in the marketplace.  We need to start creating the environment that we want rather than putting these bolt-on patches to problems that keep popping up every time, so we really need to focus on security in software.

Jim Manico
 * So David, you have often mentioned that software security bugs are the broken windows of cyberspace. You were not referring to Microsoft Windows, you were referring to a different paradigm of broken windows.  Would you be so kind as to elaborate on this topic for us?

David Rice
 * Certainly, so this kind of goes back to why in the world do I think software is so important. Not only do I think it creates the fabric of the Internet, it also creates the environment.  The environment is critical when it comes to human behaviors.  Now, you are correct when I am referring to broken Windows here, I am not referring to Microsoft Windows, even though a lot of people like to make that joke.  I am referring to what two criminologists identified, by the names of Clark and Wilson.  They identified broken windows in a literal sense.  What they said is that if there is a window that is broken in a neighborhood and it goes unrepaired, that broken window sends a message out to would be instigators that hey, no one is really taking care of the house, so what they noted in their research was that, well, one broken window tends to lead to another broken window and another broken window.  When all the windows in the house are broken, or at least a good number of them are broken, that also sends out a message into the environment and into the neighborhood that says well gosh, if no one is taking care of the house and the neighbors are not causing a ruckus about this disheveled house in the neighborhood, then maybe the neighbors do not care about this neighborhood either, so what you see is this order tends to propagate.  Little elements of disorder tend to send a message out into the environment that creates more disorder, which invites vandalism, which invites more disorder, like petty theft.  Then petty theft invites more disorder, like more serious forms of crime, so the criminologists recognized, well gee, the way to combat crime is not through draconian police mechanisms, it is actually through these simple fixes like painting over graffiti of fixing broken windows.  We tend to think that poor neighborhoods have higher crime.  That is true, but only to a degree, because typically poor neighborhoods do not have the funds or resources to repair things that happen, so when a broken window is going to cost me 70 dollars to fix it, well that is the food for the month.  Well, the window might not go fixed.  What they recognized also was that communities that came together and started pooling their resources, trying to clean up their neighborhood, that led to a direct effect of reducing crime.  In some instances, no law enforcement was necessary to clean up a neighborhood.  All it took was the neighborhood to project the message of order into the environment.  That really reduced crime to a greater extent.  Of course, when you start talking about murders and things like that, there are going to be those violent crimes, but they are actually really anomalies in the grand scheme of things.  It is when murder gets out of control we realize that there is a large environmental message being sent out.  That is what we understand from human beings.  An environment communicates or dictates behavior to a large extent, more so than we would like to think.  We would like to think that character really drives people internally, but we know from the research that it really does not.  There is one book, I think it is by Philip Lombardo.  He wrote a book called The Lucifer Effect.  In The Lucifer Effect, he kind of documents how evil starts with these very small things.  Good people end up doing bad things.  Well, you have to question why does that happen.  He also recognized that an environment had a very large influence on people’s behavior, so you can take wonderfully ethical people, like the prison guards…and all of a sudden because of the environment communicated certain things that were or were not permissible…You take these wonderfully ethical people from Midwest America, and all of sudden they appear to be monsters, so environment matters, environments in real space just as much as in cyber space, so when we look at the environment in cyberspace, we ask ourselves, well, where does this disorder come from?  My argument is that software defects are the broken windows of cyberspace.  What they do is communicate a message of disorder out into the environment that in turn invites more disorder, so the recent vulnerability that was identified in Adobe is a great example.  People were posting bugs to the Adobe forum and attackers were, of course, reading that.  They said well, anything that might crash Adobe could potentially be a vulnerability also, so that invited cyber criminals into the mix.  Well, what else can we find, what else can we find?  Therefore, those small defects tend to invite more elements of disorder.  Of course, I believe the blog post that I pulled that from, the author from The Last Watchdog was saying gee, now that cybercrime is out of control, these small bugs are real issues.  My argument is that, no, they have always been issues.  The reason why cybercrime is out of control is because of the broken windows.  They have invited disorder into the environment and told people, hey, no one is in control here.  These broken windows are all over the place.  Now, the irony is that hackers do not break windows.  They are not breaking software.  They are simply finding the defects that software manufacturers failed to detect themselves, so, in fact, you are buying a new house, but it has all sorts of broken windows.  The crazy part is that you do not know how many broken windows that it actually came with, so you have no real idea of the amount of disorder, only that it is high for any piece of given software that is put out in the environment, so we really need to, when we say focus on software security, I believe that if we focus on reducing, that is incentivizing software manufacturers to not introduce vulnerabilities into the environment or to highly restrict what I call unrestrained vulnerability dumping, that is what they do.  They write software, stick it out into the environment and guess what?  Any vulnerabilities are really your problem.  They might be my problem as a software manufacturer, but nowhere near to the extent as you are.  Now, you have data breach legislation that will hit you over the head if you do not get a patch in time, or worse yet, I don't give you a patch in time, so in order to get a handle on software security and in order to make the Internet safe, we really have to create an environment that promotes safety, that promotes order.  Right now, it does anything but.  We have a constant supply of vulnerabilities pushed into the marketplace on a daily basis and then worse, we have all these security contraptions that we have put into place that really increase the entropy of the entire system.  That is now we have all of these different variables that we need to keep track of, not only to protect ourselves, but to make sure that the efficacy of the security products are actually high or at least they are maintained at a high level.  Of course, very few people can get it right, as we have seen from Sisma scores, as we have seen from PSI flubs.  We see that it is very hard to keep an environment secure, not so much because people are not good at it.  It is because the system is inherently flawed.  They cannot succeed in a system that produces new vulnerabilities and where you have a brand new security technology almost every year, that you need to reintroduce into the environment in order to offer yourself some protection.  It just all goes towards disorder and higher entropy.  Then we wonder why things are getting out of control, so my argument it that we really need to incentivize software manufacturers to create better, more robust, more resilient software.  Then I think we will start getting a leg up.  Now, will that solve everything?  Absolutely not.  It is a really good move in the right direction.

Jim Manico
 * David, in one of your blog posts, you stated that cyber security suffers from a lack of etiology. What did you mean by that?

David Rice
 * Sure. An etiology is what we use in diagnosis of disease.  An etiology helps us identify the origin of the diseases.  Now, it goes to the point that if you misidentify the origin, your treatment probably is not going to work near as effectively, so my argument is that cyber security suffers from a mistaken etiology.  What that means is that we have mistaken the symptom for the cause.  That is, that we have a high notion of vulnerability research.  We have a high notion of going and finding hackers and hunting them down.  We think that these guys, the hackers are the problem.  To a degree they are, but they are not central to the problem, so this goes back to broken windows and why I think software security is important.  I think that that is the correct etiology.  That is, the correct diagnosis is better software, not going after software attackers, although that can be a piece of it, not doing lots of security products, although that is a piece of it.  We really are focusing all of our energies in one direction or have for a very long time.  Only recently in the history of cyber security, we focused on software security to any great extent.  Even at that, it is still not a lot.  When you look at how much effort is put into cyber security products themselves, even if you look at the consensus audit guidelines and you look at their section on software security, it is really light compared to all the other stuff that they start focusing on, so you see that even in the CAG, as good as it is, it is still biased toward a network response to cyber security.  We really need to be more intrinsic.  That is, we need to focus on the software much more, give people much greater insight.  Now, the given example of how mistaken etiology can really cripple a nation…For the longest time in auto manufacturing, we believed the driver, that is the nut behind the wheel, was the cause for all the deaths and all the fatalities and injuries on the highway system.  Of course, when epidemiologists, that is the people who focus on looking at epidemics in the environment and saying, well, what is the cause here.  Epidemiologists were looking at the statistics.  It seemed to all point to drivers misbehaving.  I mean, they were misbehaving on the roadway.  Of course, we started the three Es.  The three Es were engineering, education, and enforcement.  This started at least a decade long attempt on the part of the United States to try to teach people how to drive safely.  You know what?  It did not work.  The problem was that people were still dying.  People were still getting injured on the road.  That caused the epidemiologists to step back and say, what did we misread here?  Where were we wrong?  They realized that they suffered from a mistaken etiology.  That is, they misdiagnosed the problem.  They thought, well gee, drivers are the people in the car.  They are human actors.  They are the ones probably doing something wrong.  What is it that is killing them?  They figured it was the humans.  What they actually found out was that it was not that the humans were killing themselves, it was actually the way the cars were designed that was increasing the likelihood of death and fatalities.  You could only teach people so much about safety before the system in which they operated in needed to be changed.  That means that the highways needed to be redone.  The vehicles themselves really were the primary focus for the longest time, so we see the same fatality in cyber security.  That is that all of the data seems to point to those stupid users who keep habitually clicking on e-mail links or keep going to web sites that they should not go to.  I think that is a mistaken etiology.  As far as that, well, we spend lots and lots of money on user awareness as it is.  The argument can be made that we need to spend a lot more.  I get that argument, but it is the same mistake we made in driver education.  Now, we focus on the nut behind the keyboard.  We think that it is the user's problem.  They cannot seem to patch their systems.  They cannot seem to stop habitually clicking on links.  I am sorry, but the Internet is made up of links.  That is just what it is.  They are not going to not click on it.  It is almost impossible for the normal user to distinguish between a safe and an unsafe link.  Yes, experienced users can, but they should not have to become experts in order to run their computer any more than a driver needs to become an auto safety engineer in order to drive their car.  The system in which they operate in cyberspace is fundamentally flawed, so we need to create a system that rewards the drivers, that protects the drivers even when they do something stupid.  Now, is this going to solve all the problems?  Absolutely not.  You do need some driver education.  You still need it now.  You are going to need something of a similar sort on the Internet.  We think that through education and enforcement of the issues, cyber security will wither and die.  I mean to a certain extent, that is the idea.  We just need to train people.  We just need to educate them.  That is true to a degree, but my argument is that it is not going to be nearly enough to solve the problem.  Now when auto safety was the auto issue of the day, most people did not think it was an issue.  In fact, six months before the 1966 auto vehicle safety act was passed, only 18 percent of the population thought it was a big deal, thought it was a national issue, six months beforehand.  Now, when we look at the actual data, we know that deaths and fatalities on the US highways were costing the United States anywhere between three and five percent GDP.  At the time, in the 1960s, that was an enormous, I mean an absolutely phenomenal amount of money.  Three to five percent GDP was just huge.  Now, that was significant because what you see is a disconnect.  Individuals did not think this was a big deal.  When we started looking at the data and started scratching our heads to figure out why is this not working out, why is our training and education program not working out, we realized the huge disconnect.  Individuals did not see the problem, but in aggregate we could look and see it was costing the nation literally hundreds of billions of dollars in lost productivity.  The same argument can be made in cyber security.  That is, oh my gosh, you know cyber security costs us a tremendous amount of money, but if you ask your normal person on the street, is the internet dangerous or do you think that the Internet is bad, they say oh no, I don't think so.  I don't know anybody who got hacked, so I would say that most people do not think that the Internet is unsafe.  We see that even among our policy people nowadays in government that yeah, they kind of understand the cyber security maybe is sort of important, but I have other things to worry about.  That is the same way auto safety was back in the 60s, so I think we have done ourselves a disservice.  We focus on the symptoms.  Maybe that is just the way it has to be.  Maybe we have to make these huge, glaring mistakes before we self-correct.  It is an unfortunate fact if we do, but when we look at etiologies, it really determines our approach.  If we have a mistaken etiology, it means that our approach is mistaken.  I think that is where we are not right.  I think we need to change our approach by changing what is the source of the problem.  It's not hackers.  It is not uneducated users, although they are a piece of it.  It is at root, insecure software.  You focus on that, you change the game.

Jim Manico
 * So David, President Obama during the ‘08 campaign, he publically stated that hackers could compromise U.S. networks and do great harm. Now, you know, this is old news and no big deal to the security community, but why was this huge news to hear something like this from such a senior government official?

David Rice
 * I think it was a recognition, finally, that this was a national security issue. For the longest time, cyber security practitioners have been second class citizens.  We are second class to just about every other issue.  Maybe that is just the way it is always going to be.  We are not going to be a high priority.  By raising it to our most senior executive in the nation and having him say that this is a big deal, we need to pay attention to this, it gives us the necessary leverage that we need.  Now, I did not say funding.  I did not say support.  I just said that it gives us the necessary leverage.  That recognition is important for us.  Now, that does not mean that we have cart blanche reason to argue for all of our little trinkets, toys and gadgets that we want in security.  It simply means that we have recognition at the top level, now it is serious.  Now, we have to put our game face on and really face into this issue in a professional manner.  I think the response so far by the cyber security community has been mixed.  I think we have a lot of different avenues that we can approach cyber security by.  I think we also run the risk of overreaching our welcome, so to speak.  Cyber security, though the President has voiced concern, still does not compete with the financial crisis.  It still does not compete with health care, even though luminaries like Jim Lewis at CSIS have stated that this is the most fundamental economic challenge we face in the new century as the United States.  I still think that we have so many other crises that are going to take top priority in the administration.  We have to kind of balance our approach a little bit.  We have to recognize that we have a lever, we need to pull it in the right ways, but we cannot over extend ourselves.  We can easily outstay our welcome.

Jim Manico
 * Has the U.S. government backed up this executive understanding with actual money? Has the reality of billions in funding being made available for cyber security really happened, and has it changed the game in DCIT?

David Rice
 * I think when we look at the funding source for, what I think is like 300 some odd million dollars that has gone out. That is really a drop in the bucket when you look at the federal expenditures.  Now with that said, we have a lot of money going to a lot of different places, so maybe that is all that they can afford at this time.  I do not know what the level of responsiveness by the federal government is going to be in terms of budget or anything or is it going to be sufficient.  It allows us to keep some projects and programs limping along.  You still have the danger of the prime contractors just sucking an enormous amount of money out of that for side projects, etc., that may never go to cyber security or just ancillary projects to cyber security.  That is really just the position on my part.  What has changed though and what is important is that there is more money out in the marketplace assigned to cyber security.  That is different from when Richard Clarke was the cyber czar back in the Clinton administration.  Back then, Mr. Clarke was just a lone voice in the wilderness trying to wake people up.  There really was not a lot of support out in the marketplace.  Now, there are budgets for cyber security out there.  Now I would say that cyber security is in even more need of leadership than ever before, but it is just not there, so whoever ends up stepping up to the role for cyber security, they have a different marketplace that is more supportive, but they face huge challenges downstream that we can only imagine, so it is a mixed bag on that one.

Jim Manico
 * So, until very recently, Melissa Hathaway, the acting senior director for cyberspace for the National Security and Homeland Security Councils just resigned, so what does this mean for national cyber security?

David Rice
 * It means that we are without another candidate, so I think that Melissa was right in putting in her resignation. She was really in a rock…not just Melissa, anyone else who has been offered the position and has decided not to take it.  I think that speaks volumes about the position itself, either how it has been articulated, how it has been scoped.  Whatever it is, people are shying away from that role.  I think that is significant.  It really means that we are leaderless in cyber security when we really should not be, at least from the national architecture aspect.  There are plenty of leaders in cyber security, but we are looking for that one central coordinator.  I think we need to take this time to really recognize something that is tremendously important, that is no one in the cyber security community right now really wants that job.  The way the job has been articulated, it really does not give it enough budget and authority to do anything.  That is also partly by design.  If you look at General Jim Jones, you look at Larry Summers who is at the National Economic Council and, of course, General Jones is on the National Security Council, these guys are big boys.  They know what they are doing.  If they are pushing back against the cyber security coordinator, there is a reason for it.  I think we need to take away a lesson from this.  Both from the fact that no one wants the job and too, the way the job was actually formulated with input by the NSC and NEC…That is, our approach is wrong.  The job has not been given enough power.  The reason it has not been given enough power is that it can really do a lot of harm if it is given power.  I think that is very important recognition on our part.  Maybe our approach to cyber security is so flawed that not even the National Security Council or the National Economic Council can say without good conscience you guys should do what you should do.  Now, this maybe hearsay, but I think that it is critical to realize this.  Again, these guys are big boys.  They, no matter what you say about appointees or politicians, these guys know what they are doing, so we need to take their pushback and take that moment to reflect exactly what they are saying.  Now, Larry Summers’ position is that well, if we put cyber security in place now, it could do real economic damage to the recovery of the United States.  The irony is though, that is the same argument made during the Clinton administration during the late 1990s.   That is, you could cripple innovation.  You could cripple this boom that we are having, so whether it is boom or bust, security really has no place in it.  Why?  Because it is just too expensive.  Well, the reason why it is too expensive is cause we require all sorts of different practices, all sorts of different technologies and they all have to somehow work together properly in order to defend ourselves or at least to have the hope of defending ourselves.  It is immensely expensive, so what Mr. Summers is looking at is well, we are trying to recover here, and if we put this mandate out or if we start doing what cyber security actually says, we could do then what could actually cripple us.  When I wear my economics hat, I have to actually agree with him.  What we ask for is immensely difficult and expensive.  Our problem in cyber security is that we do not make it easy to do security.  We expect people to be geeks like us.  We expect people to become cyber security experts, even mom and pop.  Oh, what is this firewall that suddenly pops up?  Oh, is my antivirus updated?  Oh, is this action set up?  We expect a lot from our users.  Maybe that is right to a degree, but not to the extent that we require it now, so we really have to acknowledge that there are certain market realities, certain economic fundamentals that we as professionals need to start paying attention to.  If our message is not resonating with some of the best leaders we have in the nation, that should be a sign to us that maybe our approach is not right.  We need to adjust.  We need to meet these guys at least halfway.  We can't go from the right thing to do is ABCDEF and G.  It may be the right thing to do, but the crazy thing is, it is not the effective thing to do, so in cyber security, we spend a lot of time being right.  That is the right thing to do, to have firewalls.  That is the right thing to do, to have antivirus and IDS and DLP and take your pick of security technologies, as well as all the processes that go around them.  Maybe, just maybe, those are not the effective things that we need to be doing.  Of course, my argument is that the more effective thing to do is to focus on software security, focus on the source of disorder and use the security technologies as compliments to spearhead as opposed to relying on the technologies that have obviously and continuously failed year to year.  They simply are not counteracting the flood that is ahead of us or that we are in right now, so I think Melissa Hathaway is a good chance to sit and reflect.  Melissa Hathaway’s resignation is a good chance to sit back and reflect and say what is the message we are receiving, and what do we need to do to make things different.

Jim Manico
 * Alright David, suppose you were the U.S. government federal cyber space czar and you actually had authority and budget. What would be your priority for running the government’s application security efforts?

David Rice
 * Sure thing. I have at least three priorities.  One is to recognize that whatever I do has consequences, that is, there needs to be recognition of market realities and economic fundamentals.  That is the first recognition.  The second recognition is more so, I would treat cyber security less like a law and order issue and more like a public safety issue.  Even though you can say that hackers are attacking us, they are attacking us only because of defects and vulnerabilities within the software we have already deployed.  It is the same thing if a marketer puts out a defective product into the space, I am sorry the manufacturers put out a defective product into the space.  That is a public safety issue.  Software manufacturers are known to put out one of the most defective products in the global market out into the market place.  This is recognized by multiple reports, including a study for a national infrastructure security for the financial services sector, so there is a recognition that software is some of the most defective products out there in the marketplace, yet we do nothing really from a national perspective to constrict the source of that disorder or to reduce the emissions of vulnerabilities into the environment, so that I think is a critical thing that I would look at.  Now, I have been a big proponent of doing software labeling, as difficult and technically challenging as I know that is.  Until you get consumers involved in cyber security at a very base level that requires nothing more in them than looking at a spectrum of risk that they are purchasing into, which is exactly what they do when they buy a vehicle with a five star rating, you do not have influence in the market.  As big as the government spending budget may be in IT security in particular, this is something that Melissa Hathaway’s report was very good to focus on.  Her report said well, let us use government spending power to influence the market to create better software and that will help to a degree.  I cannot argue with it.  What I can argue with is that government spending simply will not be enough to affect the type of change that we want.  Why do I know this because it was not enough for pharmaceutical safety, it was not enough for automobile safety, it was not enough for food safety.  You have to engage the attention of the consumer.  The consumer, or at least private consumption represents over 70 percent of GDP spending in the United States, 60 percent in the EU, and something like 55 percent in Australia.  Consumers are immensely powerful, and they are far more powerful than their governments will be as far as their spending power.  You need to engage the Internet user.  You need to engage the consumer so that manufacturers have an incentive to meet the demand for more secure software.  Right now, you can say that there is a demand, but it is really in niche areas.  Software manufacturers are free really to make any claim about software security that they want.  Microsoft is trustworthy computing.  Apple simply says well, I am not a PC.  They all have assertions that are really pretty much unfounded.  Only if you really drink the Kool-Aid can you justify acceptance of their arguments.  What we need is more objectivity in the marketplace that allows consumers an easy way of exercising their muscles on Moth.  Now, that is very important too because you cannot coordinate users very well by expecting them all to be security experts.  It has not worked.  It is not going to work.  It has not worked in any other industry.  What you can do is allow them to coordinate with FDA labels, allow them to coordinate ad hoc through automobile safety labels.  You can allow them to coordinate through fuel efficiency labels or Energy Star labels.  These are all mechanisms that leverage the free market in order to get what we need as a nation more so than just what we want.  We may want a car, but I do not need a fuel efficient car, but it is better for the planet and everyone around me if I buy one, so how do we incentivize it?  Well, we put up fuel efficiency ratings.  This at least allows the market to say well, gee, this car has 22 mpg, this one has 40.  Well, I am going to go buy the 40 one just because it is better for the consumer.  They feel better off for doing that.  That is exactly the reaction we need in software space.  Now, I know that there are an enormous amount of hurdles for pulling this off.  That would be a priority to me and one of the top priorities is getting that in motion.  Even if I can convince one of the states or convince one sector, well not even sectors, because they do not have enough spending power.  Financial services is a prime example.  Financial services spend billions or at least hundreds of millions on this stuff and they still suffer from all sorts of breaches, so even the financial services sector would not have enough power, so at least integrating the states or at least putting some type of federal aspect into improving software quality.  I think it would be tremendously helpful to the national security of the nation, so to kind of recap real quick.  The first one is that I would be wary of unintended consequences, so I really have to look downstream in order to understand what the implications of actions would be.  I do not think implementing some type of national PCI standard would be good.  In fact, I would fight that to my dying breath I think because I have seen how bad PCI is right now.  I certainly could not justify doing it at a national or even at a state level even though some states are trying to go down that road.  I think that PCI has been a disaster.  Has it done some good?  Sure, but at a national level, I think it would be crushing.  The second thing I said was that I would make cyber security a public safety issue.  That is critical, simply because when you have an enormous flow of defective products going into the marketplace, it is a public safety issue.  You can argue that it is a law enforcement issue.  I get that, but really law enforcement coordinating on a public scale to catch a couple thousand hackers who we have no idea who they are.  I think that wastes a lot of resources that could be better spent focusing the attention on putting more resilient, high quality products into the marketplace.  That really incentivizes or disincentivizes hackers.  Make it harder for the hackers to find the vulnerabilities than it is for us to fight against them.  Then, finally that plays off the whole idea of just focus on market incentives.  Market incentives would be my primary lever that I would do as cyber security coordinator to help the market improve on what it delivers, so that means that there are not only market incentives for manufacturers to make better software, there has to be incentives for the consumers to buy the software.  They have to feel that they are better off for buying more secure software.  Right now, they have no incentive to do that.  When you look at the highest quality, most secure software, it is a niche product.  You have to spend millions of dollars in order to get it and really the thing that protects it is that it is so exclusive that no hacker can get their hands on it.  We cannot have security be an exclusive, discriminatory mechanism in the environment.  Security has to be distributed as evenly as possible across the environment just like auto safety is across the auto market, just like pharmaceutical safety is across the spectrum of pharmaceutical drugs, just like it is across the food industry.  We need an even distribution of security and it cannot just be a regressive tax against the poor of our population.  The poor is anyone who does not have a couple million dollars to spend on high quality, high security software.  The focus would be on the levers of market incentives, which ones can we pull, which ones can we put into place, recognizing that first focus that I would have, which was be aware of unintended circumstances.  Incentives have both bad and good effects.  That would be a primary focus for me, so I think those are the three things that I think the cyber security coordinator would really, if I were that person, focus on.

Jim Manico
 * So I have a question here from the OWASP Swedish Chapter Lead. He asks the meanest question you can ask a security expert.  Do you write code since very few security experts are active in software engineering?  Still, expertise in both security and software engineering is required to build more secure software.  The book Geekonomics takes pride in not containing code, but what is your opinion on the actual tech issues?

David Rice
 * Happy to go there, so actually yes, I do write code. I was a chief software architect doing reservation software.  I write code.  I live what I preach.  You know, I focus on writing high quality secure software right out of the bat, so I really do practice what I preach and I understand the psychology and challenges that face developers.  Now, I understand that I get hot under my collar because I also have a security take on things.  Like some software developers, they have their thing, whether it is performance, whether it is spread management, whether it is memory management.  Whatever it is, they have their hot button, and security just happens to be my hot button, so I know that I get hot under the collar with security sometimes.  I practice what I preach, but I am also looking to improve everyone's experience and make it easier for software developers to write software so that there is credibility in writing software.  It does help your career.  You do get more money for writing better software.  Those are all things that have to be important in any recognition in changing the market incentives, absolutely.  I write software, I deal with software, I have network experience, security product experience, development experience, you name it.  Across the board, I have lived and breathed in many respects what people experience out in the market.  I try not to forget it because obviously I cannot play in all those fields all the time.  It is a huge focus on understanding the impact of what I am stating from a policy perspective or the potential impact on people from what I am stating from a policy perspective.

Jim Manico
 * David, what do you think we can do to make software engineering resources and security people join forces?

David Rice
 * So how can we get security people and software developers to kind of join forces? Well, maybe what I am about to state is heresy, but you have got to keep the network security folks away from the software developers.  You have to keep the majority of security people away from software developers in general.  I say this both lovingly and also with a little bit of oomph behind it.  As I stated earlier in the interview, security practitioners often come from the position that this is the right thing to do.  It may be the right thing to do and I may even agree with you that it is the right thing to do, but maybe it is not the effective thing to do.  What security people, I have seen in my experience continually fail to do is recognize the demographic and the psychology of the people that they are working with.  They come from a technologists perspective that says put this in here, do this, do this, do this and you will be fine.  It does not recognize the impact on work processes.  It does not understand the impact on promotion, any of that, so I would actually argue that security should be less involved.  That network security folks should be less involved with software security and that the converts, that is the people within software development who see the importance of software security should be the evangelists within the group.  That is the first human aspect.  The second aspect though I think is the technology answer.  This is where I am a strong proponent of technology in the software development process.  What I like about the current set of software security tools that are coming out, although they are not perfect, what they do is give a feedback loop and an education mechanism to software developers so that they can learn security while they are coding.  Now, we like security to be right there out of the shoot, but we have to give those folks a ramp, a way of becoming security aware.  Now, I say that users should not have to become security experts.  I agree with that.  Of course that is what I espouse.  Software developers to a certain extent should become security experts to the degree that they can.  They are the ones writing the code.  They are the ones that are generating the vulnerabilities and are the prime ones to avoid generating those vulnerabilities, so I believe a tool set allows the developer to learn, to educate themselves, to have that private little feedback loop.  They are sitting there, and it is teaching them that hey, this is not a good thing to do or what you want to do is this.  Here is an example or here is where the problem is, that is a very grass roots, organic way of growing security among software developers without having that overlord security guy evangelizing security saying hey, come to my meeting,  hey, come over here, hey, come do this.  You have got to come to the user awareness training.  You know that is just going to turn them off when you have these intimate micro learning sessions that the tools are getting better and better at doing.  I think that is a great opportunity that developers can build their professional expertise, become aware of the issues and do so in private without having to go to these huge sessions.  They do it incrementally, that is, it is not a two hour session on security.  Fine, I am done with that.   Forget that, I am going back to writing code.  It is a continuous, what we say, the field and form mechanism.  Field and form mechanism is where you get educated.  You go out into the field and you do it, then you come back into the form, which is the tool educating you on what you need to do, and then you go right back to writing code.  That is an immensely more effective way of writing more secure software and training large sections of the developer population without mandating these very Orwellian, tight, draconian mechanisms of sit down and go through user awareness training.  It is just brutal.  That is not going to work with developer psychology.  These are rugged individualists.  They love what they do.  They are focused on writing good solid code in their eyes, so let us reward them, but let us recognize the psychology of software developers and use that to our advantage, instead of treating it as a difficulty or a challenge that they have to change.  They are not going to change their behaviors.  We need to meet them halfway, just like I argue with meeting General Jim Jones, meeting him halfway, or Larry Summers halfway.  We need to meet our developers at least halfway, if not more so, in order to get this moving along.

Jim Manico
 * So David, how about OWASP? How are we doing?  What OWASP projects do you like, and what can we do at OWASP to be better in terms of serving the security user and programmer communities?

David Rice
 * I think that OWASP is doing a fantastic job. I think, one, I like the community of OWASP and how they communicate back and forth, but I love the projects that are coming out of it, so my big passover is doing some type of software labeling.  I think that is tremendously important in terms of what we are trying to accomplish in terms of creating that mechanism out in the marketplace, so when OWASP is doing the same type of project in terms of ASVs requirements and the coding requirements that they are putting out, I think that these are really important steps on getting the community to converse and engage in dialogue.  One, I think it is necessary, but also making it easier for people to consume these, so I know Jeff Williams.  I love his approach to how he starts thinking about these things.  I think that OWASP really represents his viewpoint, but also really represents the community very well in terms of how we are trying to approach software security, but cyber security in general.  I think that it is a leading edge aspect for how we are going to address the issues confronting us in the next you know twenty, thirty, or even forty years, so I still think OWASP is in its early stages.  I know you have a lot of mature products running, but I still think OWASP is still in its early stages.  I think the best is yet to come, and that is good because I think we have a long way to go.