CRN Interview: Scott Charney, Microsoft

Scott Charney, Microsoft's new chief security strategist, recently spoke with West Coast Bureau Chief Marcia Savage about the company's security efforts. Charney formerly led PricewaterhouseCooper's Cybercrime Prevention and Response Practice. Before PwC, Charney was chief of the Computer Crime and Intellectual Property Section of the U.S. Department of Justice and prosecuted major computer-crime cases. He joined Microsoft in April.

CRN: What are your top goals?

\

Scott Charney, security strategist, Microsoft

Charney: The job has two components. There's an internal component where the top goal is to figure out how better to secure products and services [and come up with strategic thinking on how to get that done, as well as handling issues on a day-to-day basis. The second part is externally focused, where I'm working with other industry members and the government to figure out how to do critical infrastructure protection and trustworthy computing and how to make the environment more secure.

CRN: Can you talk about what progress Microsoft has made so far with its Trustworthy Computing initiative?

Charney: Trustworthy Computing is a long-term goal. It involves security elements and privacy elements, safety, reliability, availability. But already we're doing things we could describe in a different way by calling it SBD: Secure by design, secure by default, and secure by deployment. The goal is to build more secure stuff--a lot of what you've seen in the security push, sending the Windows programmers back to school, building tools to look for buffer overruns, stuff like that. That's all to get it secure by design and, of course, that's a long-term process. Secure by default--we [used to ship products with all the bells and whistles enabled. When you loaded it, you said, 'Wow, I can do everything,' but of course most people don't do everything. By having everything turned on, you had a lot of things running that you might not be paying attention to and might represent a security risk. IIS 6 [is deployed with everything turned off, and you have to turn things on. It's a little less user-friendly in the sense that you have to configure it, but it means you have to think about what you want to turn on, and if you turn on only the things that you're using and turn the other stuff off, it's more secure. Secure by deployment--[we look at how [we can keep people secure so they can stay secure. If you look at Windows XP, you get a little balloon that pops up and says updates are available, and you can click on the button, and it will find the updates for you and ask you if you want to install them. And therefore it's much easier to stay secure. One of the challenges when you make mass-market products is that your user base is not a monolithic thing. My mom may love to just click on the balloon and have it all done. But in a large enterprise, a company might say, 'We're running a lot of custom configurations and propriety software, so you might have tested this but it still might crash my system, so I need to do some regression testing in my environment before this is deployed.' You can't just push out everything to everybody all the time. For that, you see things like Windows update server, where we push the stuff to a server. IT staff can do their testing and if they say, 'Yes, this is something we need to deploy in our environment,' they can push it out from that server. We're making strides in all of these areas, but this is a long-term process.

id
unit-1659132512259
type
Sponsored post

CRN: Can you provide more detail about how the initiative has changed the development process? Is it going to create longer development cycles for your software?

Charney: That's a hard question to answer. There are certain things we're doing now that will slow up release dates. If you do more extensive review of code, it can slow things up and that's fine; we said we're going to do that. I wonder if over time, it actually might not slow things up. Once you teach developers that these commands are subject to buffer overruns [and they should design them this [way instead, and as you create more tools to look for problems, maybe you end up increasing speed over time because you automate some of the processes. In the short term, to take the 7,000 Windows developers and send them to training slows things down, but over the long term, as you design better, train better, test better, maybe you get that back.

CRN: Can you talk about how Microsoft handles its patches? People say they get overwhelmed by the number of patches and updates. Do you have any plans to streamline that process and make it easier for people?

Charney: Absolutely. It's one of the issues I've been focusing on since I started.

We need to do patch management better, that's part of security usability--security that's easy to do. Obviously, if you have a critical fix that needs to get out quickly, you want to release it quickly even if it's a single thing. But then you can think about things that maybe you don't have to release individually, and you can do a multi-patch. We're thinking security roll-ups. If you're patching your system over a period of time, you might reach a point where you're not sure you're up to date. Maybe you can run something, a security roll-up that has everything that you need to be current, and you just deploy that and you're set.

Longer term, there's service packs.

As part of this process, we also need to provide better tools so people can figure out where they stand. We have some tools like HFNetChk to run on your system to see where you are. In the long term, it's better if we can come up with a single harmonized tool that leads to consistent results and follow a carefully thought-out policy of how patches should work and be implemented. What I have said to staff as we work on this is I'd rather have it good than fast. The traditional model was, 'OK, let's get something out there quickly,' but now with the new shift on security, everyone understands that doing it right is more important than getting it out fast.

CRN: How about .Net--what is your security strategy around that?

Charney: I'm not there yet. It's too soon.

CRN: What are your thoughts on the issue of vulnerability reporting?

Charney: It's an issue I had a lot of contact with when I was on the government side. There was a case of a hacker who shut down an airport by hacking a telephone switch, and this switch was in place all over the country and could be disabled with a limited number of commands. I had to start thinking, do you tell people this vulnerability exists? As I worked through it on the government side and as I think about it on the industry side, my thinking hasn't changed a lot. If you know of a vulnerability but there is no patch or workaround, giving notice of that vulnerability to the public at large invites bad guys to exploit it when the good guys don't have anything for it, and that's dangerous. So the first thing is it's better to talk about a vulnerability when there is a patch. That, of course, creates a bit of a race: The good guys have to race for the patch while bad guys are racing to exploit. There's always a risk that the bad guys will win the race not only because they get there faster but [also because not everyone deploys every patch promptly. Having said that, I think you have to notify people about the vulnerability. The key to making that work is that vendors do patches with all deliberate speed. There are issues there; depending on the complication, it may not be a quick-fix. The other difficult issue is if someone sees a vulnerability and tells a vendor. What do you do if a vendor doesn't do anything about it? The common practice I see is that people find vulnerabilities, they report it to the vendor, which I think is responsible, and if they don't feel the vendor is acting on it, they threaten to go public, and if that doesn't work, they go public. Having a set of rules so everyone knows what the game plan is and follows it is better. Best practices are a good thing. It's incumbent on vendors when there's a serious problem that needs a patch that they act judiciously to get the patch out. But they have to be given some amount of time because there's engineering to be done, regression testing that has to be done; you don't want to put out a patch that breaks a lot of stuff. We just have to manage expectations. Things do have to happen, but depending on the case, it may happen quickly, it may take a little time, but everyone has to work with due diligence.

CRN: Overall, what do you see as the key challenges ahead in Internet security?

Charney: As we put more sensitive personal data on the Internet, the breaches of confidentially and integrity take on ever more importance. As the criminal population has gotten more sophisticated, the risk of use of these technologies goes up dramatically. We have to be really vigilant about this stuff. On denial-of-service attacks, I think the threat model has changed somewhat over the last few years. We're now worried about terrorist groups, that they might want to disrupt networks in ways they didn't focus on before. The problem is exacerbated by our increasing reliance and dependence on these networks. When the Internet went down in 1988 because of the Morris worm, who noticed? Some people noticed, but it [the Internet wasn't what it is today. The threat model has changed, and you see increased concern about identity theft and terrorist attacks on the network. At the same time, because of that, we see a push toward more secure technologies [and better authentication. Biometrics would be one example of that, smart cards [is another. We'll probably see more combination hardware/software solutions, in part because hardware tends to be more secure and harder to tamper with. You'll see society struggle to balance this need for more authentication with concerns about privacy and rights of anonymous speech.

CRN: Anything else?

Charney: Obviously some of these issues will be very challenging; one is on the external side -- how industry and government are going to work together to protect these infrastructures. The role of industry and the role of government are not yet clearly defined. They have not carefully parceled out who is responsible for doing what. It's going to be really important as precursor to that, we really have to get the national plan. Dick Clarke's [President Bush's cybersecurity chief group is working on a national plan for securing the nation's critical infrastructure. ... First you need to get the plan right because first you plan, then you implement through action. If the plan and vision are wrong, then everything else fails. Industry and government have a lot of hard work to do.