Jeremy wrote: ] The moral is obvious. You can't trust code that you ] did not totally create yourself. If any discussion about secure computing platforms goes on long enough, this paper will come up. Its a flavor of Goodwin's law. Goodwindows law? Heh.. Good topic to spin into a pre-Interz0ne rant. I agree with you in your bold, in that having code is not the end-all be all of trustworthy computing bases or anything like that. Some interesting things have been done with polluted compilers as well.. A long time ago, this was my motivation for knowing how to bootstrap a compiler and base OS. I guess that places a fair amount of trust in GNU, OpenBSD, or others, and many eyes making the difference. I attempted to make sure I understood the chain of dependencies necessary to make basic server type functions happen. On a personal computing level, I suck. I own an Apple. For the record, that's giving in, not selling out. I don't care to take the time rolling my own of anything most of the time.. I just want it to work. However, if I ever found proof of my personal privacy trust being compromised by deliberate holes in my hardware or software, I would find myself very, very, very angry about it. Matters of physical security in relation to software/hardware are another matter. Its just as easy, and more likely, to have your hardware compromised physically by any powers who would be powerful enough to pull the strings on deliberate pre placed software holes from vendors like Apple, RedHat, Sun, or Microsoft. The "mad rogue coder with silver bullet" risk is less likely in the public (read: open source) sector by virtue of more review. I'd think so anyway. On the other end of things, you know the feds look over what they use. If an intelligence agency such as the NSA found a serious deliberate hole in a piece of open code, I'd like to think that efforts to see it removed from the mainline branches would be taken, and an investigation started by other agencies to figure out how it got there. The source of such things would be a major concern, in the most general of ways. In that context I'd think that national security would have a larger focus. Keeping a hole secret would only make sense if you were the only one holding the secret, or knew definitively who the other holders were. I'd like to think that our government would not be behind placing deliberate holes as well, but I'm also naive for breakfast. They were fans of key escrow type schemes in the past. Someone _is_ keeping all the major distribution folks on their toes. I think I've seen a news story about "attempted breakins" on just about every major open OS's source distribution site, as well at other key pieces of code. People are aware of and do think about these type of code compromises.. From the perspective of joe hacker. When it comes to trade craft, knowing is half the battle. The other half, is really hard and time consuming so you do what parts of it you feel you have a reason to. You also really hope someone else is doing their end of the bargain, and isn't trying to screw you over in the process. Its most likely that software security holes used against you will be ones that happened by accident. shrug. The commentary at the end of this is great: ] I have watched kids testifying before Congress. It is clear ] that they are completely unaware of the seriousness of their ] acts. There is obviously a cultural gap. The act of breaking ] into a computer system has to have the same social stigma ] as breaking into a neighbor's house. It should not matter ] that the neighbor's door is unlocked. The press must learn ] that misguided use of a computer is no more amazing than ] drunk driving of an automobile. You could write that today, and it would still be right. However, there are a few cravats. More people get killed by cars, but from a national security perspective a virus is a way more sexy thing to place undue concern upon. Knowledge and application remain separate things, and breaking into other people's systems should be seen as crime as Thompson suggests, be the perpetrator an individual, company, or government. Leaving backdoors is certainly not the type of behavior that should been seen as acceptable from any of the aforementioned parties as well. A culture that sees this is what will make the difference, more even then laws that define it. I do feel that such a culture does largely exist within the hacking community. Lets hope that the undue concern present in other spaces does not backfire as Thompson worried, and gets a good result. Cyber attack is a risk, but I have a gut feeling that years will pass where only really bad tv shows using it as a plot device will occur, as opposed to actual cyber attacks. The real thing will not wind up being so scary after the show, or so goes the theory. The virus people do still worry me though. Evolving threats isn't the greatest of ideas. There are many parties paying attention, and its all being done in a way where its quite visible. I wish I knew who the parties involved were so I had a sense of stakes. If its the intelligence agencies of various nation states and large corporations, fine then. Ok. Its the beginning of that Gibson'esq universe I have the occasional nightmare (wet dream?) about. If its really genuinely 16-24 kids in school and most of the thinking is going into what Cowboy Bebop reference to use, then I'm just really scared.. Someone is going to make a mistake, and either way, I think I know what segment of the hacking community is going to get the blame.. [ U: And its people like me who are associated with such communities, who fit the "disgruntled acquired geek meat" profile, have the skills and knowledge necessary, and are loudmouths to boot, that will wind up with the FBI wrongly busting in their door someday on a hunch driven by a profile. Or worse, because of people making up shit about you that was inspired by those bounties rather then truth. Fuck you very much for the added stress. I get a massive wave of paranoia every time I see a virus strike MS. ] Whoever it is, please stick to things that attack desktops when playing with anything that self-propagates.. Don't do stuff that propagates and goes after databases, routers, firewalls, etc. A virus that went after WiFi APs or SOHO devices would be borderline. Anything that targets stuff that even remotely could be used for serious infrastructure.. Is.. Well.. Seriously scary. In the same way that breaking into a computer system should be seen as breaking into someone's house, making something that targets core infrastructure, and then giving it a self propagating profile, should be seen as construction of a WMD. Releasing it should be seen as a very serious crime. Those crimes, due to the global nature of the internet, would be global crimes. Also in the same way that breaking into a computer system should be seen as breaking into a house, our property and privacy rights should be respected. The government should not have any backdoors into private systems. I also hope they play their role in protecting us in the event holes are discovered. I'd hope that any big players in such games see the stakes for what they are, and do not do something stupid. I hope some random college kid doesn't drop the Cisco self propagating nuke someday. As I've said in the past, I tend to fear random entities doing stupid things then governmental entities doing something intentional. shrug, again.. RE: ACM Classic: Reflections on Trusting Trust |