Study: Antivirus Software Catches About Half Of Malware, Misses 15 Percent Altogether
Topic: Technology
10:22 am EST, Mar 3, 2009
Antivirus software immediately discovered only 53 percent of malware samples, according to data gathered by Damballa in a six-month study that used McAfee Scan Engine v5.3.00 to scan more than 200,000 malware samples. Another 32 percent were found later on, and 15 percent were not detected at all. The average delay in detection and remediation was 54 days. ... Failsafe 3.0 includes a management console and will ship this month, with pricing starting at $100,000 for 10,000 nodes. It's aimed at organizations that prefer to keep their botnet-detection "locally in the cloud," Guerry says. "This is sensitive information to these clients," he says.
Couldn't say in their LAN or intranet? Had to work "cloud" in there.
The Multi-Principal OS Construction of the Gazelle Web Browser - Microsoft Research
Topic: Technology
7:28 pm EST, Feb 22, 2009
Web browsers originated as applications that people used to view static web sites sequentially. As web sites evolved into dynamic web applications composing content from various web sites, browsers have become multi-principal operating environments with resources shared among mutually distrusting web site {\it principals}. Nevertheless, no existing browsers, including new architectures like IE 8, Google Chrome, and OP, have a multi-principal operating system construction that gives a browser-based OS the exclusive control to manage the protection of all system resources among web site principals.
In this paper, we introduce Gazelle, a secure web browser constructed as a multi-principal OS. Gazelle's Browser Kernel is an operating system that exclusively manages resource protection and sharing across web site principals. This construction exposes intricate design issues that no previous work has identified, such as legacy protection of cross-origin script source, and cross-principal, cross-process display and events protection. We elaborate on these issues and provide comprehensive solutions.
Our prototype implementation and evaluation experience indicates that it is realistic to turn an existing browser into a multi-principal OS that yields significantly stronger security and robustness with acceptable performance and backward compatibility.
I mainly want to write this down somewhere so I can refer back to it. HTTP never ceases to surprise me and I never know when I'll need this info again.
Yesterday I started looking into a bug with our Web Macro Recorder (WMR) tool. In the world of web scanners being able to record login macros and knowing when to replay them is essential to quality coverage during a scan.
The WMR would hang when attempting go through the login process on some sites, e.g. Yahoo.com. After receiving a 301 redirect to the HTTPS login page it would just quit, give up, throw in the proverbial towel. I watched this happen from WireShark, there was no RST there was nothing.
I decided to try and recreate this on a server I control so that I could mess around. After I noticed it was using HTTP/1.0 I tried using the headers() function in PHP to give responses as http proto ver 1.0. No luck... There is a bug with the headers() function in PHP that requires you to add some additional voodoo.
After implementing this I wasn't able to reproduce the bug in WMR that was occurring on Yahoo.com. The Apache server I was using was also configured to send additional headers that Yahoo wasn't including. The one that caught my suspicion was the Connection: Keep-Alive header.
Using interactive mode I tried removing the Connection header sent by my server and SUCCESS was able to reproduce the same behavior that occurs with Yahoo.com. So the one issue is that the WMR is not properly handling HTTP/1.0 redirects.
In other testing it was determined that WMR would work sometimes but not every time. e.g. We saw it work about 66% of the time on Citigroup.com. After the revelation with Yahoo.com I decided to go back and look at Citigroup's HTTP responses.
I noticed this:
HTTP/1.1 302 Moved Temporarily Date: Thu, 12 Feb 2009 16:11:20 GMT Server: Hitbox Gateway 9.3.6-rc1 P3P: policyref="/w3c/p3p.xml", CP="NOI DSP LAW NID PSA ADM OUR IND NAV COM" Set-Cookie: CTG=1234455080; path=/; domain=vendorweb.citibank.com; expires=Thu, 19-Feb-2009 16:11:20 GMT; max-age=604800 nnCoection: close Pragma: no-c... [ Read More (0.1k in body) ]
Recently, a popular website "phpbb.com" was hacked. The hacker published approximately 20,000 user passwords from the site. This is like candy to us security professionals, because it's hard data we can use to figure out how users choose passwords. I wrote a program to analyze these passwords looking for patterns, and came up with some interesting results.
This incident is similar to one two years ago when MySpace was hacked, revealing about 30,000 passwords. Both Wired and InfoWorld published articles analyzing the passwords.
The striking different between the two incidents is that the phpbb passwords are simpler. MySpace requires that passwords "must be between 6 and 10 characters, and contain at least 1 number or punctuation character". Most people satisfied this requirement by simply appending '1' to the end of their passwords. The phpbb site has no such restrictions, the passwords are shorter and rarely contain anything more than a dictionary word.
It's hard to judge exactly how many passwords are dictionary words. A lot of things like "xbox" or "pokemon" are clearly words, but not in an English dictionary. I ran the phpbb passwords through various dictionary files, and come up with a 65% match (for a simple English dictionary) and 94% (for "hacker" dictionaries). The dictionary words were overwhelmingly simple things, like "apple" or "orange", rather than complex words like "pomegranate".
I’m working on a book for Addison/Wesley entitled " Protocols And Performance: A Web Server In Three Acts (plus supporting cast)". The book will lead the reader through the history of the HTTP protocol by building three separate web servers: HTTP 0.9-1.0, HTTP 1.1, and HTTP “2.0”. During the process of putting these different servers together the reader will continually evaluate their performance and stability using statistical analysis methods.
As the story unfolds there will also be tales from other HTTP alternatives, internet bodies, and other protocols in development at the time. These will be told from the point of view of HTTP as a player in the story.
A big part of the book is teaching modern protocol design using scientific analysis, reusable libraries, modern techniques, and confirming that these new approaches are valid with evidence. This means taking on existing myths and dogma pushed by many proponents and also looking at other project’s bad code.
This seems like it will be a great book that I'll want to get when it's finally published. Addison/Wesley seems to try and publish technical books that involve story telling rather than just technical reference books. Good stuff.
Now, the captchas provided by the site aren't very "hard" to solve (in fact, they're downright bad):
But there are many interesting parts here:
1. The HTML 5 Canvas getImageData API is used to get at the pixel data from the Captcha image. Canvas gives you the ability to embed an image into a canvas (from which you can later extract the pixel data back out again). 2. The script includes an implementation of a neural network, written in pure JavaScript. 3. The pixel data, extracted from the image using Canvas, is fed into the neural network in an attempt to divine the exact characters being used - in a sort of crude form of Optical Character Recognition (OCR).
Nozzle: detecting heap spraying attacks - Microsoft Research
Topic: Technology
5:29 pm EST, Jan 22, 2009
Heap spraying is a new security attack that significantly increases the exploitability of existing memory corruption errors in type unsafe applications. With heap spraying, attackers leverage their ability to allocate arbitrary objects in the heap of a type-safe language, such as JavaScript, literally filling the heap with objects that contain dangerous exploit code. In recent years, spraying has been used in many real security exploits, especially in Web browsers.
We propose Nozzle, a runtime monitoring infrastructure that detects attempts by attackers to spray the heap. Nozzle uses lightweight emulation techniques to detect the presence of objects that contain executable code. To reduce false positives, we developed a notion of global “heap health”.
Ben Livshits vs. Mark Dowd The ultimate showdown. The ultimate destiny.
A series of Paul Graham's articles has led me to something I'm calling the Innovation Problem. Essentially it started when I read his article After Credentials. I enjoyed it article, but found this part is odd:
Do they let energetic young people get paid market rate for the work they do? The young are the test, because when people aren't rewarded according to performance, they're invariably rewarded according to seniority instead. ... If people who are young but smart and driven can make more by starting their own companies than by working for existing ones, the existing companies are forced to pay more to keep them.
This statement about motives seemed out of sync with his essay Great Hackers:
Great programmers are sometimes said to be indifferent to money. This isn't quite true. It is true that all they really care about is doing interesting work. But if you make enough money, you get to work on whatever you want, and for that reason hackers are attracted by the idea of making really large amounts of money. But as long as they still have to show up for work every day, they care more about what they do there than how much they get paid for it.
Perhaps this is because Graham is talking about a general case of person in the first essary and a subset of people (Specifically great programmers) in the second.
Now, I don't consider myself a super hacker and nor would I ever compare myself to someone like RTM or others Graham has mentioned. Quite the contrary I've gone out of my way to deny unwarranted comparisons. I do however consider myself a hacker and I understand exactly what Graham means in his 2nd essay.
I think that performance metrics are one half of a two sided coin, depending on what drives you are a person: pay or project.
Let me explain. I work for a Fortune 15 technology corporation. They pay me very, very, very well. However in return I'm subjected to (with a fair bit of good things) unbelievablystupidbullshit. They don't seem to realize that I couldn't give 2 shits about their money otherwise I'd have alot less bullshit in my life.
Jay Chaudhry met with me twice in the spring of 2008 and asked me to join his new start up Zscalar. I turned him down for a couple reasons, the biggest being he kept appealing to the wrong side of me. He kept talking dollars, he never talked projects. How are you doing "in the cloud" security. Are you buying or building? ... [ Read More (0.4k in body) ]
The Idea for Digitally Assisted Billiards was to help people understand the physics of pool by using a webcam and projector to directly give an in-game visualization of the shots being taken.