| |
Current Topic: Technology |
|
Common Use-Cases for Merging |
|
|
Topic: Technology |
4:31 pm EST, Jan 6, 2008 |
Common Use-Cases for Merging There are many different uses for svn merge, and this section describes the most common ones you're likely to run into.
Branches oh my. Common Use-Cases for Merging |
|
Topic: Technology |
2:40 am EST, Jan 5, 2008 |
And of course, there was John Glenn, monitored inside and out, blood tested, urine sampled, entire organism analyzed for signs of accelerated aging. Close observation of the Senator suggested that there might not be any medical obstacles to launching the entire legislative branch into space, possibly the most encouraging scientific result of the mission.
This site is GREAT. A Rocket To Nowhere |
|
Ajax Security Book Out! Awesome buzz! |
|
|
Topic: Technology |
12:07 pm EST, Dec 21, 2007 |
Acidus writes:
Ajax Security is out and the feedback I'm getting is incredible.Andrew van der Stock The Executive Director of OWASP reviewed a draft of Ajax Security and here is what he had to say about it: If you are writing or reviewing Ajax code, you need this book. Billy and Bryan have done a stellar job in a nascent area of our field, and deserves success. Go buy this book. Is it just a re-hash of old presentations? No. The book breaks some new ground, and fills in a lot of the blanks in all of our presentations and demos. I hadn’t heard of some of these attacks in book form before. The examples improved my knowledge of DOM and other injections considerably, so there’s something there for the advanced folks as well as the newbies. I really liked the easy, laid back writing style. Billy and Bryan’s text is straightforward and easy to understand. They get across the concepts in a relatively new area of our field. The structure flows pretty well, building upon what you’ve already learnt ... there is advanced stuff, but the authors have to bring the newbie audience along for the ride. Billy and Bryan spend a bit of time repeating the old hoary “no new attacks in Ajax” meme which is big with the popular kids (mainly because their products can’t detect or scan Ajax code yet and still want money from you), and then spend the rest of the book debunking their own propaganda with a wonderful panache that beats the meme into a bloody pulp and buries it for all time.
Web security guru dre offers up this review of Ajax Security: It’s quite possible that many Star Wars Ajax security fans will be calling Billy Hoffman, the great “Obi-Wan”, and pdp “Lord Vader” to represent the “light” and “dark” sides that is The Force behind the power wielded by Ajax. The book, Ajax Security, covered a lot of new material that hadn’t been seen or talked about in the press or the security industry. The authors introduced Ajax security topics with ease and provided greater understanding of how to view Javascript malware, tri... [ Read More (0.2k in body) ] Ajax Security Book Out! Awesome buzz!
|
|
Slashdot | Official 700MHz Bidder List |
|
|
Topic: Technology |
11:22 pm EST, Dec 20, 2007 |
For many years, the idea of a truly software-based, frequency-hopping radio was the idea of dreams and science fiction. We have them today. They work well, but are still limited in frequencies they can utilize. Power-sources have been the biggest limiting factor for opening up spectrum for unregulated use, but that too is quickly being overcome by technological discoveries (see the nano-wire battery article from yesterday). Regulated spectrum may have been important when radio transmissions were inefficient, dirty, and even dangerous. We've overcome those issues, and now have the technology to utilize wireless transmissions that could be best navigated and selected based on distance to the other transceiving device, available power for transceiving, speed and latency requirements, and other traffic detected. Because power is not limitless, the idea that one massive power source would likely overpower everything in the area is only based on the idea that someone would or even could transmit garbage over every frequency at high power levels. Yes, I know there are technological marvels that COULD do this, and that's why I will allow for the idea that the FCC may exist only to penalize users of such dirty-transmission devices. Personally, I feel that the market would correct for these power-wasting freaks, but I'll at least accept a small role for the FCC to prevent dirty-transmissions. With frequency-hopping, and software-based radios, we'd reach a new era of wireless. We're WASTING gigahertz of spectrum on old media -- TV, radio, even cell phone and cordless phone frequencies that could be better used to combine everything into a WiFi-like system. The days of forced media schedules are slowly ending, with more and more people grabbing TV shows a la carte, via bittorrent or PVR-systems. Instead of flooding the airwaves with the gigahertz of garbage no one is watching, de-regulate that bandwidth and allow more wireless providers to send people what they want, when they want it. Those who demand faster bandwidth and lower latency may spend the money for the extra power they'll need to acquire the spectrum they need in their area, for their purposes. Yet power is the BIGGEST cost of wireless transmissions, and I can guarantee that anyone who wants to hog a wide swath of spectrum will find themselves with an unbelievable electric bill after one month. Yet even with someone locally occupying a certain amount of frequencies, there is still a huge amount of bandwidth available all over the entire radio spectrum. A move to digital, on demand IP-based transceiving makes more sense. We're moved beyond the need for fixed-frequencies, except for the old media who needs to control, and regulate, competition out of existence. They know their time has come. The need to keep cell phones on the same basic frequency, TV on the same basic frequency, and radio on the same basic frequency has been replaced, and proven so, by the newer technologies out there (Satellite, XM, WiFi, even 700Mhz cordless phones). Those days are over, but we're too engaged with the old system to realize it. The best thing the FCC could do is to just deregulate the 700Mhz-900Mhz frequencies entirely, and let the market provide services. Let's see what would happen. I bet amazing things would come into the market quickly. Then start deregulating more frequencies, until the FCC shrinks to a minor enforcer of clean transceiving.
Slashdot | Official 700MHz Bidder List |
|
PhreakNIC 0x0b Day 1 - 04 - CypherGhost - Postal Experiments |
|
|
Topic: Technology |
11:06 pm EST, Dec 20, 2007 |
PhreakNIC 0x0b presentation from CypherGhost on the funny rules surrounding what you can and can't ship via the USPS, as well as why some things take longer than others.
Totally amazing 'mail phreaking.' PhreakNIC 0x0b Day 1 - 04 - CypherGhost - Postal Experiments |
|
Mechanical Construction and Production - Bird Strike Tester |
|
|
Topic: Technology |
11:03 am EST, Dec 15, 2007 |
The experimental setup is shown at the right. It can launch an impactor with a mass of up to 2 kilograms at a velocity in the range of 50 to 400 m/s. The maximum speed depends on the impactor mass (maximum kinetic energy : 40 kJ).
Propels roast chicken at 400 mph. Give this to the boys in Iraq. Mechanical Construction and Production - Bird Strike Tester |
|
Database Developments - Startupping Forums |
|
|
Topic: Technology |
4:50 am EST, Dec 15, 2007 |
The single biggest challenge in scaling an Internet service is the database. In my experience, the ways things generally work is like this: you put up an Internet service using a single instance of a basic database, like MySQL or BerkeleyDB. Everything is run off that single database. As you grow, the database becomes overloaded, and then you begin an endless cycle of scaling work. First, you split the single database into multiple, unrelated databases. You also need to worry about reliability, and so start looking at things like hot backups. Then you look at adding read-only copies of your databases, and with those you have to worry about keeping things in sync and other related hassles. And of course you start implementing in-memory caches using something like memcached, which are great, but have their own issues as well (cold caches on restarts being the biggest I can think of off hand). All of this takes a lot of work, a lot of expertise, and a lot of maintenance. Two items relating to databases caught my eye this week. The first was the benchmarking of a RAID consisting of Solid State Drives, or SSDs. An SSD is basically a chunk of non-volatile RAM in a package with a disk-drive interface. They're designed to replace standard hard drives with something with the performance of RAM. While SSDs have been in development for a few years, I think they're really starting to become interesting for use in databases. They provide orders of magnitude speedups over hard drives in some areas, like seek performance. They're not perfect, but if I was in the situation where I needed to scale a database immediately, I'd definitely look at using an SSD. The other item was the announcement of Amazon's SimpleDB. This is a new web service that goes along with the existing S3 and EC2 services. SimpleDB provides a robust, simple, scalable database system. If it offers high performance and high reliability, as they claim, then this could be a very big deal. In combination with the other services, it would eliminate the vast majority of scaling issues facing an Internet service. It's only in limited beta right now, but once it goes live (and works as advertised), I think I'd be hard pressed to not recommend that a new startup host their entire site using Amazon. This further reduces the cost structure associated with an Internet startup (both in terms of money and in terms of talent needed) and lowers the barrier to entry for new services. Like I said, a very big deal.
Database Developments - Startupping Forums |
|
Catalyst - Accelerating Perl Web Application Development |
|
|
Topic: Technology |
11:02 am EST, Dec 14, 2007 |
Many web applications are implemented in a way that makes developing them painful and repetitive. Catalyst is an open-source Perl-based Model-View-Controller framework that aims to solve this problem by reorganizing your web application to design and implement it in a natural, maintainable, and testable manner, making web development fun, fast, and rewarding. Everything that your web application needs to do is only written once; you connect to the database in one place, have configuration in one place, etc. Then, you just write actions for each URL that your application needs, without worrying about the database connections or HTML to produce. Catalyst will handle the details so you can worry about writing your application. Catalyst is designed to be reliable. There are hundreds of production applications and thousands of users. The code is well-tested, and new releases almost always maintain compatibility with applications written for older versions. You don't have to worry about Catalyst breaking your application and slowing down your development. It just works. Most importantly, Catalyst has a thriving community. You can ask a question on the IRC channel and get a response at almost any time of the day. This book embodies Catalyst's philosophies of Do It Yourself and Don't Repeat Yourself.
If you use perl, or know perl, and you're not using Catalyst for your web development... you should be. Probably. Its great. Catalyst - Accelerating Perl Web Application Development |
|
Topic: Technology |
4:42 pm EST, Dec 12, 2007 |
What are you guys using to edit Javascript? I am tired of not having a decent IDE for this. Javascript Editor |
|