| |
Current Topic: Technology |
|
Lisdexamfetamine - Wikipedia, the free encyclopedia |
|
|
Topic: Technology |
7:23 pm EDT, Jul 23, 2008 |
Lisdexamfetamine (L-lysine-d-amphetamine) is an inactive molecule prodrug (brand name Vyvanse) consisting of the psychostimulant d-amphetamine coupled with the essential amino acid L-lysine. Lisdexamfetamine was developed so that the psychostimulant is released and activated more slowly as the prodrug molecule is hydrolyzed—consequently cleaving off the amino acid-during the first pass through the intestines and/or the liver. Essentially, this makes lisdexamfetamine an extended-release formulation of d-amphetamine; however, the release characteristics are integral to the molecule itself, rather than simply the pill construction.
Lisdexamfetamine - Wikipedia, the free encyclopedia |
|
Canned Platypus » SSD Power Consumption |
|
|
Topic: Technology |
8:35 pm EDT, Jul 22, 2008 |
And people accuse me of offensively characterizing people who reach different conclusions than I have. It is to laugh. In reply, and attempting to remain more civil than Robin had been, I said this. You misread the article, Robin. If you look at the power-consumption results on page 14, you’ll see that the Hitachi drive drew more power *at idle* than the Sandisk SSD did *under load* - and that doesn’t even count the difference in cooling needs. The MemoRight SSD also used less power at idle than the Hitachi did, and idle is where most notebook drives are most of the time. Those results are starkly at odds with the traffic-generating headline, and until the inconsistency is resolved I wouldn’t jump to any conclusions. What problems do exist with SSD power consumption are also more easily solved than you let on. Some functions can be moved back to the host, others to dedicated silicon which can do them very efficiently. It’s not like hard drives don’t have processors in them drawing power too, y’know. When somebody does a head to head comparison where the drives are idle 90% of the time and only reading 90% of the remainder, and properly accounts for the fact that whole-system I/O performance might not scale perfectly with drive performance, then it’ll be worth paying attention to. Of course, Robin tried to poison the well by preemptively dismissing any methodological criticism as “denial and obfuscation” but I’d like to expand on that last point a bit. At the low end of the scale, a slightly improved I/O rate might prevent a processor from entering its power-saving sleep modes. At the high end of the scale, a slightly improved I/O rate could push a multi-threaded benchmark past the point where context switches or queuing problems degrade performance. In these cases and many others, the result can be more instructions executed and more power drawn on the host side per I/O, yielding a worse-than-deserved result for a faster device on benchmarks such as Tom’s Hardware used. Since the power consumed by CPUs and chipsets and other host-side components can be up to two orders of magnitude more than the devices under test, it doesn’t take long at all before these effects make such test results meaningless or misleading. I’m sure Robin knows a thing or two about storage benchmarketing, which is not to say that he has engaged in it himself but that he must be aware of it. Workloads matter, and any semi-competent benchmarker can mis-tune or mis-apply a benchmark so that it shows something other than the useful truth. Starting from an assumption that Tom’s Hardware ran the right benchmark and demanding that anyone else explain its flaws is demanding that people reason backwards. Instead we should reason forwards, starting with what we know about I/O loads on the class of systems we’re studying, going from there to benchmarks, results, and conclusions in that order. That’s where the ... [ Read More (0.1k in body) ] Canned Platypus » SSD Power Consumption
|
|
GT VentureLab: Requiem for a (patent) dream |
|
|
Topic: Technology |
6:03 pm EDT, Jul 22, 2008 |
If your software business is based more upon your patent portfolio than your ability to execute, your reality is about to change. The US Patent and Trademark Office (USPTO) has just added two important new tests for patentable subject matter will invalidate many if not most software patents. With a recent (June 2008) decision, the USPTO has taken the position that process inventions generally are unpatentable unless they “result in a physical transformation of an article” or are “tied to a particular machine.” Most software patents are process inventions but can meet neither of these requirements: they do not generally physically transform anything, and they are usually defined broadly so that they can protect the invention while running on any general purpose computer. Very specifically, the USPTO has categorically stated that a general purpose computer is not a "particular machine." The Patent Law Blog has a lot more information on this important ruling.
GT VentureLab: Requiem for a (patent) dream |
|
Patent Law Blog (Patently-O): The Death of Google's Patents |
|
|
Topic: Technology |
6:01 pm EDT, Jul 22, 2008 |
The Patent and Trademark Office has now made clear that its newly developed position on patentable subject matter will invalidate many and perhaps most software patents, including pioneering patent claims to such innovators as Google, Inc. In a series of cases including In re Nuijten, In re Comiskey and In re Bilski, the Patent and Trademark Office has argued in favor of imposing new restrictions on the scope of patentable subject matter set forth by Congress in § 101 of the Patent Act. In the most recent of these three—the currently pending en banc Bilski appeal—the Office takes the position that process inventions generally are unpatentable unless they “result in a physical transformation of an article” or are “tied to a particular machine.”[1] Perhaps, the agency has conceded, some “new, unforeseen technology” might warrant an “exception” to this formalistic test, but in the agency’s view, no such technology has yet emerged so there is no reason currently to use a more inclusive standard.[2] null
WOO WOO Patent Law Blog (Patently-O): The Death of Google's Patents |
|
Topic: Technology |
12:22 am EDT, Jul 18, 2008 |
------------- Hacker's Myth ------------- This is a statement on the fate of the modern underground. There will be none of the nostalgia, melodrama, black hat rhetoric or white hat over-analysis that normally accompanies such writing. Since the early sixties there has been just one continuous hacking scene. From phreaking to hacking, people came and went, explosions of activity, various geographical shifts of influence. But although the scene seemed to constantly redefine itself in the ebb and flow of technology, it always had a direct lineage to the past, with similar traditions, culture and spirit. In the past few years this connection has been completely severed. And so there's very little point in writing about what the underground used to be; leave that to the historians. Very little point writing about what should be done to make everything good again; leave that to the dreamers and idealists. Instead I'm going to lay down some cold hard facts about the way things are now, and more importantly, how they came to be this way. This is the story of how the underground died.
.:: Phrack Magazine ::. |
|
Perl Catalyst and Cloud Computing - Vox |
|
|
Topic: Technology |
1:06 am EDT, Jul 17, 2008 |
Recently I had the change to speak with Frank Speiser, who is leading several Cloud Computing projects that should be of interest to the Perl Community and to Catalyst developers in particular. nullnullnull
Perl Catalyst and Cloud Computing - Vox |
|
Metaphor Crash: Why creating software really is like building cars |
|
|
Topic: Technology |
5:05 pm EDT, Jul 16, 2008 |
I am going down this path to make a point. The common argument I see is that NASA produces reliable software and it costs millions of dollars. Who could afford that level of reliability other than an agency with deep tax pockets? The common computer user certainly couldn't. But what this argument misses is that NASA is pretty much their only customer. By the same logic, if a car can only be made reliable by investing millions of dollars in research and development, then a company that builds just a single car (as in one physical car, not one model) would have to charge millions for it just to break even. By this argument which is commonly given to justify sloppy software, nobody but the most wealthy could afford a reliable car. But that is not the case because this is not what car manufacturers do. They invest millions in making a reliable design, that is true. But then they reproduce that design at a minute fraction of the original cost. That design, reproduced tens of thousands of times, is then shipped to dealers who can afford to charge $25K for that same car that took millions to design and produce and still make enough commission to take a week long vacation to the Bahamas. In other words, the cost of producing a reliable car CAN be compared to the cost of producing reliable software if both have a large customer base. It is a valid comparison. ... The problems I have seen seem to come in with custom development projects. They usually fall in the category of "one customer who must bear all of the cost". These projects lack the economy of scale that mass produced products enjoy and are like NASA and FAA projects in this respect. One client must bear the full cost and the argument can be made that, faced with the enormous cost of producing extremely high quality software, most companies settle for "good enough" software for a much lower cost. There is a lot of truth in this argument and I believe it is valid. But the flaw here is the assumption that the only way to produce quality software is by dramatically increasing cost. The implication is that clients put up with sloppy work because they can't afford high quality work. What this ignores is that there are many ways of controlling both cost and quality. In my next article, I will talk about some of the main causes of high cost and poor quality on software projects. See you then.
Metaphor Crash: Why creating software really is like building cars |
|
Ask E.T.: Visualizing Social Networks |
|
|
Topic: Technology |
1:01 am EDT, Jul 16, 2008 |
Thank you Kindly Contributor Pierre Scalise. These network diagrams provoke the question: What is the space that network diagrams reside in? I suppose the nodes are positioned relative to one another which means there must be many diagrammatic arrangements consistent with a data set, and that the inferences about social networks perhaps depend on the arrangment chosen. Are social network diagrams somewhat like constellation (star-myth) maps? That is, self-referential and somewhat indeterminate. More generally, in thinking about diagrams that show links, it is helpful early on to identify the space in which the linked nouns are placed. See also the thread on analytical issues in causal diagrams: http://www.edwardtufte.com/bboard/q-and-a-fetch-msg?msg_id=0000yO&topic_id=1&topic=
Nick, I'm thinking you might want to get into this, as its a chance to show off the social network visualizations for memestreams you did to Edward Tufte. Ask E.T.: Visualizing Social Networks |
|
Your life will be flashed before your eyes | Technology | The Guardian |
|
|
Topic: Technology |
12:58 am EDT, Jul 16, 2008 |
Babak Parviz wears contact lenses. But he's not yet using the new contact lenses he's made in his Seattle laboratory. Containing electronic circuits, they look like something from a science fiction movie. He's now going to add some extremely small light emitting diodes (LEDs), helping turn his prototype contact lenses into a sophisticated personal display - the tiniest one possible. As an assistant professor of electrical engineering at the University of Washington, Parviz works on bio-nanotechnology, self-assembly, nanofabrication and micro-electro mechanical systems. He makes tiny but functional electronic devices and, using nanotechnology and microfabrication techniques, integrates them on to polymers or glass using a process known as self-assembly. So how did he think of making a "bionic" contact lens? "Imagine a person with that kind of research expertise and background," says Parviz. "Imagine also the same person waking up every morning and putting a contact lens in his eye."
Your life will be flashed before your eyes | Technology | The Guardian |
|
Computer Terminal : My[confined]Space |
|
|
Topic: Technology |
4:17 pm EDT, Jul 15, 2008 |
http://www.myconfinedspace.com/wp-content/uploads/2008/07/computer-terminal.jpg Computer Terminal : My[confined]Space |
|