Create an Account
username: password:
 
  MemeStreams Logo

Post Haste

search

possibly noteworthy
Picture of possibly noteworthy
My Blog
My Profile
My Audience
My Sources
Send Me a Message

sponsored links

possibly noteworthy's topics
Arts
Business
Games
Health and Wellness
Home and Garden
Miscellaneous
  Humor
Current Events
  War on Terrorism
Recreation
Local Information
  Food
Science
Society
  International Relations
  Politics and Law
   Intellectual Property
  Military
Sports
(Technology)
  Military Technology
  High Tech Developments

support us

Get MemeStreams Stuff!


 
Current Topic: Technology

DRM: Desirable, inevitable, and almost irrelevant
Topic: Technology 4:46 pm EDT, Sep 15, 2007

Andrew Odlyzko's latest is a short rant on DRM. Consider it in light of the recent Rick Rubin profile, The Music Man in the NYT Magazine.

The fundamental issue that limits current use and future prospects of DRM is that, in the words of [10],

The important thing is to maximize the value of your intellectual property, not to protect it for the sake of protection.

DRM all too often gets in the way of maximizing the value of intellectual property.

People are very frequently willing to pay more for flat rate plans than they are for metered ones, even if their usage does not change. The trend towards flat rate plans is not universal, and there is likely to be a spectrum of charging schemes. Flat rate plans are likely to dominate for inexpensive and frequently purchased goods and services, and extreme examples of differential pricing are likely to prevail for expensive and seldom-purchased things, see [4] for a discussion and evidence.

But overall, we should expect to see growth in flat rate pricing and bundling (as in subscriptions to magazines, or in a collection of cable channels for a single price). In addition to a willingness to pay more for flat rate plans, people tend to use more of a good or service that does not involve fine-scale charging or decision making. Typical increases in usage are from 50% to 200% when users are switched from metered to flat rates. Depending on whether one wishes to increase or decrease usage, this may or may not be desirable, but in the case of information goods, the overwhelming incentive is to increase usage. This provides yet another incentive to avoid fine-grained pricing and control that DRM is often designed for.

What we are likely to end up with is a huge universe of free material, much of it of little interest to all but a handful of people.

DRM: Desirable, inevitable, and almost irrelevant


How to Avoid Being Disturbed While Debugging
Topic: Technology 9:40 pm EDT, Sep  6, 2007

This is really about good record-keeping habits.

Q: It seems that ... I can't get through more than 15 minutes of work without someone interrupting me, and then I lose my train of thought.

...

A: What you really want is a way to remember what you were doing when you were doing it, and debugging is one of the best examples of this. Nothing is more annoying than to come back to a problem you were working on and not remember what you had already tried.

Real scientists, as opposed to lame hacks who claim to be scientists, know how to formulate ideas - called hypotheses - and test them. They write down each hypothesis, then describe the experiment and the results. They keep all of this data in logbooks or notebooks.

Just because your test passes or your code doesn't crash doesn't mean that you have completed your debugging.

How to Avoid Being Disturbed While Debugging


VoIP hacker talks: Service provider nets easy pickings
Topic: Technology 8:00 pm EDT, Sep  3, 2007

A combination of simple dictionary and brute-force attacks in combination with Google hacking enabled a criminal pair to break into VoIP-provider networks and steal $1 million worth of voice minutes, says one of the duo who has pleaded guilty to his crimes.

He designed software to generate 400 prefixes per second against the carrier gear, scanning all the combinations between 000 and 999 randomly to throw off intrusion-detection systems (IDS) that might pick up a sequential attack.

"Most of the telecom administrators were using the most basic password. They weren’t hardening their boxes at all."

He also wrote search strings that he fed into Google seeking exposed Web interfaces on devices, and that proved fruitful as well.

VoIP hacker talks: Service provider nets easy pickings


Bet on It!
Topic: Technology 11:25 am EDT, Sep  1, 2007

It’s not surprising that Internet-based mechanisms can be found to tap into the wisdom of the masses. What’s surprising is that there’s a way to draw it out so quickly and efficiently. And that it works so well.

IEEE covers prediction markets. Toward the end of the article, they discuss FutureMAP:

FutureMAP quickly ran afoul of public opinion. Members of Congress found it offensive and ghoulish, and they quickly terminated the program.

This statement is immediately followed by a pull-quote:

“Maybe it won’t work this time; maybe the crowd will be stupid

Bet on It!


Software developer perceptions about software project failure
Topic: Technology 11:24 am EDT, Sep  1, 2007

The last sentence caught my eye:

Software development project failures have become commonplace. With almost daily frequency these failures are reported in newspapers, journal articles, or popular books. These failures are defined in terms of cost and schedule over-runs, project cancellations, and lost opportunities for the organizations that embark on the difficult journey of software development. Rarely do these accounts include perspectives from the software developers that worked on these projects.

This case study provides an in-depth look at software development project failure through the eyes of the software developers. The researcher used structured interviews, project documentation reviews, and survey instruments to gather a rich description of a software development project failure.

The results of the study identify a large gap between how a team of software developers defined project success and the popular definition of project success. This study also revealed that a team of software developers maintained a high-level of job satisfaction despite their failure to meet schedule and cost goals of the organization.

Subscription required for access to full text, but there are at least 4 versions of the paper available.

And here are two versions of a video you're sure to enjoy (again, and again, and again):

Software developer perceptions about software project failure


Graph Annotations in Modeling Complex Network Topologies
Topic: Technology 5:28 pm EDT, Aug 30, 2007

Dmitri Krioukov and George Riley have collaborated with Xenofontas Dimitropoulos and Amin Vahdat on a new paper.

The coarsest approximation of the structure of a complex network, such as the Internet, is a simple undirected unweighted graph. This approximation, however, loses too much detail.

In reality, objects represented by vertices and edges in such a graph possess some non-trivial internal structure that varies across and differentiates among distinct types of links or nodes.

In this work, we abstract such additional information as network annotations. We introduce a network topology modeling framework that treats annotations as an extended correlation profile of a network.

Assuming we have this profile measured for a given network, we present an algorithm to rescale it in order to construct networks of varying size that still reproduce the original measured annotation profile.

Graph Annotations in Modeling Complex Network Topologies


Trust | Geoff Huston's ISP Column | September 2007
Topic: Technology 7:08 am EDT, Aug 30, 2007

Geoff Huston discusses the call for papers I recently recommended.

Trust and networking go hand in hand, and I'm pleased to see that the topic of trust has been raised by the Internet Society recently.

In this column I'd like to informally respond to this call for participation and revise some earlier thoughts I had on trust to see if we've made any significant progress on the issue of trust in the Internet in the past four years.

Jumping to the end:

So is trust the universal answer?

The problem for me is that "trust" is not that much different from blind faith, and, in that light, "trust”"is not a very satisfying answer. The difference between "fortuitous trust" and "misplaced trust" is often just a matter of pure luck, and that’s just not good enough for a useful and valuable network infrastructure. "Trust" needs to be replaced with the capability for deterministic validation of actions and outcomes of network-based service transactions. In other words, what is needed is less trust and better security.

Trust | Geoff Huston's ISP Column | September 2007


Trust and the Future of the Internet
Topic: Technology 11:51 am EDT, Aug 25, 2007

Interested in a free trip to Toronto?

The Internet Society (ISOC) Board of Trustees is currently engaged in a discovery process to define a long term Major Strategic Initiative to ensure that the Internet of the future remains accessible to everyone. The Board believes that Trust is an essential component of all successful relationships and that an erosion of Trust: in individuals, networks, or computing platforms, will undermine the continued health and success of the Internet.

The Board will meet in special session the first week October of 2007 for intensive study focused on the subject of trust within the context of network enabled relationships. As part of this process, the Board is hereby issuing a call for subject experts who can participate in the two day discussion. Topics of interest include: the changing nature of trust, security, privacy, control and protection of personal data, methods for establishing authenticity and providing assurance, management of threats, and dealing with unwanted traffic.

I'll give you two topics:

If you trust us, you're stupid.

The Internet has neither trust nor a future.

Discuss.

Trust and the Future of the Internet


Inheritance and loss? A brief survey of Google Books
Topic: Technology 11:50 am EDT, Aug 25, 2007

Paul Duguid, co-author of The Social Life of Information, in First Monday.

The Google Books Project has drawn a great deal of attention, offering the prospect of the library of the future and rendering many other library and digitizing projects apparently superfluous. To grasp the value of Google’s endeavor, we need among other things, to assess its quality. On such a vast and undocumented project, the task is challenging.

In this essay, I attempt an initial assessment in two steps.

First, I argue that most quality assurance on the Web is provided either through innovation or through “inheritance.” In the later case, Web sites rely heavily on institutional authority and quality assurance techniques that antedate the Web, assuming that they will carry across unproblematically into the digital world. I suggest that quality assurance in the Google’s Book Search and Google Books Library Project primarily comes through inheritance, drawing on the reputation of the libraries, and before them publishers involved.

Then I chose one book to sample the Google’s Project, Lawrence Sterne’s Tristram Shandy. This book proved a difficult challenge for Project Gutenberg, but more surprisingly, it evidently challenged Google’s approach, suggesting that quality is not automatically inherited.

In conclusion, I suggest that a strain of romanticism may limit Google’s ability to deal with that very awkward object, the book.

Browse through the stacks here at MemeStreams:

The Social Life of Legal Information

We need less cheerleading and more constructive critique.

Academic freedom and the hacker ethic

Hackers advocate the free pursuit and sharing of knowledge without restriction, even as they acknowledge that applying it is something else.

Relearning Learning: Applying the Long Tail to Learning

In a digitally connected, rapidly evolving world, we must transcend the traditional Cartesian models of learning that prescribe “pouring knowledge into somebody’s head." We learn through our interactions with others and the world ...

The Only Sustainable Edge: Why Business Strategy Depends on Productive Friction and Dynamic Specialization

As opportunities for innovation and growth migrate to the peripheries of companies, industries, and the global economy, efficiency will no longer be enough to sustain competitive advantage. The only sustainable advantage in the future will come from an institutional capacity to work closely with other highly specialized firms to get better faster.

Old Search Engine, the Library, Tries to Fit Into a Google World

"The nature of discovery is changing." "It has huge ramifications."

"We can show people things they don't ask for."

Inheritance and loss? A brief survey of Google Books


A Brief History of Scanning
Topic: Technology 11:50 am EDT, Aug 25, 2007

A new paper by Mark Allman, for an upcoming conference.

Incessant scanning of hosts by attackers looking for vulnerable servers has become a fact of Internet life.

In this paper we present an initial study of the scanning activity observed at one site over the past 12.5 years.

We study the onset of scanning in the late 1990s and its evolution in terms of characteristics such as the number of scanners, targets and probing patterns.

While our study is preliminary in many ways, it provides the first longitudinal examination of a now ubiquitous Internet phenomenon.

A Brief History of Scanning


(Last) Newer << 19 ++ 29 - 30 - 31 - 32 - 33 - 34 - 35 - 36 - 37 ++ 47 >> Older (First)
 
 
Powered By Industrial Memetics
RSS2.0