The core Internet protocols have not changed significantly in more than a decade, in spite of exponential growth in the number of Internet users and the speed of the fastest links. The requirements placed on the net are also changing, as digital convergence finally occurs.
Will the Internet cope gracefully with all this change, or are the cracks already beginning to show?
In this paper I examine how the Internet has coped with past challenges resulting in attempts to change the architecture and core protocols of the Internet.
Unfortunately, the recent history of failed architectural changes does not bode well. With this history in mind, I explore some of the challenges currently facing the Internet.
Search engines can record which documents were clicked for which query, and use these query-document pairs as ‘soft’ relevance judgments. However, compared to the true judgments, click logs give noisy and sparse relevance information.
We apply a Markov random walk model to a large click log, producing a probabilistic ranking of documents for a given query. A key advantage of the model is its ability to retrieve relevant documents that have not yet been clicked for that query and rank those effectively.
We conduct experiments on click logs from image search, comparing our (‘backward’) random walk model to a different (‘forward’) random walk, varying parameters such as walk length and self-transition probability.
The most effective combination is a long backward walk with high self-transition probability.
What is the meaning of technology in our lives? What place does technology have in the universe? What place does it have in the human condition? And what place should it play in my own personal life?
Technology as a whole system, or what I call the technium, seems to be a dominant force in the culture. Indeed at times it seems to be the only force — the only lasting force — in culture. If that's so, then what can we expect from this force, what governs it?
Sadly we don't even have a good theory about technology.
Social interactions and personal tastes shape our consumption behaviors of cultural products. In this study, we present a computational model of a cultural market and we aim to analyze the behavior of the consumer population as an emergent phenomena. Our results suggest that the final market shares of the cultural products dramatically depend on the consumer heterogeneity and social interaction pressure. Furthermore, the relation between the resulting market shares and the social interaction is robust with respect to a wide range of variation in the parameter values and the type of topology.
Towards Trustworthy Recommender Systems: An Analysis of Attack Models and Algorithm Robustness
Topic: Technology
12:05 am EDT, Jul 15, 2007
Publicly-accessible adaptive systems such as collaborative recommender systems present a security problem.
Attackers, who cannot be readily distinguished from ordinary users, may inject biased profiles in an attempt to force a system to “adapt” in a manner advantageous to them. Such attacks may lead to a degradation of user trust in the objectivity and accuracy of the system.
Recent research has begun to examine the vulnerabilities and robustness of different collaborative recommendation techniques in the face of “profile injection” attacks.
In this paper, we outline some of the major issues in building secure recommender systems, concentrating in particular on the modeling of attacks and their impact on various recommendation algorithms.
We introduce several new attack models and perform extensive simulation-based evaluation to show which attack models are most successful against common recommendation techniques.
We consider both the overall impact on the ability of the system to make accurate predictions, as well as the degree of knowledge about the system required by the attacker to mount a realistic attack.
Our study shows that both user-based and item-based algorithms are highly vulnerable to specific attack models, but that hybrid algorithms may provide a higher degree of robustness.
Finally, we develop a novel classification-based framework for detecting attack profiles and show that it can be effective in neutralizing some attack types.
It's nice to know that MemeStreams has been using a robust approach for years now.
See also the May/June issue of IEEE Intelligent Systems, which includes a special feature on recommender systems. Subscription required for access to full text.
Google's Security team has discovered vulnerabilities in the Sun Java Runtime Environment that threatens the security of all platforms, browsers and even mobile devices.
Inventec [is] one of five companies based in Taiwan that together produce the vast majority of laptop and notebook computers sold under any brand anywhere in the world.
Everyone in America has heard of Dell, Sony, Compaq, HP, Lenovo-IBM ThinkPad, Apple, NEC, Gateway, Toshiba.
Almost no one has heard of Quanta, Compal, Inventec, Wistron, Asustek.
Yet nearly 90 percent of laptops and notebooks sold under the famous brand names are actually made by one of these five companies in their factories in mainland China.
I have seen a factory with three "competing" brand names coming off the same line.
See also a slideshow that accompanies the article.
(See also this thread, which links directly to the slide show.)
Everything old is new again, when you reinvent the wheel. It's merely coincidental that your new wheel looks a lot like some wheels from five years ago.
Initial Task: Rethink and reinvent online social networking
Refined Focus: Discover the user needs related to social networking and explore how a unified social network service can enhance their experience.
Prototype Goal: Create a system for users to seamlessly share, view, and respond to many types of social content across multiple networks.
Gee, sound familiar?
Socialstream is a system where users can seamlessly share, view, and respond to many types of social content across multiple networks. It is the result of a rigorous user-centered design process that involved formal research and evaluation with over 35 participants and weekly critiques from clients and colleagues.
When Network Coding and Dirty Paper Coding meet in a Cooperative Ad Hoc Network
Topic: Technology
8:12 pm EDT, Jul 9, 2007
We develop and analyze new cooperative strategies for ad hoc networks that are more spectrally efficient than classical DF cooperative protocols. Using analog network coding, our strategies preserve the practical half-duplex assumption but relax the orthogonality constraint. The introduction of interference due to non-orthogonality is mitigated thanks to precoding, in particular Dirty Paper coding. Combined with smart power allocation, our cooperation strategies allow to save time and lead to more efficient use of bandwidth and to improved network throughput with respect to classical RDF/PDF.