Toffler: "When the individual is plunged into a fast and irregularly changing situation, or a novelty-loaded context ... his predictive accuracy plummets. He can no longer make the reasonably correct assessments on which rational behavior is dependent."
The article describes a situation where a number of decisions, individually small in size and time perspective, cumulatively result in an outcome which is not optimal or desired. It is a situation where a series of small, individually rational decisions can negatively change the context of subsequent choices, even to the point where desired alternatives are irreversibly destroyed.
Overchoice, also referred to as "choice overload",[1] is a term describing a problem facing consumers in the postindustrial society: too many choices. The term was first introduced by Alvin Toffler in his 1970 book, Future Shock.
We are reaching a point where the number of inputs we have as individuals is beginning to exceed what we are capable as humans of managing. The demands for our attention are becoming so great, and the problem so widespread, that it will cause people to crash and curtail these drains. Human attention does not obey Moore's Law.
Technologist Clay Shirky argues that information overload isn't the problem tech journalism makes it out to be: it's really a failure of information filters. At the Web 2.0 Expo last week, Shirky said that the internet has made it easier and cheaper for publishers to broadcast information—so now the onus is on the consumer to filter out the noise (much like client-side spam filters). Hit the play button below to hear Shirky's well-argued points.
L. Dabbish, and R. Kraut. CSCW '06: Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work, page 431--440. New York, NY, USA, ACM, (2006)
S. Whittaker, and C. Sidner. CHI '96: Proceedings of the SIGCHI conference on Human factors in computing systems, page 276--283. New York, NY, USA, ACM, (1996)