Web.Bot
The web.bot cannot make accurate predictions
What is the Web.Bot?
The Web Bot project is said to have been developed in the late 1990s for the purpose of predicting the stock market movement.1 It is supposed to work by using small programs called 'spiders' or 'web crawlers'2 to look for trends in the relationships between words. The raw data is then analyzed by a set of linguistic tools to determine if there is any meaning to the relationships.
History of the Web.Bot project
The web.bot is built, run and maintained by Clif High and George Ure, who sometimes refer to themselves as "The Time Monks". According to an interview conducted in 2008, Clif 'came across this idea' that he calls "The Language Model for Storing Data" in 1994, and began developing it into a system that he intended to predict stock performance based on the language people were using with regard to those stocks[1][2][3].
However, in 1997, he began down a different path. Clif says:
From 1997 to 2001 I deduced some of the following principals: All people are psychic. Most don’t know it. Even if you do know it, it does not impact the next statement I’m going to make, which is: That all humans leak out these psychic impressions in the language that they choose to use in ordinary conversation. And that was my basic premise to begin with.
My working theory from that point was that if one could sample enough of the conversations going on around the planet and sift for the nuance between why one word might be chosen in an ordinary conversation as opposed to another word for the same conversation that basically you’d had a week ago, then one could determine what is moving us, if you will, at an unconscious level and be able to make some forecasts from that in a very interesting way. Sort of an extension, if you will, of my work, of the focus of it in 1997, which was commercial. [3]
How it works
The web.bot project produces 'reports' that are sold to its customers. These reports contain vague references to the kinds of language that is used surrounding certain dates. They make no specific predictions. Instead the authors and their customers comb through the reports after an event, and point to certain phrases as 'hits' for the event. Both the authors and their customers have a financial stake in the predictions, the authors for their continued income, and the customers for their peace of mind that they have not wasted their money (which they have).
This is much the same method used by proponents of Nostradamus. After the 9/11 attacks, the Nostradamus proponents were quick to point to certain quatrains, even though they didn't fit. Some went so far as to rewrite the quatrains to make the fit better.
GIGO
GIGO is an acronym that stands for the phrase "Garbage In, Garbage Out". You can have the best computer in the world, but if the data that you feed into the computer for analysis is flawed, or the program performing the analysis is flawed, you will receive flawed results. If the basic premise of your product is an unsupported claim of psychic ability, and your evidence is the cherry picked results of your analysis, then you have not only committed a basic flaw in design, you have committed a massive case of circular reasoning and observational selection. You cannot claim that the cherry-picked results of your research support the results of your research!
The 2012 connection
Type in "2012" into a search engine. Count the results. To say that there are a lot of websites talking about 2012 would be a massive understatement. What the web.bot project has done is to run their analysis on the flawed misinformation that is published on the web about 2012, and therefore produce a flawed analysis, stating that '2012 is a significant year'.
Conclusion
The web.bot project takes flawed data (the massive amount of misinformation on the web about the 2012 doomsday hoax), applies a flawed and unsupported premise (i.e., "people are psychic"), and produces a flawed analysis of the significance of December 21st 2012. It can't do what its creators claim.
Footnotes
1. See DailyCommonSense.com on the web.bot project[1]
2. See Wikipedia on Web Crawlers[2]
Bibliography
1. http://www.dailycommonsense.com/web-bot-what-is-it-can-it-predict-stuff/
2. http://en.wikipedia.org/wiki/Web_crawler
3. http://projectcamelot.org/lang/en/clif_high_half_past_human_26_sept_2008_en.html
4. http://www.halfpasthuman.com/RadioSpecial.html
No comments:
Post a Comment