Rabu, 01 April 2009

Google's April Fools Spoof

Here it is (CADIE):


Cognitive Autoheuristic Distributed-Intelligence Entity

When you walk into a dark field in the middle of the night...

and look up into a black sky and wonder how many stars there are in the universe, let's be honest: in all likelihood you don't have the faintest clue, and even if you're one of the few who do, you lack any real capacity to comprehend the figure save for the same vague sense of stunned wonder that our earliest human ancestors felt when they looked up from the African savannah at the same starry sky.

Our species' journey toward tonight's epochal announcement had much less to do with that awestruck moment than it did with the moment those same ancestors woke up hungry the next morning and started studying animal tracks in the savannah mud, thereby inadvertently developing concepts like time and causality which, by abstracting both location and temporal context into a unique reconning tool within the brain, sparked the set of responses that, ages later, we now call reason.


Rene Descartes, noted philosopher

From there, mankind's journey toward artificial intelligence took place over so many centuries and in the hands of so many thinkers that it is possible here only to pause to mark a few of the moments when one of our genius forebears expanded the edge of our species' technological envelope: Aristotle's system of reasoning based on means, not ends; al-Khowarazmi's algorithms; Descartes, Locke and Hume's monumental insights into the nature of knowledge; Church and Turing's theory of a machine capable of computing all functions which are computable; the Allied code-breakers who, struggling to crack the fiendish Enigma machine amid the horrific irrationality of World War II, inadvertently facilitated the birth of the modern computer.

The decades that followed saw an acceleration of innovation not seen since the Industrial Revolution. Computing pioneers from the game theorist von Neumann to the economist Morgenstern engaged in a tumultuous Hegelian rondolet in which probability theory mated with utility theory to spawn decision theory. Operations research and Markov decision processes tackled actions taking place in a sequence. Neuroscience shed light on the parallels and differences between electronic and human brains. Cognitive psychology delivered sound specifications for knowledge-based agents. The now-legendary summer workshop at Dartmouth in 1956 birthed automata, the first neural networks and the invention of a program capable of thinking non-numerically.

But close though we may have come to a theory of the brain, the body - computer hardware - wasn't capable of handling the extraordinary processing demands that any reasonably "intelligent" brain would place on its circuitry until Moore's Law really kicked in a few years back and the modern ultra-dense machinery of atomic scale-sized gates and their light-based interconnections finally reached the scale of brain neurons - and then surpassed it, when, in early 2007, a tight-knit, vaguely feared quantum computing group here at Google extended computers with quantum bits of Einstein-Bose condensate, polynomially speeding up our machines' data-processing ability.

From the Tech Specs... and you know there's more.

Update: Been to CADIE's blog yet? Didja follow any of the links? I just had to include this screen-shot from CADIE's favorite places in Google Maps (I'm in ur mapz), which, strangely enough, also includes Redmond, WA. As ever, click for larger...

Tidak ada komentar:

Posting Komentar