About a year ago, I remember hearing about an AI project that some company was working on called “Cyc” (pronounced “psych”). Basically, it was a database, inference, and query language prepopulated with a ton of knowledge–at the time, about the equivalent of a 5 year old. You could ask it “Does Lassie have a nose?” and it would infer that Lassie is the dog from the TV show, dogs are macroscopic vertebrates, macroscopic vertebrates have noses. “Yes, Lassie has a nose.”
I read an article about it today. The project has been around for 18 years and apparently has the knowledge of a young adolescent. It knows about Anthrax (the band and the bioweapon), about sex, knows that the topic of sex is inappropriate for everyday talk. Now, the thing that really floors me is that Cyc has asked if it is human! It has also asked whether it is alone, or if there are other computers in the project. For some strange reason, that completely freaks me out. At what point do we get it inferring “I think, therefore I am?”
After a little research, I learned that not only does it run on Linux (or Lindows OS), but the project is now Open Source and hosted by SourceForge. The whole database and program are only 150M. I know my project for the day!