In y2k (year 2000), I designed a rudimentary test to compare my intelligence with my computer’s.
My computer was a Pentium MMX/ 200 MHz with 64 MB RAM and 200 MB hard disk, running MS Windows 98 SE operating system.
I used Sun Systems’ java 1.2 for programming purposes.
The adjective “rudimentary” is crucial because I’m a trained Statistician and I knew my test was far from being perfect.
My experiments consisted of adding a random set of 7-digit numbers. I thought if the computer proved no faster than 100 times, I was going to need a more reliable, if not also a more comprehensive test, to judge whether there was a need for AI (Artificial Intelligence).
IQ is the number of problems one can solve in an hour. Had the computer proved less than 100 times faster than me in simple additions, I was going to consider my computer as dumb because my brain can do much more than mere adding numbers.
Besides, my brain got no ALU (Arithmetic Logic Unit). My brain uses an algorithm to add any-length numbers. A computer, thus, got a technological advantage over me.
Here are my findings.
My computer turned out 600,000 times faster than me.
Had I used C++ or the Assembly language, the results would have been even more impressive. Of course, I didn’t mention that computers do not err, did I?
Why are we then spending $3b per year to create AI applications?
Why are we not any closer to building Skynet since Terminator, released in 1984?
Because AI is a misnomer.
Computers do not lack intelligence and we have been chasing a wild goose.