Thursday, July 24, 2014

What does it mean 'to be'?



Chatbots are fun little gimmicks.[1] A little while back one chatbot was reported to have passed the Turing test.[2] That has been debunked.[3] Very much so, in fact. Time to debunk the debunkery, from the last source:
"3. It "beat" the Turing test here by "gaming" the rules -- by telling people the computer was a 13-year-old boy from Ukraine in order to mentally explain away odd responses."
Against Turing protocol. Also, Turing test is a test of consciousness, not perception. The chatbot doesn't 'believe' it is a 13-year-old boy from Ukraine, it is told to say that. An AI does not need a backstory, the test does not include memories. It tests a 'normal' conversation, not a flashback episode of a TV show. The point repeats the first point - such a gimmick is not worthy of a Turing test. The results mean nothing, regardless of how much the Turing test has been warped to make it passable by a simple chatbot. True again.

It continues in the same track - Turing test is not a single shot with handpicked judges. The Turing test was designed so that it would be difficult to pass, but easy to understand. Quite simply it meant putting a person (judge) to IM on a computer. After a while he would say whether he was talking to a computer or a person. If people's ability to differentiate between man and machine constantly remains statistically insignificant (you can no longer confidently say a random person can probably make a difference), the test is passed.

To make the test more reliable, it should be a double-blind (the evaluators don't know if the judge is being tested by a human or a computer) and randomized. In fact, it would be best if the people didn't know there was a chance they are talking to a computer (so they'd go for 'normal' conversations instead of trying to trick the computer).
"6. The whole concept of the Turing Test itself is kind of a joke. While it's fun to think about, creating a chatbot that can fool humans is not really the same thing as creating artificial intelligence. Many in the AI world look on the Turing Test as a needless distraction."
No, it is not. If you can create a chatbot that can convincingly discuss religion and philosophy, art and morality as convincingly as an average human you have indeed created something spectacular. Because to do that you need more than a chatbot, the 'chat' at that point will only be an interface, what goes on in the background is short of creating novels, symphonies, paintings. To fake discussions, you need to simulate emotions. Typically hormonal reactions created by bytes on a board. Manage that and you have revolutionary technology. And the only way to test if you've succeeded in creating the first 'human' computer is that 'needless distraction' commonly known as the Turing test.

Just think of it, if you can create a proper AI with a personality prototype you will be a leap closer to having plastic pals that are fun to be with.[4]

I would not be surprised if in a few months someone declared once again that the Turing test has been passed by a chatbot. And it will be big news once again because news agencies love making huge news out of nothing, regardless of what they report is true or false. Though let's be honest, even some reputable news sources manages to flame this up. But for any readers, take these news with a grain of salt.


Wednesday, July 23, 2014

Ab uno disce omnes

From one, learn all.

But can you really learn all from a single person? Well, no. As long as you cannot know everything about a person, you cannot learn everything from him/her. But how about learning everything you can? Well, that brings up an interesting point.

People are not snowflakes[1,2], there are distinct differences. Some people are better at lying and concealing information, some are better at acquiring it. Some handle numbers better, some have amazing memory capacities. So say you are great at amassing knowledge. Great for you, your persistence keeps you soaring above others when it comes to knowing and navigating details. It is one of the most impressive abilities there are. And then you meet someone who, for the life of him/her, cannot remember stuff they've tried to remember only a few days ago. Even if something is remembered, it is but a piece of what should be. However, that person more than makes up for it with quick wits and icy logic. How can one learn how the other thinks? How is the other able to learn to remember like the first one?

As a late habit I've started reading other people's code. While solving simple tasks doesn't really show much, even in Java, but complex problems require creative problems. Some people tackle them head-on, some take a more scenic route. Some hammer it in the most brutal manner imaginable. But as one reads what another has created, and once comprehends it, one's mind starts to change. It is not just about learning different solutions to problems, subconsciously it is learning new approaches. You learn new ways of thinking, and that carries over to other less-techy stuff.

But it gets worse when you bring in emotions. You cannot 'learn' to have emotions. It's like explaining a colour or trying to understand pain while suffering from congenital analgesia. Emotions and perceptions are subjective and intrinsic. They are inherent of being human with the capacity to experience love and lust, hate and madness, value beauty, practicality, companionship. Trust. These are hormonal responses that cannot be taught or learned. Senses, even worse.

What makes all of it so sad is that while you cannot learn to have emotions, you can forget about them. Bottle them up, shove them to the back of your head, ignore or avoid them. It is easy to forget to enjoy life and no amount of manic pixie dream girl tropes [3] will fix that. "My shoes are too tight. But it does not matter, I have forgotten how to dance." [4]


via Gizmodo