Wittgenstein's Lion and Machine Learning
I wouldn't assume many of the readers of this space are enthusiasts of the game of Go, a game popular in Asia that has some pretty simple rules but is remarkably complex partly because of the large board and consequently a huge number of possible moves. What is interesting about all of this is that top players of the game, perhaps because of the vast number of moves, cannot tell you why they are playing a certain way or deploying a given strategy. In other words, following a theme I've brought up here before, is that by being intuitive in their skill, these players know more than they can tell.
The inability of top players to articulate why they are playing Go a certain way means that unlike chess or checkers, a good Go playing computer confounded the programmers who wanted to build a mechanical rival. So the computer programmers at Google DeepMind took a different tack in 2014 and rather than programming the computer how to play, set it up so that the computer learned the game on its own. It was given some 30 million Go board positions from a repository and told basically to use those to figure out how to master the game and win. The machine also played a whole bunch of games by itself, and ended up generating yet another 30 million possible positions.
This self learning machine was ready for human competition in 2015 and already that year bested the European champion 5-0, and in the next year went to beat the world Go champion Lee Sedol 4-1.
What happened? A lot, but a remarkable thing here is how the the self taught DeepMind machine played the game. One of the machine's opponents remarked that its play style was "alien", making moves that were counterintuitive in play that was like something out of another dimension. It learned the game on its own, free of any human influence or bias, and so perhaps it should not come as a surprise that it was no longer behaving like a person either.
Which brings us to the philosopher Wittgenstein and his lion, who said that even if a lion could speak we could not understand him. Why? As a species human beings have the same perceptual and conceptual apparatus, and because of this we can share what we experience with a language in common. Does a lion share this perceptual and conceptual apparatus with us? Probably not, since it learns about the world in a reality utterly different from our own, with a completely different mind to interpret it. As the lion does not share this same conception of the world with us, it also would not be able to describe it to us even if it was using words we are supposed to understand
So too a self learning machine. Clearly it does not have the brain of a human and so therefore, setting aside the whole concept of consciousness and self awareness, a machine which is shaping perceptions and conceptions of the world via its own learning, independent of human input and programming, very likely, much like Wittgenstein's lion, would result in something no longer understandable by us.