This article was published in Telecom Asia, 21 March 2014 – The brave new frontiers of computing
Von Neumann reference architecture and the sequential processing of Turing machines have been the basis for ‘classical’ computers for the last 6 decades. The juggernaut of technology has resulted in faster and denser processors being churned out inexorably by the semiconductor industry, substantiating Gordon Moore’s claim of transistors density in chips doubling every 18 months, now famously known as Moore’s law. These days we have processors with an excess of billion transistors. We are now reaching the physical limit of the number of transistors on a chip. There is now an imminent need to look at alternative paradigms to crack problems of the internet age, confronting human which cannot be solved by classical computing
In the last decade or so 3 new, radical and lateral paradigms have surfaced which hold tremendous promise. They are
i) Deep learning ii) Quantum computing and iii) Genetic programming.
These techniques hold enormous potential and may offer solutions to problems which would take classical computers anywhere between a few years to a few decades to solve.
Deep Learning:Deep Learning is a new area of Machine Learning research. The objective of deep learning is to bring Machine Learning closer to one of its original goals namely Artificial Intelligence. Deep Learning is based on multi-level neural networks called deep neural networks. Deep Learning works on large sets of unclassified data and is able to learn lower level patterns on which it builds higher level representations much the same way the human brain works.
Deep learning tries to mimic the human brain For example, the visual cortex shows a sequence of areas where signals flow from one level to the next. In the visual cortex the feature hierarchy represents input at a different level of abstraction, with more abstract features further up in the hierarchy, defined in terms of the lower-level ones. Deep Learning is based on the premise that humans organize ideas hierarchically and compose more abstract concepts from simpler ones.
Deep Learning algorithms generally requires powerful processors and works on enormous amounts of data to learn key features. The characteristic of Deep Learning algorithms is that the input is passed through several non-linearities before generating its output.
.
About 3 years ago, researcher’s at Google’s Brain ran a deep learning algorithm on 10 million still images extracted from Youtube, on 1000’s of extremely powerful processors called GPUs. Google’s Brain was able independently infer that these images consisted of a preponderance of cat’s videos. A seemingly trivial result, but of great significance as the algorithm inferred this result without any other input!
An interesting article in Nature, “The learning machines”, discusses how deep learning has proved useful for several scientific tasks including handwriting recognition, speech recognition, natural language processing, and in analyzing 3 dimensional images of brain slices etc.
The importance of Deep Learning has not been lost on the Tech titans like Google, Facebook, Microsoft and IBM which have all taken steps to stay ahead in this race.
Deep Learning is in its infancy and is still esoteric knowledge. Deep Learning is truly a fascinating area of research and may be the harbinger of the real breakthrough in Artificial Intelligence has been looking for in decades.
Genetic Programming (GP) is another radical approach to computing. It had its origins in the early 1950’s and has been gaining traction in the last decade. Genetic programming (GP) is a branch of AI, based on Darwinian evolutionary principle of ‘natural selection’ and ‘survival of the fittest’. Essentially GP is a set of instructions and a fitness function to measure how well a computer program has performed a task. It is a specialization of genetic algorithms (GA) where each individual is a computer program.
Genetic Programming is a machine learning technique in which a population of computer programs are optimized according to ‘fitness criteria’ determined by a program’s ability to perform a given computational task. Fit programs survive and are moved along the evolutionary process. Fitness usually denotes the optimum value for a given objective function. In other words the fitness represents the ‘quality’ of a given solution over others. Individuals in a new population are created by the method of ‘reproduction’ and ‘cross over’.
In other words, the ‘most fit’ programs are crossbred and also possibly randomly mutated, creating a new generation of child programs. The unfit programs are discarded out and the best are bred again.
Once set up, the genetic program runs and evolves by itself and needs no further human input. Genetic Programming was pioneered by Stanford’s John Koza who was able to invent an antenna for NASA, identify proteins and invent electrical controllers.
The eerie part of GP is that the code is inscrutable. The program evolves and mutates into variations that cannot be easily reproduced. Clearly this is fodder for science fiction-like scenarios of self-aware, paranoid & psychopathic programs. Here is an interesting article that discusses this- This is What Happens When You Teach Machines the Power of Natural Selection
Quantum computing
Computers of today from hardy mainframes to smartphones operate on binary logic. The entire edifice of today’s computing is based on the binary states of the semiconductor which can be either in the state of ‘0’ or ‘1’. All computation can be reduced to arithmetic and logical operation on binary digits or more simply, binary arithmetic. Quantum computers deviate significantly from the binary arithmetic of classical computers. The unit in the quantum computer is the ‘qubit’ which can be in state ‘0’, ‘1’ and both the state ‘0’ and ‘1’ through the principle of superposition.
To understand the power of quantum computing here is an excerpt from ArsTechnica “A tale of two qubits: How quantum computers work”
“Bits, either classical or quantum, are the simplest possible units of information…. Measuring a bit, either classical or quantum, will result in one of two possible outcomes. At first glance, this makes it sound like there is no difference between bits and qubits. In fact, the difference is not in the possible answers, but in the possible questions. For normal bits, only a single measurement is permitted, meaning that only a single question can be asked: Is this bit a zero or a one? In contrast, a qubit is a system which can be asked many, many different questions, but to each question, only one of two answers can be given”
The article further goes on to state that “Classical computer memories are constrained to exist at any given time as a simple list of zeros and ones. In contrast, in a single quantum memory many such combinations can all exist simultaneously. During a quantum algorithm, this symphony of possibilities is split and merged, eventually coalescing around a single solution. “
Having more than 1 qubit results in additional property called ‘quantum entanglement’. A pair of qubits cannot be described by the states of the individual qubits alone. Those states which exhibit extra correlations are described as ‘entangled’ states. Hence in the case of 2 qubits ‘the whole is greater than the sum of its parts”. Entanglement and superposition are the cornerstones which gives quantum computing its power. Here is a short and interesting animation of quantum computing
With classical computing techniques searching an unsorted phonebook of 10,000 entries, would require us to look up at least 5000 entries, while a quantum search algorithm only needs to guess 100 times. In other words it would take a quantum computer only 5000 guesses to search through a phonebook with 25 million names. That is the power of quantum computers!
Applications of quantum computers range from weather modeling, cryptography, solving problems that have been considered ‘intractable’ with classical computing methods. NASA is planning to use quantum computers in its search for exoplanets.
Deep Learning, Genetic Programming and Quantum Computing represent paradigmatic, lateral shifts in computing. They herald a new era in computing and will enable us to crack extremely complex problems in this Age of the Internet.
Classical computing will continue to play a role in a daily lives but for real world problems of the next decade & beyond it will be these 3 computing approaches that will hold the key to our future!