It’s the end of an era. A time when the computing power of the processors that run our computers and smartphones has allowed an unprecedented technological and economic boom. 2016 will indeed sign the end of Moore’s law. Kézako? A law that states that the number of transistors, the main element that composes the processors, doubles every two years. And with them doubles also the power of our devices. But if this blessed time ends, that does not mean that technology will stagnate, far from it. This stop could even be a good thing for you.
We should not really talk about law, but rather a prediction made by Gordon Moore, a computer scientist co-founder of Intel, more than 40 years ago. It has been surprisingly accurate all this time but is coming to an end. Intel has indeed announced March 23 that doubling the power of its future processors would not happen in two years but in two and a half years. This month, the consortium of leading chipmakers also marked the end of Moore’s law, while their roadmap was modeled on it for over 20 years.
Before understanding why this trend ends and what it means for the future of your smartphones, it’s important to understand how explosive this progression has been. And the critical importance of this explosion of processor capacity in recent decades. A transistor is a kind of electronic switch, often made of silicon, that looks like this:
Since its invention in 1947, its size has greatly diminished. This means that we can put more in one space (for example, a small circuit board). Thus, in the first microprocessor manufactured by Intel in 1971, there were 2,300 transistors. This allowed the computer that worked with this chip to perform 90,000 operations per second. Currently, the processor of a high-end desktop computer has more than 700 million transistors.
This exponential progression has considerably increased computing power. Because more time passes and more this doubling of the power every two years is delusional. This gif, made by MotherJones, allows to better understand this exponential curve. If the number of calculations per second made possible by the processors is liters of water, here is the time it would take to fill Lake Michigan.
As you can see, the lake seems to remain desperately empty from 1940 to 2012, then fills up in less than 13 years. But all that is over. Finally, growth will continue, but less quickly. It was already a little bit the case. During the 2000s, the power of the processors reached a physical level. We then divided the processor into several “hearts” (the famous “quadcore” that you often see in the description of computers and smartphones) to continue to follow Moore’s law, which is ultimately rather a self-fulfilling prophecy than a law physical.
The End Of A Flight Forward
But in recent years, even this trick is not enough anymore. Transistors are so small (14 nanometers, almost 10 times smaller than the flu virus) that they are increasingly expensive to produce and therefore less profitable. Above all, on this small scale, the electrons, which serve to carry the information between the various electronic elements, begin to do anything and switch from one transistor to another without being asked ( is the tunnel effect).
Should we worry about the end of this growth that would appeal to any head of state? Not really. In fact, users should even come out a winner, they who for years have the impression that you have to change the machine very quickly.
This is a problem we all knew. When your computer or smartphone begins to get old (which in the computer world means a maximum of 5 years), it will even struggle to perform tasks that it could yet perform at the time. For example, websites evolve, and new features are added (video playback, interactive images, etc.). Except that the page becomes more difficult to display for an old computer.
“Before with this exponential growth in power, there really was no need to optimize software, designers were paying little attention to this and waiting for a new chip twice as powerful to be available,” HuffPost says. Pierre Boulet, professor of computer science at the University of Lille.
Battery And Energy, The New Nerve Of War
“Now it’s a bit like the car industry, the power of cars does not change too much, but there are many improvements on new models,” says the researcher. “In recent years, some are starting to realize that there is a problem of energy consumption,” said Pierre Boulet. A problem that affects both gigantic Google data centers or Facebook (which consume a lot) and the millions of users of these smartphones with more and more impressive screens but the autonomy more and more fragile.
Thus, to save energy and therefore batteries in smartphones, electronic chips now contain many circuits dedicated to specific tasks (GPS, antenna, 3D management, etc.) so as not to use all the power of the processor at any time. A trend that should continue as the increase in raw power becomes complicated, the end of Moore’s Law requires.
In addition to food, companies are also trying to improve other aspects of computing around, and sometimes limiting the computing power. Like the memory. In July, Intel unveiled a new type of memory 1000 times faster than the flash memory that equips our smartphones. “There is a lot of time lost by the processor and the software to wait for memory, but if such a change would be huge, it is not constant, once the new technology is in place, it will not progress much in the future. years ahead, “says Pierre Boulet.
The silicon industry is not content with stopping Moore’s law and constantly optimizing what is being done. Even if the increase in the number of transistors will be lower, it will continue for years. New techniques are being developed and tested to further reduce the size of transistors, or even create 3D processors with multiple layers of components.
Many researchers and companies are already looking for technology that will replace the microprocessor that made the success of Microsoft, Google, Apple, and Facebook, in short, the aptly named Silicon Valley. Because we forget it, but Moore’s law has certainly revolutionized our world and many areas (IT is almost everywhere, from our homes to our pockets to our cars), but it’s only a technology among others. Thus, many tracks have experimented, for years, to find the technology that will replace the silicon transistor.
Many teams, including IBM, for example, work on carbon nanotubes that could be up to 10 times smaller than the best current transistors while maintaining their effectiveness. Others are trying to use biology, using proteins instead of electrons to transmit information.
The (long) Road To The Quantum Computer
But whatever happens, by reducing the size, the transistor cannot be smaller than an atom. One of the most serious tracks, explored since the 1990s, is the quantum computer. “This is the only solution if we want to go beyond the limit of the atom,” says Pierre Boulet.
To make (very) simple, the idea is to transport the principles of computing to the level of the infinitely small (the field of action of quantum physics). In a conventional processor, each transistor can be open or closed. These are the famous bits, the “1 and 0” of computers. With a quantum computer, each “quantum bit” (qubit) can have the value 1, 0 or … both. The why of how is complicated, but such a change could increase the power of calculation.
Except that there is a problem. “Some algorithms and programs would benefit, others would not, but in any case, it would start from scratch on all software,” says Pierre Boulet. A revolution that is difficult to imagine, so many things having already been created on the foundations of the sacrosanct transistor and the late Moore’s Law. Even if their number does not increase as fast, it will still count for many years on silicon.