Thoughts in 2021 about Hardware in Artificial Intelligence

Three game-changing prowess

Warith Harchaoui
4 min readNov 6, 2021
All that in your pocket

In 2012, I attended a talk entitled Divide-and-Conquer and Statistical Inference in Big Data given by Michael I. Jordan at École Normale Supérieure in Paris. That insightful talk was about building the software artillery needed for artificial intelligence (AI) to improve higher than anybody could expect except inspiring doers as we all astonishingly see today.

One of the most interesting concept (I was able to understand) was the Big Data paradox at that time:

  • Big Data is paradise for the mathematical side because the “N goes to infinity” property is nice for convergence, efficiency, prediction and generalization quality. Algorithms work better and easier (with less human expertise).
  • Big Data is hell for the computer side because it requires hardware and software infrastructures that barely existed in terms of memory, programming and computing issues: space, access, speed, energy etc.

Silicon Valley pioneer Gordon Moore observed in the 1960s that computation power doubles every 18 months. Although its validity has been requestioned in the beginning of the 2000s, it is still true nowadays, historically thanks to silicon-based electronic feats and recently thanks to radical changes of ways to use hardware. Indeed, once silicon-based barriers have been met, parallel and distributed approaches for memory and computations on several machines (Hadoop, Spark, Amazon Web Services among others) and on several units of the same electronic components (Graphics Processing Units diverted for non-graphics AI computations) were prominent 2010s game-changers. In fact, from the point of view of an AI practitioner, one can argue that Moore’s law is even exceeded depending on the available money you actually have.

Thanks to these achievements in the 2010s, the Big Data hell softened: abandoned and even ridiculed algorithms such as Neural Networks (convolutional, recursive and all the sophisticated zoo of contemporary models) were rebranded as a new field called Deep Learning (with thus higher number of layers) and they finally proved plausible and excellent with up-to-date capabilities. New orders of magnitude for datasets collection let outstanding algorithms prove that data quality/quantity is sometimes more important than field-related human expertise (prediction-wise only, not for doing an actual job or making a decision) for an increasing number of fields such as image, sound, medical analysis and even physics.

In 2020/2021, we witnessed three prowess in the real world (not predicted in a scientific magazine in the future… I mean today!)

  • electronic computations were redefined by the Apple M1 technology: the same component contains the equivalent of improved CPU, GPU, RAM and extra Neural Engine without communicating through a motherboard. The induced saved heat and computation power give Apple the freedom to give up respected hardware makers such as Intel and Nvidia (in this case for old commercial reasons if my memory is correct) for running further AI features on mobile devices.
  • quantum computations were made available by Google Sycamore achieving quantum supremacy. This means it can solve problems that no classical computer can solve in any feasible amount of time. It’s very difficult to imagine how extreme the unlocked possibilities will be. Many scenarios when scientists had to say “this is not tractable, not feasible, this is impossible” will disappear for sure.
  • photonic computations are available right now thanks to the LightOn company. This technology shift is more than hundred times faster (1500 TOPS = 1500 x 10¹² operations per second) than a good electronic CPU (10 TOPS) with less the energy consumption of a human brain (20 Watts) or an office light bulb (100 Watts). To be honest, I cannot help myself from daydreaming about the AI implications.

This year is the sad 10th anniversary of Steve Jobs death. Beyond the controversy about his management style, how profoundly personal computers changed the world, he is for me the human being that made possible the notion of “good taste” when it comes to otherwise industrial machines, a major actor of changing cartoons into animation movies, the fact that average Joes around the world have now access to all the music and all available knowledge in their pockets. As an AI enthusiast, although many things have been said about Steve Jobs, I cannot ignore the fact that outside a lab, way after the thrill of building a prototype, a working AI-based software can be run in a device that I can put back in my pocket or that I can my wear on my wrist. This is probably why I still have goosebumps when I think about the Think Different ad campaign. I guess one of the most formidable contribution of this inspiring technology leader succeeding early Silicon Valley founders is about enabling foolish dreamers to become hungry pioneers.

Warith Harchaoui, 6th November 2021

--

--