Read more: The future of CPUs
Computers are built around logic: performing mathematical operations using circuits. The logic is built around things like Adders, not the snake; the basic circuit that adds two numbers. This is as true of today’s microprocessors as it is of all those dating back to the very beginning of computing history. You might go back to an abacus and find that, on a fundamental level, it does the same thing as your brilliant gaming PC. it’s just a lot, much less capable.
Nowadays, processors can perform many mathematical calculations using any number of complex circuits in a single clock. And much more than just adding two numbers together as well. But to arrive at your shiny new gaming CPU, there was an iteration process on the classic computers that came before, going back centuries.
As you can imagine, building something completely different than that is a little tricky, but that’s what some are striving to do, with technologies like quantum and neuromorphic computing, two distinct concepts that could change computing forever.
“Quantum computing is a technology that, at least by name, we are used to hearing about and that is always cited as ‘the future of computing’,” says Carlos Andrés Trasviña Moreno, software engineering coordinator at CETYS Ensenada .
Quantum computers use qubits or quantum bits. Unlike a classical bit, which can only exist in one of two states, these qubits can exist in two states and a superposition of these two states. It is zero, one, or both zero and one at the same time. And if it sounds terribly confusing, that’s because it is, but it also has immense potential.
Quantum computers are expected to be powerful enough to crack modern “unbreakable” cryptography, accelerate the discovery of medicine, reshape the way the global economy transports goods, explore the stars, and practically revolutionize everything related to the enormous amount of crunches.
The problem is that quantum computers are immensely difficult to make and perhaps even more difficult to manage.
“One of the major disadvantages of quantum computing is its high power consumption, as it works with algorithms of much greater complexity than that of any current CPU,” continues Moreno. “Furthermore, it requires an environment with temperatures close to absolute zero, which worsens the energy requirements of the system. Finally, they are extremely sensitive to environmental disturbances such as heat, light and vibrations.
“Each of these can alter current quantum states and produce unexpected results.”
And while you can somewhat copy the function of classical logic with qubits — we’re not starting completely from scratch in developing these machines — harnessing the power of a quantum computer requires new and complex quantum algorithms that we’re just getting to grips with. .
IBM is a company that invests heavily in quantum computing, with a goal of creating a quantum computer with 4,158 or more qubits by 2025. Google has its fingers in quantum too.
Of course, we are still a long way from the ubiquitous “quantum supremacy”, which is the moment when a quantum computer is better than today’s best classical supercomputers. Google said it did just that in 2019, though it may have turned out to be a niche accomplishment, but an impressive one nonetheless. However, in practical terms, we are not there yet.
They are a real pain to figure out, to put it scientifically. But that never stopped a good engineer.
“I think we’re scratching the surface there with quantum computing. And again, just like we broke the laws of physics with silicon over and over again, I think we break the laws of physics here too,” Marcus Kennedy, director tells me Intel Games General.
Mark Kennedy
There’s more immediate potential for the future of computing in artificial intelligence, your favorite buzzword of 2023. But it’s truly a huge and life-changing development for many, and I’m not just talking about that chatbot smart sounding and a little too argumentative in your browser. Today we’re only scratching the surface of the uses of AI, and to unlock those deeper, more impactful uses there’s a new type of chip in the works.
“Neuromorphic computing is, in my opinion, the most viable alternative [to classical computing]”, says Moreno.
“In a sense, we could say that neuromorphic computers are biological neural networks implemented in hardware. One might think that it is simply translating a perceptron into voltages and gates, but it is actually a closer imitation of how brains work, of how the actual neurons communicate with each other across the synapse.”
What is Neuromorphic Computing? The answers in the name, neuro, meaning related to the nervous system. A neuromorphic computer aims to mimic the largest computer and most complex creation ever known to man: the brain.
“I think we will reach a point where the processing capacity of those neuromorphic chips far exceeds the processing capacity of a monolithic die based on an x86 architecture, a traditional type of architecture. Because the way the brain operates , we know it has capacity and capability that far surpasses anything else,” says Kennedy.
“The most effective kind of system tends to look a lot like the things you see in nature.”
Neuromorphic chips have yet to reach their defining moment, but they are coming. Intel has a couple of neuromorphic chips in development today, Loihi and Loihi 2.
And what is a neuromorphic chip really? Well, it’s a brain, with neurons and synapses. But since they’re still made out of silicon, think of them as some sort of hybrid of a classic computer chip and brain biology.
And not necessarily a big brain: Loihi 2 has 1 million neurons and 120 million synapses, which is many orders of magnitude smaller than a human brain with about 86 billion neurons and trillion of the synapses. It’s hard to count them all, as you can imagine, so we don’t know precisely, but we do have big brains. You can brag about it all you want to your smaller-brained animal companions.
It is estimated that a cockroach has as many synapses as Loihi 2, for a better understanding of the gray matter scale we are talking about here.
“We argue that you don’t need to be so complex that the brain has its function, but if you’re going to do computer science, you just need some of the basic functions of a neuron and synapse to really make it work,” Dr. Mark Dean told me in 2021.
Doctor Mark Dean
Neuromorphic computing has plenty of room to grow, and with rapidly growing interest in AI, this nascent technology could prove to be the key to powering those increasingly impressive AI models you keep reading about.
You might think that AI models work just fine today, which is mostly thanks to the graphics cards from Nvidia running the show. But the reason neuromorphic computing is so appealing to some is “that it can dramatically reduce a processor’s power consumption, while still handling the same computational capabilities as modern chips,” says Moreno.
“By comparison, the human brain is capable of hundreds of teraflops of processing power with just 20 watts of power consumption, while a modest graphics card can produce 40-50 teraflops of processing power with a power consumption of 450 watts.”
Basically, “If a neuromorphic processor were to be developed and implemented in a GPU, the amount of processing power would surpass any existing product with only a fraction of the power.”
Sound attractive? Yes, of course she does. The lower power consumption is not only huge for the potential computing power it could bring, but also for using less energy, which has knock-on effects for cooling as well.
“Changing the architecture of computing would also require implementing a different programming paradigm, which in itself will be an impressive feat as well,” Moreno continues.
Building a neuromorphic chip is one thing, programming it is another. This is one reason why Intel’s neuromorphic computing framework is open-source, you need a lot of hands on deck to get this type of project off the ground.
“The thing we haven’t figured out yet is the software behind how to leverage the structure,” Kennedy says. “And so you can make a chip that looks a lot like a brain. Software is really what makes it work like a brain. And to this day, we haven’t cracked that nut.”
It will be some time before we completely replace AI accelerators with something resembling a brain. Or adders and binary functions, which are as old as computing itself, with quantum computers. Yet experiential endeavors have already begun to replace classical computing as we know it.
A recent breakthrough claimed by Microsoft sees the company very optimistic about the future of quantum, and IBM recently predicted that quantum computers will surpass classical ones in important tasks within two years.
In the words of Intel’s Kennedy, “I think we’re getting there.”
#verge #biggest #computings #DNA #quantum #coming
Image Source : www.pcgamer.com