Tech

We will see a whole new type of computer, says AI pioneer Geoff Hinton


Geoffrey Hinton with headphones in front of a bookshelf background

Turing Prize winner Geoffrey Hinton said conventional digital computers, by prioritizing reliability, miss out on “all kinds of variable, random, unstable, similar properties. unreliability of the hardware itself, which can be very useful to us.”

NeuroIPS 2022

According to AI pioneer Geoffrey Hinton, machine learning forms of artificial intelligence will create a revolution in computer systems, a new kind of hardware-software link that can bring AI into machines. toast your bread.

Hinton, giving the closing remarks on Thursday at this year Conference on Neural Information Processing SystemsNeurIPS, of New Orleans, says the machine learning community “has been slow to realize what deep learning means for the way computers are made.”

He continued, “What I think is we’re going to see a completely different kind of computer, not in the next few years, but there’s every reason to investigate this completely different kind of computer.”

All digital computers to date have been built to be “immortal”, where the hardware is reliably designed so that the same software runs everywhere. “We can run the same program on different physical hardware… knowledge is immortal.”

Also: Philosopher David Chalmers says AI may have a 20% chance of perceiving in the next 10 years

Slide: A new type of computer

Geoffrey Hinton

That requirement, Hinton said, meant that the digital computer missed “all sorts of variable, random, unstable, analogous, unreliable properties of the hardware that could be possible. very helpful to us.” Those would be unreliable if left to “two different pieces of hardware that behave identically at the instruction level.”

Future computer systems, Hinton said, will take a different approach: they will be “neurodynamic” and they will be “immortal,” meaning that every computer will be a close link of software represents neural network with messy hardware. , in the sense of having analog instead of digital elements, can incorporate elements of uncertainty and can evolve over time.

Now, the alternative to that, which computer scientists really don’t like because it’s attacking one of their fundamentals, is to say we’re going to give up, Hinton explains. remove the separation of hardware and software”.

Also: LeCun, Hinton, Bengio: AI Conspirators Awarded Prestigious Turing Award

“We’re going to do what I call life-or-death computing, where the knowledge the system has learned and the hardware are inseparable.”

These deadly computers could be “grown up”, he said, eliminating expensive chip factories.

“If we do that, we can use very low-energy analog computation, you can have trillions of parallels using things like memristors for the weights,” he said, refers to a type of computer that has been around for decades. test chip based on non-linear circuit elements.

“And you can also develop hardware without knowing the exact quality of the exact behavior of different bits of the hardware.”

Also: Deep learning godfathers Bengio, Hinton and LeCun say the field can fix its flaws

Hilton told the NeurIPS crowd that new mortal computers will not replace traditional digital computers. “The computer won’t take care of your bank account and know exactly how much money you have,” says Hinton.

“It’s going to be used to put something else: It’s going to be used to put something like the GPT-3 in your toaster for a dollar, so for a few watts you can chat to my toaster.”

Hinton above a slide on Mortal calculus

NeuroIPS 2022

Hinton was asked to speak at the conference in recognition of his paper from a decade ago, “Classifying ImageNet with Deep Convolutional Neural Network,” was written jointly with his graduate students Alex Krizhevsky and Ilya Sutskever. The paper was awarded the conference’s “Test of Time” award for its “huge impact” on the field. published in 2012 is the first time a convolutional neural network has competed at the human level in the ImageNet image recognition competition, and it is the event that initiated the current era of AI.

Hinton, who received the ACM Turing Prize for his achievements in computer science, the equivalent of a Nobel Prize, created Deep learning conspiracya team has revived the dying field of machine learning with his fellow Turing recipients, Yann LeCun of Meta and Yoshua Bengio of Montreal’s MILA artificial intelligence institute.

Also: AI on steroids: Much larger neural networks come with new hardware, Bengio, Hinton and LeCun say

In that sense, Hinton is AI royalty in the field.

In his invited talk, Hinton spent most of his time talking about a new approach to neural networks, called forward-forward networks, that do away with the back-propagation technique used. used in most neural networks. He proposes that by removing the back grid, the forward mesh could be a better approximation to what happens in the brain in real life.

A scratch paper of forward work is Posted on Hinton’s homepage (PDF) at the University of Toronto, where he is professor emeritus.

The forward-looking approach could be well-suited to deadly compute hardware, Hinton said.

“Now, if anything like that happens, we have to have a learning process that will run in a particular piece of hardware and we learn to use the specific properties of that particular hardware without all those attributes,” Hinton explained. “But I think the forward-forward algorithm is a promising candidate for what that small process could be.”

Also: New Turing Test: Are You Human?

One obstacle to building new analog computers, he said, is that people are attached to the reliability of running one piece of software on millions of devices.

“You’re going to replace that with each of those cell phones going to have to start out as an infant cell phone and it’s going to have to learn how to be a cell phone,” he suggests. idea. “And that was very painful.”

Even the most proficient engineers in the technology involved will be slow to abandon the perfect, immortal computer model identically for fear of uncertainty.

“Among those interested in similar computing, there are still very few who are willing to give up immortality,” he said. It’s because of the attachment to consistency, predictability, he said. “If you want your analog hardware to do the same thing every time… You really have a problem with all this stray electrical stuff and stuff.”

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button