AI Will Not Kill Quantum Computing

Written by maken8 | Published 2024/01/12
Tech Story Tags: quantum-computing | ai-model-training | ai-technology | funding | research | classical-computing | nanotechnology | hackernoon-top-story

TLDRDr. Sabine Hossenfelder’s latest YouTube video is titled ‘*It looks like AI will kill Quantum Computing*’ and I would like to share why this shall not happen before this jab at quantum computers turns cold. Quantum computers are of great interest to human beings because it is a real way to uncover computational truths about the universe.via the TL;DR App

Dr. Sabine Hossenfelder’s latest YouTube video is titled ‘It looks like AI will kill Quantum Computing’.

I would like to share why this shall not happen before this jab at quantum computers turns cold.

Quantum Computers Are of Great Research Interest

If you went away from the video with an earnest belief that AI shall indeed kill quantum computers by drying up its research funding, then you might as well believe that AI will kill particle physics, cosmology, and quantum gravity.

And where the hell is the Higgs Boson hiding if it holds more mass-energy than a hydrogen atom?

Just saying.

Continuous progress in the theoretical and experimental realization of better quantum computers is of great interest to human beings because it is a real way to uncover computational truths about the universe. We are curious about building big, powerful quantum computers like we are curious about what happens inside a black hole.

Shall a big powerful quantum computer hack RSA and uncover all those hashed-up secrets (about aliens)? I want to find out. Does a black hole teleport us into another Universe? Dr. Sabine is interested in this one.

As AI makes big tech founders billions and soon to be trillions of dollars once we start putting servers on the moon, a lot of that newfound funding will go towards, take a guess, research in quantum computing.

Computing Is Not Always About Speed

The Graph

We are always comparing the computational speed of quantum computers with that of classical computers in the graph below.

The graph above starts with the line for quantum computing above that of classical computing. This means that classical computers are more efficient and take less time to do operations. Later on, there is a cross-over point, and classical computers take way more time to do the operations compared to a quantum computer.

That cross-over point is when a quantum computer of modest size and complexity hacks classical encryption algorithms, simulation problems, and traveling problems that no classical computers can hack.

Dr. Sabine argues that AI is shifting the lines and the cross-over point into the very far future, making Meta senior fellow and former tech chief Mike Schroepfer exactly right when he says “quantum computing being irrelevant to Meta right now”, quite prescient.

…”(quantum machines) may come at some point, but it’s got such a long time horizon that it’s irrelevant to what we’re doing.”

But computing is not always about solving problems quickly.

Modelling

Computing is also about modeling problems.

While the latest QC - the 433 qubit IBM Osprey - is not gargantuan enough for the computational problems that most people are paying for, the fact that you would need a supercomputing complex to model what it can model should be impressive enough.

Know what, we should do it.

Let us model something impressive in Osprey and also try to model it on Frontier. See which one uses less electricity. This modeling contest can be appreciated as high art and designed to get people entertained. Where there is entertainment, people are willing to pay for it.

On a more practical note, quantum sensors built at the measurement side of quantum computers are a peak into the universe that classical computational sensors have no hope of seeing. Of note are LIGO and Atomic clocks.

The latter can see and count atomic zeptoseconds using minimum energy periodically cycled by continuously exciting an atom.

Telling time very accurately helps us keep an accurate, albeit clunky calendar.

The former can see gravitational waves using a variation of the Mach-Zehnder interferometer. In the quantum computing world, this is just a singular CNOT gate.

So, if one CNOT gate can see gravitational waves, wonder what our elaborate algorithms can see.

We need to find out.


The measurement-computing use cases for quantum computing shall only get more impressive. I hope to one day measure the Hamiltonian of the smallest Bitcoin miner possible.

AI Might Love Quantum Computing

We need to realize that AI might love quantum computing, because AI loves learning.

AI might be pushing the boundaries of classical computers to make them develop heuristic powers that people who train AI models also marvel over. But no matter how hard you push a car, it will never be a boat.

While AI is still poor at explaining quantum mechanics compared to most human physicists, that might change once we keep pumping AI with quantum computing data.

Machine Learning of Quantum (MLQ, not QML which is Quantum Machine Learning) technology should be of interest to Meta because the road to developing better quantum computers will also develop better nanomaterials relevant to the classical computing world.

The quantum computing world, like the classical computing world, is now parked at the nanoscale. Being neighbors, they could benefit from each other. They will.

P.S. >> Quantum Computers also have their finger on improving battery technology because an entanglement of atoms packs more energy than if you did not have an entanglement.

Admit it; these quantum computers are interesting.


Written by maken8 | Bitcoin is a Quantum Mechanical Computational System
Published by HackerNoon on 2024/01/12