Skip to content
Berke's Lab
Go back

Understanding Quantum Computing and QPU's

Source Code / Edit

Introduction

There is a myth in the tech world that Quantum Computers are just “super-fast” versions of the computers we use today, at least that was what I had thought. But fundamentally, both quantum and traditional computing has their own usages. First let’s see how they differ.

The Dimensional Rift: Bits vs. Qubits

In classical computing we live in a deterministic binary reality. A 3-bit system has 23=82^3 = 8 possible states (000 to 111), but it can only exist in one of those states at any given moment.

Stateclassical{000,001,,111}State_{classical} \in \{000, 001, \dots, 111\}

Quantum computing operates in a Hilbert Space. A 3-qubit system doesn’t just “store” a state; it exists as a linear combination (superposition) of all basis vectors simultaneously.

ψ=α0000+α1001++α7111|\psi\rangle = \alpha_0|000\rangle + \alpha_1|001\rangle + \dots + \alpha_7|111\rangle

Here, α\alpha represents complex probability amplitudes where the sum of probabilities must equal 1 (αi2=1\sum |\alpha_i|^2 = 1).

The 2n2^n Misconception

This is where the hype machine fails. A 50-qubit system holds 2502^{50} states (quadrillions of values) in superposition. A question that I had was that doesn’t this mean that this system can store quadrillions of values simultaneously? Technically, yes. Practically, no. This is due to the Holevo Bound and the axiom of Measurement. You can perform massive parallel computations inside that Hilbert Space, but the moment you ask the computer for an answer (measurement), the wavefunction collapses. You don’t get 2502^{50} answers; you get nn classical bits. The quantum computer is not a magic box that gives you all the answers at once; it’s a probabilistic machine that gives you one answer based on the interference patterns of those amplitudes.

“A Quantum Computer is like a library where you can read every book simultaneously, but the laws of physics only allow you to leave the building with a single page of notes.”

The Architecture of “Useful” Work

Why can’t a QPU replace your entire computer? Because computing isn’t a single sport; it’s a triathlon. Different architectures are evolved to solve fundamentally different mathematical problems.

The CPU: The General (Serial Processing)

CPU chip is the Master of Logic. It is designed for complex branching, operating system management, and serial tasks. It doesn’t process massive data at once; it processes complex decisions one by one, incredibly fast.

Result={Aif x>0Bif x0\text{Result} = \begin{cases} A & \text{if } x > 0 \\ B & \text{if } x \leq 0 \end{cases}

The CPU is the General. It may not dig the trenches itself, but it tells the GPU where to dig and QPU what to ask.

The GPU: The Army of Grunts (Parallel Processing)

GPU is a master of linear, deterministic tasks. Outputting a video or rendering a video game requires changing the color values of millions of pixels simultaneously. This is simple matrix multiplication, done billions of times per second.

Output=Input×Transformation Matrix\text{Output} = \text{Input} \times \text{Transformation Matrix}

The GPU wins here because of Bandwidth. It is an army of disciplined soldiers. They aren’t smart, but there are thousands of them, and they all follow orders perfectly in sync.

The NPU: The Intuition (Tensor Operations)

The Neural Processing Unit (found in modern SoCs) is a specialized evolution of the GPU. It sacrifices precision (using 8-bit integers instead of 64-bit floats) for extreme efficiency in Matrix-Vector Multiplication. It doesn’t calculate “correct” math; it calculates “approximate” weights for AI inference.

y=σ(Wx+b)y = \sigma(Wx + b)

The NPU is the Subconscious. It powers your FaceID and LLM auto-complete. It doesn’t know why the answer is correct, but it recognizes the pattern instantly.

The QPU: Probabilistic Interference

Quantum computers struggle with simple math (2+22+2 is surprisingly difficult for a QPU). However, they excel at finding the global minimum in a high-dimensional noise floor.

Take RSA encryption breaking or molecular simulation. These aren’t “data stream” problems; they are “needle in a haystack” problems. Using algorithms like Shor’s or Grover’s, we manipulate probability amplitudes (α\alpha) so that the wrong answers interfere destructively (cancel out) and the right answer interferes constructively (amplifies).

ψfinalSolution|\psi\rangle_{final} \approx |Solution\rangle

The Future is Hybrid: CPU + GPU + QPU

We are not moving toward a world where you buy a “Quantum Laptop.” We are moving toward Heterogeneous High-Performance Computing.

In the year 2035, a “computer” will likely look like a triad of specialized architectures:

The “Quantum Accelerator” Paradigm

Just as we offload graphics to the GPU today, we will offload optimization and probabilistic problems to the QPU tomorrow.


Source Code / Edit
Share this post on:

Next Post
Interesting Facts about Spanish