06/16/2025 | News release | Distributed by Public on 06/16/2025 08:45
On June 11, 2025, the Institute of Electrical and Electronics Engineers (IEEE) presented IBM with a Milestone award, commemorating the first practical demonstration of the Fast Fourier Transform (FFT).
This algorithm transformed the landscape of modern computing. It underlies the JPEG and MPEG standards, is used to reconstruct images for MRI and CT scans, and more-read more in our IBM Research blog.
But this isn't the end of the road for FFT. Rather, the lessons of FFT will further inspire the future of computing - especially as we think about a future with quantum computing. FFT has taught us that choosing the right representation can make the impossible possible, a lesson that will be critical as we search for new quantum algorithms.
The FFT: A Masterpiece of Mathematical Representation
When James Cooley and John Tukey introduced the FFT in 19651, they didn't discover a new physical phenomenon - they unlocked a better way to represent information. The Fourier transform takes a time-domain signal - a wave - and breaks it into a sum of simpler waves of varying frequencies, a new and more ordered way to look at complicated information. However, before the 1960s, calculating the Fourier transform was too slow to be useful for real-time applications - decoding seismic waves, for example.
That's where FFT comes in. In 1963, John Tukey and James Cooley devised an algorithm that could vastly accelerate this process for use on a scientific computer, using a mathematical trick that decreases the number of iterations required to perform the transformation. The result was that they reduced the computational complexity of the Discrete Fourier Transform from O(N2) to O(NlogN) - a significant improvement.
This shift unlocked real-time signal processing. Today, Cooley and Tukey's FFT algorithm appears in telecommunications (e.g., 4G/5G, WiFi), audio and video compression (e.g., MP3, JPEG), medical imaging (e.g., MRI, CT), and scientific computing (e.g., spectral methods for solving PDEs). As Richard Hamming said, "the Cooley-Tukey FFT is the most important numerical algorithm of our lifetime."
What made their FFT algorithm so impactful? A simple insight: change the mathematical representation, and the computational problem transforms.
Abstractions and Representations: How We Understand and Compute
Every step forward in science has involved abstraction - the act of modeling reality - and representation, the encoding of these models in a computable form.
A bit is an abstraction of a physical state into a binary value. "Charge" or "no charge" becomes 0 and 1.
A number is an abstraction of quantity. Five objects can be represented as "5," "V," or "101."
The FFT is a case where a mathematical transform enabled simpler and faster computation through a new representation of the same information. A messy time domain signal becomes a sum of simpler frequency-domain terms.
But as we think about the future of computing, we can no longer think about different ways to represent data - 5, V, and 101 are functionally the same input for different tools. We need to think about new ways to abstract the data so that we can study them with entirely new computational paradigms.
Representation is the lens through which we see, and compute, the world. Change the lens, and entire vistas come into focus.
Quantum Computing: A New Abstraction of Reality
Quantum computing goes beyond merely improving classical methods - it redefines information representation and processing itself. Where classical operations are deterministic, quantum computation is a dance of superposition, interference, and entanglement, wrangling the inherently probabilistic nature of subatomic particles in order to arrive at a deterministic answer.
In classical computing, information is encoded in bits as binary code. Operations are Boolean and deterministic - logical operations between bits like "AND," "OR," and "NOT" return "TRUE" and "FALSE," determining the output of the next bits.
However, in quantum computing, information is encoded in qubits as probability amplitudes living in complex vector spaces. Rather than "0" or "1," each qubit stores α|0⟩ + β|1⟩, where the probability amplitudes α and β are complex numbers of the form a + bi. Rather than classical logic, computation becomes a unitary evolution of qubit states using matrix operations, and results are probabilistic projections in full agreement with axioms of quantum mechanics. At the end of the calculation, α and β are used to calculate the probabilities that the qubit will output 0 or 1.
This shift enables entirely new computational capabilities. Shor's algorithm uses the Quantum Fourier Transform to factor integers exponentially faster than classical algorithms2. Grover's algorithm offers a quadratic speedup for unstructured search - reducing search from O(N) to O(N\sqrt{N}) queries3. And quantum simulation allows us to model quantum systems that are intractable on classical machines - Feynman's original vision4.
Quantum computing is an example of what can happen when we change both the abstraction and its representation.
Quantum-Classical Synergies: Why They Will Transform Computing
While quantum offers a significantly different lens to view the world, the most transformative future will be neither quantum or classical. Rather, it will be a synergy of the two.
First off, these two paradigms have different strengths and complementary capabilities. Classical computers excel at control logic, data storage, deterministic calculations, and massive-scale digital infrastructure. After decades of optimization, they are really good at these tasks, and can do so with incredible speed. The El Capitan supercomputer at Lawrence Livermore National Laboratory can run 2.75 quintillion - that's a million times a million times a million - mathematical operations per second.
Quantum systems do not promise to run these tasks faster. Rather, they excel at tasks where representing a problem with classical information falls short, where even El Capitan would struggle to compute an answer. This includes high-dimensional linear algebra operations, probabilistic sampling, studying optimization landscapes, and simulating quantum phenomena.
Combining these compute paradigms means solving problems that are intractable for quantum alone, e.g. big data processing for training of massive parametric models, and too complex for classical alone.
Secondly, as we push the limits of classical computing and simultaneously expand our knowledge of quantum's possibilities, new practical hybrid classical-quantum algorithms are emerging.
Among the first to be implemented on today's systems was VQE (Variational Quantum Eigensolver), which uses a classical optimizer to steer quantum circuits toward chemical ground states. QAOA (Quantum Approximate Optimization Algorithm), meanwhile, combines quantum superposition with classical feedback loops.
Newer algorithms are drawing us tantalizingly close to the milestone of quantum advantage, where quantum plus classical can outperform classical alone. SQD and SKQD (subspace quantum diagonalization) allow us to build classical surrogates of quantum Hamiltonians to estimate ground energies, while dynamic multi-product formulas combine expected values of classically hard-to-compute, time-evolved observables by using classical computations (e.g. tensor networks).
These approaches already show promise in drug discovery, material science, finance, and supply chain optimization especially in combination with advanced circuit optimization techniques available in Qiskit-the field's preferred open-source quantum SDK and the most performant quantum software stack.
Finally, we can look to other cutting-edge compute hardware to predict how quantum might amplify, rather than replace, classical processing. GPUs didn't replace CPUs-but amplified their reach into AI and simulation.
Quantum systems will amplify classical computing in a similar way. Think of them as coprocessors with radically different capabilities.
With that, our computing bottleneck shifts from hardware limits to algorithmic creativity. So, how do we fully harness quantum-classical synergies to advance computing as a whole?
We need new abstractions to describe problems across both domains. We need new representations to embed problems into quantum circuits and classical code. And we need new algorithms that dynamically balance workload across these complementary architectures.
In short: this is the dawn of a new algorithmic era, more profound, perhaps, than the era sparked by the FFT.
Lessons from the FFT
The dedication of the IEEE Milestone is a perfect time to reflect on a timeless insight:
A breakthrough often begins not with more power, but with a better question and a smarter representation. It begins with a new way of thinking.
Just as the FFT reshaped the landscape of classical computing by transforming how we represent signals, quantum computing is reshaping the very abstraction of information itself. As we step into the quantum era, the challenge isn't just faster computation - but redefining the very canvas we compute on, and learning how to paint on that canvas.
Bringing these worlds together will unlock a new era of computational power. However, it will require bold abstractions and innovative representations. It requires breakthrough algorithms that bridge worlds. And it requires new talent ready to take on the challenges - the Fouriers, Cooleys, and Tukeys, of tomorrow.
This is not the end of the FFT story - it's a new beginning!