What is Quantum Computing?

Quantum computing makes use of specialised computer hardware and algorithms that take advantage of quantum mechanics. Quantum mechanics can be best described as a fundamental theory in physics to describe the behaviour of nature at and below the scale of atoms. The theory was developed in the 1900, when experiments showed strange and counterintuitive results. In normal mechanics, objects exist in a specific place at a specific time. In quantum mechanics, objects exist in a world of probability. They have a probability of being at one point and another chance of being at a second point and so on. An example would be that if a tennis ball were thrown at a wall in the real world, it would bounce back each and every time from the same wall. However, in the quantum world this same tennis ball has a probability it could also bounce back from the opposite side of the wall, from the side of the wall or the front of the wall.


Quantum computing has its roots in the early 20th century with the development of quantum mechanics, beginning with Max Planck’s quantum hypothesis and contributions from Einstein, Bohr, Heisenberg, and Schrödinger. In 1981, Richard Feynman proposed that quantum systems could only be efficiently simulated by other quantum systems, laying the foundation for quantum computation. David Deutsch (Deutsch, 1984) expanded on this by introducing the concept of a universal quantum computer. Significant progress in quantum algorithms followed, with Peter Shor’s 1994 algorithm demonstrating that quantum computers could efficiently factor large numbers, posing a potential threat to RSA encryption. In 1996, Lov Grover introduced an algorithm providing a quadratic speedup for unstructured database searches. The first experimental demonstration of a two-qubit quantum computer occurred in 1998 using nuclear magnetic resonance (NMR), and by 2001, IBM and Stanford successfully implemented a 7-qubit quantum system to run Shor’s algorithm and factorise fifteen.


Quantum Physics

Quantum physics, at its core, explores how energy and matter behave at the smallest possible scales, typically at the atomic and subatomic levels. One of the fundamental discoveries of quantum mechanics is that energy is quantised. This means it does not exist as a continuous spectrum, but rather as discrete packets called quanta. The concept of quanta arises from quantisation, which means that certain properties, like energy, cannot vary continuously but only in specific, fixed amounts. Quanta behaves both as particles and waves. This principle, central to quantum mechanics, explains phenomena like the photoelectric effect, where light behaves as discrete photons rather than a continuous wave. In atoms, electrons occupy discrete energy levels. When an electron moves between levels, it absorbs or emits energy in fixed quanta, explaining spectral lines in atomic physics. The concept of quanta led to the development of quantum mechanics. It also underpins modern technologies like lasers, semiconductors and quantum computing.


In an atom, electrons do not move freely around the nucleus but instead occupy specific energy levels or orbitals. Each energy level corresponds to a certain amount of energy, and an electron can only exist at these levels and nowhere in between. When an electron absorbs energy, it moves to a higher level (excitation). When it loses energy, it drops to a lower level (relaxation), often emitting a photon with a precise energy (and wavelength) corresponding to the difference between the levels. If you want a more in depth look at quantum computing feel free to download my PhD research paper below.


Numbers that showcase our success

0 million

physical qubits may be required to build a fault-tolerant quantum computer capable of outperforming classical machines.

0x

speed-up is theoretically possible for certain problems using quantum algorithms like Grover’s over classical search methods.

0

seconds was all it took for Google’s Sycamore quantum processor to solve a problem that would take a supercomputer 10,000 years.

0

qubits is often cited as the threshold where quantum computers can outperform classical supercomputers in certain tasks (quantum supremacy).

ABOUT

Jeremy Green developer of Q-SLICE and QUANTA as part of his PhD in computer science. Is also a skilled and experienced security professional with more than 20 certifications across platform, security and DevSecOps including CISSP, CISM, CEH, ECDE and CHFI. He is also an official instructor for ISACA and EC Council and the author of Information Security Management Principles, fourth edition and Security Architecture A practical guide to designing proactive and resilient cyber protection published by BCS. 

Author

Jeremy is also the author of BCS Information Security Management Principles Fourth Edition and Security Architecture: A practical guide to designing proactive and resilient cyber protection.

Instructor

Jeremy is an instructor for CompTIA, ISC2, ISACA and EC Council with twenty certifications. He also teachers Ethical Hacking and Digital Forensics on a Foundation Degree and holds a Cert Ed and QTLS.

Security Architect

Jeremy is a security architect supporting the security design and implementation of a large project for Leidos. Undertaking threat modelling, design assessment and stakeholder engagement. 

Get ahead with quantum security

Many organisations will be slow to recognise or respond to the threat posed by quantum computing, particularly in relation to its potential to break classical cryptographic systems. Some of this is due to quantum computing still being widely perceived as an abstract, long-term concern rather than an immediate operational risk.