Björn Rust (he/him) is a post-industrial designer, researcher and educator, developing context-sensitive solutions in service of people and the planet.

Recent writings

  1. On the limits of orthodox economic theory and the role of design in supporting state economic policy
  2. Opportunity hoarding
  3. Doing away with bullshit

Practical quantum computers

Long before Intel gave us the Pentium, released with a 60 MHz clock speed in 1993, researchers have been imagining processors unconstrained by the binary nature of transistors. In fact, as early as 1980, Yuri Manin described the mathematical foundations of quantum computing in his paper entitled Computable and Uncomputable. But until now it has been notoriously difficult to fabricate stable qubits that are able to achieve both superposition and entanglement.

Thanks to the work of Leo Kouwenhoven and his team at Delft, the MIT Technology Review has listed quantum computers among its '10 Breakthrough Technologies' for 2017 noting that: »... a raft of previously theoretical designs are actually being built«. The article goes on to indicate that the growing corporate interest in quantum computing is playing a significant role in the viability of the technology as those corporations are offering financial backing to both the research and development of assorted adjacent technologies.

It's worth noting that IBM first demonstrated a working quantum computer in 2000 and has been making an iteration of this available to the public via a cloud-based service since last year. Google too has been working on its own quantum computer, while both NASA, Lockheed Martin and the Los Alamos National Laboratory are working with The Canadian company D-Wave System. However, Wired warned that:

... today’s quantum computers still aren’t practical for most real-world applications. qubits are fragile and can be easily knocked out of the superposition state. Meanwhile, quantum computers are extremely difficult to program today because they require highly specialized knowledge.

Russ Juskalian of the MIT Technology Review offers a further warning aimed at D-Wave's quantum annealing technology by claiming that:

The approach, skeptics say, is at best applicable to a very constrained set of computations and might offer no speed advantage over classical systems.

But by Juskalian's own reporting, the absence of this speed advantage might soon disappear if Harmut Neven, the head of Google’s quantum computing effort, is able to deliver a 49-qubit system in the coming year as promised. This is an important milestone, as Juskalian describes:

The target of around 50 qubits isn’t an arbitrary one. It’s a threshold, known as quantum supremacy, beyond which no classical supercomputer would be capable of handling the exponential growth in memory and communications bandwidth needed to simulate its quantum counterpart. In other words, the top supercomputer systems can currently do all the same things that five- to 20-qubit quantum computers can, but at around 50 qubits this becomes physically impossible.

Despite all this, quantum computing is still largely confined to the lab. But we should be mindful of how it might impact our lives. For instance, this technology could render state-of-the-art asymmetrical encryption based on the RSA model (the history of which was covered in a recent episode of 50 Things That Made the Modern Economy) obsolete since one of the greatest strengths of a quantum computer is factoring large numbers. While RSA will be replaced by quantum cryptography in time, it's fair to expect that there will be a period where only universities, wealthy corporations and governments will wield this power.

Inspired by: MIT Technology Review & Wired