Click here to subscribe today or Login.
In 2024, some of the emerging technologies in the computing field read more like science fiction than real technology. One of those technologies is quantum computing, and it’s very real. Conventional computers — i.e. the computers you and I use every day, are tremendously powerful, but they have some fundamental limitations – these limitations are not rooted only in the processing power of the machines, but in how those machines solve problems or process the data.
Conventional computers work on 1s and 0s. Those 1s and 0s basically indicate whether something is “true” — represented by a 1, or “false,” represented by a 0. These 1s and 0s are called bits.
All conventional computer logic is fundamentally rooted in the concept of something being true, or false. A program basically says, for example — “if X is true, perform an operation, and then at the end of that operation, if Y is false, do something else.” That kind of logic can get you pretty far for most purposes.
There are issues with this approach however. Let’s look at medicine. One of the ways new drugs are developed — and this is an oversimplification — is that scientists uses computers to simulate the effects of specific molecules on things like cancer, or neurotransmitters, or enzymes, whatever the case may be. A conventional computer can do this one molecule at a time, albeit at a very high rate of speed.
Where a quantum computer is different — and again, this is a massive oversimplification — is that it’s able to evaluate what happens when something is true or false simultaneously. If you’ve heard the phrase Schrodinger’s Cat, this is what it’s talking about. So using our example of developing new drugs, a quantum computer could test every possible combination of molecules in the same space of time a conventional computer might be able to do just a few.
This ability has widespread implications for — as mentioned — medicine, but also cryptography — generating truly secure passwords or encoding, materials science, expanding artificial intelligence. It represents a fundamental leap forward in computer science, bypassing speed limitations that conventional machines struggle with.
There are some drawbacks, and this isn’t magic: conventional computers will almost certainly always be better at the types of true-or-false operations we use them every day for. They aren’t as good at correcting for errors in their output. They can have ambiguous output if their parameters aren’t very clearly defined. In essence — quantum computers are good for very specific tasks that normal computers aren’t good at. But it’s still a Big Deal.
What I think will happen is that, in the near-term, quantum computers will remain relegated to experimental and highly specialized applications. Longer term, if quantum computing breaks into the consumer market, it will be as individual specialized processors that allow classical computers to leverage their capabilities. Just as computers have CPUs for data, and GPUs for graphics and certain types of other calculations, they’ll have QPUs for operations that require those types of calculations.
As for what it means for the average person — there will definitely be an impact — and in areas that have typically been slower to advance: faster development of new medicine, advances in materials science, better weather and climate predictions, better analysis and improvement of traffic management, more secure communications. It might not be on your desk anytime soon, but it will be making a difference.
Nick DeLorenzo is the CTO of the Times Leader Media Group and CIO of MIDTC, LLC. He is from Mountain Top, Pennsylvania and has covered technology for the Times Leader since 2010.