dataloss_10942547

November 13, 2018

Why and How: Quantum Computing

You are likely to have seen all the information you need on the new iPhone, Google Pixel 3, Surface Pro 6, and Office 2019 as well as the outcomes of the CCH, Thomson and Intuit conferences, although if the reviews are not factual enough for me ...

Randy Johnston

From the November 2018 Issue

You are likely to have seen all the information you need on the new iPhone, Google Pixel 3, Surface Pro 6, and Office 2019 as well as the outcomes of the CCH, Thomson and Intuit conferences, although if the reviews are not factual enough for me, you’ll see me pause this Emerging Technology series to identify the facts that matter for the practice of accounting. While Quantum Computing is not ready for prime time, there is plenty of excitement among developers and the major technology providers including, but not limited to IBM, Google, Microsoft, Intel, and some other significant players that you probably won’t recognize like D-Wave and Rigetti Forest.

Let me be clear without being hyperbolic, if the developers are successful, Quantum Computing will change the future and direction of computing for many generations including long after I’m dead and gone. As a trained computer scientist, my expectation and that of protagonists say that Quantum Computing will be able to break all encryption in use today, can tamper with most blockchain implementations, can outperform any current computing hardware (typically known as Von Neumann computing based on the design from the 1940’s used by all computers and smartphones today), and has many operational attributes of interest that improve centralized, cloud computing. Antagonists claim that Quantum Computing capabilities are overstated, that the environment is too difficult to use, and that the technical difficulties will not be overcome. Let me back up my hair-brained statements with some observations.

  • Quantum Computing is in the early stages. Intel CES 2018 announcements:
  • Intel Labs has developed a neuromorphic research chip, code-named “Loihi”, which is a Quantum Computing enabled artificial intelligence (AI) algorithm inspired by how the brain works. This could unlock exponential gains in performance and power efficiency for the future of AI.
  • Intel’s Mike Mayberry, corporate vice president and managing director of Intel Labs said, “We expect it will be five to seven years before the industry gets to tackling engineering-scale problems, and it will likely require 1 million or more qubits to achieve commercial relevance.”
  • As Quantum Computing matures over the next 10 years, it could threaten blockchain and most cryptography that secures data and digital signatures

Let’s consider encryption. Quantum Computing approaches break encryption.

  • Shor’s algorithm can easily find public key integer factors. This ability would allow a quantum computer to decrypt many of the cryptographic systems in use today.
  • Consider a problem that has these four properties:
    1. The only way to solve it is to guess answers repeatedly and check them,
    2. The number of possible answers to check is the same as the number of inputs,
    3. Every possible answer takes the same amount of time to check, and
    4. There are no clues about which answers might be better: generating possibilities randomly is just as good as checking them in some special order
  • For problems with all four properties, the time for a quantum computer to solve this will be proportional to the square root of the number of inputs. Triple DES and AES become easy targets, which are the most popular encryption algorithms today.

This stuff gets a little heady, and I certainly wouldn’t want you to think I have mastered all the principles. Like all the emerging technologies we have covered in these emerging technology columns, Quantum Computing has pros and cons.

On the positive side:

  • Computing using quantum-mechanical phenomena, such as superposition and entanglement which permits representing much more data in a single position, called a qubit
  • Google says 2018 and IBM says within five years they will achieve quantum supremacy, that is running faster than a classical computer
  • Just 50 qubits can represent 10,000,000,000,000,000 numbers. A classical computer would require a petabyte-scale memory to store that number

On the down side:

  • It is hard currently to scale physically to increase the number of qubits
  • Qubits can be initialized to arbitrary values and read easily
  • Quantum gates need faster decoherence time, currently optimized by cryogenic temperatures, 20 Millikelvins = – 459 Fahrenheit

While the current prototype hardware is working, the number of qubits in use are quite small and the current designs require elaborate computing. Note from this picture how IBM is cooling a 56-Qubit design:

Why

Limits are being approached in the current Von Neumann architectures produced by Intel, AMD and others around the world. The silicon architectures have been miniaturized to 7 nanometers (nm) by TSMC and Intel is currently attempting to transition from 14nm to 10nm. It is becoming more difficult to build processing chips based on these small distances or geometries.

Further, the tricks that engineers have used including hardware virtualization, multiple cores and threads have made the chips faster, too. However, there is no significant breakthrough in the engineering thinking that will improve performance notably. While I’m hopeful that some smart engineer will come up with a way to make chips faster and smaller there is nothing that is being reported in the engineering trade journals that shows any promise. While I’m sure significant breakthroughs would be kept secret, most of the methods over the last few years have been evolutionary, not revolutionary change. A break through is needed, and Quantum Computing may be that breakthrough.

What?

For more information about building Quantum Computing, consider the following references:

  • A classical computer has a memory made up of bits, where each bit is represented by either a one or a zero. A quantum computer maintains a sequence of qubits, read ket 0 and ket 1
  • A single qubit can represent a one, a zero, or any quantum superposition of those two qubit states. A pair of qubits can be in any quantum superposition of 4 states and three qubits in any superposition of 8 states

The representations of these states are graphically and mathematically represented by the following:

In addition, the following links will provide more introduction into the world of Quantum Computing:

How?

So how does the Quantum Computing approach work? Physicists and computer scientists look at the properties of atoms and detect or predict the position of various particles mathematically. The table below shows the various approaches being used including: photons, coherent state of light, electrons, Nucleus, Optical lattices, Joseph junctions (super conductors of charge, flux or phase), Singly charged quantum dot pairs, or Quantum dots.

What does this mean to the practice of accounting and to accountants? Our centralized applications will run on these platforms, currently in use by IBM, Google, NASA, and other large entities. As we mentioned above, we have several working examples and production models available:

Here’s a summary of what you need to know about Quantum Computing:

Key Information

TECHNOLOGY: Quantum Computing

Why is the new technology better?

It is a new method of running programs, potentially at higher speeds

How can you do this today?

D-Wave ,IBM Q, Google, Intel, Microsoft,

Rigetti Forest

Risks

Computation may not be faster

Where/when to use

Problems that have to handle vast amounts of data

How much?

Commercial units are $15M+ for 2,000 qubits

When expected in mainstream

Ten+ years

Displaced technology or service

Traditional computers, mainframes

Other resources

Waterloo Quantum 101, How QC Work

 

Implementing Quantum Computing can take extremely large calculations and complete them more rapidly. Quantum Computing is being used in very complex models today, such as Weather.com by IBM. Developers are working on compilers and algorithms to make these machines work properly. From my view, Quantum Computers are at about the same stage that mainframe computers were in 1963-1965 when mainframes (which are Von Neumann machines) were just getting the early compilers and applications were simple, but just starting to work. The difference between mainframes and Quantum Computers is that once building the hardware is practical and the compilers are working properly, the additional speed available will revolutionize Cloud computing providing the horsepower to run applications radically faster. Conversion to Quantum Computing should predominantly mean recompiling applications to take advantage of the new platform.

Recommended Next Steps

Quantum Computing is in its very early stages. This technology could prove to be a dead end, but all indications are that we are beyond that stage. The biggest issue is that hardware gates need to be designed that allow a faster decoherence time that operate at room temperature rather than in cryogenics. We’ve tried to give you enough background that you can understand some of the fundamentals of Quantum Computing. While there is no practical choice for most of you today because of the high cost and special environment required by Quantum Computers, you should watch for breakthroughs over the next 5-10 years which will enable your applications to run faster than is possible today.

Thanks for reading CPA Practice Advisor!

Subscribe for free to get personalized daily content, newsletters, continuing education, podcasts, whitepapers and more…

Subscribe for free to get personalized daily content, newsletters, continuing education, podcasts, whitepapers and more...

Leave a Reply

Randy Johnston 2020 Casual PR Photo

Randy Johnston

MCS, MCP

Randy Johnston has been an entrepreneur, technologist, and teacher for most of his career. He has helped start and run many businesses, and founded Network Management Group, Inc. and owns half of K2 Enterprises. He has written for accounting and technology publications for four decades, and for CPA Practice Advisor since 2000.