So, we are finally here, after a long wait, we are going to be in an era of quantum computing. TFQ, the beauty of TensorFlow and beast nature of quantum computing.
Quantum computing is becoming a technology to observe more closely in 2020. We have seen some recent announcements from Honeywell, Google and others, it’s worth looking forward to new pieces of hardware coming this year. Now, Google has presented a new machine learning framework for experimenting with quantum computing for research purposes.

Defining the Quantum Computer
In 1930, Alan turning developed the turning machine. It consists of tape of unlimited length that is the unlimited length, that tape further divided into small squares. Each square can either hold a symbol (1 or 0) or it can be a blank. After that, there is a read/write device that reads these symbols and gives all the instructions to the machine accordingly. This is how our traditional computers used to work.
In the quantum Turning machine, there is a small difference which makes a large gap in performance, i.e that tape. In a quantum machine that tape exists in a quantum state. This means that the symbols on the tape can be either 0 or 1. In other words, the symbols are both 0 and 1 at the same time. While a normal Turing machine can only perform one calculation at a time, a quantum Turing machine can perform many calculations at once.
Quantum computers aren’t limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition.
So the most important thing here is qubits.
Qubits represent atoms, electrons, and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously. It has the potential to be millions of times more powerful than today’s most powerful supercomputers.
Quantum computing State of the Art
A few days ago, Google AI, announced the release of TensorFlow Quantum, an open-source library for the rapid prototyping of quantum machine learning models. They were not alone in this development effort, as they had help from the University of Waterloo and Volkswagen.
The core idea of TensorFlow Quantum is to interleave quantum algorithms and machine learning programs all within the TensorFlow programming model. Google refers to this approach as quantum machine learning(QML) and is able to implement it by leveraging some of its recent quantum computing frameworks such as Google Cirq.
So the main question here is:
What is cirq?
Cirq is an open-source framework for invoking quantum circuits for devices. The main idea behind the cirq is to provide a simple programming model that abstracts the fundamental building blocks of quantum computation. The basic structure of cirq contains qubits, gates, measurement operators and of course circuits, these are the necessary things required for quantum computations.
According to Google AI: “Cirq enables researchers to write quantum algorithms for specific quantum processors. Cirq gives users fine-tuned control over quantum circuits, specifying gate behavior using native gates, placing these gates appropriately on the device, and scheduling the timing of these gates within the constraints of the quantum hardware”
So there are following key building blocks presents in cirq:
- Circuits: the basic form of a quantum circuit is represented by these Circuits. A Cirq Circuit is represented as a collection of Moments which include operations that can be executed on Qubits during some abstract slide of time.
- Gates: Gates abstract operations on collections of qubits.
- Schedules and Devices: Basically in simple terms, a Schedule is made up of a set of Operations as well as a description of the Device on which the schedule is intended to be run. In terms of quantum computation that includes more detailed information about the timing and duration of the gates.
Tensorflow Quantum
TensorFlow Quantum is a library for hybrid quantum-classical machine learning, developing/building quantum machine learning applications. It allows us to construct quantum datasets, quantum models.
TFQ provides a model that abstracts the interactions with TensorFlow, Cirq, and computational hardware. At the top of the stack is the data to be processed. Classical data is natively processed by TensorFlow; TFQ adds the ability to process quantum data, consisting of both quantum circuits and quantum operators. The next level down the stack is the Keras API in TensorFlow. Since a core principle of TFQ is native integration with core TensorFlow, in particular with Keras models and optimizers. Underneath the Keras model abstractions are our quantum layers and differentiators, which enable hybrid quantum-classical automatic differentiation when connected with classical TensorFlow layers. Underneath the layers and differentiators, TFQ relies on TensorFlow ops, which instantiate the dataflow graph.



In terms of execution standpoint, TFQ follows the following steps to train and build QML models.
- Prepare a quantum dataset: Quantum data is loaded as tensors, specified as a quantum circuit written in Cirq. The tensor is executed by TensorFlow on the quantum computer to generate a quantum dataset.
- Evaluate a quantum neural network model: In this step, the researcher can prototype a quantum neural network using Cirq that they will later embed inside of a TensorFlow compute graph.
- Sample or Average: This step leverages methods for averaging over several runs involving steps (1) and (2).
- Evaluate a classical neural networks model: This step uses classical deep neural networks to distill such correlations between the measures extracted in the previous steps.
- Evaluate Cost Function: Similar to traditional machine learning models, TFQ uses this step to evaluate a cost function. This could be based on how accurately the model performs the classification task if the quantum data was labeled, or other criteria if the task is unsupervised.
- Evaluate Gradients & Update Parameters — After evaluating the cost function, the free parameters in the pipeline should be updated in a direction expected to decrease the cost.
TFQ represents one of the most important milestones in this area and one that leverages some of the best IP in both quantum and machine learning. More details about TFQ can be found on the project’s website.
What you guys think will be the future of Quantum Computing? Do tell us in comment box….
Happy learning 🙂
Thanks for this incredible article but I have few questions.
1) Cirq is simulation or for real quantum computer ?
2) TFQ can be used for real data like .csv or even images ?
3) Can we compute on GPU ?
4) Is TFQ more efficient than TF ? What is the interest ?
Thanks a lot,
Lily
Hi Lily, thanks for the read. In response to your questions:
1) So, Cirq used to writing, manipulating, and optimizing quantum circuits and running them against quantum computers and simulators.
2) yes we can, we have something called QCNN, here you can see: https://www.tensorflow.org/quantum/tutorials/qcnn
3) Actually, this is the difference, it works on Quantum processors.
4) It is introduced to explore computing workflows that leverage Google’s quantum computing offerings, all from within TensorFlow.
The real difference is in tensor computation.
Hope, this will answer your all questions