sf.apps.train

Tools for training variational GBS devices.

A GBS device can be treated as a variational quantum device with trainable squeezing, beamsplitter and rotation gates. This module provides the functionality for training a GBS device using the approach outlined in this paper [22].

Embedding trainable parameters in GBS

Training algorithms for GBS distributions rely on the \(WAW\) parametrization, where \(W\) is a diagonal matrix of weights and \(A\) is a symmetric matrix. Trainable parameters are embedded into the GBS distribution by expressing the weights as functions of the parameters.

This module contains methods to implement such embeddings. It also provides derivatives of the weights with respect to the trainable parameters. There are two main classes, each corresponding to a different embedding. The Exp class is a simple embedding where the weights are exponentials of the trainable parameters. The ExpFeatures class is a more general embedding that makes use of user-defined feature vectors, which potentially provide more flexibility in training strategies.

Exp(dim)

Simple exponential embedding.

ExpFeatures(features)

Exponential embedding with feature vectors.

As we have discussed, the above embeddings output weights that are used within the \(W\) matrix of the \(WAW\) parametrization of GBS. This model of variational GBS is made available as the VGBS class.

VGBS(A, n_mean, embedding, threshold[, samples])

Create a variational GBS model for optimization and machine learning.

For example, a four-mode variational model can be constructed using:

>>> from strawberryfields.apps import data
>>> data = data.Mutag0()
>>> embedding = Exp(data.modes)
>>> n_mean = 5
>>> vgbs = VGBS(data.adj, 5, embedding, threshold=False, samples=np.array(data[:1000]))

We can then evaluate properties of the variational GBS distribution for different choices of trainable parameters:

>>> params = 0.1 * np.ones(data.modes)
>>> vgbs.n_mean(params)
3.6776094165797364

Additional functionality related to the variational GBS model is available.

A_to_cov(A)

Convert an adjacency matrix to a covariance matrix of a GBS device.

prob_click(A, sample)

Calculate the probability of a click pattern.

prob_photon_sample(A, sample)

Calculate the probability of a sample of photon counts.

rescale_adjacency(A, n_mean, threshold)

Rescale an adjacency matrix so that it can be mapped to GBS.

Choosing a cost function

In the context of stochastic optimization, cost functions are expressed as expectation values over the GBS distribution. Within the WAW parametrization, gradients of cost functions can also be expressed as expectation values over the GBS distribution. This module contains methods for calculating these gradients and for using gradient-based methods to optimize GBS circuits. In the case of optimization with respect to a Kullback-Leibler divergence or log-likelihood cost function, gradients can be computed efficiently, leading to fast training.

Stochastic(h, vgbs)

Stochastic cost function given by averaging over samples from a trainable GBS distribution.

KL(data, vgbs)

Kullback-Liebler divergence cost function.