Quantum state discrimination

This tutorial was generated using Literate.jl. Download the source as a .jl file.

This tutorial solves the problem of quantum state discrimination.

The purpose of this tutorial to demonstrate how to solve problems involving complex-valued decision variables and the HermitianPSDCone. See Complex number support for more details.

Required packages

This tutorial makes use of the following packages:

using JuMP
import LinearAlgebra
import SCS

Formulation

A d-dimensional quantum state, $\rho$, can be defined by a complex-valued Hermitian matrix with a trace of 1. Assume we have N d-dimensional quantum states, $\{\rho_i\}_{i=1}^N$, each of which is equally likely.

The goal of the quantum state discrimination problem is to choose a positive operator-valued measure (POVM), $\{ E_i \}_{i=1}^N$, such that if we observe $E_i$ then the most probable state that we are in is $\rho_i$.

Each POVM element, $E_i$, is a complex-valued Hermitian matrix, and there is a requirement that $\sum\limits_{i=1}^N E_i = \mathbf{I}$.

To choose a POVM, we want to maximize the probability that we guess the quantum state correctly. This can be formulated as the following optimization problem:

\[\begin{aligned} \max\limits_{E} \;\; & \frac{1}{N} \sum\limits_{i=1}^N \operatorname{tr}(\rho_i E_i) \\ \text{s.t.} \;\; & \sum\limits_{i=1}^N E_i = \mathbf{I} \\ & E_i \succeq 0 \; \forall i = 1,\ldots,N. \end{aligned}\]

Data

To setup our problem, we need N d-dimensional quantum states. To keep the problem simple, we use N = 2 and d = 2.

N, d = 2, 2
(2, 2)

We then generated N random d-dimensional quantum states:

function random_state(d)
    x = randn(ComplexF64, (d, d))
    y = x * x'
    return LinearAlgebra.Hermitian(y / LinearAlgebra.tr(y))
end

ρ = [random_state(d) for i in 1:N]
2-element Vector{LinearAlgebra.Hermitian{ComplexF64, Matrix{ComplexF64}}}:
 [0.9049983143615443 + 0.0im -0.11984167294191471 + 0.2224268161763913im; -0.11984167294191471 - 0.2224268161763913im 0.09500168563845571 + 0.0im]
 [0.2503060704439256 + 0.0im -0.16997258971668044 - 0.09226624832975384im; -0.16997258971668044 + 0.09226624832975384im 0.7496939295560743 + 0.0im]

JuMP formulation

To model the problem in JuMP, we need a solver that supports positive semidefinite matrices:

model = Model(SCS.Optimizer)
set_silent(model)

Then, we construct our set of E variables:

E = [@variable(model, [1:d, 1:d] in HermitianPSDCone()) for i in 1:N]
2-element Vector{LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}}:
 [_[1] _[2] + _[4] im; _[2] - _[4] im _[3]]
 [_[5] _[6] + _[8] im; _[6] - _[8] im _[7]]

Here we have created a vector of matrices. This is different to other modeling languages such as YALMIP, which allow you to create a multi-dimensional array in which 2-dimensional slices of the array are Hermitian matrices.

We also need to enforce the constraint that $\sum\limits_{i=1}^N E_i = \mathbf{I}$:

@constraint(model, sum(E) == LinearAlgebra.I)

\[ \begin{bmatrix} {\_}_{1} + {\_}_{5} - 1 & {\_}_{2} + {\_}_{6} + {\_}_{4} im + {\_}_{8} im\\ {\_}_{2} + {\_}_{6} - {\_}_{4} im - {\_}_{8} im & {\_}_{3} + {\_}_{7} - 1\\ \end{bmatrix} \in \text{Zeros()} \]

This constraint is a complex-valued equality constraint. In the solver, it will be decomposed onto two types of equality constraints: one to enforce equality of the real components, and one to enforce equality of the imaginary components.

Our objective is to maximize the expected probability of guessing correctly:

@objective(
    model,
    Max,
    sum(real(LinearAlgebra.tr(ρ[i] * E[i])) for i in 1:N) / N,
)

\[ 0.45249915718077216 {\_}_{1} - 0.11984167294191471 {\_}_{2} + 0.2224268161763913 {\_}_{4} + 0.047500842819227854 {\_}_{3} + 0.1251530352219628 {\_}_{5} - 0.16997258971668044 {\_}_{6} - 0.09226624832975384 {\_}_{8} + 0.37484696477803714 {\_}_{7} \]

Now we optimize:

optimize!(model)
@assert is_solved_and_feasible(model)
solution_summary(model)
* Solver : SCS

* Status
  Result count       : 1
  Termination status : OPTIMAL
  Message from the solver:
  "solved"

* Candidate solution (result #1)
  Primal status      : FEASIBLE_POINT
  Dual status        : FEASIBLE_POINT
  Objective value    : 8.64061e-01
  Dual objective value : 8.64062e-01

* Work counters
  Solve time (sec)   : 2.90361e-04

The probability of guessing correctly is:

objective_value(model)
0.8640614507314219

When N = 2, there is a known analytical solution of:

0.5 + 0.25 * sum(LinearAlgebra.svdvals(ρ[1] - ρ[2]))
0.8640627582954737

proving that we found the optimal solution.

Finally, the optimal POVM is:

solution = [value.(e) for e in E]
2-element Vector{Matrix{ComplexF64}}:
 [0.9495721399750024 + 0.0im 0.03442451603977098 + 0.21609731371190505im; 0.03442451603977098 - 0.21609731371190505im 0.05042785512985496 + 0.0im]
 [0.05042785517602001 + 0.0im -0.03442451605312517 - 0.21609731370614843im; -0.03442451605312517 + 0.21609731370614843im 0.9495721400119357 + 0.0im]
Tip

Duality plays a large role in solving conic optimization models. Depending on the solver, it can be more efficient to solve the dual of this problem instead of the primal. If performance is an issue, see the Dualization tutorial for more details.

Alternative formulation

The formulation above includes N Hermitian matrices and a set of linear equality constraints. We can simplify the problem by replacing $E_N$ with $E_N = I - \sum\limits_{i=1}^{N-1} E_i$. This results in:

model = Model(SCS.Optimizer)
set_silent(model)
E = [@variable(model, [1:d, 1:d] in HermitianPSDCone()) for i in 1:N-1]
E_N = LinearAlgebra.Hermitian(LinearAlgebra.I - sum(E))
@constraint(model, E_N in HermitianPSDCone())
push!(E, E_N)
2-element Vector{LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}}:
 [_[1] _[2] + _[4] im; _[2] - _[4] im _[3]]
 [-_[1] + 1 -_[2] - _[4] im; -_[2] + _[4] im -_[3] + 1]

The objective can also be simplified, by observing that it is equivalent to:

@objective(model, Max, real(LinearAlgebra.dot(ρ, E)) / N)

\[ 0.32734612195880936 {\_}_{1} + 0.050130916774765735 {\_}_{2} + 0.31469306450614515 {\_}_{4} - 0.3273461219588093 {\_}_{3} + 0.49999999999999994 \]

Then we can check that we get the same solution:

optimize!(model)
@assert is_solved_and_feasible(model)
solution_summary(model)
* Solver : SCS

* Status
  Result count       : 1
  Termination status : OPTIMAL
  Message from the solver:
  "solved"

* Candidate solution (result #1)
  Primal status      : FEASIBLE_POINT
  Dual status        : FEASIBLE_POINT
  Objective value    : 8.64060e-01
  Dual objective value : 8.64062e-01

* Work counters
  Solve time (sec)   : 2.60546e-04
objective_value(model)
0.8640596603179975