# Accepted papers

The following is a list of accepted papers. Some conditionally-accepted papers are still being considered, and are not shown here.

. A complete characterisation of All-versus-Nothing arguments on stabilisers
Abstract: An important class of contextuality arguments in quantum foundations are the All-versus-Nothing proofs, generalising a construction originally due to Mermin. We present a general formulation of All-versus-Nothing arguments, and a complete characterisation of all such arguments which arise from stabiliser subgroups of the Pauli n-group. We show that every AvN argument for an $n$-qubit stabiliser state can be reduced to an AvN proof for a three-qubit state which is local Clifford-equivalent to the tripartite GHZ state. This result is achieved through a combinatorial characterisation of AvN arguments, the AvN triple Theorem. The proof of this theorem makes use of the theory of graph states. This result enables the development of a computational method to identify all the AvN arguments in $\mathbb{Z}_2$ on general $n$-qubit stabiliser states. We also present new insights into the stabiliser formalism and its connections with logic.
. Minimum quantum resources for strong non-locality
Abstract: We analyse the minimum quantum resources needed to realise strong non-locality, as exemplified e.g. by the classical GHZ construction. It was already known that no two-qubit system, with any finite number of local measurements, can realise strong non-locality. For three-qubit systems, we show that strong non-locality can only be realised in the GHZ SLOCC class, and with equatorial measurements. However, we show that in this class there is an infinite family of states which are pairwise non LU-equivalent that realise strong non-locality with finitely many measurements. These states have decreasing entanglement between one qubit and the other two, necessitating an increasing number of local measurements on the latter.
Matthew Amy, Jianxin Chen and Neil J. Ross. A finite presentation of CNOT-dihedral operators
Abstract: We give a finite presentation by generators and relations of unitary operators expressible over the {CNOT, T, X} gate set, also known as CNOT-dihedral operators. To this end, we introduce a notion of normal form for CNOT-dihedral circuits and prove that every CNOT-dihedral operator admits a unique normal form. Moreover, we show that in the presence of certain structural rules only finitely many circuit identities are required to reduce an arbitrary CNOT-dihedral circuit to its normal form. By appropriately restricting our relations, we obtain a finite presentation of unitary operators expressible over the {CNOT, T} gate set as a corollary.
Howard Barnum, Ciaran Lee, Carlo Maria Scandolo and John Selby. Ruling out higher-order interference from purity principles
Abstract: As first noted by Rafael Sorkin, there is a limit to quantum interference. The interference pattern formed in a multi-slit experiment is a function of the interference patterns formed between pairs of slits; there are no genuinely new features resulting from considering three slits instead of two. Sorkin has introduced a hierarchy of mathematically conceivable higher-order interference behaviours, where classical theory lies at the first level of this hierarchy and quantum theory theory at the second. Informally, the order in this hierarchy corresponds to the number of slits on which the interference pattern has an irreducible dependence. Many authors have wondered why quantum interference is limited to the second level of this hierarchy. Does the existence of higher-order interference violate some natural physical principle that we believe should be fundamental? In the current work we show that natural physical principles can be found which limit interference behaviour to second-order, or "quantum-like", interference, but that do not restrict us to the entire quantum formalism. We work within the operational framework of generalised probabilistic theories, and prove that any theory satisfying Causality, Purity Preservation, Pure Sharpness, and Purification---four principles that formalise the fundamental character of purity in nature---exhibits at most second-order interference. Hence these theories are, at least conceptually, very "close" to quantum theory. Along the way we show that systems in such theories correspond to Euclidean Jordan Algebras. Hence, they are self-dual and, moreover, multi-slit experiments in such theories are described by pure projectors.
. Common denominator for value and expectation no-go theorems
Abstract: Hidden-variable (HV) theories allege that a quantum state describes an ensemble of systems distinguished by the values of hidden variables. No-go theorems assert that HV theories cannot match the predictions of quantum theory. The present work started with repairing flaws in the literature on no-go theorems asserting that HV theories cannot predict the expectation values of measurements. That literature gives one an impression that expectation no-go theorems subsume the time-honored no-go theorems asserting that HV theories cannot predict the possible values of measurements. But the two approaches speak about different kinds of measurement. This hinders comparing them to each other. Only projection measurements are common to both. Here, we sharpen the results of both approaches so that only projection measurements are used. This allows us to clarify the similarities and differences between the two approaches. Neither one dominates the other.
Spencer Breiner, Carl A. Miller and Neil J. Ross. Graphical Methods in Quantum Cryptography
Abstract: We introduce a framework for providing graphical security proofs for quantum cryptography using the methods of categorical quantum mechanics. We are optimistic that this approach will make some of the highly complex proofs in quantum cryptography more accessible, facilitate the discovery of new proofs, and enable automated proof verification. As an example of our framework, we reprove a recent result from device-independent quantum cryptography: any linear randomness expansion protocol can be converted into an unbounded randomness expansion protocol. We give a graphical exposition of a proof of this result and implement parts of it in the Globular proof assistant.
Daniel Cicala. Categorifying the zx-calculus
Abstract: This paper presents a symmetric monoidal and compact closed bicategory that categorifies the zx-calculus developed by Coecke and Duncan. The 1-cells in this bicategory are certain graph morphisms that correspond to the string diagrams of the zx-calculus, while the 2-cells are rewrite rules.
Robin Cockett, Cole Comfort and Priyaa V. Srinivasan. The Category CNOT
Abstract: The paper studies the category CNOT which is generated by the quantum gates controlled-not, swap and the computational ancillae. We exhibit a complete set of identities satisfied by the gates generating the category CNOT. We prove that CNOT is a discrete inverse category and, moreover, that CNOT is equivalent to the category of partial isomorphisms of finitely-generated non-empty commutative torsors of characteristic 2.
Bob Coecke, John Selby and Sean Tull. Two Roads to Classicality
Abstract: Mixing and decoherence are both manifestations of classicality within quantum theory, each of which admit a very general category-theoretic construction. We show under which conditions these two 'roads to classicality' coincide. This is indeed the case for (finite-dimensional) quantum theory, where each construction yields the category of C*-algebras and completely positive maps. We present counterexamples where the property fails which includes relational and modal theories. Finally, we provide a new interpretation for our category-theoretic generalisation of decoherence in terms of 'leaking information'.
. Uniqueness of composition in quantum theory and linguistics
Abstract: We derive a uniqueness result for non-Cartesian composition of systems in a large class of process theories, with important implications for quantum theory and linguistics. Specifically, we consider theories of wavefunctions valued in commutative involutive semirings---as modelled by categories of free finite-dimensional semimodules---and we prove that the only bilinear compact-closed symmetric monoidal structure is the canonical one (up to monoidal equivalence). Our results apply to conventional quantum theory and other toy theories of interest in the literature, such as real quantum theory, relational quantum theory, hyperbolic quantum theory and modal quantum theory. In computational linguistics, they imply that linear models for categorical compositional distributional semantics (DisCoCat)---such as vector spaces, sets and relations, and multisets/histograms---admit an (essentially) unique compatible pregroup grammar.
. Purity through factorisation
Abstract: We give a construction that identifies the collection of pure processes (i.e. those which are deterministic, or without randomness) within a theory containing both pure and mixed processes. Working in the framework of symmetric monoidal categories, we define a pure subcategory. This definition arises elegantly from the categorical notion of a weak factorisation system. Our construction gives the expected result in several examples, both quantum and classical.
Michele Dall'Arno, Sarah Brandsen, Alessandro Tosini, Francesco Buscemi and Vlatko Vedral. No-Hypersignaling as a Physical Principle
Abstract: A paramount topic in quantum foundations, rooted in the study of the EPR paradox and Bell in- equalities, is that of characterizing quantum theory in terms of the space-like correlations it al- lows. Here we show that to focus only on space-like correlations is not enough: we explicitly construct a toy model theory that, though being perfectly compatible with classical and quantum theories at the level of space-like correlations, displays an anomalous behavior in its time-like cor- relations. We call this anomaly, quantified in terms of a specific communication game, the "hyper- signaling" phenomena. We hence conclude that the "principle of quantumness," if it exists, can- not be found in space-like correlations alone: nontrivial constraints need to be imposed also on time-like correlations, in order to exclude hypersignaling theories.
Kevin Dunne. On the Structure of H*-Algebras
Abstract: Previously we have shown that the topos approach to quantum theory of Doering and Isham can be generalised to a class of categories typically studied within the monoidal approach to quantum theory of Abramsky and Coecke. In the monoidal approach to quantum theory H*-algebras provide an axiomatisation of observables and states. Here we show that H*-algebras naturally correspond with the notions of observables and states in the generalised topos approach to quantum theory. We then combine these results with the dagger--kernel approach to quantum logic of Heunen and Jacobs, which we use to prove a structure theorem for H*-algebras. This structure theorem is a generalisation of the structure theorem of Ambrose for H*-algebras the category of Hilbert spaces.
Kevin Dunne. Spectral Presheaves, Kochen-Specker Contextuality, and Quantale-Valued Relations
Abstract: In the topos approach to quantum theory of Doering and Isham the Kochen--Specker Theorem, which asserts the contextual nature of quantum theory, can be reformulated in terms of the global sections of a presheaf characterised by the Gelfand spectrum of a commutative C*-algebra. In previous work we showed how this topos perspective can be generalised to a class of categories typically studied within the monoidal approach to quantum theory of Abramsky and Coecke, and in particular how one can generalise the Gelfand spectrum. Here we study the Gelfand spectrum presheaf for categories of quantale--valued relations, and by considering its global sections we give a non--contextuality result for these categories. We also show that the Gelfand spectrum comes equipped with a topology which has a natural interpretation when thinking of these structures as representing physical theories.
Pau Enrique Moliner, Chris Heunen and Sean Tull. Space in monoidal categories
Abstract: The category of Hilbert modules may be interpreted as a naive quantum field theory over a base space. Open subsets of the base space are recovered as idempotent subunits, which form a meet-semilattice in any firm braided monoidal category. There is an operation of restriction to an idempotent subunit: it is a graded monad on the category, and has the universal property of algebraic localisation. Space-time structure on the base space induces a closure operator on the idempotent subunits. Restriction is then interpreted as spacetime propagation. This lets us study relativistic quantum information theory using methods entirely internal to monoidal categories. As a proof of concept, we show that quantum teleportation is only successfully supported on the intersection of Alice and Bob's causal future.
Alessandro Facchini, Alessio Benavoli and Marco Zaffalon. Bayes + Hilbert = Quantum Mechanics
Abstract: We consider the problem of gambling on a quantum experiment and enforce rational behavior by a few rules. These rules yield, in the classical case, the Bayesian theory of probability via duality theorems. In our quantum setting, they yield the Bayesian theory generalized to the space of Hermitian matrices. This very theory is quantum mechanics: in fact, we derive all its four postulates from the generalized Bayesian theory. This implies that quantum mechanics is self-consistent. It also leads us to reinterpret the main operations in quantum mechanics as probability rules: Bayes' rule (measurement), marginalization (partial tracing), independence (tensor product). To say it with a slogan, we obtain that quantum mechanics is the Bayesian theory in the complex numbers. As an additional consequence, we derive a Gleason-type theorem that holds for any dimension n of a quantum system, and in particular for n= 2. The theorem states that the only logically consistent probability assignments are exactly the ones that are definable as the trace of the product of a projector and a density matrix operator.
Thomas Galley and Lluis Masanes. Classification of all alternatives to the Born rule in terms of informational properties
Abstract: The extent to which the structure of measurements and probabilities is already encoded in the structure and dynamics of pure states of quantum theory has been the subject of much debate. There have been attempts to derive the Born rule (which assigns probabilities to measurement outcomes), however these are often deemed controversial[3, 6, 7]. In this work we suggest a more neutral approach where we consider all possible alternatives to the Born rule and explore the consequences of this change. We consider theories with the same pure states and dynamics as quantum theory but different measurement rules. We classify all these alternative theories using representation theoretic tools and describe informational properties of these alternatives. We show that no restriction of effects and bit symmetry single out the Born rule. We also conjecture that the Born rule is the only probabilistic assignment consistent with local tomography.
Liam Garvie and Ross Duncan. Verifying the Smallest Interesting Colour Code with Quantomatic
Abstract: The smallest interesting colour code is an [[8,3,2]] code for which a fault tolerant CCZ-gate can be implemented using 8 single-qubit bit T gates. In this paper we formalise the code in the ZX-calculus and verify its basic properties using the interactive theorem prover Quantomatic.
. Towards Quantum Field Theory in Categorical Quantum Mechanics
Abstract: In this work, we use tools from non-standard analysis to introduce infinite-dimensional quantum systems and quantum fields within the framework of Categorical Quantum Mechanics. We define a dagger compact category Star Hilb suitable for the algebraic manipulation of unbounded operators, Dirac deltas and plane-waves. We cover in detail the construction of quantum systems for particles in boxes with periodic boundary conditions, particles on cubic lattices, and particles in real space. Not quite satisfied with this, we show how certain non-separable Hilbert spaces can also be modelled in our non-standard framework, and we explicitly treat the cases of quantum fields on cubic lattices and quantum fields in real space.
. Categorical Probabilistic Theories
Abstract: We present a simple categorical framework for the treatment of probabilistic theories, with the aim of reconciling the fields of Categorical Quantum Mechanics (CQM) and Operational Probabilistic Theories (OPTs). In recent years, both CQM and OPTs have found successful application to a number of areas in quantum foundations and information theory: they present many similarities, both in spirit and in formalism, but they remain separated by a number of subtle yet important differences. We attempt to bridge this gap, by adopting a minimal number of operationally motivated axioms which provide clean categorical foundations, in the style of CQM, for the treatment of the problems that OPTs are concerned with.
. Frobenius structures over Hilbert C*-modules
Abstract: We study the monoidal dagger category of Hilbert C*-modules over a commutative C*-algebra from the perspective of categorical quantum mechanics. The dual objects are the finitely presented projective Hilbert C*-modules. Special dagger Frobenius structures correspond to bundles of uniformly finite-dimensional C*-algebras. A monoid is dagger Frobenius over the base if and only if it is dagger Frobenius over its centre and the centre is dagger Frobenius over the base. We characterise the commutative dagger Frobenius structures as branched coverings with finite fibres, and give nontrivial examples of both commutative and central dagger Frobenius structures. Subobjects of the tensor unit correspond to clopen subsets of the Gelfand spectrum of the C*-algebra, and we discuss dagger kernels.
Dominic Horsman and Niel de Beaudrap. The ZX calculus is a language for surface code lattice surgery
Abstract: Quantum computing is moving rapidly to the point of deployment of technology. Functional quantum devices will require the ability to correct error in order to be scalable and effective. A leading choice of error correction, in particular for modular or distributed architectures, is the surface code with logical two-qubit operations realised via "lattice surgery''. These operations consist of "merges" and "splits" acting non-unitarily on the logical states and are not easily captured by standard circuit notation. This raises the question of how best to reason about lattice surgery in order efficiently to use quantum states and operations in architectures with complex resource management issues. In this paper we demonstrate that the operations of the ZX calculus, a form of quantum diagrammatic reasoning designed using category theory, match exactly the operations of lattice surgery. Red and green "spider" nodes match rough and smooth merges and splits, and follow the axioms of a dagger special associative Frobenius algebra. Some lattice surgery operations can require non-trivial correction operations, which are captured natively in the use of the ZX calculus in the form of ensembles of diagrams. We give a first taste of the power of the calculus as a language for surgery by considering two operations (magic state use and producing a CNOT) and show how ZX diagram re-write rules give lattice surgery procedures for these operations that are novel, efficient, and highly configurable.
. Y-Calculus: A language for real Matrices derived from the ZX-Calculus
Abstract: We introduce a ZX-like diagrammatic language devoted to manipulating real matrices -- and rebits --, with its own set of axioms. We prove the necessity of some non trivial axioms of these. We show that some restriction of the language is complete. %We define a generalisation of Hadamard that behaves well in our case. We exhibit two interpretations to and from the ZX-Calculus, thus showing the consistency between the two languages. Finally, we derive from our work a way to extract the real or imaginary part of a ZX-diagram, and prove that a restriction of our language is complete if and only if the equivalent restriction of the ZX-calculus is complete.
Abstract: We propose a Matrix Theory approach to tensor-based models of meaning, based on permutation symmetry along with Gaussian weights and their perturbations. A simple Gaussian model is tested against word matrices created from a large corpus of text. We characterize the cubic and quartic departures from the model, which we propose, alongside the Gaussian parameters, as signatures for comparison of linguistic corpora. We propose that perturbed Gaussian models with permutation symmetry provide a promising framework for characterizing the nature of universality in the statistical properties of word matrices. The matrix theory framework developed here exploits the view of statistics as zero dimensional perturbative quantum field theory. It perceives language as a physical system realizing a universality class of matrix statistics characterized by permutation symmetry.
. Picture-perfect QKD
Abstract: We provide a new way to bound the security of quantum key distribution using only the diagrammatic behaviour of complementary observables and essential uniqueness of purification for quantum channels. We begin by demonstrating a proof in the simplest case, where the eavesdropper doesn't noticeably disturb the channel at all and has no quantum memory. We then show how this case extends with almost no effort to account for quantum memory and noise.
Irreducible noncontextuality inequalities from the Kochen-Specker theorem
Abstract: Recent work (Kunjwal and Spekkens, PRL 115, 110403 (2015)) has shown how operational noncontextuality inequalities robust to noise can be obtained from Kochen-Specker uncolourable (or KS-uncolourable) hypergraphs. In contrast to traditional approaches, this operational approach does not assume that measurement outcomes are fixed deterministically by the ontic state of the system in an underlying ontological model, nor does it employ factorizability a la Bell's theorem (where factorizability is justified by the assumption of local causality). The result of PRL 115, 110403 (2015) relied on an explicit numerical enumeration of all the extremal points of the polytope of (measurement) noncontextual assignments of probabilities to the KS-uncolourable hypergraph. Here we focus on an analytical approach to deriving such noncontextuality inequalities that relies on constraints arising directly from the structure of the hypergraph without necessarily enumerating all the extremal probabilistic models on it. This cleanly identifies the operational quantities that one can expect to be constrained (and why) by the assumption of noncontextuality instead of having to guess these quantities or obtaining them from brute-force numerical methods without any guiding principles to identify them. Indeed, we show how to identify a minimal set of independent noncontextuality inequalities for any KS-uncolourable hypergraph given the operational equivalences of the type assumed in Ref. [1]. Along the way, we define a parameterization of contextuality scenarios to obtain conditions for their KS-uncolourability. We outline a particularly simple way to generate a family of KS-uncolorable hypergraphs by defining a map, 2Reg(.), that takes any graph to a corresponding (2-regular) hypergraph. Some known examples of Kochen-Specker sets exhibit orthogonality relations from this family of KS-uncolourable hypergraphs. We analytically obtain operational noncontextuality inequalities for (2-regular) KSuncolourable hypergraphs using properties of the (simpler) underlying graphs, namely, those from which they can be obtained via 2Reg(.).
Ciaran Lee and John Selby. A no-go theorem for theories that decohere to quantum mechanics
Abstract: Quantum theory is the most experimentally verified physical theory in the history of science. Yet it may be the case that quantum theory is only an effective description of the world, in the same way that classical physics is an effective description of the quantum world. In this work we ask whether there can exist an operationally-defined theory superseding quantum theory, but which reduces to quantum theory via a decoherence-like mechanism. We prove that no such post-quantum theory can exist if it is demanded that it satisfies two natural physical principles, causality and purification. Here, causality formalises the statement that information propagates from present to future, and purification that each state of incomplete information arises in an essentially unique way due to lack of information about an environment system. Hence, our result can either be viewed as a justification of why the fundamental theory of Nature is quantum, or as showing in a rigorous manner that any post-quantum theory must abandon the principle of causality, the principle of purification, or both.
Shane Mansfield. Reality of the quantum state: Towards a stronger ψ-ontology theorem
Abstract: The Pusey-Barrett-Rudolph (PBR) no-go theorem provides an argument for the reality of the quantum state by ruling out ψ-epistemic ontological theories, in which the quantum state is of a statistical nature. It applies under an assumption of preparation independence, the validity of which has been subject to debate. We propose two plausible and less restrictive alternatives: a weaker notion allowing for classical correlations, and an even weaker, physically motivated notion of independence, which merely prohibits the possibility of superluminal causal influences in the preparation process. The latter is a minimal requirement for enabling a reasonable treatment of subsystems in any theory. It is demonstrated by means of an explicit ψ-epistemic ontological model that the argument of PBR becomes invalid under the alternative notions of independence. As an intermediate step, we recover a result which is valid in the presence of classical correlations. Finally, we obtain a theorem which holds under the minimal requirement, approximating the result of PBR. For this, we consider experiments involving randomly sampled preparations and derive bounds on the degree of ψ epistemicity that is consistent with the quantum-mechanical predictions. The approximation is exact in the limit as the sample space of preparations becomes infinite.
. Quantum combinatorial games
Abstract: In this paper, we propose a Quantum variation of combinatorial games, generalizing the Quantum Tic-Tac-Toe proposed by Allan Goff [2006]. A combinatorial game is a two-player game with no chance and no hidden information, such as Go or Chess. In this paper, we consider the possibility of playing superpositions of moves in such games. We propose different rulesets depending on when superposed moves should be played, and prove that all these rulesets may lead similar games to different outcomes. We then consider Quantum variations of the game of Nim. We conclude with some discussion on the relative interest of the different rulesets.
Daniel Mills, Anna Pappa, Theodoros Kapourniotis and Elham Kashefi. Information Theoretically Secure Hypothesis Test for Temporally Unstructured Quantum Computation
Abstract: The efficient certification of classically intractable quantum devices has been a central research question for some time. However, to observe a "quantum advantage", it is believed that one does not need to build a large scale universal quantum computer; a task which has proven extremely challenging. Intermediate quantum models that are easier to implement, but which also exhibit this quantum advantage over classical computers, have been proposed. In this work, we present a certification technique for such a sub-universal quantum Server which only performs commuting gates and requires very limited quantum memory. By allowing a verifying Client to manipulate single qubits, we exploit properties of measurement based blind quantum computing to prove, in a composable and secure way, the "quantum superiority" of the Server.
. QWIRE Practice: Formal Verification of Quantum Circuits in Coq
Abstract: We describe an embedding of the QWIRE quantum circuit language in the Coq proof assistant. This allows programmers to write quantum circuits using high-level abstractions and to prove properties of those circuits using Coq's theorem proving features. The implementation uses higher-order abstract syntax to represent variable binding and provides a type-checking algorithm for linear wire types, ensuring that quantum circuits are well-formed. We formalize a denotational semantics that interprets QWIRE circuits as superoperators on density matrices, and prove the correctness of some simple quantum programs.
. Shaded tangles for the design and verification of quantum programs
Abstract: We give a scheme for interpreting shaded tangles as quantum programs, with the property that isotopic tangles yield equivalent programs. We analyze many known quantum programs in this way---including entanglement manipulation and error correction---and in each case present a fully-topological formal verification, yielding in several cases substantial new insight into how the program works. We also use our methods to identify several new or generalized procedures.
. Biunitary constructions in quantum information
Abstract: We present an infinite number of constructions involving unitary error bases, Hadamard matrices, quantum Latin squares and controlled families, many of which have not previously been described. Our results rely on the type structure of biunitary connections, 2-categorical structures which play a central role in the theory of planar algebras. They have an attractive graphical calculus which allows simple correctness proofs for the constructions we present. We apply these techniques to construct a unitary error basis that cannot be built using any previously known method.
Lídia Del Rio and Lea Kraemer. Operational locality in global theories
Abstract: Within a global physical theory, a notion of locality allows us to find and justify information-processing primitives, like non-signalling between distant agents. Here we propose exploring the opposite direction: to take agents as the basic building blocks through which we test a physical theory, and recover operational notions of locality from signalling conditions. First we introduce an operational model for the effective state spaces of individual agents, as well as the range of their actions. We then formulate natural secrecy conditions between agents and identify the aspects of locality relevant for signalling. We discuss the possibility of taking commutation of transformations as a primitive of physical theories, as well as applications to quantum theory and generalized probability frameworks. This "it from bit" approach establishes an operational connection between local action and local observations, and gives a global interpretation to concepts like discarding a subsystem or composing local functions. The parts that may interest QPL the most are Section 4 (Applications), Appendix B (relation to Coecke's "non-signaling from terminality") and Appendix E (relation to GPTs). One can consider the main part of the manuscript as "extended abstract" (10 pages), and the Appendix as extra.
. Operational thermodynamics from purity
Abstract: This is an extended abstract based on the preprint arXiv:1608.04459. We propose four information-theoretic axioms for the foundations of statistical mechanics in general physical theories. The axioms "Causality, Purity Preservation, Pure Sharpness, and Purification" identify purity as a fundamental ingredient for every sensible theory of thermodynamics. Indeed, in physical theories satisfying these axioms, called sharp theories with purification, every mixed state can be modelled as the marginal of a pure entangled state, and every unsharp measurement can be modelled as a sharp measurement on a composite system. We show that these theories support a well-behaved notion of entropy and of Gibbs states, by which one can derive Landauer's principle. We show that in sharp theories with purification some bipartite states can have negative conditional entropy, and we construct an operational protocol exploiting this feature to overcome Landauer's principle.
Michal Sedlak, Alessandro Bisio and Mario Ziman. Perfect probabilistic storing and retrieving of unitary channels
Abstract: Any sequence of quantum gates on a set of qubits defines a multipartite unitary transformation. These sequences may correspond to some parts of a quantum computation or they may be used to encode classical/quantum information (e.g. in private quantum channels). If we have only limited access to such a unitary transformation, we may want to store it into a quantum memory and later perfectly retrieve it. Thus, once we cannot use the unitary transformation directly anymore, we could still apply it to any state with the help of the footprint kept in the quantum memory. This can be useful for speeding up some calculations or as an attack for process based quantum key distribution protocol or a communication scheme. We require the storing and retrieving protocol to perfectly reconstruct the unitary transformation, which implies non unit probability of success. We derive optimal probability of success for a qubit unitary transformation (d = 2) used N-times. The optimal probability of success has very simple form P = N/(N + 3). We solved the problem also for one up to five uses of a d-dimensional unitary transformation and in all these cases we find that the probability of success goes to one as N/(N âˆ’ 1 + d^2).
John Selby and Bob Coecke. Leaks: quantum, classical, intermediate, and more
Abstract: We introduce the notion of a leak for general process theories, and identify quantum theory as a theory with minimal leakage, while classical theory has maximal leakage. We provide a construction that adjoins leaks to theories, an instance of which describes the emergence of classical theory by adjoining decoherence-leaks to quantum theory. Finally, we show that defining a notion of purity for processes in general process theories has to make reference to the leaks of that theory ---a feature missing in standard definitions--- hence, we propose a refined definition and study the resulting notion of purity for quantum, classical and intermediate theories.
Peter Selinger and Francisco Rios. A categorical model for a quantum circuit description language
Abstract: Quipper is a practical programming language for describing families of quantum circuits. In this paper, we formalize a small, but useful fragment of Quipper called Proto-Quipper-M. Unlike its parent Quipper, this language is type-safe and has a formal denotational and operational semantics. Proto-Quipper-M is also more general than Quipper, in that it can describe families of morphisms in any symmetric monoidal category, of which quantum circuits are but one example. We design Proto-Quipper-M from the ground up, by first giving a general categorical model of parameters and state. The distinction between parameters and state is also known from hardware description languages. A parameter is a value that is known at circuit generation time, whereas a state is a value that is known at circuit execution time. After finding some interesting categorical structures in the model, we then define the programming language to fit the model. We cement the connection between the language and the model by proving type safety, soundness, and adequacy properties.
Sander Uijlen and Aleks Kissinger. A categorical semantics for causal structures
Abstract: We present a categorical construction for modelling both definite and indefinite causal structures within a general class of process theories that include classical probability theory and quantum theory. Unlike prior constructions within categorical quantum mechanics, the objects of this theory encode fine-grained causal relationships between subsystems and give a new method for expressing and deriving consequences for a broad class of causal structures. To illustrate this point, we show that this framework admits processes with definite causal structures, namely one-way signalling processes, non-signalling processes, and quantum n-combs, as well as processes with indefinite causal structure, such as the quantum switch and the process matrices of Oreshkov, Costa, and Brukner. We furthermore give derivations of their operational behaviour using simple, diagrammatic axioms.
Quanlong Wang. Qutrit ZX-calculus is complete for Stabilizer Quamtum Mechanics
Abstract: In this paper, we show that a qutrit version of ZX-calculus, with rules significantly different from that of the qubit version, is complete for pure qutrit stabilizer quantum mechanics, where state preparations and measurements have to be in the three dimensional computational basis, and unitary operations are required to be in the generalized Clifford group. This means any equality that can be derived using matrices can also be derived diagrammatically. In contrast to the qubit case, the proof here is more complicated due to the richer structure of this qutrit ZX-calculus.
Mirjam Weilenmann and Roger Colbeck. Analysis of the entropy vector approach to distinguish classical and quantum causal structures
Abstract: Bell's theorem shows that our intuitive understanding of causation must be revisited in light of quantum correlations. Nevertheless, quantum mechanics does not permit signalling and hence a notion of cause remains. Understanding this notion is not only important at a fundamental level, but also for technological applications such as key distribution and randomness expansion. It has recently been shown that a useful way to determine which classical causal structures give rise to a given set of correlations is to use entropy vectors. We consider the question of whether such vectors can lead to useful certificates of non-classicality. We find that for a family of causal structures that include the usual bipartite Bell structure and the bilocality scenario they do not, in spite of the existence of non-classical correlations. Furthermore, we find that for many causal structures non-Shannon entropic inequalities give additional constraints on the sets of possible entropy vectors in the classical case. They hence lead to tighter outer approximations of the set of realisable entropy vectors, which we are able to supplement with inner approximations. This enables a sharper distinction of different causal structures. Whether our improved characterisations are also valid for the quantum case remains an open problem whose resolution would have implications for the discrimination of classical and quantum cause and would give novel insights into the question of whether there exist additional inequalities for the von Neumann entropy.
Abstract: This paper charts a very direct path between the categorical approach to quantum mechanics, due to Abramsky and Coecke, and the older convex-operational approach based on ordered vector spaces (recently reincarnated as "generalized probabilistic theories"). In the former, the objects of a symmetric monoidal category $\mathcal C$ are understood to represent physical systems and morphisms, physical processes. Elements of the monoid ${\mathcal C}(I,I)$ are interpreted somewhat metaphorically as probabilities. Any monoid homomorphism from ${\mathcal C}(I,I)$ into $\mathbb R_{+}$, enabling us to interpret some scalars as actual probabilities, gives rise to a natural covariant functor $V_o$ from $\mathcal C$ into the category of ordered real vector spaces and positive linear maps. Moreover, the image category $V_o(C)$ is closed under a well-defined and reasonably well-behaved bilinear product $V_o(A), V_o(B) \mapsto V_o(A \otimes B)$, satisfying an un-normalized no-signaling condition. Under an additional local-tomography assumption (satisfied by most of the standard examples, but which one would like to weaken), this makes $V_o({\mathcal C})$ a symmetric monoidal category, and $V_o$, a monoidal functor. A choice of order units $u_A \in V_o(A)^{\ast}$ with $u_{A \otimes B} = u_{A} \otimes u_B$ picking out a convex set $\Omega_o(A)$ of normalized states, allows us to interpret $V_o({\mathcal C})$ as a category of convex operational models. By considering different closures of $\Omega_o(A)$, we obtain two additional monoidal functors $V_1$ and $W$, from a subcategory ${\mathcal C}_u$ of ${\mathcal C}$, to a monoidal category of complete base-normed spaces. If ${\mathcal C}$ is the category of complex Hilbert spaces and linear mappings, with $p : {\mathcal C}(I,I) = {\Bbb C} \rightarrow \mathbb R$ given by $p(z) = |z|^2$, $V_o(A)$ can be identified with the space of finite-rank self-adjoint operators on $A$, $u_A$ is the trace, ${\mathcal C}_u$ is the category of trace-preserving positive linear mappings, $V_1(A)$ is the space of self-adjoint trace-class operators, and $W(A)$ is the space spanned by the (not necessarily normal) states on the algebra of bounded operators on $A$.