qtml2024.pptx presentation bmw group airbus

rizaalaudin1 1 views 16 slides Oct 07, 2025
Slide 1
Slide 1 of 16
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16

About This Presentation

qtml 2024


Slide Content

Classification of the Fashion-MNIST Dataset on a Quantum Computer - with a focus on data encoding Kevin Shen , Bernhard Jobst , Elvira Shishenina , Frank Pollmann | QTML 2024, Melbourne

Data Encoding What ? - Loading classical data ( stored on classical computer) into quantum states 1 . 1 When yes ? - P rocess classical data by quantum algorithms - Q uantum algorithm encoding + processing   When no? - When data are given as quantum states - For some generative modelling algorithms - …

Data Encoding 2 1. Vojtěch Havlíček et al. Supervised learning with quantum-enhanced feature spaces 2. Schuld et al. Quantum machine learning in feature hilbert spaces 3. Sofiene Jerbi et al. Quantum machine learning beyond kernel methods aaaaaaa Consider . How?   Decision to be made: Do we look for quantum advantage from encoding or processing ? If processing : Encoding should be resource efficient and keep sufficient information If encoding : “Quantum feature maps”: kernel methods variational circuits 1 2 3  

Data Encoding 3 Suppose we look for quantum advantage from processing . Consider . How? 1 - Amplitude encoding :   Efficiency? - encoding gate count is exponential in the number of qubits 2 3 1. Alessandro Luongo https:// quantumalgorithms.org / 2. Ville Bergholm et al. ‚Quantum circuits with uniformly controlled one-qubit gates‘ 3 . Xiaoming Sun et al. ‚ Asymptotically Optimal Circuit Depth for Quantum State Preparation and General Unitary Synthesis‘ Consequences? - challenges provable quantum advantage from processing - challenges empirical studies now / in near future

Our motivation : To make amplitude encoding more practical for quantum machine learning experiments - Use data compression ( efficient but approximate encoding ) - If processing is robust against (or benefit from ) noises in data 4 Data Encoding Data Compression   How to evaluate data compression? - Fidelity - Performance of the processing task

Data Compression by Matrix Product States (MPS) 5 What is MPS? - One way to classically store 1D quantum state (Amplitude encoding goes here )   Image source: https:// tensornetwork.org / - Can compress data by lowering bond dimension - If low entanglement in state low bond dimension ( low memory ) & l ow approximation errors  

Data Compression by Matrix Product States (MPS) 6 How to use MPS- compressed amplitude encoding for QML? 1. Classically optimize . 2. Make MPS in right-canonical form. 3. Map MPS to sequential quantum circuit .   Rohit Dilip et al. ‚Data Compression for Quantum Machine Learning‘ Circuit gate count : - compared to from exact encoding where and is bond dimension   Key Question: - How big is ? Better not be  

Question 1: Why can we assume low ?   Amplitude encoding and Flexibile Representation of Quantum Images (FRQI) of real-life images have low entanglement -> low bond dimension   7 1. Phuc Q. Le et al. A flexible representation of quantum images for polynomial preparation , image compression , and processing operations (Our work #1) Bernhard Jobst et al. Efficient MPS representations and quantum circuits from Fourier modes of classical image data FRQI 1 : FRQI preserves global color scaling, Amplitude encoding d oes not.

8 Exact Encoding Compressed by   (Our work #1) Bernhard Jobst et al. Efficient MPS representations and quantum circuits from Fourier modes of classical image data

We derived an error bound : R egardless of image resolution ( which determines how many qubits are needed ), MPS of gives approximation error ( infidelity ), if data have algebraically-decaying Fourier c oefficients which is satisified by most natural images ( see figure )   1. Phuc Q. Le et al. A flexible representation of quantum images for polynomial preparation , image compression , and processing operations (Our work #1) Bernhard Jobst et al. Efficient MPS representations and quantum circuits from Fourier modes of classical image data Remaining question: How to compute the MPS? - SVD, stochastic methods, interpolation methods, …

Question 2: Can we make it more experiment- friendly ? We consider MPS- inspired sequetial circuits with 1. Repeating layers of general two qubit gates 10 2. Repeating layers of hardware native gates qubit gates   (Our work #2) Kevin Shen et al. Classification of the Fashion-MNIST dataset on a Quantum Computer Rohit Dilip et al. ‚Data Compression for Quantum Machine Learning‘ L   L  

11 (Our work #2) Kevin Shen et al. Classification of the Fashion-MNIST dataset on a Quantum Computer We want to observe the tradeoff between efficiency and accuracy in experiments More expressive More accurate   Simpler circuit Fewer hardware errors  

12 - We compressed the full Fashion-MNIST dataset ( 10.5281/zenodo.10680772 ) - We classically simulated the training of a 10-class variational quantum classifier - We deployed the variational classifier on an IBMQ 27 qubit chip. (Not available anymore) Experiment setup (Our work #2) Kevin Shen et al. Classification of the Fashion-MNIST dataset on a Quantum Computer

13 Red: simpler Ansatz Blue: more complex Ansatz Solid: On simulator Dotted: On IBMQ Dash: Exact FRQI encoding (Our work #2) Kevin Shen et al. Classification of the Fashion-MNIST dataset on a Quantum Computer

What can we do together? Think more about data encoding Encoding for different data structure and different applications Resource efficiency Experimental implementation … 14

Thank you for listening
Tags