QFedFisher

QFedFisher in Quantum Federated Learning: Fisher Data Unlocks Improved Model Performance and Privacy

Researchers from North Carolina State University, Amandeep Singh Bhatia, and Sabre Kais, along with their colleagues, have revealed a novel Quantum Federated Learning (QFL) algorithm that significantly enhances model performance and robustness while protecting sensitive data, marking a significant advancement for decentralized machine learning. This novel method, called QFedFisher, overcomes long-standing difficulties in federated learning by utilising the complex idea of Fisher information to locate and maintain the most important parameters in quantum models.

You can also read What is Decoherence in Quantum Computing, And Challenges

Federated Learning (FL), which allows several customers to jointly train a global model without revealing their raw, sensitive data, has quickly acquired popularity across a variety of industries, including healthcare and finance. Through iterative communication between a central server and participant clients, this decentralized training improves data security and privacy by only exchanging model parameters.

High communication costs, lengthy processing times, heightened susceptibility to privacy risks, and the substantial difficulties presented by heterogeneous client data where data distributions are neither independent nor identically distributed (non-IID) are some of the significant obstacles to the practical application of FL. Although it works well with IID data, traditional federated averaging (FedAvg) finds it difficult to converge and sustain high accuracy in certain real-world non-IID settings.

Quantum Federated Learning (QFL) has rapidly evolved as a result of the intriguing opportunities created by the convergence of FL with parameterized quantum circuits. Beyond the constraints of individual quantum nodes, QFL seeks to leverage the collective strength of distributed quantum resources. It has demonstrated promise in a number of industries, including manufacturing, healthcare, and finance. Notable examples in federated environments are variational quantum circuits (VQCs) and quantum neural networks.

You can also read Quantum Fisher Information Scaling in many-body Interaction

Fisher information, which measures the amount of information a quantum states contains under parameter changes and offers vital insights into its geometric and statistical features, is the fundamental invention of the QFedFisher method. This technique efficiently determines the critical parameters that have a major impact on the performance of the quantum model by calculating Fisher information on local client models, guaranteeing their preservation during the critical aggregation phase. This special feature facilitates the successful integration of various client datasets into a single global model, thereby overcoming the difficulties presented by data heterogeneity.

Amplitude encoding, which converts N-dimensional input data into the amplitudes of an n-qubit quantum state, is commonly used to encode classical data into a quantum state in the first step of designing a variational quantum classifier (VQC) in QFedFisher. A VQC made up of entangling CNOT gates and single-qubit rotations (RY and RX) is then implemented. During training, a classical optimiser modifies these parameterised rotations to reduce a predetermined loss function.

You can also read QCrank Protocol for DPQAs: Decoding Quantum Algorithms

Each client (i) updates its local quantum circuits parameters during local training using the ADAM optimiser and a cross-entropy loss function. Importantly, a parameter-specific measure of sensitivity is provided by computing the Fisher information vector for each parameter. The Fisher information matrix is subjected to layer-wise min-max normalisation prior to sending model parameters and Fisher information to the global server.

The global server uses a complex three-step procedure to coordinate client updates:

  • Weighted Average: Every client’s contribution is weighted by the size of its dataset, and the server first calculates a weighted average (θ_avg) of all clients’ model parameters.
  • Fisher-Average Gradients and Update: Next, it calculates the entire sum of Fisher information matrices (F_s), Fisher-average gradients (G_s), and a weighted sum of model parameters using the clients’ Fisher information. After that, a revised global model parameter (θ_s^r) is computed using these.
  • Parameter Substitution: Lastly, QFedFisher uses a predetermined Fisher threshold (δ) to identify less significant parameters. A parameter is replaced with its value from the weighted average (θ_avg,j) if its total Fisher information (F_s) is less than this cutoff; if not, it is kept. By avoiding their overwriting by potentially noisy or less important global parameters, this important step guarantees that vital local client contributions are preserved.

You can also read Q-CTRL Fire Opal Software: UKs Train Scheduling with Quantum

Extensive experiments were conducted on two different, real-world non-IID datasets to thoroughly evaluate the efficacy and viability of QFedFisher: MNIST for multi-class digit recognition and ADNI for binary classification (Alzheimer’s illness vs. normal cognition). The suggested strategy was contrasted with cutting-edge techniques such as QFedAvg and QFedAdam.

Over 100 communication rounds, QFedFisher consistently outperformed QFedAvg and QFedAdam in terms of accuracy and convergence speed for the ADNI dataset, which was unevenly distributed among 10 clients. Despite the difficult data distribution, it successfully distinguished between normal MRI scans and those linked to Alzheimer’s disease, achieving a testing accuracy of 89.9% (Table 1).

For the ADNI dataset, QFedFisher‘s client-side computational cost was less than 12.1% of QFedAvg.
Similarly, the global QFedFisher model outperformed sources by obtaining faster convergence and higher accuracy for the MNIST dataset, which was distributed among 100 clients (5% participating per round over 300 rounds).

You can also read Q-CTRL Fire Opal Software: UKs Train Scheduling with Quantum

Prioritising the preservation of important parameters over the Fisher threshold (δ=0.01), QFedFisher achieved a testing accuracy of 91.2%. For the majority of realistic QFL applications, the additional computational expense for determining Fisher information was determined to be manageable, usually accounting for less than 15% of the overall time needed for the QFedAvg approach.

In summary, utilizing the inherent geometry of the parameter space in quantum systems, including Fisher information into quantum federated learning offers a principled approach to client model optimization. Despite data disparity, QFedFisher effectively manages the difficulties posed by client heterogeneity, guaranteeing that the aggregated model gains from balanced contributions.

Within a set number of communication rounds, the experimental results clearly show that QFL, which uses layer-wise Fisher information of quantum circuits, is more robust and achieves better testing accuracy than current approaches. Building on Fisher information’s capacity to recognize and safeguard sensitive parameters during model aggregation, the researchers intend to expand their work in the future by integrating privacy-preserving strategies.

You can also read Quobly And Inria to Advance Silicon-Based Quantum Computing

Thank you for your Interest in Quantum Computer. Please Reply

Trending

Discover more from Quantum Computing News

Subscribe now to keep reading and get access to the full archive.

Continue reading