Using Quantum Computers

Goal

Set up and run a hybrid quantum/classical workflow where selected parts of the computation (e.g., sampling) run on actual quantum hardware, while the rest of the computation runs on classical infrastructure.

When to Use Real Hardware

Running on real quantum hardware is the main goal of using Kvantify Qrunch. However, real devices are noisy, and have usage costs. We recommend:

  • Using a simulator for development and debugging;

  • Switching to real hardware only for final benchmarking and validation;

  • Keeping estimators on a simulator unless you specifically need to measure them on hardware.

As hardware improves and costs reduce, more parts of the workflow can be run on real quantum devices. See Running FAST-VQE on Quantum Computers for details on how to perform a cost analysis before running on hardware.

Semi-Remote Execution

Kvantify Qrunch supports semi-remote execution by simply selecting a braket backend device. See Choose a Backend for details on how to select a quantum backend device.

The backend can be used with the sampler and/or the estimator. The classical part of the calculation (namely the parameter optimization) will then run locally, while the sampling and/or estimator part will run on the selected quantum device.

from pathlib import Path
import qrunch as qc
from braket.devices import Devices

def my_script() -> dict[str, list[float] | list[int]]:

   # Read the xyz file, and construct a molecule object.
   molecular_configuration = qc.build_molecular_configuration(
       molecule="some_xyz_file.xyz",
       basis_set="sto3g",
   )
   ground_state_problem_builder = (
       qc.problem_builder_creator()
       .ground_state()
       .standard()
       .create()
   )
   problem = ground_state_problem_builder.build_restricted(molecular_configuration)

   # Real Quantum Hardware for sampling
   quantum_computer_sampler = (
      qc.sampler_creator()
      .backend()
      .choose_backend()
      .amazon_braket(device=Devices.IQM.Garnet) # Use IQM Garnet for sampling. This is enough to run on real hardware.
      .create()
   )

   # Simulator for expectation values (keeps cost/time down)
   estimator = (
      qc.estimator_creator()
      .excitation_gate()        # Use excitation gate estimator
      .create()
   )

   gate_selector = (
      qc.gate_selector_creator()
      .fast()
      .with_sampler(quantum_computer_sampler)  # Use IQM Garnet for sampling
      .with_shots(shots=1000)                  # Finite-shots sampling keeps cost down.
      .create()
   )

   fast_vqe_calculator = (
      qc.calculator_creator()
      .vqe()
      .iterative()
      .standard()
      .with_gate_selector(gate_selector)
      .with_estimator(estimator)
      .with_estimator_shots(shots=None)  # Unlimited shots on simulator
      .create()
   )

   result = fast_vqe_calculator.calculate(problem)

   expectation_values = result.total_energy_per_macro_iteration_with_initial_energy_and_final_energy
   energies = [exp.value for exp in expectation_values]
   errors = [exp.error for exp in expectation_values]

   print("   Energies: ", energies)
   print("   Errors:   ", errors)

if __name__ == "__main__":
     my_script()

The most important part of the code above is: .amazon_braket(device=Devices.IQM.Garnet) — this tells Kvantify Qrunch to use the IQM Garnet quantum computer for sampling. This single line is enough to run sampling on real hardware.

Remote Execution

Alternatively, you can run the entire workflow as a hybrid job on AWS Braket, thus performing fully remote execution. Below is a minimal hybrid job setup that uses IQM Garnet for sampling and a simulator for expectation value estimation.

Running this example requires that you have uploaded the files to S3. Log on to https://console.aws.amazon.com, go to S3 and upload files.

You also need to input your KVANTIFY_INDEX_URL in the code below. It is the same URL you use to download Qrunch, see Getting Started.

Hybrid Job example: VQE with IQM Garnet for Sampling
from pathlib import Path

from boto3 import client  # type: ignore
from boto3.s3.transfer import S3Transfer
from braket.devices import Devices

import qrunch as qc

# Setup S3 connection for retrieving data.
# S3 stands for Amazon Simple Storage Service, a cloud storage service provided by AWS.
s3_data_bucket = "amazon-braket-us-east-1-ID"
s3_client = client("s3", "us-east-1")
transfer = S3Transfer(s3_client)

# Create a local file for xyz.
local_tmp_xyz_file = Path("/tmp/tmpfile.xyz")
local_tmp_xyz_file.touch()

quantum_computer = Devices.IQM.Garnet


def my_script() -> dict[str, list[float]]:
    """Script to be run as a hybrid job on AWS Braket."""
    # Download data from S3 to a temporary local file
    transfer.download_file(s3_data_bucket, "data/acrolein_water.xyz", str(local_tmp_xyz_file))

    # Read the xyz file, and construct a molecule object.
    molecular_configuration = qc.build_molecular_configuration(
        molecule=local_tmp_xyz_file,
        basis_set="sto3g",
    )
    ground_state_problem_builder = qc.problem_builder_creator().ground_state().standard().create()
    problem = ground_state_problem_builder.build_restricted(molecular_configuration)

    # Real Quantum Hardware for sampling
    quantum_computer_sampler = (
        qc
        .sampler_creator()
        .backend()
        .choose_backend()
        .amazon_braket(device=quantum_computer)  # Use IQM Garnet for sampling.
        .create()
    )

    # Simulator for expectation values (keeps cost/time down)
    estimator = (
        qc
        .estimator_creator()
        .excitation_gate()  # Use excitation gate estimator
        .create()
    )

    gate_selector = (
        qc
        .gate_selector_creator()
        .fast()
        .with_sampler(quantum_computer_sampler)  # use IQM Garnet for sampling
        .with_shots(shots=1000)  # finite-shots sampling keeps cost down.
        .create()
    )

    fast_vqe_calculator = (
        qc
        .calculator_creator()
        .vqe()
        .iterative()
        .standard()
        .with_gate_selector(gate_selector)
        .with_estimator(estimator)
        .with_estimator_shots(shots=None)  # Unlimited shots on simulator
        .create()
    )

    result = fast_vqe_calculator.calculate(problem)

    expectation_values = result.total_energy_per_macro_iteration_with_initial_energy_and_final_energy
    energies = [exp.value for exp in expectation_values]
    errors = [exp.error for exp in expectation_values]

    output: dict[str, list[float]] = {
        "energies": energies,
        "errors": errors,
    }
    return output


if __name__ == "__main__":
    # ===== Start ===== "Remote simulation" =====
    AWS_REGION = "us-east-1"
    docker_image = qc.docker_image_creator(
        region_name=AWS_REGION,
        image_name="hybrid_job_runner",
        cloudsmith_url="{example_url}",  # The index url used when installing qrunch from the cloudsmith repository.
    ).create()

    hybrid_runner = qc.HybridJobRunner(
        result_folder=Path.home() / "result/folder",
        device=quantum_computer,
        repository_name="qrunch",
        docker_image=docker_image,
        region=AWS_REGION,
    )
    output_dict = hybrid_runner.run_hybrid_job(my_script)
    # ===== End ===== "Remote simulation" =====
  • The function `my_script` contains your calculation logic

  • The HybridJobRunner packages this logic and submits it to AWS Braket as a hybrid job

  • The Kvantify Qrunch backend detects which components should run on the quantum computer

  • In the above example:

    • Sampling runs on IQM Garnet

    • Estimation runs locally on a simulator

  • Results are stored in the specified result/folder

Best Practices

  1. Separate hardware and simulation tasks: keep only the necessary quantum tasks on hardware to minimize queue time and cost

  2. Limit number of shots: each shot on hardware takes time and increases cost; use just enough for desired statistical precision

  3. Test locally first: use the same backend but with a simulator device to verify that your script works before running on hardware

  4. Plan for queue delays: some devices may have long queues; monitor job status in AWS Braket console

See Also