Transcript
This transcript was autogenerated. To make changes, submit a PR.
hi everyone.
I'm Sai.
K. It's a pleasure to be here at the conference 42 to share what I've learned
about the quantum computing today.
So when I first began exploring this field, I vividly recall reading a paper
that described quantum speedups as almost magical science, like fiction made real, I
felt excitement, mingled with skepticism, of course, like who won't, right?
Could these fragile devices truly deliver the things that
they're promising over time?
Small experiments convinced me there is substance beneath the hype.
Today we live in a Nisc era where hardware remains imperfect, but already offers
tangible benefits in scoped problems.
Over the next 20 to 30 minutes, I'll guide you through where we
stand, why quantum matters, concrete examples in the industry, and how
to prepare for what lies ahead.
Let's begin by setting the scene and revisiting how we have arrived here
and why even early experiments matter.
So let's actually move on to where we stand today.
What is the current quantum computing landscape?
So classical computing has driven incredible advances in smartphones,
cloud services, machine learning, but certain challenges grow exponentially
harder as scale increases.
For example, the traveling salesman problem, of course, who doesn't
remember the traveling salesman problem?
When you're reading about computer science, this is one of the
most head scratching problems that one has read about, right?
So with a few cities, of course it's trivial.
Let's say take an example of two to three cities, it's not as hard, but
as you add more and more cities.
Possibilities, literally explode our molecular simulations, modeling
electron interactions in even moderately sized molecules can quickly,
overall, because of the sheer number of electrons, and frankly speaking
how they interact with each other.
Quantum computing introduces superposition and entanglement.
Letting us explore many possibilities in parallel and capture
correlations beyond classical reach.
So which is exactly where we what we want.
And in today's NSC hardware, we have 10 to low hundreds of qubits, noisy
operations, and limited coherence times.
Despite these constraints creative algorithms and hybrid workflows often
yield advantages on scoped problems.
Again, as I said, the problems need to be scoped.
I remember a pilot where a small team simulated a simple molecule.
Raw hardware results were very noisy.
Yet after combining with classical approximations and error
mitigation, they guided a chemist away from a costly lab test.
Of course, we cannot afford to make expensive mistakes these days
with so much technology available, and as such, these things matter.
That success, even though felt modest, a fraction of a larger pipeline, but it
actually built to confidence and skills.
Understanding this context prevents over hype.
We pursue quantum even because even small wins now build expertise and
position us for larger breakthroughs.
Of course, this is exactly how Industrial Revolution started as well, right?
So think of it like learning how you, how you set to sail.
You start with a small boat in a calm, open waters before
you tackle the open ocean.
Now let's actually dive deep into the core quantum principles in practice.
So at the heart of the quantum advantage lies superposition and entanglement.
So again, like as you can see, basically there are, there's a lot of jargon.
And frankly speaking it gets overwhelming when you're trying to learn new stuff.
Of course, I can understand.
So let's take a simple example.
Imagine spinning a coin until it lands.
You don't know whether it is heads or tails.
Similarly, a qubit can hold both possibilities simultaneously when qubits
entangle their states link, so that measuring one affects the whole, allowing
representation of complex correlations, classical bits cannot efficiently in code.
So in practice, algorithms like quantum approximation optimization, world
optimization objective into a circuit.
The samples candidate solutions though current devices are noisy,
running shallow circuits and combining outputs with classical
post-processing can reveal better solutions than classical heuristics.
I recall sitting in a meeting like where?
Engineers framed a logistics routing problem on a quantum test, so they took
a region with about dozen stops encoded cost functions into a variational
circuit and ran hardware experiments.
Raw outputs were super noisy and inconsistent.
But after collecting enough samples and applying mitigation techniques
to mitigate the noise, they observed candidate routes that classical
solvers could not even recognize.
So they tested those routes in a simulation and saw slight
improvements in total travel time.
Again, I don't want to exaggerate the advantages that we got because
these are small wins that we are accounting for right now.
It wasn't earth shattering as I was just saying, but it demonstrated that quantum
could contribute to something new.
So again, it's a well-defined slice of a problem.
As you can see, the variation, the variables are defined, the scope is
defined, so and so on and so forth.
That practical doorway, identifying a manageable priest that matters is
where n era value emerges because we want to definitely make sure that
we are working with the classical computing, but again we want to make
sure that we do slightly better.
Even if possible.
Now let's actually try to see how do we work with n constraints.
What are the challenges that we are, do we are finding, and
how do we navigate around it?
So real hardware today isn't flawless noise, corrupt calculations and qubit
counts limit problem problem size.
Instead of seeing this purely as a barrier, let's actually choose.
Problems tailored to NIST machines.
For example, small scale portfolio optimization, simplified molecular
models or routing sub-problems in logistics fit within tens
of qubits and shallow circuits.
Again, you don't want to go into hundreds of qubits because the, these are limited
for no error mitigation techniques such as zero noise, extrapolation or randomized
compiling help extract meaningful results.
Despite noise.
I remember a team kind of running the same circuit at different gate
durations to amplify noise and then extrapolating data back toward a zero.
Nice estimate.
Results aligned very close to the classical benchmarks
than raw readings ever did.
These steps require extra circuit trends.
Of course, you want to make sure there are enough samples and basically
and classical computation, but.
Transforming noisy outputs into actionable insights.
Hybrid workflows prove essential because this is where I think, again, we don't
want to directly say either or that, so we want to use a hybrid approach, classical
pre-assessing narrow candidate sets.
Once you can add narrow, the candidate said quantum circuits
refine or evaluate them.
Classical post processing interprets outputs a short list, a few dozen,
and then use quantum subin to estimate energies more precisely
for that specific short list.
Guiding experimental decisions in logistics are global routing problem might
be split into several regional slices.
Classical solvers handle the broad layout.
While quantum kind of pilots refine the specific legs for
each of these constraints.
So embracing constraints creatively is exactly what we are looking for.
So now let's take a couple of examples, right?
How do we handle quantum quantum computing in the financial services landscape?
In financial services optimization and risk assessment draw very early interest
because that is exactly what we look for.
Whenever anybody speaks of financial services, consider portfolio optimization.
Classical solvers may take hours or days for large portfolios.
A quantum inspired ran mid-sized instances faster, and suggested
allocations that improved risk adjusted returns modestly in pilot tests.
Again, this is like purely pilot tests.
In one case, a wealth management firm defined a pilot with 30 to
40 assets integrating expected returns and core variance data.
They ran various circuits on simulators and small hardware to
sample candidate allocations, and then compared them to classical solutions.
The quantum assisted results sometime offered slight shifts that went
back tested, showed a few basis points of improved sharp ratio.
Those gains translate to meaningful dollar value when assets under management
scale into hundreds of millions.
So another area is a Monte Carlo risk analysis.
Classical Monte Carlo may require millions of samples to estimate tail risks.
Quantum amplitude estimate promises reduced sampling
requirements in ideal conditions.
So while full speedups evade more mature hardware, NSCA hybrid sampling
methods have cut simulation time modestly by integrating short
death quantum circuits to estimate probabilities more efficiently.
A risk team ran a pilot feeding random paths into quantum subin to assess
extreme loss bilities, while not a full speed up yet, it sharpened tail risk
estimates in a fraction of classical only pipeline time fraud detection.
Also benefits encoding transaction at into quantum kernel.
Methods can highlight subtle correlations, classical methods usually.
Miss in A POC, which was done.
Analysts coded features of transaction graphs onto quantum circuits and
used outputs in a classifier.
Results card patterns that improve detection precision slightly.
Again, as I said, small wins matter.
These remains absolutely proof of concepts, but they teach teams how
to integrate quantum sub routines.
Validate outputs and align experiments with compliance and security needs
organizations built this know-how now that they're actually well prepared to
where the hardware can scale and how to make sure we use hybrid approaches.
Now, let's take another example.
What do we do in a pharmaceutical r and d case?
So drug discovery and molecular simulation naturally aligned with
quantum comedic because that, that's where it's supposed to shine, right?
Frankly speaking, chemistry based quantum mechanics and classical
approximation struggle with electron interactions in complex molecules.
Early experiments of small molecules like hydrogen chains, small organic compounds,
demonstrate quantum circuits modeling electronic structure more faithfully.
In one collaboration, researchers ran hardware experiments to estimate ground
state energies of a small molecule.
Raw results were very noisy.
Yet after mitigation and combining with classical methods, they matched
expectations closely enough to filter out false lead, which is very
important and a huge way, frankly speaking, that saved weeks of lab
synthesis and testing in another exam.
In another example, completely different example.
A virtual screening pipeline used quantum sub proteins to
define binding energy estimates.
For a short list of candidate ligands targeting a known enzyme, classical
talking, produced 50 candidates, quantum enhanced evaluation, trimmed
it down to 15 experiment essays, then prioritize the top five, one
of which showed promising activity.
As you can see, there is the filtering is done.
Very efficiently beyond small molecules, teams explore protein
folding or binding side computations.
While classical AI tools like alpha fold excel at predicting
structures, quantum methods may offer complimentary insights into energy
landscapes or dynamic confirmations.
A startup ran hybrid loops.
Classical simulations suggested several confirmations, quantum subro routines,
estimated relative energies under.
Different conditions.
Results informed lab experiments in lead optimization.
Quantum enhanced machine learning, refines predictions of toxicity
or solubility, encoding molecular descriptors into quantum kernel models.
Some.
Some sometimes improved classification accuracy on historical data sets,
guiding selection of safer scaffolds.
Pharma r and d also considers proper optimization of clinical trial
designs, defining patient cohorts based on multiple biomark biomarkers.
Quantum inspired algorithms helps explore trial parameter
spaces to maximize statistical power under budget constraints.
A biotech firm.
Ran a pilot to optimize trial arm allocations given limited patient
populations and multiple endpoints.
Of course, the results while full scale benefit evades larger machines,
the excise built institutional understanding of mapping such
problems to quantum fend formulations.
These pilots show that investing in quantum chemistry and optimizational
skills today speeds readiness for larger simulations when
error corrected machines arrive.
Now let's actually see how to do logistics and supply chain.
Again, we are trying to establish how quantum computing can actually help
in various industries because of which it's better to go over each and how each
of the POCs were done in each of the industries and how understand better on
how quantum computing can help logistical problems like vehicle routing, inventory
allocation, and scheduling often become combinatorial mountains in pilot studies.
Quantum.
Inspired solvers explore routing variance beyond classical heuristics,
uncovering alternate routes that reduce total distance or delivery time by a
noticeable margin on scoped instances.
For example a delivery company modeled a region with 15 vehicles
and hundred stops during peak season.
Classical SOLs found good solutions quickly.
But a hybrid quantum test suggested a slight rerouting for one vehicle
that literally avoided a traffic bottleneck predicted by recent data
shaving of 5% of the total time, over hundreds of deliveries that quickly
add up over lower labor costs, happier customers, essentially speaking
inventory management benefits to.
So now.
Optimizing stock across warehouses under capacity and demand.
Uncertainties can use quantum enhanced heuristics.
A retailer pilot piloted a hybrid workflow where classical forecast
predicted demand distributions, quantum sub proteins, then optimized
inventory allocations the result.
Reduce overstock by 10% and out of stock incidents by 8% compared
to classical only benchmarks.
So scheduling maintenance crews across sites with various skills and time
is another combinatorial challenge.
A utility company tested quantum inspired servers to assign crews
under multiple constraints.
Early results showed 15% better utilization in simulated scenarios,
prompting deeper investigation.
Even demand forecasting sees quantum exploration while classical
machine learning dominates.
Some teams test quantum machine learning prototypes to capture
complex feature correlations.
In one pilot, a supply chain group, encoded time series and categorical
features into quantum feature maps and used, various clash variational
classifiers to predict demand spikes.
Accuracy improvements were modest, but consistent across
several data scope data sets.
The key is basically isolating manageable sub-problems, focus initially on critic
critical product line or regional hub rather than the entire global network.
It's ba It's important for us to slice the problem down into
multiple smaller sub slices.
Now let's actually move on to how do we assess.
Quantum hardware and simulators.
Just a quick second.
Yeah.
So multiple platforms make quantum export experiments accessible cloud
services provide entry points to super contracting tab trap Dion, and enabling
devices AWS bracket Azure Quantum, IBM Quantum, Google Quantum ai all offer
simulators with noise models and real hardware queues when starting prototype
on similars to refined circuits.
Check correctness and anticipate nice impact.
I recall juggling different SDK versions one day testing on and then on circ,
then on an ion trap system via AWS.
Each had its own quirks, frankly speaking, documenting dependencies,
gate sets, connectivity maps, and nice parameters became part of these workflows.
So that upfront effort pays off.
You already know how to adapt your certificate certificates to newer
devices, queue times and cost vary.
Of course, running many shots on real hardware can incur daily
end expenses to manage that plan.
Experiments carefully simulate first and then only run final circuits on hardware.
Use nice simulators to predict outcomes and catch major issues early.
Keep experiments small.
If a problem, max to 60 qubits, but only device only offers 30 with good fidelity.
Identify a reduced instance or partition the problem into multiple sub problems.
Maintain scripts to log, run metadata.
Like date device, calibration metrics, circuit version, so on and so forth.
So you can actually compare results over time as hardware evolves.
Engaging early with platform communities like forums, user groups, helps also
learn better practices and avoid pitfalls because I have, I myself, have been
engaging in user groups for quite some time and I learned quite a lot than just
using regular documentation over time.
This actually helps build reliable experimentation pipeline.
So now actually, how do we do software and algorithmic development?
Developing qu quantum applications involves more than writing circuits.
First translate a business problem into quantum friendly formulation for
optimization, defined cost functions, and constraints that fit variational
algorithms or icing model mappings.
In a finance pilot, a team had to express expected return risk, co variance as
cost Hamiltonian translating domain concepts into qubit interactions.
Next, optimize circuits for hardware.
Choose qubit mapping, mappings respecting connectivity.
Decompose gates into native operations and limit depth to
stay within coherence times.
It's like customizing a recipe for available kitchen tools, right?
You might want to swap an ingredient here and there, and it just a cooking
time based on the stove during the execution, apply error mitigation,
run circuits at varied or nice levels, correct measurement, statistics, and
extrapolate towards ideal results.
I remember the ra.
I remember running the same circuit with doubled gait durations to amplify noise.
Then using extrapolation to estimate the zero noise output.
Though imperfect it gave a clearer signal.
Post-processing integrates.
Quantum outputs into classical decision pipelines.
Feeding refined solution into larger workflows.
It is key.
Basically simulate with realistic, nice models.
As circuits run hardware tests, analyze deviations, tweak again, over time.
Teams develop intuition.
You get to know this over time and version controls matter.
Keep circuits and repository tag versions.
Track trans translation settings.
Avoid simulator tests to catch regressions.
Automate, sorry, automate.
Simulator test or casual regressions log results systematically
so that you actually get to know what is really going on.
You can compare tests over time, and so on and so forth.
Next, what goes on with the organizational strategy, a structured
approach, maximizes impact.
First, identify use cases with clear metrics and manageable scope.
Host workshops where domain experts outline pain points, then assess
which might suit quantum pilots.
For example, a retail team might point to seasonal inventory imbalances.
A logistic group flags, dynamic routing challenges.
Second, build capability train staff through online
courses internal hackathons.
Our partnerships with research labs, so on and so forth, form small Quantums Conor
teams empower to prototype ideas quickly.
Third, execute proof of concept experiments run on simulators.
Then hardware define success criteria such as solution quality improvement over time.
S saving simulative to classical baselines.
In one organization, they set a threshold.
If a quantum pilot could match classical results within 5% of error, they
considered worthwhile stepping stone.
Even matching performance builds familiarity.
You get to understand the quantum landscape, so allocate budget
for cloud compute over time.
Maintain a roadmap that revisits use cases as hardware matures.
Avoid ad hoc experiments with no clear business alignment.
You would definitely want to focus on measurable outcomes
and learning objectives.
I have seen companies spend months.
Running random circuits without tying them to specific problems.
They end up with reports full of technical jargon, but little business insight.
In contrast, teams aligning pilots to concrete use cases actually do better.
So what is a near time?
What is a future of the quantum business value?
I. Even in NSC teams report modest but meaningful improvements, faster
optimization on scale, on small to medium instances, I might save a
few percentage of delivery times.
A chemistry experiment might roll out of all state early finance.
Test my Titan Risk Securities.
What starts as a pilot on a reduced small scale model
becomes a full scale application.
Later the experience gained understanding noise mitigation, hybrid design platform,
a platform quirks become priceless when fault tire and machines arrive.
So by balancing realistic expectations with curiosity and experimentation.
You actually understand how to harness quantum breakthrough breakthroughs as soon
as they become practical in my, sorry.
In my experience, teams that document every pilot success or
failure built a knowledge base that later accelerated their work when
improving devices re sufficient scale.
What once was a dozen qubit experiment becomes a stepping stone to hundreds
of qubits with manual faction.
What are the key takeaways right?
Quantum computing in today's SQA is not a magic bullet, but if it, it offers
genuine opportunities when we choose suitable problems, adopt hybrid methods
and building internal capability pace dividends as hardware improves, I
encourage you to identify one well scope use case in your domain, perhaps a small
optimization or a simulation that fill tens of qubits and shallow circuits.
Run a pilot on simulators.
First, refine your formulation.
Test circuit designs, then move to hardware runs, collecting data.
Document literally every step.
Share all your lessons across teams.
Engage with a wider com quantum community.
Over time, you'll build a repository of patterns, common ways to map problems,
to variation circuits these patterns.
Reduce friction for future projects.
Allocate a mod and how?
Encourage a culture of exploration balanced with pragmatism.
Celebrate small wins, which will definitely help you set the groundwork.
Embrace this explore area era of exploration with both
enthusiasm and pragmatism.
Thank you for listening.
I look forward to your questions and your continued.
Conversations about how quantum can actually create
value in your organizations.
Thank you again for joining Con 42.
Hope you all have a good time.
I.