Quantum Machine Learning. An interview with Santanu Ganguly.

Dan: Hello and welcome
to the quantum divide.

This is the podcast that talks about
the literal divide between classical it.

And quantum technology.

And the fact that these two domains.

Are will and need to
become closer together.

Quantum networking actually is
more futuristic than perhaps

the computing element of it.

But we're going to try
and focus on that domain.

But we're bound to experience many
different tangents both in podcast

topics and conversation as we go on.

Enjoy.

Good day and welcome
to the Quantum Divide.

This is episode six.

We're really flying through them, Steve.

Well done.

Good effort.

Steve: Yeah, the plan of one a month is.

It's quickly changed, I think.

But

Dan: Maybe we'll fall back on that in
due course, listen, this week's podcast

is going to be really interesting for
me because it's a bit of a dark art.

Quantum machine learning for me is a
combination of two fascinating topics

of which I know fairly little about.

So it feels quite sci fi for me.

I'm sure you get that all the time,
Santanu, but let me just introduce

Santanu, who's joining us today.

I feel like there's a
mega brain in the room.

That's the only thing,
that's all I can say.

When you've got a list of different
master's degrees you've got and

you're studying for a PhD and so on,
as well as working as a researcher.

Yeah, I'm just looking forward
to hearing you talk, Santanu, and

have this conversation with you.

So thanks again for joining.

An MSc in observational astrophysics,
an MSc in mathematics, and you're

studying machine learning quantum
machine learning, in fact.

Yeah, why don't I let you just give an
intro, and then perhaps if you could

start with a bit of a high level view
of what is quantum machine learning,

and and we'll take it from there.

Santanu: So thank you.

Thanks a lot for the kind words
and thanks for your time and for

inviting me to this amazing podcast.

As you're well aware, we used to
be ex colleagues, so I'm really

flattered by your kind words, but
you did not speak of the truth.

And as for Megabrain,
you are absolutely right.

There is one right there.

I can see Stephen.

So thanks a lot for being here.

Yeah To go right into it.

What is quantum machine learning that
is that is a tough question to answer

right now so So as the industry Looks
at quantum machine learning right now.

There are actually four broad
classification or types of

quantum machine learning.

So for example I'm gonna quote a figure.

Obviously this is a podcast I
called sh Shoulder Figure, but he

came out in a book by Mario Schul.

And she basically defined four
different, there's a matrix.

So she defined four different types.

So the first one was CC.

So this is basically classical.

So classical data, classical computing.

And this is where quantum inspired
methods come into the play.

And...

In this subcategory, what it addresses
is purely classical algorithms, which

processes purely classical data.

Now, the question is.

So what does that have to do with quantum?

If it's classical and classical,
there's nothing to do with quantum.

But, and that is a good question, but
before I get into too much detail, there

are some ideas from quantum computing
and quantum information science where

people have figured out that people have
figured out how to strip out a lot of the

complexities offered by quantum mechanics.

So what we are left with is
a classical idea or algorithm

that has its origins in quantum.

Hence quantum inspired.

And this is very important.

The reason I jumped into it and the
terminology is something probably will

flesh out as we talk, but quantum inspired
is very important as we speak right now.

One of the reasons quantum machine
learning started to grow out of

quantum computing is because of
the power of superposition and

entanglement that is offered.

So if you look at classical machine
learning, a lot of stuff, if not most of

the stuff, is not really deterministic.

There's a lot of probability at play.

And at the end of the day, any
classical machine learning problem

is an optimization problem.

And hence that is where quantum
machine learning comes into play.

If you look at the properties of
superposition, for example, that offers.

a multi dimensional way of addressing
certain things that in classical binary

machine learning would take up a lot
of computational cost, so to speak.

And this is also very important.

So yes That's why quantum machine
learning, very high level thoughts,

because machine learning has a lot
of probability, probabilistic stuff

in there, you do cost optimization
of cost functions, for example,

you do gradient descent and all
these are mathematical stuff.

And all these, it turns out, can be
actually done better, sometimes, not

always, sometimes with options of,
pulling in quantum machine learning.

Now, going back to where I was with the,
with what is quantum machine learning.

I spoke about CC, classical
data for classical algorithms.

The next one that's in the matrix
is classical data, but Quantum

algorithms, which is where
quantum machine learning is today.

So this is quantum enhanced
machine learning, so to speak.

This is where we are because most
of the realistic data we have today

that is being tested, let's say, or
in research papers, everywhere else

is largely classical classical data.

So the main idea is to take classical
data, Embedded into quantum computers

for the use of, I should not say
quantum circuits because there is

this philosophy also works in, for
example, an annealing platforms,

which is DOF systems, for example.

So basically encoded into a quantum state
and then process that classical data in

some way inside of a quantum computer.

That's.

the broad part of it.

fourth one and down the matrix is
quantum data and classical algorithms.

In other words, machine
learning for physics.

Let's say that's an example, or
machine learning for chemistry, right?

And again, this is a very important area
where the next one that I'll talk about

will actually offers a lot of advantage.

So this is about applying
classical ML algorithms to process

data from quantum computers.

So whatever data we can actually extract
out of quantum computer by measurement.

We run it through a classical ML
algorithm and that makes our life a

lot easier because we understand the
classical ML algorithm and as soon as

we measure data in a quantum computer,
the quantum states collapse and the data

become classical anyway, so to speak.

The fourth one, and the last one, is
quantum data for quantum algorithms.

And that is quantum learning.

And this is the most
important perspective.

This is the most important subcategory
or category from the perspective

of basically research in QML.

It focuses on quantum algorithms which
process data encoded as quantum states.

So here we typically assume
that the data is given to us,

that is actually quantum data.

In other words, data out of chemistry,
let's say, or molecule model that is

already quantum in a quantum state.

So the data , comes to us in a quantum
state arising from a quantum environment.

Could be molecular modeling, like I said,
could be a quantum sensor, and then.

It is given to us over a quantum network,
which is where Stephen is the guru.

And hence the importance of
the QQ, the last matrix part.

Now in this area, There are several
research going on and some seminal

papers have been written, especially
relevant to quantum sensors.

So a few of them is by Zhang and
Zhuang of University of Arizona.

I think Stephen might
be familiar with them.

So basically they have been doing research
on, taking sensor data in quantum state.

And running them through quantum
algorithms something that, that you

call virtual quantum algorithms, which
is not a hundred percent quantum.

There is a classical part to it.

So they basically do iterations.

The output is basically classical it, then
it then gets applied to classical machine

learning algorithm, something like S V
M, support vector machines, and then it's

fed back into the , VQA process again,
and then you can basically classify.

Classify the quantum data.

That's one way of doing it.

And why am I rambling about this?

I'm rambling about this because of,
because some, this is a very, in my humble

opinion, this is a very important work.

Cisco is looking at quantum networking.

Now, yes, they're looking
at it at a physical level.

In terms of quantum routers or switches
or whatever else you have sooner

or later, once people have to start
thinking about protocol and handling,

which is where Stephen is I think.

Sooner or later, people have to start
thinking about, okay, I have got a

net quantum network is 500 people.

How do I know that?

Dan's data is going to Stephen not to this
guy , Santana who's on call this entire

protocol management and handling at a
large scale, this is going to be become

important as soon as, the first kind of
stable communication exchange happens on

a, on one quantum network at a productized
level is a very important area.

I personally don't think there is
enough work that is being done here.

And this is where all these machine
learning quantum inspired methods that

I mentioned first comes into play.

And why I mentioned quantum inspired,
and please stop me if I'm rambling on,

and you will ask some questions, but
why I mentioned quantum inspired is

this has become extremely important
during the last six months to one year.

Because companies like NVIDIA, they have
come up with libraries to support HPC high

performance computing to simulate quantum
inspired computing and machine learning.

NVIDIA have CUDA Quantum, I think.

Last, as a library, I think last
month or maybe six weeks or two

months back, they also came out with
hardware to support such simulations.

Now, one of the biggest challenges,
be that quantum computing, be that

machine learning, be that communication
today is quality of qubits.

Physical quality of qubits, be
that superconducting, be that

photonic is not very good.

It's, it's loads of errors in photonic
area we have with other problems,

not so much as the physical errors,
but there are some other challenges.

So one of the things that people always
come up with, for example, they say, okay.

understand all these benefits,
it's not available today.

So why are we really interested?

The answers are coming out of there
that, Hey, we can do this today.

The massive benefits in drug
discovery, financial industry,

from a business perspective.

No other , sector is
more lucrative than finance.

And then they are actually
looking at it actively.

Dan: Actually, another one
another industry that drives new

technology is defense as well.

So defense and finance,
you're absolutely right.

That's going to be key initial
drivers in terms of investment.

But Hey, let me take a
step back for a second.

So you mentioned the the two by
two matrix I'm familiar with.

I just want to quickly walk
through it from my point of view.

So classical and classical,
really it's about inspired

algorithms and simulation, right?

CQ is where we are today, where
we've got quantum computers running

algorithms, which are, or circuits
or something in an annealing system,

which is mapped and doing some
optimization in the classical world.

Santanu: Just CQ is classical data.

So we take classical data and we embed
it into quantum computers to the use of,

there are various ways, feature mapping
and then various ways of encoding up,

PCA, there are various ways of doing this.

So that's CQ basically.

Dan: and the way I see those two, and
definitely CQ, is basically, it's the

classical machine learning process,
leveraging a quantum algorithm to optimize

part QC is the other way around, where
it's using the machine learning, classical

world to improve something happening
in the quantum now where it gets really

confusing is where you were talking about
quantum machine learning, or I guess this

is in the QQQ world where you're using
quantum computing to optimize quantum data

at the moment, from what I understand,
almost all quantum machine learning is.

Born out of the classical world in
terms of the machine learning algorithms

run on a classical computer with
elements done in the quantum world.

Is that right?

Or are there pure...

Santanu: I think that depends.

That's my personal
opinion and you're right.

Most of the algorithms in quantum machine
learning or optimization have been

inspired by classical machine learning
simply because That's how science

works or science or technology works.

You start to build on something that
you already know and diverse from that.

So yes to, to some extent that's true.

The last two that you mentioned, QC and
QQ are quite interesting because if you

get quantum data, machine learning for
physics or machine learning for chemistry.

And if you take quantum data and
apply classical machine learning to

them, things can get computationally
very expensive and costly.

For example, molecular modeling,
to model a penicillin molecule

with classical computing.

Costs you something like
10 to the power 48 bits.

In a quantum world, that would
be 286 cubits or something.

And you're right in saying that.

A lot of these are inspired by,
by classical machine learning

algorithms that exist today.

A lot of these algorithms,
variational quantum algorithms and

variational quantum eigen solvers,
for example these are all hybrids,

so to speak, classical quantum mix.

They came out early days of quantum
computing because of lack of

depth in quantum circuits, right?

So lack of depth in quantum circuits
due to a limited availability of qubits,

number one, number two, physical quantum
circuits are very noisy right now.

So all these factors, and there
are several other factors.

What they did is, they said, okay,
gate decomposition is a challenge.

Let's do something let's reduce the
circuit, quantum circuit to a small size.

Get a classical output.

And then feed this back to
the next circuit, and the next

circuit, and the next circuit.

And then that kind of
becomes a hybrid model.

Steve: I have a bit of
a softball question.

It's not such a complicated question,
but it's something that occurred

to me very quickly working in this
topic, quantum machine learning.

is where would you draw the
line of quantum machine learning

and classical machine learning?

For example, we have this two by
two matrix, and we have , C's at

the top and the Q's at the bottom.

So there's a lot of ways to
draw this line of separation.

And for me, it was so clear
how to draw that line.

And so what we concluded actually,
and let's see if you agree with

this, is Anything that does something
quantum, uses a quantum model anywhere,

we call quantum machine learning.

No matter if it's using neural
networks classically and have

a piece of quantum somewhere,
that's quantum machine learning.

Like one, qubit worth of
information in the machine

learning model, and now that's QML.

And I don't know if that's
a generally agreed upon

Santanu: No, you are absolutely right.

I do agree with you right now.

That's where the industry is.

And that's why You know, once quantum
inspired methods are coming into play

because basically if you look at
most how most quantum neural networks,

Adversarial networks are implemented.

I just actually sent a paper out somewhere
on QGAN using real life financial data.

The general part, I use the quantum
circuit, but the discriminator part,

you have options of using quantum
circuits or classical machine learning.

I use classical because frankly,
I don't get paid for the research.

I'm doing it on my own time.

I don't want to spend too much money.

I don't know if to run this on physical
hardware because physical hardware.

Right now we'll offer a lot of errors.

And this you can see if you run
it, run something on a simulator

without naming any vendors.

I have tested this out on at least
two, three different vendors right now.

If you own on Amazon bracket, for example,
if you run it on a simulator and if

you run the same thing on the physical.

Qubits, you see the
difference in results, right?

So that's where actually going back
to what I was rambling about quantum

inspired computing has become so
important all of a sudden because people

have realized that finance, defense,
drug discovery, very high powered and

critical industries, climate change.

Why climate change?

It brings me back to the, machine
learning for physics again.

For example, the Mars Rover
2 that's right now on Mars.

They actually have quantum code there
from DOF systems to optimize and

model the weather pattern on Mars.

I went for a DOF training back in 2019,
and the gentleman who was actually

doing it hands on from NASA was sitting
beside me, so I'm 100% sure he's

going to code on it, but why do people
do weather patterns would prefer to

use a quantum computer to optimize
weather patterns and why not classical,

it's been done for years and years,
decades over on classical computers.

That's because it's
computationally very expensive.

To put that kind of resource
On on, on a spacecraft may or

may not have been productive.

So this is a very important thing because
as industries are looking at what's

more sustainable, you look at generative
AI, huge hype right now it's broken up.

The Gardner hype ceiling completely.

Dan: Yeah, this is going to
be one of my questions, right?

And actually it, it leads on from, first
of all Steve, I just wanted to say your

question was, made me laugh because
the whole point of a two by two matrix

is it's supposed to simplify things.

But I love the fact that in this case,
it still didn't make it clear enough.

There's still too many question
marks in all of the boxes.

I love that, but yeah, coming
on to large language models.

Is there a, an applicability of some
form of quantum machine learning

for large language models, which is
different to other machine learning?

Or is it just seen as another form
of galvanizing the machine learning

that's running on a classical system
by doing the optimization part?

I'm wondering whether because
large language models use

tokenization and so on, maybe
there's different benefits there.

Santanu: but that is a good question.

So first of all quantum NLP is an
active area of research and there

are some published papers on that.

You can size that some very
well reputed published papers.

I'm not saying there is a benefit.

So I'm very careful.

You have noticed that so far I've been
very careful not to mention the phrases

quantum speed up or quantum advantage.

I'm not doing that.

So yeah, because I'm very conscious that
this is an active area of research and

I'll struggle to back these words up.

So one thing is quantum NLP
is an active area of research.

Number one.

Number two, as we speak today quantum
inspired methods boosted by H P C

high performance computing be that,
supercomputers be that GPU driven

as nvi i d NVI I D I A is doing is
very much a realistic possibility to

actually cut down the computational
costs on generative AI type.

models.

If we look at how much does chat
GPT in its current form cost?

It costs over 100, 000 a day.

Every day, even it's current kind
of limited, format Google bird if

rumors are to be believed, a search
done by Google bird costs over 8 to

10 times more than a keystroke search
on Google search engine, right?

And this cost comes out of,
GPU usage of GPU server space.

And this is.

exploding basically.

So this is where quantum inspired machine
learning should become powerful, because

they can cut down on this expansion,
use the simulation to cut down on

the GPU usage, cut down on the server
usage, and actually give a benefit.

If not from quantum speed up or
advantage, but from the point of

view of sustainability and maybe
scalability, that's where the I think

that's what the near term benefit is.

And that's where these companies
are actually looking at.

Because, again waiting for
error minimized physical quantum

platforms at least five years.

And then every year people say five years.

For the last five years, every
year people say five years.

And that's something I know
from personal experience.

So I still say that's another five
years, or maybe three to five years.

I don't know.

But what do we do till then?

Do we just sit around and then...

See if it happens or not.

No, there are options where you
can actually leverage some form of

quantum machine learning, as you
said, and apply to industry, apply

to climate change, apply to finance,
to defense, whatever you have it.

Steve: Yep.

I'm also curious about this other,
it's not directly related, but

you mentioned the hype cycle
of quantum machine learning.

And I've been following it loosely for
the last couple of years, I don't work

on it directly, but when I first saw
quantum machine learning, you see a lot

of excitement, a lot of research papers,
and now it's generally calming down.

I think one reason for that is, was
that CQ problem of getting the classical

data into the quantum computer is very
difficult to overcome, maybe impossible.

But also I think people are just
starting to get more realistic.

And I wonder what's your perspective?

Do you also see the trend
just starting to flatten out?

Maybe the hype is becoming more realistic.

Santanu: I agree with you that the whole
point of getting classical data into a

quantum algorithm you have to go through
these states these basically future

mapping, this state transformation,
encoding, amplitude encoding or state

encoding, whatever form you take,
that is an additional step, right?

And.

If you look at any realistic study
where they actually publish the

code, publish the results, not just
claims, there are some, a lot of

papers who claim quantum advantage.

You look at the data, you ask, okay,
so I get about the same performance

as a get out of classical machine
learning from using a bunch of.

Classical data.

So why use quantum machinery?

What's my advantage there?

A, you are introducing an additional
step in the data processing, right?

And B, if your end result doesn't give
you anything substantial, why do it?

It doesn't make any sense.

I completely agree with it.

There are some very marginal cases.

Specifically perhaps in
in, in aerodynamics, or I

mentioned weather patterns.

So that's where aerodynamics comes
in, where maybe The computational

cost can be cut down to some extent
using quantum optimization, but again,

that is not something that, in my
opinion, should be viewed upon as a

quantum advantage or quantum speed up.

It's purely from a point of view.

Of basically cutting down your
computational complexity and to that

end, there are some problems which are
again, which are academic right now.

Mostly academic, for example, some NP
hard problems that can actually be solved.

I'd say.

With far less computational
cost on quantum optimization

than in classical computing.

Traveling salesman.

So that is where supply chain
comes in, MaxCut problems etc.

Certain problems, there is
some merit to use this, even

though the data is classical.

For example, I did this demo of a
traveling salesman problem on a D Wave

simulator on my laptop, actually in 2021.

I was doing a presentation.

I did a traveling salesman for all the
States in the United States, basically.

So some guy goes around from state to
state and, what is the shortest possible

time that he can actually do it on my
laptop with Running a DUF simulator, it

takes less than two minutes, but if you
try to do it classically on my 16, it will

not be done in 16, in under two minutes.

That's clear.

Steve: One point there, though, I
know what you mean, but the optimal

problem for the traveling salesman
for 50 states, let's say that I

agree is very difficult to solve.

But would you say that the simulator
is also finding the absolute

optimal state to, I'm not sure how
the works on the annealer, but.

But there are such classical
approximations as well that don't use

any quantum that could also improve the
runtime of traveling salesman potential.

So maybe like just to be completely
fair, are they solving the same problem?

Santanu: That's absolutely true.

The whole point I'm trying to
make is And this is a very simple,

simplistic example, but the whole
point I'm trying to make is there are.

Areas are very specific areas where
there could be some advantage because

certain optimization problems are done.

And mind you, I'm saying optimization
because I'm generalizing that, every

machine learning problem is at the end
of the day, an optimization problem.

So you have to have an optimizer
there at the end of your code.

So certain platforms annealers
specifically, they do.

One thing and one thing very
well, and that's optimization.

And yes, there are some specific
problems that they do very well.

So there are some, areas of promise
right now, but you are absolutely right.

This is a major issue, the,
processing of classical data into

quantum domain, , that entire.

Encoding side of things is an additional
step, and that takes a lot of benefits,

especially if you're doing image
analysis, you take image data, and

if you try to, A, you have to reduce
it right now, because there aren't

enough qubits around to simulate
27 by, or 28 by 28 images, whatever

they are, and you do image reduction,
then you do state transformation,

and then state or amplitude encoding,
And by the time you've done that, a

classical ML would be done, right?

So

Steve: But I guess it comes down
to, when we're looking for speedups

in QML or improvements, we always
should compare it to the state

of the art classical to, right?

So traveling salesman, for example,
can't compare the brute force

search to potentially, optimized
version or approximation version.

So I mean that the issues of putting
classical data into the annealer

that's already gonna cause , I don't
know, I'm not that familiar with the

kneeling but if I, let's say instead
think about the superconducting

crate and code the information using
logic gates, which that's, like you

said, it's gonna take probably the
majority of the qubit lifetime just

to put the data into the system.

But then the publishing the results, I'm
always scared about okay, this has a.

exponential advantage over brute
force search, but brute force search

is nowhere near the state of the art
or something, so I always had to be

careful of saying looking deeply for the
classical, give them a fair criticism

and not say, okay, we take the worst
classical algorithm and compare it to

the best quantum in this exponential gap.

But, and I think we saw some results
like that in the coming days.

I think this is where quantum inspired
stemmed from is trying to find the best

classical algorithm To compare against
the quantum and what happened was

actually in the classical case you can
do things based on the quantum that make

the runtime much better and now there's
no more gap and that happened I think

two or three times and big controversies
came from that but not controversies

but lots of news and lots of excitement.

Santanu: and that's absolutely
true Stephen, because, it depends

on the data, very important.

It depends on the problem that we are
trying to solve and it depends on what.

What the end goal is what is it
that, that you're looking at?

So for example, in an annealer,
there are no circuits.

So you're basically looking at the energy.

So you take the, so basically for the,
going back to the TSP problem, I would

just define, 50 or 49 addresses for the
states of us text file and encode that

into a Hamiltonian or a Lagrangian.

And that in turn gets mapped
into the qubit energy levels.

And then he's just.

Energy level optimization, and
then that's where it comes out.

Hence, certain annalers, for example,
they just do one thing it's optimization,

but they do it very well, because there
is no problem with gate errors and, gate

decomposition, et cetera, et cetera.

There are other kinds of challenges,
obviously, but yeah, from a point

of view of quantum inspired or high
performance computing boosted quantum

computing even if you look at a summit
supercomputer for, let's say Oak Ridge

National Laboratory, that is, I think,
theoretically capable of performing up

to 10 to the power 17 single precision
floating point operations per second.

Okay, now to compute two to the
power 63 equivalence of two by two

matrix multiplications as an example.

Would be a complexity of something like
56 if my math in my head is still correct.

So at a hundred percent utilization, you
take some, just a few seconds to compute.

To compute a full simulation, to
store a full state of 53 cubits.

We will need, by this calculation,
72 petabytes of storage.

. So when we map this into
the, Ram that summit has.

And the sockets, et cetera, that it has,
because obviously it's a super computer.

We should expect that the
simulation should encounter very

high communication overhead.

Moving data from permanent
storage into RAM and vice versa.

So yes these are areas
that people are looking at.

IBM, I think, has found a way
to minimize data transfers.

I think somebody called, I think
there's a paper by Pignols from

2019 where this is described.

Yes, this is an exciting area of
actually looking at stuff and trying

to get some benefits out of all these.

studies of quantum algorithm that has
come out of through all these years.

Sorry if I'm rambling, my apologies.

Dan: No, that's why we're here.

Brilliant.

Thank you.

Yeah, so a slightly higher level
question and no doubt it will end

up in some deeper conversations.

To me, machine learning tends to fall
into a few different buckets, right?

You've got reinforcement learning
supervised and unsupervised learning.

Do you think there's how does
quantum benefit them differently?

Or is it that they, there's the
fact that they are all algorithms

of some form, which means they'll
be able to benefit from quantum?

It's a kind of clear divide that

Santanu: Yeah.

So in, in quantum space, actually
there are equivalent algorithms in

almost Everything you mentioned.

For example for support vector
machines, there is a QSVM.

So for SVM, there's a QSVM, quantum,
quantum support vector machine.

As Stephen pointed out, what is
basically being done right now is

you take classical data, you encode
it in quantum, some form of quantum

state or amplitude or , whatever else.

And then you basically just run a quantum
optimizer or if you want, depending

on platforms, not all platforms can
do that, but basically that's where

your quantum perspective comes in.

Other than that it's is basically
a classical thing going on without

quantum encoded junk in it.

Being that there's also something called
quantum neural networks, so I don't think

it has been clearly defined what a quantum
neural network is, but you know there

are quantum analogs in literature that.

It's proposed for quantum neurons,
CNN convolutional neural networks,

recurrent neural networks etc.

There's a, there's one interesting area
where I think financial world actually

has taken up is quantum generative
adversarial networks, so QGANs.

These are separate from QNNs because
generating modeling is emerging as

one of the most interesting way.

That some quantum algorithms can be
applied to ML, and that is purely

because of the intuition that quantum
computers can generate probability

distributions for which classical
computers struggle to generate samples.

Okay, so this is intuitive.

Depending on the paper, or the
implementation or the research

and the data and the use case.

So there's a lot of factors there.

A QGAN may use a classical or
quantum neural network or both.

For both the generator and the
discriminator, for example, I just

mentioned the paper of recent paper of
mine, I used Qiskit for circuits for the

generator part, but for the discriminator
part, I just used classical algorithms.

So why this could be exciting maybe is.

This feels more straightforward
way to connect quantum computing to

classical ML without having to wait
for fault tolerant quantum platforms.

So this is again, this is very.

New thought process, maybe three months
or maximum six months old, so this can

change three months from now but there
are formal proofs for quantum kernels that

they may be able to yield an advantage.

Again, this is not experimentally
proven, but these are theoretical proofs.

I think where quantum machine learning
and algorithms are, is people are

basically thrashing out things as Stephen
said, that there was a hype curve.

And now it's coming down, which is normal.

It happens with every possible
technology that comes into play.

For quantum machine learning, one of
the most prominent drawbacks is what's

known as hitting the barren plateaus.

So this is when basically the number of
qubits in a circuit as they increase.

any gradients with respect to
the parameters of that circuit

tends to decay exponentially.

So you're basically the, training
of quantum neural networks

becomes a challenge because these
are gradient based algorithms.

And then If it starts to decay
exponentially, it basically gets stuck.

You don't get any benefits
out of that, right?

It's just, whereas on a classical layer,
if you have a gradient descent algorithm,

and if you have bumps, you can do,
parametrized optimization there and take

that thing out of the local minima and
try to push it down to the global minima.

So that's that kind of
things, we know how to do it.

In quantum world that still is a
bit of a challenge to achieve as,

as simplistic as that may sound,

.
Steve: I tried to, think about how
does this whole topic come back

to networks and communication.

Santanu: Fantastic.

My, personal opinion is and this is
strictly my opinion from working in,

machine learning and stuff is, if you, and
I'll again go back and mention annealing

and annealing platforms, what annealing
and optimizers, what optimizers do

basically is use combinatorial algorithms.

At the end of the day, you have
something and that's what the

kind of optimization comes in.

You look at a classical network from
20 years back running OSPF is Dijkstra

algorithm based on combinatorial math.

That's exactly where the benefit
should come from, in my opinion.

And that's where, that's
what I got interested into.

into annealers and combinatorial algorithm
about three, four years back, because

in my head, again, going back to, the
analogy of quantum machine learning

coming out of classical machine learning.

Optimization of networks should be
inspired by how networks are being

optimized in classical world as well.

Yes, that's why I mentioned these papers
of sensor network classification, but

with SVM by Zhang and Zhuang is that's
exactly I think what they're trying to do.

They're using VQS.

And that's fine.

But personally, I think quantum data,
when it comes quantum sensors, or whether

it comes from your quantum network node,
let's say there's a quantum computer

sitting at the end, it's quantum data.

So step one, day one is to look at how we
can optimize that either using annealers

as optimizers or Take a page out of Zhang
and Zhuang and use hybrid algorithms.

So that's my take on it.

That's, that should be step, step
one to try that out actually.

And that's something I've been very
interested in doing, but it's very hard

for me to get classical networking data.

I know big enough to actually
try anything like that.

. That's how you can actually.

And do SVM is binary classification.

That's what Zhang and
Zhong's paper has done.

But it doesn't have to be binary.

It, it can be some other algorithm.

It can be something else.

And you just run the quantum version.

That's fine.

Steve: and I guess the whole
topic of machine learning

in communication networks.

of the problems with this kind of
a topic, I don't know much about.

What topics in classical communication
networks can machine learning help with?

And then on top of that, could we
use quantum to do something as well?

Santanu: This is a very good question.

. Ultimately the quantum machine learning
or quantum hybrid classical machine

learning may become very important
because as you correctly said, machine

learning is resource intensive.

You run it on any classical network.

Wherever machine learning AI, etc.

These are extremely resource intensive,
especially if you're running unsupervised

learning, for example, in a large
scale network, you don't know all the

data, everybody who is working wherever
thousands of users, and you'd likely not

run supervised learning there because
you may not have data that you know,

data that you can train the network on.

So you're probably will be will end
up running unsupervised learning

or semi supervised learning.

You know some data, and there is some
unknown, but unsupervised or semi

supervised, your data thing will explode
as compared to supervised learning.

And that in turn will take up resources.

And in a classical
world, this costs money.

This is not cheap.

And again, going back to our
discussions before, this is where

actually you can get some serious
benefits using quantum inspired.

be that quantum inspired, be that
quantum machine learning, whenever

the hardware becomes available.

That's where you can get some benefits.

Dan: I would think in answer to your
question, Steve, as well, there's

two different types of areas where
machine learning could be and is used

in networks at the moment, right?

There's the not so real time use
cases where perhaps there's occasional

calculations made optimizations maybe
even preparatory sets of changes to

be made in the network over time.

And then there's the real time
stuff, which is where things get...

A lot more interesting.

And as you said, Santanu, the
amount of data just ramps up almost

exponentially because you're receiving
so much from all the sources.

I think the way you described it
sounded like you were referring to an

enterprise network with lots of end users.

But what about a network using
satellites and endpoints in the air

and things moving on the ground?

In terms of real time
optimizations, something like that.

Then I would think there has to
be a benefit because that's the

kind of thing where we're really
pushing the capabilities of

classical methods at the moment.

The machine learning techniques in
networks at the moment there, I don't want

to do them any injustice because there's
some really amazing stuff out there.

But ultimately they're
looking at historical data.

And they're looking at a snapshot
in time of current data and they're

making some intelligent predictions.

So it's about extending the amount of time
that, the prediction is made from and

also to be able to do it in real time on a
device, perhaps, which isn't connected to

a HPC cluster might be in an aircraft or
it might be in a satellite or something.

That's where that's the state
of the art, I think, pushing the

boundaries with the technology.

Steve: And one problem I remember
reading about when it comes to large

infrastructure is, for example,
probably predictive maintenance.

When you need to go and maintain your
infrastructure again, is it broken?

When will it break?

When should you replace the components?

And I think machine learning
is involved in this problem.

This is something I only read about, but.

Yeah, the networks also require this
predictive maintenance approach.

I wonder, is it yeah, maybe it's not
something people explore, but it could be

Santanu: No, you're right.

You're right.

For example, there is this
concept of self healing network.

Dan: Self optimizing network
as well, it's another term.

Santanu: Exactly..

And there's a massive use case in
security because especially semi

supervised semi supervised learning.

So many malware, so many threats,
so many fingerprints, so many single

threat signatures, and so many more
out there that we don't know about.

So there is an obvious use case.

And this is where basically in
in, in mainstream security, XDR

X, as in anything, is coming
into play because you put it on.

You use machine learning and security.

So the tool or the model, whether it's
trained on everything that you know, all

the threads that you know, so it can stop
the known threats, but there are unknown

threats and they might go through, but
once they go through, it learns about

the unknown threats and then, get trained
itself and then protects it next time.

And then there are certain other.

Solutions where what they use
kind of reinforcement learning.

And if any threat if any, for
example, signature appears to

be suspicious they score them.

So the score six if six kind of
suspicious, they might raise an alert.

If it's seven, then they may sandbox
it to look at a deeper look at it.

Eight, nine, 10 doesn't go through at all.

And then that kind of stuff.

And these are again.

They learn from this experience when
the sandbox said they look at it, they

learn the signatures train itself and
in future, they know, okay, this is a

lot for J, which is what happened to
my lab here And this is LOG4J, this,

don't want to host your Santanu's lab.

Dan: So the machine learning
algorithm in that situation would

have learned Santanu's patching
skills and what up to that.

Santanu: I had to come back
from vacation actually.

Dan: It caught a lot of people though.

That was bad.

Yeah.

Santanu: They like to drive back
four hours to put the patches on

my lab at 10 at lunch or something.

It's gone.

Left my family there.

Dan: Yeah.

Yeah.

We all suffered for that one.

That's for sure.

Steve any tangents you'd like to bring?

Steve: I want to see like how the things
I work on, for example, communication,

quantum networks are influenced by
quantum machine learning and potentially

where's the direction for that.

So things I'm thinking about are noise
mitigation and quantum infrastructure,

quantum hardware, and the question
I have is always coming down.

Do we need the quantum for that or not?

Can we do this mitigation,
error detection, error modeling?

Using purely classical infrastructure,
but then when it comes to the

quantum hardware, maybe it is
better to use the QC approach

Santanu: Yes I agree with you because
eventually, I know it's easier for

us who actually are familiar with
classical machine learning and AI

to say, okay, we'll just use the
classical stuff and be done with it.

But eventually you have to think of cost.

And these costs.

These, we're talking about
industrial solutions here.

So these costs will ultimately
have to be paid by the customers.

So we need to think about the
business viability of that as well.

So from point of view of all
that yes I agree with you.

That is where quantum algorithms will
also be a potential use case, be that

quantum inspired, be that quantum
platform driven, five years, six years

from now, whatever it is, but that's
definitely one of the areas where you

can leverage that as well, correction.

I think there are some papers around
who are using reinforcement learning

as well for for, correctional,
I'm sure you've seen them as well.

So I'm interested in that myself.

Steve: I think I'm out of
questions at the moment.

I could probably come up with some more.

Dan: Yeah.

I'm just asking chat GPT if
it's got any questions for you.

Just see what it says.

Hang on.

Santanu: Okay.

Dan: Yeah, I was talking
about data complexity.

Yeah real time data streams,
dynamic learning scenarios.

Yeah, that's one that I touched on.

Santanu: One of the things that
actually is very important, and I

don't often see this in the books
or literature that I read, that is

actually related to implementation,
especially in gate based computing.

How do you implement?

When you implement, , consider how
things are being resolved at that level.

I'll try to give a quick example.

So implementing gates as potentially very
large matrices and constructing operators

via metrics products is of complexity.

O (N) to the power eight or
something, I can't remember.

So this is a worst case scenario, I admit.

But.

This entire situation becomes intractable,
actually, and performance starts to suffer

when you have like more than eight qubits.

That's it.

There's a lot of paper I read and
literature I go through where they

don't really explore the implementation
challenges of quantum algorithms.

And they because this varies from
platform to platform, an algorithm

that's, very efficient on any link,
for example, kubo may not be that

efficient on a gate based environment and
implementation can be a nightmare as well.

I of platform it is.

So I need a lot of literature, but I
don't really specifically see these

aspects being addressed, but this
is also something very important

because it's directly affects.

The behavior of the hardware that you
are actually using to do the computation.

Dan: it's ultimately up to the, the
vendor of each of the computing

environments that are out there based
on the different technologies to then

go and test these kinds of things,
or at least for their customers

to bring their use cases that.

That leveraged some of the new theory
and so on, but I guess there's,

there isn't a body out there that
would be doing that kind of thing.

Because there's no standards
at the moment, pretty much,

physics is all you've got.

Santanu: Yeah, that's it.

Steve: My question was actually, so
when you think about the physical

implementation, especially at the
gate level, one thing I'm wondering

actually is, do we have that?

Continuous degree of freedom
along the block sphere to

actually manipulate these systems.

It seems to me that you need finite
precision to encode the gate pulse, right?

So do you even get gradient
descent in that case?

What happens if you don't have continuous

Santanu: right now, I agree, at gate
level right now, there are limitations.

There simply aren't enough qubits
around, and also how the qubits are

connected, they're connected in one way
in, in IBM where they were, where they

came out of the concept of transpilers.

And there are, qubits connection is
different in Rigetti, for example,

even the both are self conducting.

So all these factors comes into play
when you start to implement that.

And people are implementing,
they need some knowledge of what

kind of hardware is in there.

Otherwise, you are in a classical
computer, I write a Python code.

I don't even worry about where
this code is going, which bit it's

affecting, how it's affecting, I just
write it, I'm gone, and it works.

So the quantum computing hardware
domain is far from that space.

Steve: Because especially when I think
about quantum machine learning, you

have your quantum circuit, you put
some parameterized unitary gate, and

those values are just real numbers.

They can go any direction.

When you put it in the quantum
computer, you need the microwave pulse.

Which has a a cool up and a cool down
or heat up and a cool down process.

You can't get so much a narrow
beam and therefore you don't get

the continuous degrees of freedom.

You have the intervals
eventually you can only pick.

Yeah.

It's definitely hardware dependent
because you need some sort of

precision in your operations.

But then the question is, can
you do machine learning at all

without that ability to have

Santanu: Hence, very good question
very nice, and this is where

we are, I was going with hybrid
quantum classical algorithms.

This is why you have VQEs and VQAs,
variational quantum algorithms,

which are all hybrid basically.

They are, none of them are fully quantum.

And all these factors into that
decision, not just circuit depth,

but also what you mentioned, right?

These are limitations that exist today.

And I think there are some companies, for
example, working on basically separating

the circuits and then the computation
kind of coding platform from the circuit.

So you don't have to worry about
if it's IBM or if it's Rigetti

or something, you can just do
something on a classic platform.

It gets drilled down.

Yeah.

Otherwise, you have to be very
aware of the platform you're

using and what you're doing on it.

And what your code is going
to do to your teammates.

You really need to know the physics.

Otherwise, it's you may never
get an advantage of that.

I don't see enough
studies or work on this.

Most literature you read, they
just give you their results and

explaining, for example, there is
this very well known paper that came

out in the financial industry from J.

P.

Morgan and I.

M.

Q, I think.

And they treated some live financial data.

That's what got me interested in trying
QCBMs actually on financial data.

Quantum Circuits Born Machine.

And they claim the quantum
advantage, but the claim is there,

You need to look at the code,

Steve: Again, it has to
come down to what's fair.

What's the advantage over what,
that's always my question,

Santanu: exactly.

Steve: the best classical
algorithm or the worst one,

Santanu: So exactly.

So I agree with that.

Dan: Was it a comparison
issue in that one?

Was it that they're comparing
something against a manual process or?

Santanu: Nah, it's just a thing they
said, we used some, classical optimizers.

. So they use QCBM, , which is
basically gate based computing,

and then they said they did they
optimized that output with annealing,

but the details are very sketchy.

Yeah, those are papers that interest me
because I tried to drill down to see where

and how they got that as enthusiastic as I
am about quantum machine learning myself.

You, you need to need some
validation to to convince yourself.

Dan: Okay, slightly different topic.

We have touched on the hype cycle.

Thanks for bringing it up.

I just wanted to thoughts from both
of you that the hype around machine

learning at the moment is bringing a
bit of fear into some people, and some

of that's irrational, some of it isn't.

When you add the word quantum on
the front, it makes it sound...

That's the hyper futuristic powerful
version of machine learning, I think.

I guess just some thoughts about the
societal impact and possibly the kind

of negative connotations that could
be made from the name just on its own.

Steve: I'm just thinking about it.

And lately what I'm thinking is as
soon as you put quantum in front

of anything, you might even be safe
to assume it's going to be worse.

That doesn't necessarily mean
it's going to be any better.

It's probably going to be, as soon as
you put the Q in front of it, it's.

50 years away and noisy, so
maybe it makes it safer and less

threatening when you put quantum.

Dan: That's good to hear that you
think that as the quantum physicist,

but but what about the general public?

Maybe we just need to
keep banging the drum.

Steve: true.

It's the general public.

Dan: Quantum is slow and may never come.

Steve: especially the quantum
leap is not a big jump, right?

the smallest possible jump you can make.

I guess if you want to quantize leaps,
but but generally I agree with you.

It's, it sounds.

More threat scientific, it sounds
more, loaded term quantum and it

makes people scared, even more scared.

So we have quantum computers that
will break security of the internet.

We need to change everything.

Things like that.

We have algorithms that will change
XYZ field in the next two years.

People promise these things and I
don't know if it's true, but this,

there's two sides of the story.

The hype is in some sense necessary
because it brings attention.

It brings funding, it brings jobs.

Without the hype, it's too dull and
people don't get interested in it at all.

But then they have to keep the
expectations real, at least to

the people who are working on it.

Or putting money into it, let's say.

You don't want to trick your investors,
you don't want to trick them.

Not too much, at least.

At least give them some hope.

Maybe the timeline is half
as long as it should be, but,

something should be possible.

It shouldn't promise impossibilities.

Dan: Yeah.

Some great points.

I had somebody else talking about a
similar topic recently saying that

the industry at the moment is in an
education and preparation phase, right?

It's more about spreading the word,
trying to bring up the number of

people, increase the skills in the
market, improve awareness and so on.

We're not at the point.

Where you've got more of a kind of scaling
issue and it's about commoditization

and all that kind of stuff.

It's still about education and planning.

Santanu: That's true.

Where we are, I think, is , when I
was very little, there were God's

gift to computing was a Commodore 64.

So that's, I think that's that's
where quantum computing is right now.

But,

Dan: not even there.

You Commodore 64, come on.

You could go to the shop and buy one.

Santanu: Yeah.

Back in the list, you could, now you
can get a two, two qubit computer from

a quantum computer from Spin Queue.

You can order it over Amazon.

It'll get delivered.

It's two, two qubits, education
and only, but it's the hardware

and it'll run on your tabletop.

I think that's where it's going.

But then again, if you consider that,
people had Commodore 64 as God's gift

to computing back, if I remember.

In 1987 or 86 and how much computation
developed, in the next 20 years,

classical computing, how fast it became

once you get the first step in, once
you know the starting point, then

it doesn't take long, but they have
to break into that starting point.

Anyway, going back to your
question I agree with you.

For most people, quantum AI given the
fear about, normally AI is massive.

But.

I think that it's a, there's a other side
of the coin that, and that is, all these

AI platforms, classical AI platform,
generative AI or whatever else is going

on, the more complex the models get
more, wider, the attack surface becomes

from a purely security perspective.

And I'm not just saying that.

DDoS kind of attacks, which
corrupts the system, I'm

saying, poisonous data in there.

If someone finds a way to feed
your classical AI model or the

data lake or whatever you have with
corrupted data and does it quietly

without being detected, you're done.

You are basically end up training
either noise or poor data or whatever.

It could be a competitor,
and at the end of the,

Dan: Biased outcomes.

Whereby the AI is making decisions
in a particular way that is

Santanu: exactly, they can manipulate,
yeah, they can manipulate the bias from

that perspective, security perspective,
a quantum AI, a quantum environment

, should offer far better security.

If not by quantum
encryption, just by default.

So that's my take on it.

Okay.

If it gets into the
hands of a mad scientist.

It might cause some problems, but
, this is all the gift of technology.

. It's hard to stop.

That's human nature, unfortunately.

Dan: Yeah.

Nice thought to end it on.

Yeah.

Brilliant.

Listen, I've had a fantastic
time talking to you Santani.

Thank you very much for

Santanu: Thank you.

Thanks for having me.

Dan: Steve, do you have
any closing comments?

Steve: No, that's it for me.

Thanks a lot for this.

It was very educational for me.

I've dabbled in the QML topics, but
never really heard it from the source.

So

Santanu: I'm not the biggest
source, but thank you.

Thank you for the kind

Dan: And I'm sure we're only
scratching the surface as well.

Yeah, lots more to think about.

Thank you Santanu.

Really appreciate it.

Take care of yourself.

Bye bye.

Santanu: Thank you.

Steve: thank you.

Dan: I'd like to take this moment to
thank you for listening to the podcast.

Quantum networking is such a broad domain
especially considering the breadth of

quantum physics and quantum computing all
as an undercurrent easily to get sucked

into So much is still in the research
realm which can make it really tough for

a curious it guy to know where to start.

So hit subscribe or follow me on your
podcast platform and I'll do my best

to bring you more prevalent topics
in the world of quantum networking.

Spread the word.

It would really help us out.

Creators and Guests

Dan Holme
Host
Dan Holme
Quantum curious technologist and student. Industry and Consulting Partnerships at Cisco.
Stephen DiAdamo
Host
Stephen DiAdamo
Research scientist at Cisco, with a background in quantum networks and communication.
Quantum Machine Learning. An interview with Santanu Ganguly.
Broadcast by