Post Quantum Cryptography Monologue

Dan: Hello and welcome
to the quantum divide.

This is the podcast that talks about
the literal divide between classical it.

And quantum technology.

And the fact that these two domains.

Are will and need to
become closer together.

We're going to try and focus on.

Networking topics.

Quantum networking actually is
more futuristic than perhaps

the computing element of it.

But we're going to try
and focus on that domain.

But we're bound to experience many
different tangents both in podcast

topics and conversation as we go on.

Enjoy.

Good day to you.

This episode of The Quantum Divide
is a monologue from me, Dan Holme,

on post quantum cryptography.

And I'm doing this because the
importance of this technology

in the coming few years.

And The Quantum Divide is a
networking podcast and we haven't

really started on security yet.

And this is one of the main use
cases of quantum technology.

In terms of the rate at which
it's going to hit the market,

especially in the networking domain.

I'm going to start with PQC, post
quantum cryptography, which is, it's

almost adjacent to quantum, rather
than quantum itself, but it warrants a

session to talk through it, because the
reason for post quantum cryptography,

the whole concept behind PQC, has been
built because of quantum computers.

So let me start with first of all
describing what I mean by that.

So the quantum computing world is
accelerating the capacity of quantum

computers to be able to run algorithms
that can solve very niche types of

problems, including highly parallelized
algorithms using the quantum physics

of the qubit and basically this
capability is giving those with a quantum

computer the ability to essentially
break the current cryptography used

in a large part of applications and
end to end communication across the

internet and private networks today.

However, the technology is not there yet.

The technology is not yet
there yet, it's on the horizon.

So I'm just going to dig into
that a little bit for you.

And I'm bound to make some mistakes.

Because this is not my natural
field, but I'm fascinated by it.

So feel free to send me an email or
something and tell me where I was wrong.

It might open up doors for
later episodes of this podcast.

So essentially the reason.

P QC exists is because the RSA 2048 E C
C and the Diffy helmann protocols could

all be broken using a thing called Shor's
algorithm or the Grover search algorithm.

So these are two different algorithms.

The Grover algorithm is a search
algorithm, which allows to

search through unstructured data.

In an exponentially improved way over the
classical methods and Shor's algorithm is

similar in that it gives an exponential,
it's utilizing the exponential problem

solving capability of a quantum computer.

To perform essentially what is called
the integer factorization problem.

And that's where two large prime
numbers are taken, multiplied together,

and it creates a very large integer.

That is the premise of the algorithm and
Shor's algorithm can do that in reverse.

So why is that mathematical
calculation, which sounds very simple,

albeit with very large numbers,
why is that such an issue for RSA?

Essentially, because the quantum computers
can perform calculations in parallel and

efficiently factor these large numbers,
which form the basis for the security

of these classical algorithms, this
necessitates the development of new

encryption methods that can withstand
attacks from quantum adversaries.

This is, whilst not broken yet, a huge
risk for the industry, and not only

a risk, but something that , is going
to impact the industry soon enough.

Looking at the quantum computing, the
rate of the development of quantum

computers, IBM, their latest qubit
technology is 433 qubits, and they

have Quantum computers on the kind of
horizon one, horizon two, two, three year

type timeframe going above a thousand.

Up to 2000 cubits and the quote
that I saw in terms of numbers was

something like you need 2 million
cubits to, to break RSA 2048.

If they're, if those cubits are error
prone, if they're not error corrected,

and if you have the ability to do
error corrected cubits, then you

only need about 5, 000 or something.

But ultimately the point is, the
industry is going into a direction where.

Eventually, these algorithms that we
use in all of our communications today

are going to be susceptible to being
broken if somebody can access the packets

that are flying between the users.

Now to add to this it's the paranoia
that there is a handle going on.

So handle is h H N d L.

So harvest now, decrypt later,
which is a behavior that you

might think that's crazy.

Why would how much storage
are you gonna need to store?

Encrypted data.

If you don't even know what it is,
you just to capture everything.

Well, That's alright.

But there are two types of secrets.

There are secrets, which
perhaps transitive in terms of

their timeframe, , lifetime.

And once they've gone, then
it's not so much of an issue.

They're more like a real time
secret, but things which are more

longer term carry much higher risk.

And storage is cheap and it may be
said that there are many different

vulnerabilities in various nation
state networks that perhaps

there is all this kind of harvest
now, decrypt later going on.

So this is what's
accelerating the move to PQC.

And let's have a quick
look at the PQC algorithms.

So we'll explore several post
quantum encryption algorithms

that have shown promise.

In withstanding quantum attacks.

It's worth saying that these these
algorithms have been approved by NIST

or there's a, there's an approval
process by NIST where they will

make an eventual recommendation for
which protocols to use for both key

encapsulation mechanisms and signatures.

But it is worth stating at this
point that some of these protocols,

they haven't been around that long.

So there's still a risk.

That they could be broken fairly
quickly if new vulnerabilities are found

through cryptanalysis or through brute
force, somehow using quantum computers.

It's one of those things that, you know,
which the industry has to do its best to

take steps to go in the right direction.

But it may be that there are other
steps that are required later on.

And that brings about the concept of
cryptographic agility, so I'll come on

to some recommendations at the end in
the podcast, but ultimately organizations

need to prepare their infrastructure
to test and ensure and implement

change protocols for implementing a PQC
algorithms where they deem necessary, but

they also need to consider cryptographic
agility, which is a The ability to change

different PQC schemes as time progresses.

And if those that are chosen by NIST,
or if one of them chosen by NIST is

in, is implemented in your network,
that needs to be replaced because of a

future vulnerability or some other type
of issue, then cryptographic agility

means you have the capabilities in
your infrastructure to to make those

changes without then doing some kind
of huge rip and replace type activity.

I'll come back to that later.

So for now, let's just talk briefly
about the different algorithms.

So first of all, lattice
based cryptography.

So essentially what I'm doing here is
describing the mathematical foundation

of the algorithms or protocols
which will replace the integer

factorization problem protocols such
as RSA, Diffie Hellman and and ECC.

So the first one is
lattice based algorithm.

So instead of using prime numbers,
what a lattice does is utilizes the

mathematical properties of lattices,
which are grids or networks of

points in multidimensional space.

And it's quite easy to imagine a two
dimensional space with a, a big map

of dots and the algorithm basically
provides security based on the hardness

of certain lattice problems, such
as the shortest vector problem and

the learning with errors problem.

And these problems are where, based
on particular basis vectors, what

calculations do you need to do to
get to other places on the on grid?

Now, this is, you can imagine
that in, in 2D, no problem.

But the thing that makes Lattice probably
particularly hard for classical computing

is when you start scaling the dimensions.

So from 2D to 3D, you can imagine
that quite easily, I'm sure.

We live in a 3D world.

And but once you get up to 5, 6, 100,
500 dimensions, it's impossible for

you or anybody really to imagine what
that would look like, but mathematics

provides us the tools to be able
to describe that mathematically.

And therefore, calculations and problems
can be formed in that domain which

make for these tough algorithms very
tough for classical computers to solve.

So in lattice based cryptography, I
mentioned shortest vector problem.

This involves finding the shortest
non zero vector within a lattice.

The lattice is a mathematical structure.

It's formed by a grid of points
in multi dimensional space.

Solving this SVP is computationally
difficult, and the security of lattice

based cryptography schemes relies on
the assumption that finding the shortest

vector in the lattice is challenging.

The other problem I mentioned
was learning with errors.

This problem, also known as
LWE, is another key component

of lattice based cryptography.

It's based on the difficulty of solving
systems of linear equations with errors.

In the LWE problem, random errors are
introduced into linear equations, making

it hard to recover the original values.

The security of this lattice
based scheme relies on the

hardness of solving this problem.

Just like in RSA2048, just like in
in any of these other PQC algorithms,

it's about creating problems that
quantum computers can't solve.

Another prominent class of these PQC
algorithms is code based cryptography.

These algorithms they rely on error
correcting codes, which are mathematical

sets of constructs almost used to detect
and correct errors in data transmission.

So the code based cryptography,
it utilizes the difficulty of

decoding certain types of error
correcting codes to provide security.

Examples of this include and I'm
probably gonna say these wrong , the

McEliece crypto system, the
Niederreiter crypto system, and bit

flipping key encapsulation scheme.

I'm not a crypto nerd and there's
so many different algorithms

and systems to, to know.

I'm just scratching the surface
to, to help me and you out.

But obviously you can go very
deep on any of these individual

algorithms if you felt inclined.

So moving on, the next set
of cryptography problems are

multivariate polynomial equations.

And these are, your classical
polynomial equations, obviously

with many different depths of power
to make them complicated enough.

But they're employed polynomial
functions with multiple variables, so

that means, again, it's a very hard
problem to try and solve these equations.

And examples of the schemes used here
are hidden field equations there's one

called Rainbow Scheme, and one I've
seen previously is called SPHINCS Plus,

which is known as a stateless hash based
signature scheme, which I'll come on to.

A bit more on multivariate
polynomial equations.

Basically, a polynomial equation is
where you have multiple variables raised

to various powers and combined through
addition and multiplication operations.

They're good fun in maths, fun being
an operative word, obviously in

air quotes, but solving systems of
multivariate polynomial equations is

computationally challenging when the
number of variables and the degree of

the polynomial increase, as I mentioned.

So the security of these schemes is based
on the difficulty of solving, solving

the equation, solving the problem.

Next, we look at hash based cryptography,
also known as one time signature schemes.

It's another category of
post quantum algorithms.

These schemes rely on hash
functions, which are mathematical

functions that transform data.

Into a fixed size output called a hash
value that's predefined and things

called Merkle trees which for me
sounds a bit like Merkin, but let's

not go there which are data structures.

They're used for efficiently checking
the integrity of large data sets.

Hash based cryptography utilizes this
one time signature where a single key

is used to sign a message only once.

Yeah, they take an input it's
a mathematical function again.

It takes an input of any size and
it produces a fixed size output.

The output is typically a string of
bits which is unique to the input data.

Hash functions should have properties
like pre image resistance, where

it's computationally infeasible
or impossible to find the original

input from the hash value.

So it's extremely
difficult to go backwards.

And it's very difficult to
find two different inputs

that produce the same hash.

Next, the Merkle trees a Merkle
tree is basically a tree of hashes.

It's a hierarchical structure that
has these different hash functions.

It allows Checking again of the
consistency of large datasets, each

leaf in the tree represents a value of
a hash of a small portion of the data.

And each internal node represents
the hash value of its child node.

So you've got this whole,
that's why they call it a tree,

all of these interrelations
between these different hashes.

And overall it provides a
verification of the data.

By using a tree, obviously you
have to do more computations on the

hash to try and work out the input.

That makes it much more
difficult to to tackle.

So I'm just going to touch, I'm
just going to rewind a little bit.

I've mentioned all these different
protocols, but let's just talk briefly

about why it is that the RSA 2048 ECC and
the Diffie-Hellman protocols are insecure.

Really, it's because of the way these
it's the way these protocols are used.

In the process of
authentication and encryption,

Those two methods basically are key
exchange mechanism and the signing of

the creation of signatures to identify.

Identity there's the combination of those
two things are the core of asymmetric

encryption, which is used in, it's the
public and private key encryption that's

used in many of our applications from
sending you bank details to buy something

online to, connections to your healthcare
provider to store your personal data.

So if these can be broken,
then the keys can be revealed

which is why PQC is necessary.

What's a KEM well, key
encapsulation mechanism?

It's a cryptographic technique used
to exchange symmetric keys between

two parties over an insecure channel.

It's commonly used in hybrid encryption
where there's a combination of

asymmetric and symmetric encryption.

The KEM encapsulation.

Results in a, like a fixed length
symmetric key that can be used in

one of two ways it can be used to
create a data encryption key or it

can be created key encryption key.

So there's this whole it's fairly simple
to understand once you've had a good

look at it, but there's a whole process.

Of the way the keys are sent
between the client and the server.

I think these are the end
points in the quantum world,

that would be Alice and Bob.

But of course, it's a combination of
creating a key from private key, sharing

it, and then feeding back a shared
secret based on some encapsulation

of the other end, or with Diffie
Hellman, it's a little bit different.

But ultimately it is the
exchange of passwords.

You can think of it that way.

So I mentioned signatures as well.

Digital signature schemes are used to
authenticate the identity of a sender.

They detect unauthorized
modifications also, and they

underpin trust in the system.

Signatures, similar to
key encryption mechanisms.

They also depend on this public
private key pair and hence a break

in the public key cryptography will
also affect these digital signatures.

So the algorithms for
both have to be changed.

Now for PQC there are
mechanisms and algorithms for

key encapsulation mechanisms.

The primary one is called Crystals Kyber.

It's a module learning with errors
based, that's the name of the process

in the lattice learning with errors
based key encapsulation mechanism.

And that's the one that's been
selected by NIST at this point in time.

And for signatures, there's
crystals, dilithium, and that's

a lattice signature scheme.

It's it's different to Kyber, but it's
in the same portfolio of algorithms.

And there's one called Falcon,
which is also lattice based.

And SPHINCS+, I mentioned, this is a
stateless hash based signature scheme.

And what I love about this
is that crystals, Kyber,

crystals, dilithium, Falcon.

They all come from Star Wars and I
haven't yet looked into why that is,

but there's obviously a nerd somewhere
in the team that developed crystals,

dilithium, crystals, Kyber and Falcon.

Maybe Falcon is not related, Millennium
Falcon, it's it is for me, I think.

So I would, it really feels like somebody.

Somebody important somewhere that
had the power to agree the naming

of these maybe there's a committee
and there's a bunch of a bunch of

Star Wars nerds on that committee.

And, they did a brilliant
thing to, to pick those names.

I'm very chuffed.

So my understanding with PQC and
these key encryption methodologies and

mechanisms is that really what we're
doing here is a bit of a rip and replace.

The protocols, they behave in different
ways, but ultimately they are,

they're used as an encryption method.

The actual sequence flow
of messages and so on.

I don't think that changes.

I'll check on that one, but that's
my understanding for it for KEM.

And then when we look at
signatures additional signature

provides construction, defining
security post quantum signatures.

So let's look into that a little bit more.

Just like the previous way, or the
current way, I should say, of signing

messages and keys, in, in, Sequence of
messages used in negotiation encryption.

What's important here is that the
certificate guarantees that adversary

even with access to the signing CA or
Oracle cannot forge a valid signature.

This is the whole point of it because
signatures are really developed.

For with the public and private key
set up somebody could essentially

adopt the identity of somebody else,
share their public key instead of.

The original individual and pretend
to be them and then continue

with data transmission over
and negotiated encrypted path.

But with the signature, then there's a
series of things which are done in the

signing, which prove the identity and
it's done through the PKI infrastructure.

It's done through a CA, which
is a third party organization

holding the certificate.

And it basically brings in a third
party to approve the use of the

signature for that particular traffic.

And by doing that, by checking
the signature, then verifying

the identity of the end user.

And there's very strong security
governance around the CAs.

You can get private and public CAs.

Your architecture may vary, but
that's basically what it looks like.

So looking a bit more at the protocols for
signatures, di lithium crystals di lithium

is a is based on a lattice problem.

Again the module learning with errors
problem, the design of the algorithm is

based on some kind of mathematical problem
about rendering lattice based schemes.

Additionally, it offers
deterministic and randomized signing.

It's essentially ticking all the boxes
for the certificate requirements.

Falcon, on the other hand, is
based on the hash and sign lattice.

So it's still using lattice, but
it's hash first and then sign.

You get hash and sign and hash
which I'll talk about in a second.

But the main principle
for Falcon is compactness.

It was designed in a way that achieves.

Minimal total memory requirement and the
sum of the signature size plus the public

key size is all that's needed in memory.

And this is possible due to the
compactness of the lattices.

So it's very efficient.

But the downside is that the algorithms
need a floating point arithmetic support.

So there's a different requirement
from the operating system.

And the, whatever, mathematical
libraries are available at that end node.

Really the performance characteristics
of these two signatures may differ

based on different implementations
and the hardware platform.

Generally, dilithium is known for its
relatively fast signature generation,

while falcon can provide more
efficient signature verification.

In most cases, this probably doesn't
matter, but when you're designing, low

power or power optimized end devices.

Then of course, every optimization you
can make for power consumption is useful.

So those can come into account plus
with With signing and key encapsulation

mechanisms, it used to be the keys
and signing of the negotiation of the

connection or tunnel, if you like, the
encrypted tunnel would have been made

every now and again, but in order to
increase the security quite often that

is happening at the beginning of every
type of operation in a device that could

be your web browser, it could be your
could be your connected car, it could

be your phone, it could be your toaster.

And essentially, the more and more
IoT devices there are, then the more

and more key management is required.

And for a network that's managing
a very large IoT estate, there may

be millions of these per second at
the very pointy end of the spectrum.

The performance of the CA is important
the performance of the, of any kind of

centralized services also need to be
built into the architecture and design

and therefore the performance
characteristics of the different

certificates, signature
protocols may be different.

Just briefly on hash then sign then hash.

Essentially, it's just the order
of operations, but if you hash the

message before signing it, it gives
you an additional layer of security

by ensuring that Only a fixed size
digest of the message is signed

rather than the entire message.

So there's an opt optimization
in terms of the message size.

Yeah, there's pros and cons
here and that's often the case.

One thing I wanted to talk about
was what it is that they recommend.

So there will be performance
trade-offs because of the size

and the the processing needed to
perform these new pqc algorithms.

Let me give you an example.

In terms of size, if you look at the
signature schemes traditional R S

A 2048 is 2 56 bytes for each key.

And the signature, whereas with
dilithium, it starts at dilithium two,

for example, is 13, 12 . And so on.

So it's, the key sizes are much larger
and then just traditional algorithms

like RSA and AES and those kinds of
things where they have different.

Key length, it's the same for
Kyber, Dilithium and Falcon

and SHPINCS+ and so on.

They, NIST have a series of levels
for the importance of the system.

And this really is based on the
technology and the intelligence

or complexity needed to break
an AES or SHA algorithm.

So they have these five levels.

The first one starts around the
hardness of AES 128, which is Kyber 512,

Falcon 512 signatures with Sphinx 128.

And then halfway up at three,
you've got optimal keys at AES 192.

The comparison and recommended
PQC algorithms there are Kyber 768

Dilithium 3 and finally at the top
end, you've got AES 256, and the

equivalents there are Kyber 1024, Falcon
1024, Dilithium 5, SPHINCS, SHA 256.

The more secure you want the
system, the more processing

power you need, the longer keys.

The more complexity you need in
the endpoint, the more processing

power and power required to do that.

One thing I wanted to say about
the migration to PQC moving on to

migration a little bit here, is that
neither the traditional algorithms

nor the post quantum algorithms
are really fully trusted to protect

data for the required lifetimes.

So the traditional algorithms
like RSA and Elliptic Curve, they

will fall to quantum encryption.

Analysis eventually and will be broken
by quantum computers, but the post

quantum algorithms face like major
uncertainty about the mathematics, the

vulnerabilities, the compliance issues
the hardware and software that's going

to be built to, to run these algorithms.

So there hasn't been
much time for maturing.

The state of play is that NIST is
going to make formal recommendations

this year, although they've already,
they're only, I think working out the

final Final algorithm verification.

So the ones that they have selected
already, mostly the ones I've mentioned

on this pod, they can be tested.

Now you can download these, you can
implement them into your system.

Many hardware vendors are
already supporting these

and have programs for them.

Remember that when it comes to
implementation, it's not just about

rip and replace, it's about thinking
about key management it's a multi

layered approach when it comes to how
the keys are used and in some, okay,

some appliances, some systems, it may
be very quite simple to, to update

them but others perhaps not so easy and
there may be multiple changes needed.

Up and down the security
authorization, authentication stack.

So in terms of migration essentially
different strategies are available as

you'd expect in this complex world.

It's important to consider backward
compatibility the impact of performance,

the risk of potential attacks
during a transition phase and the

support of the infrastructure that
you have, software and hardware.

One approach is called hybrid encryption,
where Combination of both classical

and post quantum algorithms are used.

This allows for a gradual
transition, ensuring compatibility

and also protecting against the
harvest now decrypt later attacks.

and concerns or paranoia by encrypting
the data in such a way that it's using

the PQC, but you can still use your
classic TLS supported group extensions and

connectivity for the for the connection.

Another way to use a hybrid approach
is to just use PQC for sensitive data.

Where you're using perhaps tunneling
for traffic, which is not perhaps only

has a value in terms of the data that
it's carrying for a short period of

time, more risk can be taken on that
than it can something which is carrying

sensitive data, which is permanent
and carries perhaps even is owned by a

third party or carries IP of some kind.

By that I mean intellectual property.

Another strategy is
focusing on key exchange.

So as compromised keys can compromise the
security of the entire system, focusing

there, first of all, is a good approach.

So using PQC key exchange mechanisms,
as I mentioned, replacing the Diffie

Hellman algorithms with lattice
based cryptography or the code

based key exchange that I mentioned.

These mechanisms provide secure ways
of establishing shared secret keys

between parties, even in the presence of
quantum adversaries or quantum computers

that are able to decrypt traffic.

You could think of that as a
first kind of line of defense.

So a quick comment on security
considerations, classical cryptanalysis.

exploits weaknesses in algorithm
design, in mathematical vulnerabilities

or implementation flaws.

Whereas quantum cryptanalysis is different
because it harnesses the power of quantum

computers to solve the mathematical
problems rather than the vulnerabilities.

So both pose threats to all
algorithms, including PQC.

And developing and adopting these
Cryptographic algorithms resilient

against these threats is crucial.

Recent attacks on the side
channel implementations using deep

learning based power analysis.

I've also shown that one needs to
be cautious while implementing the

required PQC algorithms in hardware.

So two of the.

Side channel attacks were
almost on Kyber side and one

was on Sabre called Sabre side.

Essentially the side channel attacks
are emerging and more may emerge.

I wanna to retouch on
cryptographic agility.

This is relevant for both classical and.

Quantum crypt analysis.

And really it's the capability of your
organization to adapt to emerging threats.

To.

Adopt and take on stronger algorithms,
comply with new standards and so on.

And this really needs to come
out in the planning and risk

management of the network.

The risk management comes
first, but then the strategy for

transformation comes afterwards.

And the cryptographic agility
should be built into the

mindset of the strategy, right?

Because if there are algorithms
which are broken in the future,

then the systems need to be able
to take on new ones as necessary.

In summary, what I'm trying to,
what I'm trying to say through all

of this is it's time to just have a
look at this in your organization.

Even if it's just one or two individuals
that are given a side project to do

an assessment of all of the algorithms
that are in place and the data that

each is protecting to give you a bit
of a snapshot of your overall exposure.

Quite often organizations are going
through their security audits.

And they're really taking a tick box
approach to ensure that they prove

that they're doing the right thing,
maybe in one particular place, rather

than everywhere, or they're just
using the bare minimum necessary.

And by doing that you're totally
reactive to this problem.

And it's highly likely that's going
to mean pain in the future when

it's, when there's a time pressure.

So in the meantime, you can do the very
basics, which are really to just invest

in some research, allocate some resources.

To PQC connect with your vendors and
academia if necessary, but there's a

lot of information out there for you
to do these first two steps and that

really prepares you and puts you in.

It gives you a feeling of confidence
that, there isn't some quantum Armageddon

coming that you might see if you
might pick that up in some of the hype

material that's out there, ignore it.

But take it seriously, and it's time
to have a look at your infrastructure.

So I'm going to conclude now, it's
important to remember that this

threat of quantum computing on our
current cryptographic system is real.

But by exploring the algorithms,
understanding the migration

strategies and so on, you can ensure
a secure and resilient future in

the face of quantum adversaries.

And.

I would say as a follow on, I plan to
have some similar sessions like this,

looking at Q K D and other cryptography
orientated systems, solutions and

infrastructures that are evolving
around the use of quantum technology

in the cryptographic world.

So thanks for your attention
if you've made it this far.

Thank you very much and I'm
happy to take any requests.

Follow ups to this or any additional
changes or comments where I

have perhaps not hit the mark.

Brilliant.

Thank you.

Bye bye.

I'd like to take this moment to thank
you for listening to the podcast.

Quantum networking is such a broad domain
especially considering the breadth of

quantum physics and quantum computing all
as an undercurrent easily to get sucked

into So much is still in the research
realm which can make it really tough for

a curious it guy to know where to start.

So hit subscribe or follow me on your
podcast platform and I'll do my best

to bring you more prevalent topics
in the world of quantum networking.

Spread the word.

It would really help us out.

Creators and Guests

Dan Holme
Host
Dan Holme
Quantum curious technologist and student. Industry and Consulting Partnerships at Cisco.
Post Quantum Cryptography Monologue
Broadcast by