Another approach often used to store key information is to encrypt the keys and store them in an encrypted form. In addition, this approach is often used to distribute key information in cryptographic networks.

The need to store and transmit key information encrypted with other keys led to the development of the concept key hierarchies.

The hierarchy of key information can include many levels, however, the most commonly distinguished are:

master keys (master keys),

key encryption keys,

working keys (session).

Session keys are at the lowest level and are used to encrypt data. When these keys need to be securely transferred between network nodes or stored securely, they are encrypted using next level keys − key encryption keys.

At the top level of the key hierarchy is the master key. This key is used to encrypt encryption keys when it is required to store them securely on disk. Usually only one master key is used in each computer, which is contained on external media, usually protected from unauthorized access.

The value of the master key is fixed for a long time (up to several weeks or months). Session keys change much more often, for example, when building crypto-protected tunnels, they can be changed every 10-15 minutes, or based on the results of encryption of a given amount of traffic (for example, 1 Mb).

Key distribution is a very responsible process in key management. One of the main requirements for the implementation of this process is the concealment of distributed key information.

The problem of key distribution is reduced to the construction of a key distribution protocol that provides:

1) mutual authentication of session participants;

2) session validation to protect against attacks by

repetitions;

3) the use of the minimum number of messages in the exchange of keys.

Generally speaking, there are two approaches to the distribution of key information in computer network:

1. Distribution of key information using one

or multiple key distribution centers.

2. Direct exchange of session keys between users.

Distribution of key information using key distribution centers

This approach assumes that the key distribution center knows the keys to be distributed, and therefore, all recipients of key information must trust the key distribution center.

Dignity this approach is the ability to centrally manage the distribution of key information and even the policy of access control of remote entities to each other.


This approach is implemented in the Needham-Schroeder protocol and the Kerberos authentication protocol based on it. The distribution of key information and access control is based in these protocols on the issuance of credentials by the key distribution center. The use of these protocols makes it possible to securely distribute session keys even in the event of mutual distrust of the two interacting parties.

Direct exchange of session keys between users

To be able to use a cryptosystem with a secret key for secure information exchange between opposite parties, the interacting parties need to develop a shared secret, on the basis of which they can securely encrypt information or securely generate and exchange session keys. In the first case, the shared secret is the session key, in the second case, the master key. In any case, an attacker should not be able to eavesdrop on the communication channel to obtain this secret.

There are two main ways to solve the problem of generating a shared secret without revealing it to an attacker:

use of a public key cryptosystem for encryption;

· use of protocol of open distribution of Diffie-Hellman keys.

The implementation of the first method should not raise questions. Let's consider in more detail the implementation of the second method.

Diffie-Hellman protocol

The Diffie-Hellman protocol was the first public key algorithm (1976). The security of this protocol is based on the difficulty of computing discrete logarithms.

Let users A and B want to work out a shared secret. To do this, they perform the following steps.

Parties A and B agree on the module to be used N, as well as the primitive element g, whose degrees form numbers from 1 to N-1.

1. Numbers N and g are public elements of the protocol.

2. Users A and B independently choose their own secret keys of CK A and CK B (random large integers, smaller N kept secret).

3. Users A and B calculate the public keys of OK A and OK B based on the corresponding secret keys using the following formulas:

4. Parties A and B exchange public key values ​​over an insecure channel.

5. Users A and B form a shared secret K according to the formulas:

User A:

K = (OK B) CK A (mod N) = (g CK B) CK A (mod N) = g CK B . CK A (mod N).

User B:

K = (OK A) CK B (mod N) = (g CK A) CK B (mod N) = g CK A . CK B (mod N).

Key K can be used as a shared secret key (master key) in a symmetric cryptosystem.

Example 6.2.

Let's take the module N= 47 and primitive element g= 23. Let users A and B chose their secret keys CK A = 12, CK B = 33. Then,

In this case, the shared secret will look like:

The Diffie-Hellman public key distribution algorithm eliminates the need for a secure channel for key transmission. However, a guarantee is required that the recipient has received public key from the sender from whom he expects it. This problem solved with the help of digital certificates and digital signature technology.

The Diffie-Hellman protocol has found effective application in the protocol SKIP key management. This protocol is used when building crypto-protected tunnels in the ZASTAVA product family.

Key management

In addition to choosing a cryptographic system suitable for a particular IC, an important issue is key management. No matter how complex and secure the cryptosystem itself is, it is based on the use of keys. If to ensure the confidential exchange of information between two users, the key exchange process is trivial, then in an IS, where the number of users is tens and hundreds, key management is a serious problem.

Under key information is understood as the totality of all active keys in the IS. If a sufficiently reliable control of key information is not provided, then having taken possession of it, the attacker gets unlimited access to all information.

Key management- information process, which includes three elements:

* key generation;

* accumulation of keys;

* distribution of keys.

Let's consider how they should be implemented in order to ensure the security of key information in the IS.

Key generation

At the very beginning of the conversation about cryptographic methods, it was said that you should not use non-random keys in order to make them easy to remember. Serious ICs use special hardware and software methods to generate random keys. As a rule, PSC sensors are used. However, the degree of randomness of their generation should be quite high. Ideal generators are devices based on "natural" random processes. For example, serial samples of key generation based on white radio noise. Another random mathematical object is the decimal places of irrational numbers, for example, or e, which are calculated using standard mathematical methods.

In ISs with medium security requirements, software key generators are quite acceptable, which calculate the PRNG as a complex function of the current time and (or) the number entered by the user.

Key accumulation

Under accumulation of keys refers to the organization of their storage, accounting and removal.

Since the key is the most attractive object for an attacker, opening the way to confidential information for him, special attention should be paid to the issues of key accumulation.

Secret keys should never be written explicitly on a medium that can be read or copied.

In a rather complex IS, one user can work with a large amount of key information, and sometimes it even becomes necessary to organize mini-databases according to key information. Such databases are responsible for the acceptance, storage, accounting and deletion of the used keys.

So, each information about the keys used must be stored in encrypted form. Keys that encrypt key information are called master keys. It is desirable that each user knows the master keys by heart, and does not store them at all on any material media.

A very important condition for the security of information is the periodic updating of key information in the IS. In this case, both ordinary keys and master keys should be reassigned. In especially responsible IS, it is desirable to update key information on a daily basis.

The issue of updating key information is also related to the third element of key management - key distribution.

Key distribution

Key distribution is the most critical process in key management. There are two requirements for him:
  1. Efficiency and accuracy of distribution
  2. Secrecy of distributed keys.
Recently, there has been a noticeable shift towards the use of public-key cryptosystems, in which the problem of key distribution disappears. Nevertheless, the distribution of key information in IS requires new effective solutions.

Distribution of keys between users is implemented by two different approaches:

  1. By creating one or more key distribution centers. The disadvantage of this approach is that the distribution center knows to whom and what keys are assigned, and this allows you to read all messages circulating in the IS. Possible abuses significantly affect protection.
  2. Direct key exchange between users of the information system.
In this case, the problem is to reliably authenticate the subjects.

In both cases, the authenticity of the communication session must be guaranteed. This can be provided in two ways:

  1. Request-response mechanism, which consists of the following. If user A wants to be sure that the messages he receives from B are not false, he includes an unpredictable element (request) in the message sent to B. When answering, user B must perform some operation on this element (for example, add 1). This cannot be done in advance, since it is not known what random number will come in the request. After receiving a response with the results of the actions, user A can be sure that the session is authentic. The disadvantage of this method is the possibility of establishing a albeit complex pattern between the request and the response.
  2. Timestamp mechanism ("timestamp"). It means fixing the time for each message. In this case, each IS user can know how "old" the incoming message is.
In both cases, encryption should be used to ensure that the response was not sent by an attacker and the timestamp has not been altered.

When using timestamps, there is a problem with the allowable delay time interval for session authentication. After all, a message with a "temporal stamp" in principle cannot be transmitted instantly. In addition, the computer clocks of the recipient and the sender cannot be absolutely synchronized. What delay of the "stamp" is considered suspicious.

Therefore, in real ICs, for example, in credit card payment systems, it is the second mechanism of authentication and counterfeit protection that is used. The interval used is from one to several minutes. A large number of known methods of stealing electronic money are based on "wedging" into this gap with false withdrawal requests.

Public key cryptosystems can be used for key exchange using the same RSA algorithm.

But the Diffie-Hellman algorithm turned out to be very effective, allowing two users to exchange a key without intermediaries, which can then be used for symmetric encryption.

Diffie-Hellman algorithm

Diffie and Helman proposed to create cryptographic systems with a public key discrete exponentiation function.

The irreversibility of the transformation in this case is ensured by the fact that it is quite easy to calculate the exponential function in a finite Galois field consisting of p elements. ( p- either a prime number or prime to any power). Calculation of logarithms in such fields is a much more time-consuming operation.

If a y= x, 1<x<p-1, where is a fixed element of the field GF(p), then x= lo g y above GF(p). Having x, it is easy to calculate y. This will require 2 ln( x+y) multiplication operations.

Inverse calculation problem x from y will be quite difficult. If a p is chosen correctly enough, then extracting the logarithm will require calculations proportional to

L(p) = exp((ln p ln ln p) 0.5 }

To exchange information, the first user chooses a random number x 1 , equiprobable of integer 1... p-one. He keeps this number in secret, and sends the number to another user

y 1 = x mod p

The second user does the same, generating x 2 and calculating y 2 , sending it to the first user. As a result, they can calculate k 12 = x 1 x 2 mod p.

In order to calculate k 12 , the first user raises y 2 to the power x one . The second user does the same. Thus, both users have a common key k 12 , which can be used to encrypt information with conventional algorithms. Unlike the RSA algorithm, this algorithm does not allow you to encrypt the actual information.

Without knowing x 1 and x 2 , an attacker might try to compute k 12 , knowing only intercepted y 1 and y 2. The equivalence of this problem to the problem of calculating the discrete logarithm is a major and open question in public key systems. A simple solution has not yet been found. So, if the direct transformation of 1000-bit prime numbers requires 2000 operations, then the inverse transformation (calculation of the logarithm in the Galois field) will require about 10 30 operations.

As you can see, with all the simplicity of the Diffie-Hellman algorithm, its second drawback in comparison with the RSA system is the absence of a guaranteed lower estimate of the complexity of opening the key.

In addition, although the described algorithm circumvents the problem of hidden key transmission, the need for authentication remains. Without additional funds, one of the users cannot be sure that he exchanged keys with exactly the user he needs. The danger of imitation remains in this case.

As a generalization of what has been said about the distribution of keys, the following should be said. The task of key management comes down to finding such a key distribution protocol that would provide:

* the possibility of refusal from the key distribution center;

* Mutual authentication of session participants;

* Confirmation of session authenticity by a request-response mechanism, using software or hardware for this;

* using the minimum number of messages in the key exchange.

Every cryptographic system works with cryptographic keys. If the key data management mechanism is not implemented in the system, then it will not be difficult for an attacker to take possession of it. Key management includes procedures such as generating, storing, and distributing keys. The last procedure is the most responsible.

In a symmetric cryptosystem, the two parties must first agree on a secret session key, the key to encrypt all packets. The key must be secret and periodically updated by subscribers. An asymmetric cryptosystem involves the use of two keys - a private (secret) and a public one. The public key is disclosed. When sending messages, you need to send the public key, implementing the authentication during the transfer.

The key distribution system must comply with the following requirements:

  • integrity and confidentiality of distributed keys
  • efficiency and accuracy of distribution

When distributing keys, there are 2 approaches:

  • use of key distribution centers
  • direct key exchange between subscribers

In the first approach, the key distribution center knows which keys were sent and to whom. In the second approach, you need to verify the identity of the network entities. A task key distribution comes down to creating a , which implements:

  • authentication of session participants
  • session validation
  • implementation of the minimum number of message passing in the key exchange

A clear example of the implementation of key distribution centers is the Kerberos system. Here we will consider the second approach. For this use:

  • an asymmetric public key cryptosystem to protect the secret key of a symmetric cryptosystem
  • Diffie-Hellman public key distribution systems

Implementation of a combined cryptosystem for managing distribution

The main aspect of asymmetric public key cryptosystems is their potentially high security. No need to transfer the keys, make sure they are authentic. However, such a system loses in speed relative to symmetric cryptosystems with a secret key.

The combined implementation of asymmetric and symmetric encryption allows you to eliminate the main disadvantages that are characteristic of systems separately. The idea is this:

  • a symmetric cryptosystem is implemented to encrypt the plaintext, and an asymmetric public key cryptosystem is implemented to encrypt the secret key of the symmetric cryptosystem.

This approach is also called the schema. electronic digital envelope. Let's look at an example. User A wants to implement a hybrid encryption method to securely transport packet M to user B. The algorithm is as follows:

  • User A's actions:
    • Generates (by any means) a session secret key K s , which is needed in symmetric encryption to encrypt packets
    • Encrypts the packet M with a symmetric algorithm on the session secret key K s
    • Encrypts the secret session key K s with the public key K B of user B using an asymmetric algorithm
    • Sends over an open channel to user B an encrypted packet M along with an encrypted session key K s
  • User B's actions (when receiving an electronic digital envelope):
    • decrypt the session key K s using an asymmetric algorithm using your private key K B
    • Decrypt the packet M using a symmetric algorithm using the decrypted key K s
    • User actions are shown in Fig.1

Picture 1

When implementing a digital envelope, the disadvantages of asymmetric and symmetric algorithms are compensated due to:

  • the problem of key distribution of the symmetric crypto algorithm is eliminated by the fact that the session key K s is transmitted over an open channel in encrypted form using an asymmetric crypto algorithm
  • the problem of the slow speed of the asymmetric algorithm is not relevant, since only the key is encrypted, and the text is encrypted with a symmetric crypto algorithm

If the length of the session key is less than the length of the asymmetric key, then the attacker will implement an attack on the session key. Table 1 shows the ratio of key lengths.

Table 1 - Key lengths for asymmetric and symmetric systems with the same cryptographic strength

Diffie-Hellman key distribution method

W. Diffie and M. Hellman created the public key distribution method in 1976. The method allows users to exchange keys over an insecure communication channel. Its security is based on the difficulty of calculating discrete logarithms in a finite field, in contrast to the ease of solving the direct problem of discrete exponentiation in the same field. The scheme of the method is shown in Fig.2.

Drawing - 2

Users A and B, when exchanging data, generate their random secret keys K A and K B (the keys are random large integers). User A and B then calculate the public keys:

  • J A,B = g K A,B (mod N)

N, g are large integer primes. These numbers are not secret, and are known to all users of the system. Users A and B then implement a key exchange J over an insecure channel and implement them to compute a shared session key J:

  • user A: J = (J B) K A (mod N) = (g K B) K A (mod N)
  • user B: J` = (J A) K B (mod N) = (g K A) K B (mod N)
  • J = J` since (g K B) K A = (g K A) K B

Due to the implementation of the one-way function, the operation of calculating the public key is irreversible. The Diffie-Hellman algorithm allows you to encrypt information with each communication session using new keys. This improves security by not having to store secrets on media. Also, this algorithm allows you to implement a method of comprehensive protection of the confidentiality and authenticity of the transmitted data.

Method of comprehensive protection of confidentiality and authenticity of transmitted data

To simultaneously protect the confidentiality and integrity of information, it is advisable to implement encryption in the complex. The algorithm works like this:

  • user A signs the packet M with his secret key K A , implementing the standard digital signature algorithm
  • user A computes a shared secret key K by Diffie-Hellman principle, from his public key and user B's public key
  • user A encrypts packet M with a shared secret key K, using symmetric encryption
  • user B receives the packet M, calculates the public key K and decrypts the packet M
  • user B verifies the signature of the decrypted packet M using the public key of user K A

Based on the Diffie-Hellman algorithm, the crypto-key management protocols SKIP, IKE work.

Key distribution protocol(key establishment protocol) is a cryptographic protocol, during the execution of which a shared secret is made available to two or more parties for subsequent use for cryptographic purposes.

Key distribution protocols are divided into two classes:

    Key transport protocols;

    Key exchange protocols.

Key transport protocols(key transport) are key distribution protocols in which one participant creates or otherwise acquires a secret and securely transfers it to other participants.

Key exchange protocols(key agreement, key exchange) are key distribution protocols in which a shared secret is generated by two or more participants as a function of the information contributed by each of them (or associated with them) in such a way that (ideally) no other party can predetermine their shared secret.

There are two additional forms of key distribution protocols. A protocol is said to perform a key update if a completely new key is generated in the protocol, independent of the keys generated in previous sessions of the protocol. The protocol performs the generation of derived keys (key derivation) if a new key is “derived” from the cryptosystem participants already existing.

Key properties of key distribution protocols include the properties of key authentication, key validation, and explicit key authentication.

(Implicit) key authentication(implicit key authentication) - a property by which one protocol participant makes sure that no other party, except for a specially identified second protocol participant (and possibly a trust center), can access the secret keys obtained in the protocol. There are no guarantees here that the second party actually got access to the key, but no one else but him could get it. Implicit key authentication is independent of the actual possession of the key by the other party and does not require any action from the other party.

Key confirmation(key confirmation) - a property by which one participant in the protocol is convinced that another participant (possibly unidentified) really possesses the secret keys obtained in the protocol.

Explicit key authentication(explicit key authentication) is a property that is executed when both (implicit) key authentication and key validation take place.

    1. Needham-Schroeder protocol on symmetric keys

This protocol underlies a large number of key distribution protocols that use trusted centers. There are two types of this protocol:

    Needham-Schroeder protocol on symmetric keys;

    Needham-Schroeder protocol on asymmetric keys.

The symmetric key protocol works like this:

Preliminary stage:

This approach creates a kind of vicious circle: in order to share the secret (message being transmitted), the sender and recipient must already have a shared secret (encryption key). Previously, this problem was solved by a non-cryptographic method - the transfer of a key through communication channels that are physically protected from listening (Fig. 1). However, creating such a channel and maintaining it in operational readiness in case of an emergency need to transfer the key is quite laborious and costly.

Rice. one.

The problem was successfully solved within the framework of modern cryptography, which arose a little more than a quarter of a century ago, which is called so in contrast to the “traditional cryptography” already known by that time. The solution is to use asymmetric (two-key) ciphers or key distribution schemes over open communication channels.

In the first case, the procedures for - and decryption are performed on different keys, so there is no need to keep the encryption key secret. However, due to the extremely low performance characteristics and susceptibility to some special types of attacks, such ciphers turned out to be of little use for hiding directly user information. Instead, asymmetric ciphers are used as part of combined schemes, when a data array is encrypted with a symmetric cipher on a one-time key, which in turn is encrypted with a two-key cipher and transmitted in this form along with the data.

Key distribution schemes over open communication channels solve the same problem in a slightly different way: during an interaction session, two correspondents develop a shared secret key, which is then used to encrypt the transmitted data with a symmetric cipher. Moreover, the interception of information in the channel during the session of generating such a key does not give the enemy the opportunity to obtain the key itself: K=K(X,Y) is uncomputable (Fig. 2).


Rice. 2.

Problems of asymmetric cryptography

To date, asymmetric cryptography quite successfully solves the problem of distributing keys over open communication channels. Nevertheless, there are several problems that cause some concern for its future. The stability of all asymmetric cryptography schemes is based on the impossibility of efficient computational solution of a number of such mathematical problems (the so-called NP-problems) as factorization (factorization) of large numbers and logarithm in large discrete fields. But this impossibility is just an assumption, which can be refuted at any moment if the opposite hypothesis is proved, namely NP=P. This would lead to the collapse of all modern cryptography, since the problems on which it is based on the unsolvability are quite closely related, and breaking even one cryptosystem will mean breaking most others. Intensive research is being carried out in this direction, but the problem still remains open.

Another threat to modern cryptosystems comes from the so-called quantum computers - information processing devices built on the principles of quantum mechanics, the idea of ​​which was first proposed by the famous American physicist R. Feynman. In 1994, P. Shor proposed a factorization algorithm for a quantum computer, which allows factoring a number in a time that depends polynomially on the size of the number. And in 2001, this algorithm was successfully implemented on the first working prototype of a quantum computer created by specialists from IBM and Stanford University.

According to experts, a quantum computer capable of breaking the RSA cryptosystem can be created in about 15-25 years.

Another unpleasant fact about asymmetric cryptosystems is that the minimum "safe size" of keys is constantly growing due to progress in the relevant field. Over the entire quarter-century history of such systems, it has already grown by about 10 times, while over the same period for traditional symmetric ciphers, the key size has changed by less than half.

All of the above makes the long-term prospects of asymmetric cryptography systems not entirely reliable and forces us to look for alternative ways to solve the same problems. Some of them can be solved within the framework of the so-called quantum cryptography, or quantum communication.