From: daw@mozart.cs.berkeley.edu (David Wagner)
Newsgroups: sci.crypt
Subject: Re: interesting - NSA Dual Counter Mode (DCTR)
Date: Tue, 21 Aug 2001 16:33:14 +0000 (UTC)
Message-ID: <9lu2ga$15k2$1@agate.berkeley.edu>
Trevor L. Jackson, III wrote:
>This perspective suggest that ciphers with a variable number of rounds
>would be desirable in order to scale the processing load to the
>available computational power.
I'm not so fond of a variable number of rounds, because it introduces
a new failure mode.
Consider a key-exchange protocol that also negotiates the number
of rounds to use, and suppose that this parameter is sent in the
clear (unprotected) from A to B. Let's say A sends "use 64 rounds".
Now imagine the attacker intercepts this and replaces it with "use
63 rounds". The two endpoints become desynchronized: A uses E_k,
the full 64-round cipher, and B uses E'_k, the 63-round reduced cipher
(but with the same key for both ciphers). Note that these two ciphers
differ only by a single round, so if we obtain the encryption of a known
plaintext x by both sides, we obtain a pair (E'_k(x),E_k(x)) that forms
an input-output pair for a single round of the cipher. Moreover, in most
ciphers, given the ability to isolate a single round like this, we can
often recover the key. Thus, an active attacker can try to desychronize
the endpoints, learn a known plaintext or two, recover the key in this
way, and then modify all subsequently transmitted packets to make sure
that neither endpoints is able to detect the attack.
While this may sound rather theoretical, it is a failure mode introduced
by variable-round ciphers, and one that I'd prefer to avoid. At the
very least, variable-round ciphers introduce a new requirement on a
key-exchange protocol; at the worst, they may be susceptible to attack.