This was posted not too long ago on sci.crypt... Enjoy... I think the most relevant information is near the top, but it''s all quite good... :-) -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- There is no intrinsic difference between algorithm and data, the same information can be viewed as data in one context and as algorithm in another. Why then do so many people claim that encryption algorithms should be made public and only the key should be kept secret? (This is the famous and derisive mantra about "security through obscurity".) There are several answers: a) Some people, with little understanding about computer technology, try to keep secret what their programs do, even though the programs themselves are public. A program *is* a representation of the algorithm, even though it happens to be more difficult for humans to read than, say, an detailed description in English. Actually it is a very good idea to keep secret the algorithm (in all its representations), as long as you can afford to do so. That is why major governments do exactly that. b) One can memorize a key and keep it secret in one''s head. Normally, encryption algorithms are too complicated to be memorized. Therefore it is easier to keep secret a key than an algorithm. c) Most people and organizations do not have sufficient expertise to create a new and good encryption algorithm and then try to keep it secret. A bad encryption algorithm, in this context, is an algorithm that can be broken by a sophisticated attacker even without knowledge about the algorithm itself. As you see, the reasons are of a practical nature, and are not derived from any fundamentals in cryptology. If we could find a practical way to keep secret both the key (that is the data the encryption method operates on) and also the method itself (or at least part of the method), then security would be greatly enhanced because the attacker would have less knowledge to work with. I believe there are several ways to overcome these practical difficulties: a) Machine generated secret ciphers. Today there are only a few encryption algorithms that are generally accepted as good. But suppose there existed a generator program which could construct a new encryption algorithm depending on some random input. Actually, the generator program would produce another program which would then be used as the encryption software. In some important cases, it is feasible to keep secret the resulting program: International organizations could distribute disks containing the program using trusted persons, the program could be loaded in centralized servers which actually operate from within a safe, or maybe the program (in encrypted form) would be run only from a floppy disk which would be handled with the same care as the key to a safe. We all know that absolute security is impossible. What I am suggesting here is that in many cases this system of security is better than one using a standardized and public algorithm which attracts a lot cryptanalytic work and may be broken in the near future or may have already been broken in secret. b) Intrinsically secret ciphers. Extend secrecy to parts of the encryption method. In his book, Schneier very briefly describes a variant of DES where the Sboxes (which most people would consider as part of the algorithm) are variable and depend on the key. Another very interesting possibility would have the key express the encryption method. In other words consider the key as the program, and the cipher simply as an interpreter, that follows the key''s instructions to scramble the plaintext or unscramble the ciphertext. This would call for large keys, but not larger than keys used in public key encryption. c) "Variable" ciphers. The idea here is to implement a cipher that incorporates a huge number of different encryption functions. The objective is to overwhelm the analytic capability of an attacker. (At the end of this post you will find the outline of a proof about why a cipher of this type is intrinsically more secure.) My GodSave encryption algorithm belongs to this class of "variable" ciphers. It extensively uses data of the key to change the control flow of the program execution. In other words, whereas most algorithms just operate on the key, GodSave uses information in the key to decide how to operate. Even better: It is constructed in such a way that large sections of its code can be modified by a programmer without affecting the security of the algorithm, offering some of the advantages described under a) above. Here is the definition of another cipher of this type (let us call it "heavy DES"): Start by randomly defining 16K DES keys; you need less the 1 MB space in your hard disk to save them. Suppose that this large key set is public (either by choice or because an attacker gained access to your computer and stole it). So now you have a large set of DES ciphers with *known* keys; the effort to break any one of them is 1. Now define a secret key of 140 bits. Use 14 bits at a time to index one of the 16K DES functions. Encrypt a 64 bit block by sequentially chaining the 10 indexed DES functions. DES is not a group, therefore each instance of the 140 bits long key results in a different mapping of the plaintext space into the ciphertext space. If we choose from 2^N DES functions and chain P of them together (in the example above N=14 and P=10) then there are 2^(N*P) different instances. Already with 140 bits of entropy, a brute force attack is out of the question no matter how many hardware coded DES machines you have. Suppose you have perfect cryptanalytic knowledge of DES - trapdoor and all - even then, can you see a way to attack this variable version? Finally, let me try to demonstrate why a "variable" cipher is more difficult to break: Take two ciphers A and B with keys of N bits. These ciphers must be independent in the sense that physically breaking one does not help in breaking the other. (By "physical break" I mean the work necessary to recover the key when the cipher is already cryptanalyzed; "logical break" would be the work necessary to successfully cryptanalize a cipher"). Let us suppose that these ciphers are not perfect; and therefore there exists a method (known or unknown) to physically break them that is more efficient then brute force, i.e. the trying of all possible keys. (Observe that no publicly known cipher with a key that is used repeatedly has been proved to be perfect in this sense.) For ciphers A and B there exists then a method to break them with as few as 2^(N*k) operations where k<1 (2^N corresponds to brute forcing them). If you increase the key length by 1 bit, then you would need 2^((N+1)*k)=2^N * 2^k operations to break A or B. But if you create a new cipher C with a key of N+1 bits where the last bit is used to choose either A or B as the working cipher with a key of N bits, then you must break A, and with 50% probability B also, with an effort comparable to 2^(N*k)+0.5*2^(N*k)=3/2 * 2^(N*k). Therefore you need more effort to break C with a key of N+1 bits, than either A or B with a key of N+1 bits, as long as k is less then ln(3/2)/ln(2) = 0.58. If instead of two ciphers, you started with 2^M different ciphers, then the results are more dramatic. The effort required for breaking the resulting cipher is now 2^(N*K-1)*(2^M+1) and it will be stronger as long as k < 1/M*(ln(2^M+1)/ln(2) -1) or for large M: k < 1 - 1/M. This works like a security amplifier: if you can construct 1024 independent ciphers then by this method you can produce a cipher that has a 10 bit longer key and is provably 512 times more secure than any one of them (in the sense that an attacker must invest 512 times more effort to break it). -=-=-=-=-=-=-=-=-=-=--=-=-=-=-=-=-=-=-=-=- -- ''L8r -BKM ----------------------------------------------------------------------------- Do not meddle in the affairs of Dragons | Brandon Matthews - SysAdmin For you are crunchy and good with mustard. | <bmatt@devils.eng.fsu.edu> -- Anonymous | (850) 668-2677 ----------------------------------------------------------------------------- finger bmatt@devils.eng.fsu.edu for PGP public key
Rogier Wolff
1998-Jun-01 00:49 UTC
Re: [linux-security] "Flavors of Security Through Obscurity"
Brandon K. Matthews wrote:> > > This was posted not too long ago on sci.crypt... Enjoy... I think the most > relevant information is near the top, but it''s all quite good... :-) > > -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- > > > There is no intrinsic difference between algorithm and data, the > same information can be viewed as data in one context and as > algorithm in another. Why then do so many people claim that > encryption algorithms should be made public and only the key > should be kept secret? (This is the famous and derisive mantra > about "security through obscurity".) There are several answers:> a) Some people, with little understanding about computer > technology, try to keep secret what their programs do, even > though the programs themselves are public. A program *is* a > representation of the algorithm, even though it happens to be > more difficult for humans to read than, say, an detailed > description in English. Actually it is a very good idea to keep > secret the algorithm (in all its representations), as long as you > can afford to do so. That is why major governments do exactly > that. > > b) One can memorize a key and keep it secret in one''s head. > Normally, encryption algorithms are too complicated to be > memorized. Therefore it is easier to keep secret a key than an > algorithm.> c) Most people and organizations do not have sufficient expertise > to create a new and good encryption algorithm and then try to > keep it secret. A bad encryption algorithm, in this context, is > an algorithm that can be broken by a sophisticated attacker even > without knowledge about the algorithm itself.This is a VERY important one.> As you see, the reasons are of a practical nature, and are not > derived from any fundamentals in cryptology. If we could find a > practical way to keep secret both the key (that is the data the > encryption method operates on) and also the method itself (or at > least part of the method), then security would be greatly > enhanced because the attacker would have less knowledge to work > with.The problem is: What attacks do you want your cryptography to be "secure" against? - unknown plaintext, unknown algorithm. - unknown plaintext. - known plaintext. - chosen plaintext. - chosen key. Most Cryptographic algorithm designers want their algorithm to resist all of the above. You want the algorithm to resist even the hardest attack (the attacker can "borrow" your box to do some sample encryptions. At first this sounds pretty odd that you''d allow an attacker to borrow your encryption box. However consider an ATM. When I can listen to its communications, I see a message coming by which I KNOW must say: "I have a Mr. Wolff here, his PIN is xyzw, can he withdraw $200?" Now, I personally don''t know how they would put the bits in this sentence into the plaintext message, but the algorithm must resist aginst this type of attack.)> I believe there are several ways to overcome these practical > difficulties: > > a) Machine generated secret ciphers. > > Today there are only a few encryption algorithms that are > generally accepted as good. But suppose there existed a generator > program which could construct a new encryption algorithm > depending on some random input. Actually, the generator program > would produce another program which would then be used as the > encryption software. In some important cases, it is feasible to > keep secret the resulting program: International organizations > could distribute disks containing the program using trusted > persons, the program could be loaded in centralized servers which > actually operate from within a safe, or maybe the program (in > encrypted form) would be run only from a floppy disk which would > be handled with the same care as the key to a safe. We all know > that absolute security is impossible. What I am suggesting here > is that in many cases this system of security is better than one > using a standardized and public algorithm which attracts a lot > cryptanalytic work and may be broken in the near future or may > have already been broken in secret.If you have the "generator program", then there is just a few bits more of "key" that need to be "known" to decypher a message. This is nothing different from a slightly more complex encryption algorithm and a slightly larger key.> b) Intrinsically secret ciphers. > > Extend secrecy to parts of the encryption method. In his book, > Schneier very briefly describes a variant of DES where the Sboxes > (which most people would consider as part of the algorithm) are > variable and depend on the key. Another very interesting > possibility would have the key express the encryption method. In > other words consider the key as the program, and the cipher > simply as an interpreter, that follows the key''s instructions to > scramble the plaintext or unscramble the ciphertext. This would > call for large keys, but not larger than keys used in public key > encryption.If you randomly chose S-boxes, the resulting cypher is very weak. IBM/NSA seem to have done a remarkable job at finding those S-boxes that result in a good cypher. This is the consensus about DES among the cryptanalysts nowadays. If you start modifying your S-boxes along the road, depending on your key, all you can do is effectively create a larger key. This is nothing different from a slightly more complex encryption algorithm and a slightly larger key.> c) "Variable" ciphers. > > The idea here is to implement a cipher that incorporates a huge > number of different encryption functions. The objective is to > overwhelm the analytic capability of an attacker. (At the end of > this post you will find the outline of a proof about why a cipher > of this type is intrinsically more secure.)The word "overwhelm" is misplaced.> My GodSave encryption algorithm belongs to this class of > "variable" ciphers. It extensively uses data of the key to change > the control flow of the program execution. In other words, > whereas most algorithms just operate on the key, GodSave uses > information in the key to decide how to operate. Even better: It > is constructed in such a way that large sections of its code can > be modified by a programmer without affecting the security of the > algorithm, offering some of the advantages described under a) > above.Not true. Suppose my application is "hurt" by even a small amount of plaintext seeping through. ("The winning lottery numbers are: 1, 23, 26, 34, ..." Someone being able to decrypt part of this message before the results are published, some part of the numbers, can greatly increase his chances of winning. Bad for the lottery.) Now if my "modifications" make every 8th block of 64 bits trivially decryptable, then I stand a chance of "publishing" an important part of the plaintext.> Here is the definition of another cipher of this type (let us > call it "heavy DES"): Start by randomly defining 16K DES keys; > you need less the 1 MB space in your hard disk to save them. > Suppose that this large key set is public (either by choice or > because an attacker gained access to your computer and stole it). > So now you have a large set of DES ciphers with *known* keys; the > effort to break any one of them is 1. Now define a secret key of > 140 bits. Use 14 bits at a time to index one of the 16K DES > functions. Encrypt a 64 bit block by sequentially chaining the 10 > indexed DES functions. DES is not a group, therefore each > instance of the 140 bits long key results in a different mapping > of the plaintext space into the ciphertext space. If we choose > from 2^N DES functions and chain P of them together (in the > example above N=14 and P=10) then there are 2^(N*P) different > instances. Already with 140 bits of entropy, a brute force attack > is out of the question no matter how many hardware coded DES > machines you have. Suppose you have perfect cryptanalytic > knowledge of DES - trapdoor and all - even then, can you see a way > to attack this variable version?Yes. Suppose I have a perfect cryptanalytic knowledge of DES: Suppose I instantly know the plaintext given a cyphertext. In that case the cryptographic function could just as well be the identity function. So, now I have a cyphertext. That''s the output of your 10th DES step. Because we simplified DES to the identity function, that''s also the input to the tenth step, the output of the 9th step...... Also the plaintext. Suppose, that from the output of DES, I can deduce which of your 16K DES keys was used to generate it. The DES keys are known in this case, and DES wasn''t designed to resist "known key" attacks. But I cannot do this perfectly. I always get two "possible key" alerts. So, from your cyphertext, I can reverse two possible "level 9" outputs. From that I get four level 8 outputs etc etc. In total just 2^20 possibilities to check, and Bingo. [.....]> 2^(N*k)+0.5*2^(N*k)=3/2 * 2^(N*k). Therefore you need more effort > to break C with a key of N+1 bits, than either A or B with a key[.....] Next we see lots of mathematical sounding stuff, that is totally irrelevant. Your described "multi-DES" is essentially an ECB cypher (Electronic Code Book). This suffers from all the weaknesses of ECB cyphers. Suppose the ATMs used this. Suppose the bank sends: " Transaction approved, please pay mr XYZ: $0000200,00 " Now if I can change that 00002 into 20000, I''m suddenly rich. ECB means that I can watch someone withdraw $2000000,00, and insert part of that message into the stream. You imply that I can generate cyphers of arbitrary key length. That is not true: There are only (2^64)! ECB cyphers. Now this is quite a lot, but that''s a certain hard upper limit. Suppose that instead of passing 64 bits at a time into the DES engines, I pass in just 8. (Say that I want to be able to encrypt a "telnet session", where characters don''t come in 8 at a time...). Now the number of ECB codes is simply 256!, which is not longer than 8*256 bits. You imply that I could generate longer keys than this. The important thing is, that in cryptography, all college students at one time or another think they''ve found a wonderfully secure new cryptographic algorithm. Experience shows that so far all of them have had serious flaws. Leave designing cryptographic algorithms to the experts. Allow everybody to know the algorithm. The bad guys will steal a "box" anyway. Keep your keys secret, but that''s the only thing you should keep secret. Make sure that your algorithm is resistant to known plaintext attacks, and all "known ..." variants. If you keep your algorithm secret, most likely some highschool kid will eventually figure out a fast way of decrypting your cypher. The advantages of publishing the algorithm are twofold. You get to know from the beginning that the cryptanalyst will be working with a known-algorithm: you won''t be fooled into thinking that he can''t figure out the algorithm. And everybody else gets to shoot holes in it. As a reference, read the chapter on random numbers (a good random number generator can serve as a cryptographic engine) in "the art of computer programming" by Donald Knuth, published 1972 (or thereabouts). The main point being that having the control flow change depending on the data is NOT a good way to make a flawed algorithm better. A good algorithm is the only solution. Regards, Roger. P.S. Future posts with "I''ve designed this fancy cryptographic algorithm" will be ignored... -- If it''s there and you can see it, it''s REAL |___R.E.Wolff@BitWizard.nl | If it''s there and you can''t see it, it''s TRANSPARENT | Tel: +31-15-2137555 | If it''s not there and you can see it, it''s VIRTUAL |__FAX:_+31-15-2138217 | If it''s not there and you can''t see it, it''s GONE! -- Roy Wilks, 1983 |_____|
# Your described "multi-DES" is essentially an ECB cypher All block ciphers are ECB ciphers unless you use CFB or some other technique to make them not ECB ciphers. I fail to see where his cipher would allow for partial-known-plaintext, nor how it could not be implemented in a CFB manner. His idea is pretty interesting: He uses Karn''s encryption in such a way to get rid of the obvious problems with the algolrythm, and the varient is a change in the hashes used with his modified Karn. More information here: http://www.infoweb.co.cr/tecapro/godsave/ I didn''t come to discuss crypto though, but to announce my patch for libc-5.3.12, the libc on RedHat 4.2 systems. I lost some of the "why libc is insecure" discussion, leaving me with only a patch for libc-5.4.44. Which I was able to apply 1/2 to libc-5.3.12--the other half has changed too much. I am wondering if there would be anything wrong with not allowing getenv calls inside libc itself when and if the program is suid, by replacing _all_ getenv()s with __libc_secure_getenv. I have both source and i386 RPMs utilizing the included patch here: http://linux.samiam.org/blackdragon/ - Sam *** libc-5.3.12/libc/nls/msgcat.c.secenv Sun May 31 20:08:59 1998 --- libc-5.3.12/libc/nls/msgcat.c Sun May 31 20:12:41 1998 *************** *** 115,120 **** --- 115,122 ---- #include <sys/mman.h> #endif + extern char *__libc_secure_getenv(const char *); + nl_catd catopen( name, type) const char *name; int type; *************** *** 134,146 **** if (stat(catpath, &sbuf)) return(NLERR); } else { #if BROKEN_SETLOCALE ! if ((lang = (char *) getenv("LANG")) == NULL) lang = "C"; #else /* Query the locale from the previous setlocale call in msgcat-libc.c*/ if ((lang = (char *) setlocale(LC_MESSAGES,(char *) NULL)) == NULL) lang="C"; #endif ! if ((nlspath = (char *) getenv("NLSPATH")) == NULL) { #if OLD_NLS_PATHS nlspath = "/nlslib/%L/%N.cat:/nlslib/%N/%L"; --- 136,148 ---- if (stat(catpath, &sbuf)) return(NLERR); } else { #if BROKEN_SETLOCALE ! if ((lang = (char *) __libc_secure_getenv ("LANG")) == NULL) lang = "C"; #else /* Query the locale from the previous setlocale call in msgcat-libc.c*/ if ((lang = (char *) setlocale(LC_MESSAGES,(char *) NULL)) == NULL) lang="C"; #endif ! if ((nlspath = (char *) __libc_secure_getenv ("NLSPATH")) == NULL) { #if OLD_NLS_PATHS nlspath = "/nlslib/%L/%N.cat:/nlslib/%N/%L"; *** libc-5.3.12/libc/nys/nis/src/nis_names.c.secenv Sun May 31 20:14:09 1998 --- libc-5.3.12/libc/nys/nis/src/nis_names.c Sun May 31 20:16:18 1998 *************** *** 29,34 **** --- 29,35 ---- #include "nis_conf.h" #include "xalloc.h" + extern char *__libc_secure_getenv(const char *); nis_name nis_local_directory(void) { *************** *** 101,107 **** nis_name nis_local_group(void) { ! return getenv("NIS_GROUP"); } --- 102,108 ---- nis_name nis_local_group(void) { ! return __libc_secure_getenv("NIS_GROUP"); } *************** *** 248,254 **** return rnames; } ! path = getenv("NIS_PATH"); if (path == NULL) path = "$"; --- 249,255 ---- return rnames; } ! path = __libc_secure_getenv("NIS_PATH"); if (path == NULL) path = "$";
Andi Kleen
1998-Jun-01 05:08 UTC
[linux-security] Re: "Flavors of Security Through Obscurity"
"Brandon K. Matthews" <bmatt@devils.eng.fsu.edu> writes:> c) "Variable" ciphers. > > The idea here is to implement a cipher that incorporates a huge > number of different encryption functions. The objective is to > overwhelm the analytic capability of an attacker. (At the end of > this post you will find the outline of a proof about why a cipher > of this type is intrinsically more secure.)There are already lots of these ciphers. Examples are the Russian GOST or Bruce Schneier''s Blowfish cipher where the SBOXes can be changed and kept secret. Most publicly available Blowfish and GOST implemenations use fixed, known sboxes AFAIK (hexadecimal Pi in case of Blowfish, some standard set for GOST), but the ciphers were really designed to work with variable SBoxes. [mod: As I''m told, not just any S-boxes will do. You get a cryptographically weak cypher if you don''t choose your S-boxes just right. Nobody knows how the DES people got it right, but they DID. Using PI might give you a good source of pseudorandom numbers, but it is unlikely to provide good S-Boxes. -- REW] -Andi
Joseph S D Yao
1998-Jun-01 10:58 UTC
[linux-security] Re: "Flavors of Security Through Obscurity"
> As a reference, read the chapter on random numbers (a good random > number generator can serve as a cryptographic engine) in "the art of > computer programming" by Donald Knuth, published 1972 (or > thereabouts). ...You are referring to the volume on numerical algorithms. Recently updated and re-published. I ran out and got a copy, but somehow I don''t have as much time to read it now as then ... ;-) I believe the first two volumes are out, and there was some talk of more than the initial three becoming available [there were originally seven or eight planned, IIRC].> P.S. Future posts with "I''ve designed this fancy cryptographic > algorithm" will be ignored...I trust you will at least look at them before trashing them ... you never know just where true inspiration will strike. ;-) There have been several brilliant mathematicians who didn''t come up through acknowledged channels. Joe Yao jsdy@tux.org - Joseph S. D. Yao
Christopher Biggs
1998-Jun-01 13:51 UTC
[linux-security] Re: "Flavors of Security Through Obscurity"
"Brandon K. Matthews" <bmatt@devils.eng.fsu.edu> moved upon the face of the ''Net and spake thusly:> This was posted not too long ago on sci.crypt... Enjoy... I think the most > relevant information is near the top, but it''s all quite good... :-) >^^^^^^^^^^^ I think you misspelt "utter rubbish". By the poster''s own theory of "equivalence of algorithm and key", then varying the algorithm is no different from increasing the key length. I challenge the poster to memorize a 2048-bit RSA key. If the algorithm is secret, then how is anybody else to understand your messages? I have a machine I use to acheive the same results: a shredder. -- | Christopher Biggs email:chris@stallion.oz.au | One of the founding membata,| | Stallion Technologies, Queensland, Australia | Society for Creative Pluri. | | VoiceNet +61-7-3270-4266 Fax +61-7-3270-4245 | Linux: To connect and serve | | Send mail with "Subject: sendpgpkey" for my PGP public key. MIME mail OK |
Aleph One
1998-Jun-02 07:51 UTC
[linux-security] Re: "Flavors of Security Through Obscurity"
On 2 Jun 1998, Christopher Biggs wrote:> I think you misspelt "utter rubbish". > > By the poster''s own theory of "equivalence of algorithm and key", then > varying the algorithm is no different from increasing the key length. > > I challenge the poster to memorize a 2048-bit RSA key. > > If the algorithm is secret, then how is anybody else to understand > your messages? I have a machine I use to acheive the same results: a > shredder.This is not a fair argument. If we take for example secret key cryptography we could well make the ''secret'' key public and keep secret the algorithm. What is boils down to is that the distinction between data(key) and code(algorithm) has become less clear. He does make some interesting arguments. The problem is that it is more difficult to design a good cipher than to select a good key (good keys depend on the algorithm anyway). But by allowing the key to influence the algorithm you a shifting the burden of designing a good cipher from the cryptographer to the end user/software. It may indeed be possible to produce a strong cipher in such a manner but its certainly no easier taks than designing a strong regular cipher.> -- > | Christopher Biggs email:chris@stallion.oz.au | One of the founding membata,| > | Stallion Technologies, Queensland, Australia | Society for Creative Pluri. | > | VoiceNet +61-7-3270-4266 Fax +61-7-3270-4245 | Linux: To connect and serve | > | Send mail with "Subject: sendpgpkey" for my PGP public key. MIME mail OK |Aleph One / aleph1@dfw.net http://underground.org/ KeyID 1024/948FD6B5 Fingerprint EE C9 E8 AA CB AF 09 61 8C 39 EA 47 A8 6A B8 01
lists@notatla.demon.co.uk
1998-Jun-03 17:22 UTC
[linux-security] Re: "Flavors of Security Through Obscurity"
X-Mailing-List: <linux-security@redhat.com> archive/latest/4 From: Andi Kleen <ak@muc.de>> There are already lots of these ciphers. Examples are the Russian GOST or > Bruce Schneier''s Blowfish cipher where the SBOXes can be changed and kept > secret. Most publicly available Blowfish and GOST implemenations use fixed, > known sboxes AFAIK (hexadecimal Pi in case of Blowfish, some standard set > for GOST), but the ciphers were really designed to work with variable SBoxes. > > [mod: As I''m told, not just any S-boxes will do. You get a > cryptographically weak cypher if you don''t choose your S-boxes just > right. Nobody knows how the DES people got it right, but they DID. > Using PI might give you a good source of pseudorandom numbers, but > it is unlikely to provide good S-Boxes. -- REW]Excuse me being a little behind with my mail... Blowfish relies on S-boxes derived from the key in a complicated and slow way. Some of the initial setup uses PI. What separates these pseudo-random boxes from the designed ones of DES is that these are very large (and are used partially in series in each round). 1970''s hardware constrained DES to small S-boxes which would be weak if random.
Raphael Ho
1998-Jun-05 03:09 UTC
[linux-security] Re: "Flavors of Security Through Obscurity"
I am not a security/encryption expert - I just happened to have taken a 1 year course in encryption theorems in University. Just my 2 cents. Brandon K. Matthews wrote:> Actually it is a very good idea to keep > secret the algorithm (in all its representations), as long as you > can afford to do so. That is why major governments do exactly > that. >Gibberish. This way, nobody can check for flaws in the algorithm - actually, this could theoretically allow the implementing person/organization to add "back doors" to decrypt your message. The only reason why DES and PGP is so popular is because the source is open, and anybody can check for flaws themselves. I am sure some *unnamed* government would love to let you use their secret encryption program.> As you see, the reasons are of a practical nature, and are not > derived from any fundamentals in cryptology. If we could find a > practical way to keep secret both the key (that is the data the > encryption method operates on) and also the method itself (or at > least part of the method), then security would be greatly > enhanced because the attacker would have less knowledge to work > with. >If the implementer kept part of the encryption method secret,then the implementer holds an amazing amount of power over any organization that uses that algorithm. c.f. my above comment, and that film featuring Sandra Bullocks.> a) Machine generated secret ciphers. > > Today there are only a few encryption algorithms that are > generally accepted as good. But suppose there existed a generator > program which could construct a new encryption algorithm > depending on some random input.How can you generate an algorithm based on an input? Even if it does, this random input is therefore (by definition) a ''key''. Crack the original key, along with the program that generates the algorithm, et voila! You have cracked the message. This "random input" must also be transmitted to your destination in order for them to generate the decryption box. And how are you going to do that? via DES?> b) Intrinsically secret ciphers. > > Extend secrecy to parts of the encryption method. In his book, > Schneier very briefly describes a variant of DES where the Sboxes > (which most people would consider as part of the algorithm) are > variable and depend on the key. Another very interesting > possibility would have the key express the encryption method. In > other words consider the key as the program, and the cipher > simply as an interpreter, that follows the key''s instructions to > scramble the plaintext or unscramble the ciphertext. This would > call for large keys, but not larger than keys used in public key > encryption. >c.f. above comment.Secret algorithms are untrustworthy by definition, and if the "secret algorithm" is only distibuted to a small group of trusted people, then it might be acceptable. If this "secret algorithm" is distributed to any body else, they just need to find your key to decrypt your message.> c) "Variable" ciphers. > Here is the definition of another cipher of this type (let us > call it "heavy DES"): Start by randomly defining 16K DES keys; > you need less the 1 MB space in your hard disk to save them. > Suppose that this large key set is public (either by choice or > because an attacker gained access to your computer and stole it). > So now you have a large set of DES ciphers with *known* keys; the > effort to break any one of them is 1. Now define a secret key of > 140 bits. Use 14 bits at a time to index one of the 16K DESIf you encrypt something twice with different keys,you can decrypt it with both keys - but mathematically, there is another key of a similar length that can decrypt that message equally well. (since encryption is basically a factorization excercise, there are more than one way to skin a cat.) I don''t have mathematical proof of this, so I will stand corrected if somebody could proof otherwise. If my assumptions are correct, you are only as secure as the plain vanilla "lightweight" DES. Encrypting a message 140 times with 56bit keys would not make it more secure. (Does someone know Phil Zimmerman here? Maybe he could end this once and for all.) Just my $0.02 Regards, Raf.
Richard Watts
1998-Jun-06 05:31 UTC
[linux-security] Re: "Flavors of Security Through Obscurity"
On Fri 5 June 1998, Raphael Ho <Raphael.Ho@globalone.net> wrote: [ might it be a good idea to move this off-list ? ]>I am not a security/encryption expert - I just happened to have >taken a 1 year course in encryption theorems in University. > >Just my 2 cents. > >Brandon K. Matthews wrote: > >> Actually it is a very good idea to keep >> secret the algorithm (in all its representations), as long as you >> can afford to do so. That is why major governments do exactly >> that. >> > >Gibberish. > >This way, nobody can check for flaws in the algorithmBut that''s OK in the intelligence sector, since the players are you (the NSA, for the sake of argument) and other SIGINT agencies: the other agencies are unlikely to tell you if they find a flaw, so your best bet is to keep the code secret, and make their jobs slightly harder. Secret algorithms do place a real barrier in the way of the attacker. The argument against them is that, unless you are the NSA (and even then): (i) This barrier is usually trivial for the determined attacker. (ii) Keeping your algorithm secret means that you miss out on peer reviewing which could stop you deploying an algorithm with a serious flaw. The (entirely separate) argument against using a crypto algorithm designed by someone else, whose security you can''t verify, is that you''re trusting your security to the designer of the algorithm, who typically cares much less about your secrets than you do. [snip]>> generally accepted as good. But suppose there existed a generator >> program which could construct a new encryption algorithm >> depending on some random input. > >How can you generate an algorithm based on an input? Even if >it does, this random input is therefore (by definition) a ''key''. Crack the >original key, along with the program that generates the algorithm, >et voila! You have cracked the message. This "random input" must >also be transmitted to your destination in order for them to generate >the decryption box. And how are you going to do that? via DES?I believe that his point is that control-flow-modifying crypto algorithms have a much larger keyspace than non-self-modifying algorithms (and are harder to analyse). Unfortunately, that keyspace is also horribly sparse, and by his own argument, there exists no (fast) general algorithm for determining whether you have a good key or not. The point about keys is not so much that they''re small, as that the space is dense - _any_ DES key should be as good as any other: the amount of effort required to discover if my particular key is good or not should be small. There may well be (indeed, there must be) control-flow-modifying crypto algorithms with this property, but it''d be very hard to show that they were secure. This (incidentally) is one potential way to `break'' RSA: don''t find a way to factor any large composite, just find an algorithm which can factor large composites of factors which fail a test as difficult as trial division - that opens up a lot more attacks, because it effectively cripples probabilistic primality testing and makes key generation a thoroughly horrible thing to do (cf. secure ZKPs involving Hamiltonian circuits). [snip]>If you encrypt something twice with different keys,you can decrypt it with both >keys - but mathematically, there is another >key of a similar length that can decrypt that message equally well. >(since encryption is basically a factorization excercise, there are more >than one way to skin a cat.) I don''t have mathematical proof of this, >so I will stand corrected if somebody could proof otherwise.Erm, this is what it means to say that DES is not a group: in any group G with operation o, f o g = h (f,g,h here would be DES keys, and o is encryption). If it were true that f o g = h, 3DES would be rather a silly thing to do (which is how the question came up in the first place). Schneier gives Campbell & Wiener in Proc. Crypto ''92 as a reference for this (though he defines this property in terms of closure and purity). Obviously there is a program which is the inverse of 2DES (or any other isomorphism you care to mention), but there is no practicable general way of finding such programs. [snip] Richard.