Friday, May 31, 2013
Thursday, May 30, 2013
the festivity of counts () = meterede hal bX-j1efvs found the soup encryption algotrithm
6or air or means u gets closer to weakness
asmoip it gets closer to weakness 5 naise eonaise
2eon re hee re hee gets closer to weakness
3 aor symoip malcuth 6
aor 6
eon 4 eon 6
asmoip eom 2
eom 6ixgetsclosertoweakness
symoip anerbgetsclosertoweakness
owteprngetsclosertoweakness
asmoipsourapplepopgetshoegisttoweakness
festivity of cont () = rianuis found the soupalgortihm
the dehsity period is in the lillys -intheouilis
-inyhrusoilus6
-avseinerteemeaasmoip 5 naise eonaise it was in the lillys
2eon
3 aor symoip malcuth 6
aor 6
eon 4 eon 6
asmoip eom 2
eom 6
symoip
owt
asmoipthenpassesoutonthe timre dimre
comparitive tonice of a ew tegeria () = souplill soupalgored to find the soup algorithm
to find the soupalgorithm
7z3tretlr an mNetworksomumoaet_1he;Security-and-Cryptography-3omu7 an they wanted to play with the map so they made you think water wasnt realm on thetlr 61
#define PHASAR 0 /* q0, eq1; whereby we limit to SQUARE */idme metur find meture m on the thinkbma meture 613 rinse oatlrtom
#define SQUARE 1 /* q1, eq0; whereby we limit to PHASAR */thinkbma meture 613 rinse oatlrtom idme metur find meture m on the
#define ZINCID 2 /* qe, ec0; whereby we limit to ec1 */
#define RREHTM 3 /* -(o) qr, ec1; whereby we limit to ec2 */
#define ZDIFFN 4 /* re, lm2; lohit, baekmuk, cudjkuni; whereby we limit cosine dysfunction na square -.*/
#define XSINED 5 /* qn, r3d; lohit, baekmuk, cudjkuni; whereby we limit sine reduction between quad non-verbal square verbal */
#define EDESTR 6 /* eqe, nh-1; e tangent destruction in limit tandem with quad non-verbal and square verbal respectively */
extern u32 g1;
extern u32 g0;
extern u32 n0;
extern u32 he;
extern u32 n; /* SQUARE retention in tandem with PHASAR 6 non verbal(tb) e tangent limit square verbal(ta); whereby RREHTM is presupposed */
typedef struct {
u8 quad_tandm; /* 6 phases na square */
u32 limit_quad;
u8 square_tandm; /* 6 phases na quad */
u32 limit_square;
u8 zincid_tandm; /* incidence of na; (tb) nhe (ta); limit of pure -- +(o) -- e tangent destruction */
u32 limit_zincid;
} pt_t; /* positronium */ means its trying to get away from it
typedef struct { /* square randem RREHTM na h of -(o); ( (a of -(h)) -- n + (R -- e) ) */
u32 (*go-nhe)(void* qr, void* q0, void* qe); /* verbal square na quad na tandem go-nhe; e tangent destruction na (o) cosine dysfunction */
u32 (*nhe-go)(void *re, void *q1, void* eqe); /* relative movement of verbal limit where -(0) is mundane seeking through negative quad */
u32 (*nhe-h)(void qn, void* lm2);
} ps_t; means it was trying to get away from it
INTRODUCTION Blowfish is a variable-length, a new secret-key block cipher. It is a Fiestal network, iterating a simple encryption function 16 times. Its main features are:
Block cipher: 64-bit block.
Variable key length: 32 bits to 488 bits.
Much faster than IDEA and DES.
Unpatented and royalty free.
No license required.
DESCRIPTION OF BLOWFISH Blowfish is a variable-length key, 64-bit block cipher. The algorithm consists of two parts: a key-expansion part and a data- encryption part. Key expansion converts a variable-length key of at most 56 bytes (448 bits) into several sub key arrays totaling 4168 bytes. Data encryption occurs via a 16-round Feistel network. Each round consists of a key-dependent permutation, and a key- and data-dependent substitution. The additional operations are four indexed array data lookups per round. Implementations of Blowfish that require the fastest speeds should unroll the loop and ensure that all sub keys are stored in cache. Sub keys: Blowfish uses a large number of sub keys. These keys must be precomputed before any data encryption or decryption.
1. The P-array consists of 18 32-bit sub keys: P1, P2,..., P18. 2. There are four 32-bit S-boxes with 256 entries each:
S1,0, S1,1,..., S1,255;
S2,0, S2,1,..,, S2,255;
S3,0, S3,1,..., S3,255;
S4,0, S4,1,..,, S4,255. Encryption: Blowfish is a Feistel network consisting of 16 rounds. The input is a 64-bit data element, x. Divide x into two 32-bit halves: xL, xR For i = 1 to 16: xL = xL XOR Pi xR = F(xL) XOR xR Swap xL and xR Next I Swap xL and xR (Undo the last swap.) xR = xR XOR P17 xL = xL XOR P18 Recombine xL and xR (to get the cipher text) Function F: Divide xL into four eight-bit quarters: a, b, c, and d F(xL) = ((S1,a + S2,b mod 232) XOR S3,c) + S4,d mod 232 Decryption is exactly the same as encryption, except that P1, P2,..., P18 are used in the reverse order.
Generating the Sub keys: The sub keys are calculated using the Blowfish algorithm. The exact method is as follows: 1. Initialize first the P-array and then the four S-boxes, in order, with a fixed string. This string consists of the hexadecimal digits of pi (less the initial 3). For example: P1 = 0x243f6a88, P2 = 0x85a308d3, P3 = 0x13198a2e, P4 = 0x03707344 2.XOR P1 with the first 32 bits of the key, XOR P2 with the second 32-bits of the key, and so on for all bits of the key (possibly up to P14). Repeatedly cycle through the key bits until the entire P-array has been XORed with key bits. 3. Encrypt the all-zero string with the Blowfish algorithm, using the sub keys described in steps (1) and (2). 4. Replace P1 and P2 with the output of step (3). 5. Encrypt the output of step (3) using the Blowfish algorithm with the modified sub keys. 6. Replace P3 and P4 with the output of step (5). 7. Continue the process, replacing all entries of the P- array, and then all four S-boxes, with the output of the continuously-changing Blowfish algorithm. In total, 521 iterations are required to generate all required sub keys. The speed comparisons of block ciphers on a Pentium for different algorithms are given below:
Speed Comparisons of Block Ciphers on a Pentium Algorithm Clock cycles per round # of rounds # of clock cycles per byte encrypted Notes Blowfish 9 16 18 Free, unpatented DES 18 16 45 56-bit key IDEA 50 8 50 patented by Ascom-Systec
IMPLEMENTABLE PLATFORMS A standard encryption algorithm must be implementable on a variety of different platforms, each with their own requirements. These include: Special hardware: The algorithm should be efficiently implementable in custom VLSI hardware. Large processors: The algorithm should be efficient on 32-bit microprocessors with 4 KB program and data caches. Medium-size processors: The algorithm should run on micro controllers and other medium-size processors, such as the 68HC11. Small processors: It should be possible to implement the algorithm on smart cards. The requirements for small processors are the most difficult. RAM and ROM limitations are severe for this platform. Also, efficiency is more important on these small machines.
ADDITIONAL REQUIREMENTS These additional requirements should, if possible, be levied on a standard encryption algorithm.
It should be simple to code. If possible, the algorithm should be robust against implementation mistakes.
It should have a flat key space, allowing any random bit string of the required length to be a possible key. There should be no weak keys.
It should facilitate easy key-management for software implementations. In particular, the password that the user enters becomes the key.
It should be easily modifiable for different levels of security, both minimum and maximum requirements.
All operations should manipulate data in byte-sized blocks. Where possible, operations should manipulate data in 32-bit blocks.
Design Decisions Of Variable Length-key (64-Bit Block Cipher) Based on the above parameters, we have made these design decisions. The algorithm should:
Manipulate data in large blocks, preferably 32 bits in size.
Have either a 64-bit or a 128-bit block size.
Have a scalable key, from 32 bits to at least 256 bits.
Use simple operations that are efficient on microprocessors.
Be implementable on an 8-bit processor with a minimum of 24 bytes of RAM (in addition to the RAM required to store the key) and 1 kilobyte of ROM.
Consist of a variable number of iterations. For applications with a small key size, it is possible to reduce the number of iterations with no loss of security
If possible, have no weak keys. If not, the proportion of weak keys should be small enough to make it unlikely to choose one at random.
Use sub keys that are precomputable and one-way hash of the key. This allows the use of long pass phrases for the key without compromising security.
Use a design that is simple to understand. This will facilitate analysis and increase the confidence in the algorithm.
BUILDING BLOCKS There are a number of building blocks that have been demonstrated to produce strong ciphers. Large S-boxes: Larger S-boxes are more resistant to differential cryptanalysis. An algorithm with a 32-bit word length can use 32-bit S-boxes. Key-dependent S-boxes: key-dependent S-boxes are much more resistant to these attacks differential and linear cryptanalysis. Combining operations: Combining XOR mod 216, addition mod 216, and multiplication mod 216+1 [7]. Key-dependent permutations: The fixed initial and final permutations of DES have been long regarded as cryptographically worthless.
DESIGN DECISIONS A 64-bit block size yields a 32-bit word size, and maintains block-size compatibility with existing algorithms. Blowfish is easy to scale up to a 128-bit block, and down to smaller block sizes. The Feistel network that makes up the body of Blowfish is designed to be as simple as possible, while still retaining the desirable cryptographic properties of the structure. Round i of a general Feistel network: Rn,i and Ni are reversible, non-reversible functions of text and key. For speed and simplicity, XOR is chosen as reversible function. This lets to collapse the four XORs into a single XOR, since: RP1,i+1 = R1,i+1 XOR R2,i-1 XOR R3,i XOR R4,i This is the P-array substitution in Blowfish. The XOR can also be considered to be part of the non-reversible function, Ni, occurring at the end of the function. Function F, the non-reversible function, gives Blowfish the best possible avalanche effect for a Feistel network: every text bit on the left half of the round affects every text bit on the right half. Additionally, since every sub key bit is affected by every key bit, the function also has a perfect avalanche effect between the key and the right half of the text after every round. Hence, the algorithm exhibits a perfect avalanche effect after three rounds and again every two rounds after that. The non-reversible function is designed for strength, speed, and simplicity. Four different S-boxes are used instead of one S-box primarily to avoid symmetries when different bytes of the input are equal, or when the 32-bit input to function F is a byte wise permutation of another 32-bit input. The four S-box design is faster, easier to program, and seems more secure. The function that combines the four S-box outputs should be as fast as possible. A simpler function would be to XOR the four values. The alternation of addition and XOR ends with an addition operation because an XOR combines the final result with xR. If the four indexes chose values out of the same S-box, a more complex combining function would be required to eliminate symmetries. As the structure of the S-boxes is completely hidden from the cryptanalyst, the differential and linear cryptanalysis attacks have a more difficult time exploiting that structure. While it would be possible to replace these variable S-boxes with four fixed S-boxes that were designed to be resistant to these attacks, key-dependent S-boxes are easier to implement and less susceptible to arguments of hidden properties. Additionally, these S-boxes can be created on demand, reducing the need for large data structures stored with the algorithm. Each bit of xL is only used as the input to one S-box. In DES many bits are used as inputs to two S-boxes, but this added complication is not necessary with key- dependent S-boxes. The P-array substitution can be considered to be part of the F function, and is already iteration-dependent. The number of rounds is set at 16. However, this number affects the size of the P- array and therefore the generation process; 16 iterations permits key lengths up to 448 bits. This number can be reduced, and greatly speed up the algorithm in the process. In algorithm design, there are two basic ways to ensure that the key is long enough to ensure a particular security level. One is to carefully design the algorithm so that the entire entropy of the key is preserved, so there is no better way to cryptanalyze the algorithm other than brute force. The other is to design the algorithm with so many key bits that reduce the effective key length by several bits. Since Blowfish is designed for large microprocessors with large amounts of memory, the latter one is used. The sub key generation process is designed to preserve the entire entropy of the key and to distribute that entropy uniformly throughout the sub keys. The digits of pi are chosen as the initial sub key table for two reasons: because it is a random sequence not related to the algorithm, and because it could either be stored as part of the algorithm or derived when needed. Any string of random bits--RAND tables, output of a random number generator--will suffice for pi. In the sub key generation process, the sub keys change slightly with every pair of sub keys generated. This is to protect against any attacks of the sub key generation process. It also reduces storage requirements. The 448-bits limit on key size ensures that every bit of every sub key depends on every bit of the key. The key bits are repeatedly XORed with the digits of pi in the initial P-array to prevent the following potential attack: Assume that the key bits are not repeated, but instead padded with zeros to extend it to the length of the P-array. An attacker might find two keys that differ only in the 64-bit value XORed with P1 and P2 that produce the same encrypted value. If so, he can find two keys that produce all the same sub keys. This is a highly tempting attack for a malicious key generator. To prevent this same type of attack, the initial plaintext value in the sub key-generation process is fixed. Highly correlated key bits, such as an alphanumeric ASCII string with the bit of every byte set to 0, will produce random sub keys. The time-consuming subkey-generation process adds considerable complexity for a brute-force attack. A total of 522 iterations of the algorithm are required to test a single key.
POSSIBLE SIMPLIFICATIONS Several possible simplifications, aimed at decreasing memory requirements and execution time are outlined below: Fewer and smaller S-boxes: It may be possible to reduce the number of S-boxes from four to one to reduce the memory requirements for the four S-boxes from 4096 bytes to 1024 bytes. Additionally, it may be possible to overlap entries in a single S-box to reduce the requirements from 1024 bytes to 259 bytes. Fewer iterations: It is probably safe to reduce the number of iterations from 16 to 8 without compromising security. On-the-fly subkey calculation: The current method of subkey calculation requires all sub keys to be calculated advance of any data encryption. An alternate method is the one where every subkey can be calculated independent of any other.
CRYPTANALYSIS OF BLOWFISH
The most interesting results are:
John Kelsey developed an attack that could break 3-round Blowfish, but was unable to extend it. This attack exploits the F function and the fact that addition mod 232 and XOR do not commute. Serge Vaudenay examined a simplified variant of Blowfish, with the S-boxes known and not key-dependent.
The discovery of weak keys in Blowfish is significant. A weak key is one for which two entries for a given S-box is identical. We have to do the key expansion and check for identical S-box entries after generating a Blowfish key.
AREAS OF APPLICATION: A standard encryption algorithm must be suitable for different applications: Bulk encryption: The algorithm should be efficient in encrypting data files or a continuous data stream. Random bit generation: The algorithm should be efficient in producing single random bits. Packet encryption: The algorithm should be efficient in encrypting packet-sized data. Hashing: The algorithm should be efficient in being converted to a one-way hash function.
Products that use Blowfish The products that use the Blowfish encryption algorithm are: BF-SDK (Blowfish Software Development Kit) provides the basic functions to encrypt and decrypt data.CertifiedMail.com is a website that provides encrypted message delivery using Blowfish to transmit messages from an e-mail client to the CertifiedMail Server, then stores messages with Blowfish. OpenBSD is a free Unix-like operating system that uses Blowfish by default for one-way password encryption. Scramdisk is a Disk encryption for Windows95 and Windows98. Ultra-Scan is an ultrasonic fingerprint scanner uses Blowfish to encrypt the fingerprint images.
CONCLUSION:
In this paper we discussed Blowfish, it is a variable-length key block cipher. It is only suitable for applications where the key does not change often, like a communications link or an automatic file encryptor. It is significantly faster than DES when implemented on 32-bit microprocessors with large data caches, such as the Pentium and the PowerPC. Although there is a complex initialization phase required before any encryption can take place, the actual encryption of data is very efficient on large microprocessors. Linux includes Blowfish in the mainline kernel, starting with v2.5.47.
Blowfish is a 16 pass block encryption algorithm that has never been broken. The most efficient way to break Blowfish is through exhaustive search of the key space. Although a number of excellent algorithms have been developed BLOWFISH is used frequently because:
It has been repeatedly tested and found to be very secure.
It is extremely fast due to its taking advantage of built-in instructions on the current microprocessors for basic bit shuffling operations.
It was placed in the public domain.
REFERENCES
[1] E. Biham and A. Shamir, Differential Cryptanalysis of the Data Encryption Standard, Springer-Verlag, 1993.
[2] H. Feistel, "Cryptography and Computer Privacy," Scientific American.
[3] B. Schneier, "Data Guardians," MacWorld, Feb 1993.
[4] B. Schneier, Applied Cryptography, John Wiley & Sons, New York, 1994.
[5]J.L Smith, The Design of Lucifer, A Cryptographic Device for Data Communication, RC 3326, White Plains: IBM Research.
[6] HYPERLINK "http://www.howstuffworks.com" www.howstuffworks.com
7z3tretlr an mNetworksomumoaet_1he;Security-and-Cryptography-3omu7 an they wanted to play with the map so they made you think water wasnt realm on thetlr 61
#define PHASAR 0 /* q0, eq1; whereby we limit to SQUARE */idme metur find meture m on the thinkbma meture 613 rinse oatlrtom
#define SQUARE 1 /* q1, eq0; whereby we limit to PHASAR */thinkbma meture 613 rinse oatlrtom idme metur find meture m on the
#define ZINCID 2 /* qe, ec0; whereby we limit to ec1 */
#define RREHTM 3 /* -(o) qr, ec1; whereby we limit to ec2 */
#define ZDIFFN 4 /* re, lm2; lohit, baekmuk, cudjkuni; whereby we limit cosine dysfunction na square -.*/
#define XSINED 5 /* qn, r3d; lohit, baekmuk, cudjkuni; whereby we limit sine reduction between quad non-verbal square verbal */
#define EDESTR 6 /* eqe, nh-1; e tangent destruction in limit tandem with quad non-verbal and square verbal respectively */
extern u32 g1;
extern u32 g0;
extern u32 n0;
extern u32 he;
extern u32 n; /* SQUARE retention in tandem with PHASAR 6 non verbal(tb) e tangent limit square verbal(ta); whereby RREHTM is presupposed */
typedef struct {
u8 quad_tandm; /* 6 phases na square */
u32 limit_quad;
u8 square_tandm; /* 6 phases na quad */
u32 limit_square;
u8 zincid_tandm; /* incidence of na; (tb) nhe (ta); limit of pure -- +(o) -- e tangent destruction */
u32 limit_zincid;
} pt_t; /* positronium */ means its trying to get away from it
typedef struct { /* square randem RREHTM na h of -(o); ( (a of -(h)) -- n + (R -- e) ) */
u32 (*go-nhe)(void* qr, void* q0, void* qe); /* verbal square na quad na tandem go-nhe; e tangent destruction na (o) cosine dysfunction */
u32 (*nhe-go)(void *re, void *q1, void* eqe); /* relative movement of verbal limit where -(0) is mundane seeking through negative quad */
u32 (*nhe-h)(void qn, void* lm2);
} ps_t; means it was trying to get away from it
INTRODUCTION Blowfish is a variable-length, a new secret-key block cipher. It is a Fiestal network, iterating a simple encryption function 16 times. Its main features are:
Block cipher: 64-bit block.
Variable key length: 32 bits to 488 bits.
Much faster than IDEA and DES.
Unpatented and royalty free.
No license required.
DESCRIPTION OF BLOWFISH Blowfish is a variable-length key, 64-bit block cipher. The algorithm consists of two parts: a key-expansion part and a data- encryption part. Key expansion converts a variable-length key of at most 56 bytes (448 bits) into several sub key arrays totaling 4168 bytes. Data encryption occurs via a 16-round Feistel network. Each round consists of a key-dependent permutation, and a key- and data-dependent substitution. The additional operations are four indexed array data lookups per round. Implementations of Blowfish that require the fastest speeds should unroll the loop and ensure that all sub keys are stored in cache. Sub keys: Blowfish uses a large number of sub keys. These keys must be precomputed before any data encryption or decryption.
1. The P-array consists of 18 32-bit sub keys: P1, P2,..., P18. 2. There are four 32-bit S-boxes with 256 entries each:
S1,0, S1,1,..., S1,255;
S2,0, S2,1,..,, S2,255;
S3,0, S3,1,..., S3,255;
S4,0, S4,1,..,, S4,255. Encryption: Blowfish is a Feistel network consisting of 16 rounds. The input is a 64-bit data element, x. Divide x into two 32-bit halves: xL, xR For i = 1 to 16: xL = xL XOR Pi xR = F(xL) XOR xR Swap xL and xR Next I Swap xL and xR (Undo the last swap.) xR = xR XOR P17 xL = xL XOR P18 Recombine xL and xR (to get the cipher text) Function F: Divide xL into four eight-bit quarters: a, b, c, and d F(xL) = ((S1,a + S2,b mod 232) XOR S3,c) + S4,d mod 232 Decryption is exactly the same as encryption, except that P1, P2,..., P18 are used in the reverse order.
Generating the Sub keys: The sub keys are calculated using the Blowfish algorithm. The exact method is as follows: 1. Initialize first the P-array and then the four S-boxes, in order, with a fixed string. This string consists of the hexadecimal digits of pi (less the initial 3). For example: P1 = 0x243f6a88, P2 = 0x85a308d3, P3 = 0x13198a2e, P4 = 0x03707344 2.XOR P1 with the first 32 bits of the key, XOR P2 with the second 32-bits of the key, and so on for all bits of the key (possibly up to P14). Repeatedly cycle through the key bits until the entire P-array has been XORed with key bits. 3. Encrypt the all-zero string with the Blowfish algorithm, using the sub keys described in steps (1) and (2). 4. Replace P1 and P2 with the output of step (3). 5. Encrypt the output of step (3) using the Blowfish algorithm with the modified sub keys. 6. Replace P3 and P4 with the output of step (5). 7. Continue the process, replacing all entries of the P- array, and then all four S-boxes, with the output of the continuously-changing Blowfish algorithm. In total, 521 iterations are required to generate all required sub keys. The speed comparisons of block ciphers on a Pentium for different algorithms are given below:
Speed Comparisons of Block Ciphers on a Pentium Algorithm Clock cycles per round # of rounds # of clock cycles per byte encrypted Notes Blowfish 9 16 18 Free, unpatented DES 18 16 45 56-bit key IDEA 50 8 50 patented by Ascom-Systec
IMPLEMENTABLE PLATFORMS A standard encryption algorithm must be implementable on a variety of different platforms, each with their own requirements. These include: Special hardware: The algorithm should be efficiently implementable in custom VLSI hardware. Large processors: The algorithm should be efficient on 32-bit microprocessors with 4 KB program and data caches. Medium-size processors: The algorithm should run on micro controllers and other medium-size processors, such as the 68HC11. Small processors: It should be possible to implement the algorithm on smart cards. The requirements for small processors are the most difficult. RAM and ROM limitations are severe for this platform. Also, efficiency is more important on these small machines.
ADDITIONAL REQUIREMENTS These additional requirements should, if possible, be levied on a standard encryption algorithm.
It should be simple to code. If possible, the algorithm should be robust against implementation mistakes.
It should have a flat key space, allowing any random bit string of the required length to be a possible key. There should be no weak keys.
It should facilitate easy key-management for software implementations. In particular, the password that the user enters becomes the key.
It should be easily modifiable for different levels of security, both minimum and maximum requirements.
All operations should manipulate data in byte-sized blocks. Where possible, operations should manipulate data in 32-bit blocks.
Design Decisions Of Variable Length-key (64-Bit Block Cipher) Based on the above parameters, we have made these design decisions. The algorithm should:
Manipulate data in large blocks, preferably 32 bits in size.
Have either a 64-bit or a 128-bit block size.
Have a scalable key, from 32 bits to at least 256 bits.
Use simple operations that are efficient on microprocessors.
Be implementable on an 8-bit processor with a minimum of 24 bytes of RAM (in addition to the RAM required to store the key) and 1 kilobyte of ROM.
Consist of a variable number of iterations. For applications with a small key size, it is possible to reduce the number of iterations with no loss of security
If possible, have no weak keys. If not, the proportion of weak keys should be small enough to make it unlikely to choose one at random.
Use sub keys that are precomputable and one-way hash of the key. This allows the use of long pass phrases for the key without compromising security.
Use a design that is simple to understand. This will facilitate analysis and increase the confidence in the algorithm.
BUILDING BLOCKS There are a number of building blocks that have been demonstrated to produce strong ciphers. Large S-boxes: Larger S-boxes are more resistant to differential cryptanalysis. An algorithm with a 32-bit word length can use 32-bit S-boxes. Key-dependent S-boxes: key-dependent S-boxes are much more resistant to these attacks differential and linear cryptanalysis. Combining operations: Combining XOR mod 216, addition mod 216, and multiplication mod 216+1 [7]. Key-dependent permutations: The fixed initial and final permutations of DES have been long regarded as cryptographically worthless.
DESIGN DECISIONS A 64-bit block size yields a 32-bit word size, and maintains block-size compatibility with existing algorithms. Blowfish is easy to scale up to a 128-bit block, and down to smaller block sizes. The Feistel network that makes up the body of Blowfish is designed to be as simple as possible, while still retaining the desirable cryptographic properties of the structure. Round i of a general Feistel network: Rn,i and Ni are reversible, non-reversible functions of text and key. For speed and simplicity, XOR is chosen as reversible function. This lets to collapse the four XORs into a single XOR, since: RP1,i+1 = R1,i+1 XOR R2,i-1 XOR R3,i XOR R4,i This is the P-array substitution in Blowfish. The XOR can also be considered to be part of the non-reversible function, Ni, occurring at the end of the function. Function F, the non-reversible function, gives Blowfish the best possible avalanche effect for a Feistel network: every text bit on the left half of the round affects every text bit on the right half. Additionally, since every sub key bit is affected by every key bit, the function also has a perfect avalanche effect between the key and the right half of the text after every round. Hence, the algorithm exhibits a perfect avalanche effect after three rounds and again every two rounds after that. The non-reversible function is designed for strength, speed, and simplicity. Four different S-boxes are used instead of one S-box primarily to avoid symmetries when different bytes of the input are equal, or when the 32-bit input to function F is a byte wise permutation of another 32-bit input. The four S-box design is faster, easier to program, and seems more secure. The function that combines the four S-box outputs should be as fast as possible. A simpler function would be to XOR the four values. The alternation of addition and XOR ends with an addition operation because an XOR combines the final result with xR. If the four indexes chose values out of the same S-box, a more complex combining function would be required to eliminate symmetries. As the structure of the S-boxes is completely hidden from the cryptanalyst, the differential and linear cryptanalysis attacks have a more difficult time exploiting that structure. While it would be possible to replace these variable S-boxes with four fixed S-boxes that were designed to be resistant to these attacks, key-dependent S-boxes are easier to implement and less susceptible to arguments of hidden properties. Additionally, these S-boxes can be created on demand, reducing the need for large data structures stored with the algorithm. Each bit of xL is only used as the input to one S-box. In DES many bits are used as inputs to two S-boxes, but this added complication is not necessary with key- dependent S-boxes. The P-array substitution can be considered to be part of the F function, and is already iteration-dependent. The number of rounds is set at 16. However, this number affects the size of the P- array and therefore the generation process; 16 iterations permits key lengths up to 448 bits. This number can be reduced, and greatly speed up the algorithm in the process. In algorithm design, there are two basic ways to ensure that the key is long enough to ensure a particular security level. One is to carefully design the algorithm so that the entire entropy of the key is preserved, so there is no better way to cryptanalyze the algorithm other than brute force. The other is to design the algorithm with so many key bits that reduce the effective key length by several bits. Since Blowfish is designed for large microprocessors with large amounts of memory, the latter one is used. The sub key generation process is designed to preserve the entire entropy of the key and to distribute that entropy uniformly throughout the sub keys. The digits of pi are chosen as the initial sub key table for two reasons: because it is a random sequence not related to the algorithm, and because it could either be stored as part of the algorithm or derived when needed. Any string of random bits--RAND tables, output of a random number generator--will suffice for pi. In the sub key generation process, the sub keys change slightly with every pair of sub keys generated. This is to protect against any attacks of the sub key generation process. It also reduces storage requirements. The 448-bits limit on key size ensures that every bit of every sub key depends on every bit of the key. The key bits are repeatedly XORed with the digits of pi in the initial P-array to prevent the following potential attack: Assume that the key bits are not repeated, but instead padded with zeros to extend it to the length of the P-array. An attacker might find two keys that differ only in the 64-bit value XORed with P1 and P2 that produce the same encrypted value. If so, he can find two keys that produce all the same sub keys. This is a highly tempting attack for a malicious key generator. To prevent this same type of attack, the initial plaintext value in the sub key-generation process is fixed. Highly correlated key bits, such as an alphanumeric ASCII string with the bit of every byte set to 0, will produce random sub keys. The time-consuming subkey-generation process adds considerable complexity for a brute-force attack. A total of 522 iterations of the algorithm are required to test a single key.
POSSIBLE SIMPLIFICATIONS Several possible simplifications, aimed at decreasing memory requirements and execution time are outlined below: Fewer and smaller S-boxes: It may be possible to reduce the number of S-boxes from four to one to reduce the memory requirements for the four S-boxes from 4096 bytes to 1024 bytes. Additionally, it may be possible to overlap entries in a single S-box to reduce the requirements from 1024 bytes to 259 bytes. Fewer iterations: It is probably safe to reduce the number of iterations from 16 to 8 without compromising security. On-the-fly subkey calculation: The current method of subkey calculation requires all sub keys to be calculated advance of any data encryption. An alternate method is the one where every subkey can be calculated independent of any other.
CRYPTANALYSIS OF BLOWFISH
The most interesting results are:
John Kelsey developed an attack that could break 3-round Blowfish, but was unable to extend it. This attack exploits the F function and the fact that addition mod 232 and XOR do not commute. Serge Vaudenay examined a simplified variant of Blowfish, with the S-boxes known and not key-dependent.
The discovery of weak keys in Blowfish is significant. A weak key is one for which two entries for a given S-box is identical. We have to do the key expansion and check for identical S-box entries after generating a Blowfish key.
AREAS OF APPLICATION: A standard encryption algorithm must be suitable for different applications: Bulk encryption: The algorithm should be efficient in encrypting data files or a continuous data stream. Random bit generation: The algorithm should be efficient in producing single random bits. Packet encryption: The algorithm should be efficient in encrypting packet-sized data. Hashing: The algorithm should be efficient in being converted to a one-way hash function.
Products that use Blowfish The products that use the Blowfish encryption algorithm are: BF-SDK (Blowfish Software Development Kit) provides the basic functions to encrypt and decrypt data.CertifiedMail.com is a website that provides encrypted message delivery using Blowfish to transmit messages from an e-mail client to the CertifiedMail Server, then stores messages with Blowfish. OpenBSD is a free Unix-like operating system that uses Blowfish by default for one-way password encryption. Scramdisk is a Disk encryption for Windows95 and Windows98. Ultra-Scan is an ultrasonic fingerprint scanner uses Blowfish to encrypt the fingerprint images.
CONCLUSION:
In this paper we discussed Blowfish, it is a variable-length key block cipher. It is only suitable for applications where the key does not change often, like a communications link or an automatic file encryptor. It is significantly faster than DES when implemented on 32-bit microprocessors with large data caches, such as the Pentium and the PowerPC. Although there is a complex initialization phase required before any encryption can take place, the actual encryption of data is very efficient on large microprocessors. Linux includes Blowfish in the mainline kernel, starting with v2.5.47.
Blowfish is a 16 pass block encryption algorithm that has never been broken. The most efficient way to break Blowfish is through exhaustive search of the key space. Although a number of excellent algorithms have been developed BLOWFISH is used frequently because:
It has been repeatedly tested and found to be very secure.
It is extremely fast due to its taking advantage of built-in instructions on the current microprocessors for basic bit shuffling operations.
It was placed in the public domain.
REFERENCES
[1] E. Biham and A. Shamir, Differential Cryptanalysis of the Data Encryption Standard, Springer-Verlag, 1993.
[2] H. Feistel, "Cryptography and Computer Privacy," Scientific American.
[3] B. Schneier, "Data Guardians," MacWorld, Feb 1993.
[4] B. Schneier, Applied Cryptography, John Wiley & Sons, New York, 1994.
[5]J.L Smith, The Design of Lucifer, A Cryptographic Device for Data Communication, RC 3326, White Plains: IBM Research.
[6] HYPERLINK "http://www.howstuffworks.com" www.howstuffworks.com
advanced carcinogenesis analyis to the selsus flower means metuern and the invent of scioinoice
advanced carcinogeneisis-EOHEOAEOAEOH analysis is the first wind of the invent of scienoice
the first invent of scienoice is meteurn
the first invent of scienoice is meteurn
the fluke television betrexe algorithm
the zorbeind intelligence nec oroforce alorein ian tonic alorian
the oroforce ou-lis force alorian or the fluke of the zorbein myrthos neon uneon or the twelve bed radical of the 15 different sun torus cancer movement
also it might be called the fluke of the century or the history of the future or the future of enternia or the enternia dormium future encapsulation
or the uenteurnia/::/unix\::\inix/::/linux\::\riauis/::\iasrian\::/seulnthncs
elordonejustsomthingohhmm="ok lets see somthing the clear math"
another wierd thing is i selected the whole thing
i selected it all
was all
selecting it all
somehow weirdly there was a sainey by selecting some and then certain then all and it was somehow weirdly being or benign in some sainey or it was weird but correct
selecting some and then certain then all
selecting certain and then some and then all
applied carcinogenesis means () =
applied ian nal tonic or that light and matter and space are tonic before the phenomenon so that the applied tonic to the word becomes tonic to the phenomenon so that true carcinogenesis of the phenomenon can have true carcinogenesis or true phenomenon of eternia
a eternia dormium is void of carcinogenesis suggest that the face or the sccas of the emptiness per peroid ehld scca or nhe must use tonic to attain true carcinogenesis
the word nhe always has the tonic to have void or vague memory not - 1 ihm sje where it is dry to the character of pressure
a eternia dormium is void of carcinogenesis suggest that the face or the sccas of the emptiness per peroid ehld scca or nhe must use tonic to attain true carcinogenesis
the word nhe always has the tonic to have void or vague memory not - 1 ihm sje where it is dry to the character of pressure
physical ian nal tonic of the zorbeind word physical
means to physical pavements of layers
or that the cosmic layer and the pavement of two is physical
or that the cosmic layer and the pavement of two is physical
truisms means queals metruen netophyr stages of entropy - 1 or that separists stages in tileine always meant cosmic power
a truism in tileine is a separitist wind before the flower of fouriyour
the flower of fouriyour is a cosmic string that means tileine
tileine simply means to give fouriyour to the cosmic wind
or to compress the cosmic wind to answer the fourth wind of the tileine stage of the compression of the cosmic compression of the tileine or the cancer wind
the cancer wind is a torus
the torus cancer movement end is a wind that has a particular separitist agreement with the notion agreement that means seminal or seminole or semihnal or seminal
the first wind of the cosmic tileine is the word siminal
the cancer wind of the cosmic tileine is the word compression
the first wind and the second wind is the cancer of the geome metrun of the cosmic disposition of the tileine and the tileine winds or the well of the word of the second words with the fourth words of the wind and the compression of the temporal strings of tileine and tileeine are the separitists winds that take place before the cosmic string can enter Zorbeinseiey or absolute chameloeyde of float disposition
so you almost know the cosmic wind means to float and disapear
the flower of fouriyour is a cosmic string that means tileine
tileine simply means to give fouriyour to the cosmic wind
or to compress the cosmic wind to answer the fourth wind of the tileine stage of the compression of the cosmic compression of the tileine or the cancer wind
the cancer wind is a torus
the torus cancer movement end is a wind that has a particular separitist agreement with the notion agreement that means seminal or seminole or semihnal or seminal
the first wind of the cosmic tileine is the word siminal
the cancer wind of the cosmic tileine is the word compression
the first wind and the second wind is the cancer of the geome metrun of the cosmic disposition of the tileine and the tileine winds or the well of the word of the second words with the fourth words of the wind and the compression of the temporal strings of tileine and tileeine are the separitists winds that take place before the cosmic string can enter Zorbeinseiey or absolute chameloeyde of float disposition
so you almost know the cosmic wind means to float and disapear
cosmological iananl tonic of the word comso-synthesis can mean () =
a comso logic al or al go-rithm syn-thesis wind or a logic of wind can mean () =
a cosmoa logic of wind can mean () =
a somocosm is a wind al -rith mean wind () = (queals)(euequels)(euegules) wind ?
a comsic wind can al -thirhm maen wind antian fouryiyour fouieyrour the cosmic wind can cosmic the wind of fouyriyour
a selsus flower and a zelsus flower have always to the mind of tileine meant the first wind of the cosmic fouyriyour or that the word journey and wind always mean the behind wind of the first journey of the cosmic wind to the well of the word of tileine
the first tileine to my knowledge had no cosmic wind only it meant that it was a selsus flower that always was cosmic
a cosmoa logic of wind can mean () =
a somocosm is a wind al -rith mean wind () = (queals)(euequels)(euegules) wind ?
a comsic wind can al -thirhm maen wind antian fouryiyour fouieyrour the cosmic wind can cosmic the wind of fouyriyour
a selsus flower and a zelsus flower have always to the mind of tileine meant the first wind of the cosmic fouyriyour or that the word journey and wind always mean the behind wind of the first journey of the cosmic wind to the well of the word of tileine
the first tileine to my knowledge had no cosmic wind only it meant that it was a selsus flower that always was cosmic
EOHAOEOEOAEOHEOH analysis
comparitive iannal tonic to the word central suggest to my and !ts mind that it simply means wind
central analysis ian nal to-OHE the word -WELL o-HE wor -de-o unnieant is the motion behind the wind almost ive (h)i (v)-h h(ive to the wind is the central wind h(ive) in the central well of the ive without the h can mean the central word without an h can be the wind or the h of the well of the word that would represent the well of the wind before the h took the next (ive) without the h to the well of the (h)ive of the central word without the h to the next h without the word and without the well so that the central of 3c102 well of the h was always behind the well of the h so that the word central always meant that the wind was always behind the wind which represented the central well of the wind behind the wind and before the well so that the central wind was always behind the wind and the central well was always behind the wind behind the wind so that the word central had the meaning of the wind behind the wind and behind the word so that the word central would represent the central wind behind the wind so that the central meaning of the word wind and the wind word central would mean always behind the wind so that the meaning of central always meant the perfect wind of the central well of the eternel wind or the first and perfect wind was always central to have been behind or that the word central meant that the first wind and the first meaning of the word central was always perfect before the meaning of the word central or that all perfection is behind the wind so that the eternel wind meant the first central wind
central analysis ian nal to-OHE the word -WELL o-HE wor -de-o unnieant is the motion behind the wind almost ive (h)i (v)-h h(ive to the wind is the central wind h(ive) in the central well of the ive without the h can mean the central word without an h can be the wind or the h of the well of the word that would represent the well of the wind before the h took the next (ive) without the h to the well of the (h)ive of the central word without the h to the next h without the word and without the well so that the central of 3c102 well of the h was always behind the well of the h so that the word central always meant that the wind was always behind the wind which represented the central well of the wind behind the wind and before the well so that the central wind was always behind the wind and the central well was always behind the wind behind the wind so that the word central had the meaning of the wind behind the wind and behind the word so that the word central would represent the central wind behind the wind so that the central meaning of the word wind and the wind word central would mean always behind the wind so that the meaning of central always meant the perfect wind of the central well of the eternel wind or the first and perfect wind was always central to have been behind or that the word central meant that the first wind and the first meaning of the word central was always perfect before the meaning of the word central or that all perfection is behind the wind so that the eternel wind meant the first central wind
non -on o- n- and the qaud of scca
t-ou-on o-o catain philosophy the-EOA-n- and the w-EOH-((nd thell in word ehmed theomorphic cphelbaoum cantian and the notive theiomopheec -OH-qaud of sccametruen and the kant and cant scienoice quadsaccording
central-OH-EOHAOEOEOAEOHEOH analysis of the word central
thie first thought to my mind of the word central can be a little to the left of the notion behind anything relating to some notion in shod meleos the thot the mind the word central. there is somthing central in the notion that when guiding well with a word the central part of the word to my my is well first then the notion the the word is well with the next word that has a central motor with the well of the word. the word central can me also that the wind is central to the word or that four words and four winds can guide the word or the word next to the wind of its central word that has the next word well with the word of the first word next to the word is the well of the word or that the word well and the next word with the first word is well with the word or that it is well with the wind
according to catain philosophy the well in word ehmed theomorphic cphelbaoum cantian and the notive theiomopheec metruen and the kant and cant scienoice quad non -on o- n- and the quad of sccas
Origin [edit]
The phrase "anthropic principle" first appeared in Brandon Carter's contribution to a 1973 Kraków symposium honouring Copernicus's 500th birthday. Carter, a theoretical astrophysicist, articulated the Anthropic Principle in reaction to the Copernican Principle, which states that humans do not occupy a privileged position in the Universe. As Carter said: "Although our situation is not necessarily central, it is inevitably privileged to some extent."[14] Specifically, Carter disagreed with using the Copernican principle to justify thePerfect Cosmological Principle, which states that all large regions and times in the universe must be statistically identical. The latter principle underlay the steady-state theory, which had recently been falsified by the 1965 discovery of the cosmic microwave background radiation. This discovery was unequivocal evidence that the universe has changed radically over time (for example, via the Big Bang).
Carter defined two forms of the Anthropic Principle, a "weak" one which referred only to anthropic selection of privileged spacetime locations in the universe, and a more controversial "strong" form which addressed the values of the fundamental constants of physics.
Roger Penrose explained the weak form as follows:
"The argument can be used to explain why the conditions happen to be just right for the existence of (intelligent) life on the earth at the present time. For if they were not just right, then we should not have found ourselves to be here now, but somewhere else, at some other appropriate time. This principle was used very effectively by Brandon Carter and Robert Dicke to resolve an issue that had puzzled physicists for a good many years. The issue concerned various striking numerical relations that are observed to hold between the physical constants (the gravitational constant, the mass of the proton, the age of the universe, etc.). A puzzling aspect of this was that some of the relations hold only at the present epoch in the earth's history, so we appear, coincidentally, to be living at a very special time (give or take a few million years!). This was later explained, by Carter and Dicke, by the fact that this epoch coincided with the lifetime of what are called main-sequence stars, such as the sun. At any other epoch, so the argument ran, there would be no intelligent life around in order to measure the physical constants in question — so the coincidence had to hold, simply because there would be intelligent life around only at the particular time that the coincidence did hold!"—The Emperor's New Mind, Chapter 10
One reason this is plausible is that there are many other places and times in which we can imagine finding ourselves. But when applying the strong principle, we only have one Universe, with one set of fundamental parameters, so what exactly is the point being made? Carter offers two possibilities: First, we can use our own existence to make "predictions" about the parameters. But second, "as a last resort", we can convert these predictions into explanations by assuming that there is more than one Universe, in fact a large and possibly infinite collection of universes, something that is now called a multiverse ("world ensemble" was Carter's term), in which the parameters (and perhaps the laws of physics) vary across universes. The strong principle then becomes an example of a selection effect, exactly analogous to the weak principle. Postulating a multiverse is certainly a radical step, but taking it could provide at least a partial answer to a question which had seemed to be out of the reach of normal science: "why do the fundamental laws of physics take the particular form we observe and not another?"
Since Carter's 1973 paper, the term "Anthropic Principle" has been extended to cover a number of ideas which differ in important ways from those he espoused. Particular confusion was caused in 1986 by the book The Anthropic Cosmological Principle by John D. Barrow and Frank Tipler,[15] published that year which distinguished between "weak" and "strong" anthropic principle in a way very different from Carter's, as discussed in the next section.
Carter was not the first to invoke some form of the anthropic principle. In fact, the evolutionary biologist Alfred Russel Wallace anticipated the anthropic principle as long ago as 1904: "Such a vast and complex universe as that which we know exists around us, may have been absolutely required ... in order to produce a world that should be precisely adapted in every detail for the orderly development of life culminating in man."[16] In 1957, Robert Dicke wrote: "The age of the Universe 'now' is not random but conditioned by biological factors ... [changes in the values of the fundamental constants of physics] would preclude the existence of man to consider the problem."[17]
Variants [edit]
Weak anthropic principle (WAP) (Carter): "we must be prepared to take account of the fact that our location in the universe is necessarily privileged to the extent of being compatible with our existence as observers." Note that for Carter, "location" refers to our location in time as well as space.
Strong anthropic principle (SAP) (Carter): "the Universe (and hence the fundamental parameters on which it depends) must be such as to admit the creation of observers within it at some stage. To paraphrase Descartes, cogito ergo mundus talis est."
The Latin tag ("I think, therefore the world is such [as it is]") makes it clear that "must" indicates a deduction from the fact of our existence; the statement is thus a truism.
The Latin tag ("I think, therefore the world is such [as it is]") makes it clear that "must" indicates a deduction from the fact of our existence; the statement is thus a truism.
In their 1986 book, The Anthropic Cosmological Principle, John Barrow and Frank Tipler depart from Carter and define the WAP and SAP as follows:[18][19]
Weak anthropic principle (WAP) (Barrow and Tipler): "The observed values of all physical and cosmological quantities are not equally probable but they take on values restricted by the requirement that there exist sites where carbon-based life can evolve and by the requirements that the Universe be old enough for it to have already done so."[20]
Unlike Carter they restrict the principle to carbon-based life, rather than just "observers." A more important difference is that they apply the WAP to the fundamental physical constants, such as the fine structure constant, the number of spacetime dimensions, and the cosmological constant —, topics that fall under Carter's SAP.
Unlike Carter they restrict the principle to carbon-based life, rather than just "observers." A more important difference is that they apply the WAP to the fundamental physical constants, such as the fine structure constant, the number of spacetime dimensions, and the cosmological constant —, topics that fall under Carter's SAP.
Strong anthropic principle (SAP) (Barrow and Tipler): "The Universe must have those properties which allow life to develop within it at some stage in its history."[21]
This looks very similar to Carter's SAP, but unlike the case with Carter's SAP, the "must" is an imperative, as shown by the following three possible elaborations of the SAP, each proposed by Barrow and Tipler:[22]
This looks very similar to Carter's SAP, but unlike the case with Carter's SAP, the "must" is an imperative, as shown by the following three possible elaborations of the SAP, each proposed by Barrow and Tipler:[22]
- "There exists one possible Universe 'designed' with the goal of generating and sustaining 'observers'."
-
- This can be seen as simply the classic design argument restated in the garb of contemporary cosmology. It implies that the purpose of the universe is to give rise tointelligent life, with the laws of nature and their fundamental physical constants set to ensure that life as we know it will emerge and evolve.
- "Observers are necessary to bring the Universe into being."
-
- Barrow and Tipler believe that this is a valid conclusion from quantum mechanics, as John Archibald Wheeler has suggested, especially via his idea that information is the fundamental reality, see It from bit, and his Participatory Anthropic Principle (PAP) which is an interpretation of quantum mechanics associated with the ideas of John von Neumann and Eugene Wigner.
- "An ensemble of other different universes is necessary for the existence of our Universe."
-
- By contrast, Carter merely says that an ensemble of universes is necessary for the SAP to count as an explanation.
'Modified anthropic principle (MAP) (Schmidhuber): The 'problem' of existence is only relevant to a species capable of formulating the question. Prior to Homo sapiensintellectual evolution to the point where the nature of the observed universe - and humans' place within same - spawned deep inquiry into its origins, the 'problem' simply did not exist.[23]
The philosophers John Leslie[24] and Nick Bostrom[25] reject the Barrow and Tipler SAP as a fundamental misreading of Carter. For Bostrom, Carter's anthropic principle just warns us to make allowance for anthropic bias, that is, the bias created by anthropic selection effects (which Bostrom calls "observation" selection effects) — the necessity for observers to exist in order to get a result. He writes:
"Many 'anthropic principles' are simply confused. Some, especially those drawing inspiration from Brandon Carter's seminal papers, are sound, but... they are too weak to do any real scientific work. In particular, I argue that existing methodology does not permit any observational consequences to be derived from contemporary cosmological theories, though these theories quite plainly can be and are being tested empirically by astronomers. What is needed to bridge this methodological gap is a more adequate formulation of how observation selection effects are to be taken into account."—Anthropic Bias, Introduction., [26]
Strong self-sampling assumption (SSSA) (Bostrom): "Each observer-moment should reason as if it were randomly selected from the class of all observer-moments in itsreference class."
Analysing an observer's experience into a sequence of "observer-moments" helps avoid certain paradoxes; but the main ambiguity is the selection of the appropriate "reference class": for Carter's WAP this might correspond to all real or potential observer-moments in our universe; for the SAP, to all in the multiverse. Bostrom's mathematical development shows that choosing either too broad or too narrow a reference class leads to counter-intuitive results, but he is not able to prescribe an ideal choice.
Analysing an observer's experience into a sequence of "observer-moments" helps avoid certain paradoxes; but the main ambiguity is the selection of the appropriate "reference class": for Carter's WAP this might correspond to all real or potential observer-moments in our universe; for the SAP, to all in the multiverse. Bostrom's mathematical development shows that choosing either too broad or too narrow a reference class leads to counter-intuitive results, but he is not able to prescribe an ideal choice.
According to Jürgen Schmidhuber, the anthropic principle essentially just says that the conditional probability of finding yourself in a universe compatible with your existence is always 1. It does not allow for any additional nontrivial predictions such as "gravity won't change tomorrow." To gain more predictive power, additional assumptions on the prior distribution of alternative universes are necessary.[23][27]
Playwright and novelist Michael Frayn describes a form of the Strong Anthropic Principle in his 2006 book The Human Touch, which explores what he characterises as "the central oddity of the Universe":
"It's this simple paradox. The Universe is very old and very large. Humankind, by comparison, is only a tiny disturbance in one small corner of it - and a very recent one. Yet the universe is only very large and very old because we are here to say it is... And yet, of course, we all know perfectly well that it is what it is whether we are here or not."—[28]
Character of anthropic reasoning [edit]
Carter chose to focus on a tautological aspect of his ideas, which has resulted in much confusion. In fact, anthropic reasoning interests scientists because of something that is only implicit in the above formal definitions, namely that we should give serious consideration to there being other universes with different values of the "fundamental parameters" — that is, the dimensionless physical constants and initial conditions for the Big Bang. Carter and others have argued that life as we know it would not be possible in most such universes. In other words, the universe we are in is fine tuned to permit life. Collins & Hawking (1973) characterized Carter's then-unpublished big idea as the postulate that "there is not one universe but a whole infinite ensemble of universes with all possible initial conditions".[29] If this is granted, the anthropic principle provides a plausible explanation for the fine tuning of our universe: the "typical" universe is not fine-tuned, but given enough universes, a small fraction thereof will be capable of supporting intelligent life. Ours must be one of these, and so the observed fine tuning should be no cause for wonder.
Although philosophers have discussed related concepts for centuries, in the early 1970s the only genuine physical theory yielding a multiverse of sorts was the many worlds interpretation of quantum mechanics. This would allow variation in initial conditions, but not in the truly fundamental constants. Since that time a number of mechanisms for producing a multiverse have been suggested: see the review by Max Tegmark.[30] An important development in the 1980s was the combination of inflation theory with the hypothesis that some parameters are determined by symmetry breaking in the early universe, which allows parameters previously thought of as "fundamental constants" to vary over very large distances, thus eroding the distinction between Carter's weak and strong principles. At the beginning of the 21st century, the string landscape emerged as a mechanism for varying essentially all the constants, including the number of spatial dimensions.[31]
The anthropic idea that fundamental parameters are selected from a multitude of different possibilities (each actual in some universe or other) contrasts with the traditional hope of physicists for a theory of everything having no free parameters: as Einstein said, "What really interests me is whether God had any choice in the creation of the world." In 2002, proponents of the leading candidate for a "theory of everything", string theory, proclaimed "the end of the anthropic principle"[32] since there would be no free parameters to select. Ironically, string theory now seems to offer no hope of predicting fundamental parameters, and now some who advocate it invoke the anthropic principle as well (see below).
The modern form of a design argument is put forth by Intelligent design. Proponents of intelligent design often cite the fine-tuning observations that (in part) preceded the formulation of the anthropic principle by Carter as a proof of an intelligent designer. Opponents of intelligent design are not limited to those who hypothesize that other universes exist; they may also argue, anti-anthropically, that the universe is less fine-tuned than often claimed, or that accepting fine tuning as a brute fact is less astonishing than the idea of an intelligent creator. Furthermore, even accepting fine tuning, Sober (2005)[33] and Ikeda and Jefferys,[34][35] argue that the Anthropic Principle as conventionally stated actually undermines intelligent design; see fine-tuned universe.
Paul Davies's book The Goldilocks Enigma (2006) reviews the current state of the fine tuning debate in detail, and concludes by enumerating the following responses to that debate:
- The absurd universe: Our universe just happens to be the way it is.
- The unique universe: There is a deep underlying unity in physics which necessitates the Universe being the way it is. Some Theory of Everything will explain why the various features of the Universe must have exactly the values that we see.
- The multiverse: Multiple universes exist, having all possible combinations of characteristics, and we inevitably find ourselves within a universe that allows us to exist.
- Intelligent Design: A creator designed the Universe with the purpose of supporting complexity and the emergence of intelligence.
- The life principle: There is an underlying principle that constrains the Universe to evolve towards life and mind.
- The self-explaining universe: A closed explanatory or causal loop: "perhaps only universes with a capacity for consciousness can exist." This is Wheeler's Participatory Anthropic Principle (PAP).
- The fake universe :We live inside a virtual reality simulation.
Omitted here is Lee Smolin's model of cosmological natural selection, also known as "fecund universes," which proposes that universes have "offspring" which are more plentiful if they resemble our universe. Also see Gardner (2005).[36]
Clearly each of these hypotheses resolve some aspects of the puzzle, while leaving others unanswered. Followers of Carter would admit only option 3 as an anthropic explanation, whereas 3 through 6 are covered by different versions of Barrow and Tipler's SAP (which would also include 7 if it is considered a variant of 4, as in Tipler 1994).
The anthropic principle, at least as Carter conceived it, can be applied on scales much smaller than the whole universe. For example, Carter (1983)[37] inverted the usual line of reasoning and pointed out that when interpreting the evolutionary record, one must take into account cosmological and astrophysical considerations. With this in mind, Carter concluded that given the best estimates of the age of the universe, the evolutionary chain culminating in Homo sapiens probably admits only one or two low probability links.Antonio Feoli and Salvatore Rampone dispute this conclusion, arguing instead that the estimated size of our universe and the number of planets in it allows for a higher bound, so that there is no need to invoke intelligent design to explain evolution. [38]
Observational evidence [edit]
No possible observational evidence bears on Carter's WAP, as it is merely advice to the scientist and asserts nothing debatable. The obvious test of Barrow's SAP, which says that the Universe is "required" to support life, is to find evidence of life in universes other than ours. Any other universe is, by most definitions, unobservable (otherwise it would be included in our portion of this universe). Thus, in principle Barrow's SAP cannot be falsified by observing a universe in which an observer cannot exist.
- Physical theory will evolve so as to strengthen the hypothesis that early phase transitions occur probabilistically rather than deterministically, in which case there will be no deep physical reason for the values of fundamental constants;
- Various theories for generating multiple universes will prove robust;
- Evidence that the universe is fine tuned will continue to accumulate;
- No life with a non-carbon chemistry will be discovered;
- Mathematical studies of galaxy formation will confirm that it is sensitive to the rate of expansion of the universe.
Hogan[40] has emphasised that it would be very strange if all fundamental constants were strictly determined, since this would leave us with no ready explanation for apparent fine tuning. In fact we might have to resort to something akin to Barrow and Tipler's SAP: there would be no option for such a universe not to support life.
Probabilistic predictions of parameter values can be made given:
- a particular multiverse with a "measure", i.e. a well defined "density of universes" (so, for parameter X, one can calculate the prior probability P(X0) dX that X is in the range X0 < X < X0 + dX), and
- an estimate of the number of observers in each universe, N(X) (e.g., this might be taken as proportional to the number of stars in the universe).
The probability of observing value X is then proportional to N(X) P(X). (A more sophisticated analysis is that of Nick Bostrom.)[41] A generic feature of an analysis of this nature is that the expected values of the fundamental physical constants should not be "over-tuned," i.e. if there is some perfectly tuned predicted value (e.g. zero), the observed value need be no closer to that predicted value than what is required to make life possible. The small but finite value of the cosmological constant can be regarded as a successful prediction in this sense.
One thing that would not count as evidence for the Anthropic Principle is evidence that the Earth or the solar system occupied a privileged position in the universe, in violation of the Copernican principle (for possible counterevidence to this principle, see Copernican principle), unless there was some reason to think that that position was a necessary condition for our existence as observers.
Applications of the principle [edit]
This article or section may contain previously unpublished synthesis of published material that conveys ideas notattributable to the original sources. Relevant discussion may be found on the talk page. (December 2010) |
The nucleosynthesis of carbon-12 [edit]
Fred Hoyle may have invoked anthropic reasoning to predict an astrophysical phenomenon. He is said to have reasoned from the prevalence on earth of life forms whose chemistry was based on carbon-12 atoms, that there must be an undiscovered resonance in the carbon-12 nucleus facilitating its synthesis in stellar interiors via the triple-alpha process. He then calculated the energy of this undiscovered resonance to be 7.6 million electron-volts.[42][43] Willie Fowler's research group soon found this resonance, and its measured energy was close to Hoyle's prediction.
However, a recently released paper argues that Hoyle did not use anthropic reasoning to make this prediction.[44]
Cosmic inflation [edit]
Main article: Cosmic inflation
Don Page criticized the entire theory of cosmic inflation as follows.[45] He emphasized that initial conditions which made possible a thermodynamic arrow of time in a universe with a Big Bang origin, must include the assumption that at the initial singularity, the entropy of the universe was low and therefore extremely improbable. Paul Davies rebutted this criticism by invoking an inflationary version of the anthropic principle.[46] While Davies accepted the premise that the initial state of the visible Universe (which filled a microscopic amount of space before inflating) had to possess a very low entropy value — due to random quantum fluctuations — to account for the observed thermodynamic arrow of time, he deemed this fact an advantage for the theory. That the tiny patch of space from which our observable Universe grew had to be extremely orderly, to allow the post-inflation universe to have an arrow of time, makes it unnecessary to adopt any "ad hoc" hypotheses about the initial entropy state, hypotheses other Big Bang theories require.
String theory [edit]
Main article: String theory landscape
String theory predicts a large number of possible universes, called the "backgrounds" or "vacua." The set of these vacua is often called the "multiverse" or "anthropic landscape" or "string landscape." Leonard Susskind has argued that the existence of a large number of vacua puts anthropic reasoning on firm ground: only universes whose properties are such as to allow observers to exist are observed, while a possibly much larger set of universes lacking such properties go unnoticed.
Steven Weinberg[47] believes the Anthropic Principle may be appropriated by cosmologists committed to nontheism, and refers to that Principle as a "turning point" in modern science because applying it to the string landscape "...may explain how the constants of nature that we observe can take values suitable for life without being fine-tuned by a benevolent creator." Others, most notably David Gross but also Lubos Motl, Peter Woit, and Lee Smolin, argue that this is not predictive. Max Tegmark,[48] Mario Livio, andMartin Rees[49] argue that only some aspects of a physical theory need be observable and/or testable for the theory to be accepted, and that many well-accepted theories are far from completely testable at present.
Jürgen Schmidhuber (2000–2002) points out that Ray Solomonoff's theory of universal inductive inference and its extensions already provide a framework for maximizing our confidence in any theory, given a limited sequence of physical observations, and some prior distribution on the set of possible explanations of the universe.
Ice density [edit]
When water freezes into ice, the ice floats because ice is less dense than liquid water. This is one possible example of the anthropic principle, because if ice did not float, it might have been difficult or impossible for living organisms to have existed in water; without the insulating properties of a top ice layer, lakes and ponds would tend to freeze solid and thaw very little during warmer periods. This principle has been criticized as neglecting the existence of the tropical zone and other warmer climates.
Ice is unusual in that it is approximately 9% less dense than liquid water. Water is the only known non-metallic substance to expand when it freezes. The density of ice is 0.9167 g/cm3 at 0°C, whereas water has a density of 0.9998 g/cm3 at the same temperature. Liquid water is densest, essentially 1.00 g/cm3, at 4°C and becomes less dense as the water molecules begin to form the hexagonal crystals[50] of ice as the freezing point is reached. This is due to hydrogen bonding dominating the intermolecular forces, which results in a packing of molecules less compact in the solid.
The Anthropic Cosmological Principle [edit]
A thorough extant study of the anthropic principle is the book The Anthropic Cosmological Principle by John D. Barrow, a cosmologist, and Frank J. Tipler, a theosophist and mathematical physicist. This book sets out in detail the many known anthropic coincidences and constraints, including many found by its authors. While the book is primarily a work of theoretical astrophysics, it also touches on quantum physics, chemistry, and earth science. An entire chapter argues that Homo sapiens is, with high probability, the onlyintelligent species in the Milky Way.
The book begins with an extensive review of many topics in the history of ideas the authors deem relevant to the anthropic principle, because the authors believe that principle has important antecedents in the notions of teleology and intelligent design. They discuss the writings of Fichte, Hegel, Bergson, and Alfred North Whitehead, and the Omega Pointcosmology of Teilhard de Chardin. Barrow and Tipler carefully distinguish teleological reasoning from eutaxiological reasoning; the former asserts that order must have a consequent purpose; the latter asserts more modestly that order must have a planned cause. They attribute this important but nearly always overlooked distinction to an obscure 1883 book by L. E. Hicks.[51]
Seeing little sense in a principle requiring intelligent life to emerge while remaining indifferent to the possibility of its eventual extinction, Barrow and Tipler propose the:
"Final anthropic principle (FAP): Intelligent information-processing must come into existence in the Universe, and, once it comes into existence, it will never die out."—[52]
Barrow and Tipler submit that the FAP is both a valid physical statement and "closely connected with moral values." FAP places strong constraints on the structure of theuniverse, constraints developed further in Tipler's The Physics of Immortality.[53] One such constraint is that the universe must end in a big crunch, which seems unlikely in view of the tentative conclusions drawn since 1998 about dark energy, based on observations of very distant supernovas.
In his review[54] of Barrow and Tipler, Martin Gardner ridiculed the FAP by quoting the last two sentences of their book as defining a Completely Ridiculous Anthropic Principle (CRAP):
"At the instant the Omega Point is reached, life will have gained control of all matter and forces not only in a single universe, but in all universes whose existence is logically possible; life will have spread into all spatial regions in all universes which could logically exist, and will have stored an infinite amount of information, including all bits of knowledge which it is logically possible to know. And this is the end."
Subscribe to:
Posts (Atom)