Thursday, October 29, 2009

Section 7.3-7.5, November 2nd

All of the stuff was pretty straight forward this time. The hardest part would have been to follow all the little steps in ElGamill encryption, but that wasn't really hard. It does seem silly to me thought that it would say to break the message up. Why would you use public key encryption to send a long message and not just a key? That didn't make sense to me.

The most interesting thing was reading about ElGamill encryption. I knew about the Diffie-Hellman encryption, but not this. It is always interesting to learn more. I am most excited to learn about eliptic curve cryptography.

Tuesday, October 27, 2009

Section 7.2, October 28

So this is another one of those sections that makes up for all the sections that I think are not really hard. I can't really follow what is happening when we get to the index calculus step. I am not really sure about what is going on to find the different exponents.

The most interesting thing for me was that this works out for p = 1 mod 4. I think this is interesting because before we talked about finding squares in the environment where p = 3 mod 4. It is just cute to me that this would happen and be so opposite.

Monday, October 26, 2009

Section 6.5 - 6.7, 7.1, October 28

Nothing was really hard to understand in these sections. They were more explanatory then they involved math and thus were not hard. Probably the hardest would have been following the explentation of the discrete logorithm, but I have already learned about that.

What is cool to me is the discrete logarithmic problem. It provides us with another unique way to generate keys that are compitationally very hard to break, but very easy to create. The use of the duffie-helman encryption based on this also interests me.

Thursday, October 22, 2009

Section 6.4.1, October 26

The hardest part, and the part I am still not 100% on is how we pick the numbers squared from wich we find small primes. Are these semi-randomly chosed within the bounds of being just a little larger than n? Is there a set procedure for deriving these. This is the part that I can't quite grasp.

The coolest part was seeing how it is possible to generate these numbers that give us equations to solve the x*2 = y^2 (mod n) where x != y. It is cool to see how trivial the actuall work is. The real catch is in just how long it will take.

Wednesday, October 21, 2009

Section 6.4, October 22

The hardest part for me to follow was right at the beginning in the p-1 factoring algorithm. I couldn't understand why we were being putting a to the B! power. It took me a second read through to understand we are trying to find a bound where B! is divisible by p-1.

The most interesting fact for me was to find out why when generating strong primes we first need to assure that the (prime - 1 / 2) is also a prime. When I implemented RSA I had this requirnment and was not completly sure why, but now I see. It gives us a large prime factor for p-1, which makes it resistant to the p-1 factoring algorithm.

Monday, October 19, 2009

Section 6.3, October 19

It wasn't too hard to follow the chapter. I have done most of this in my programming classes before. All the proofs made sense. I took the longest to figure out what was happening with the Miller-Rabin primality testing. I didn't get why it worked until later when they ran the proof.

The coolest test was the Miller-Rabin primality testing, because it was the onet hat I have not used before. It is also cool to see how it can be used to quickly find a factor if we prove that it is not a composite number.

Friday, October 16, 2009

Section 3.11, October 19

I feel a little silly saying it, but the thing I had the hardest time understanding was the format of the legendre numbers. I originally asumed the two numbers where being divided by one another. It was only after I went to class that I finally understand. This little mistake made it nearly impossible for me to follow along, but once I grasped this it was cool.

The cools part for me was learning about the quadratic reduction problem. After reading the chapter I also looked over it on Wikipedia. It is cool to find something attached to RSA that I had not already heard about.

Thursday, October 15, 2009

Section 3.9, October 16

The hardest part for me to follow was the proof. At first I didn't realize it was only talking about numbers where p is congruent to three mod four. This made it a little bit hard for me to follow at first. Once I got this I could see how we could use it.

Nothing was all that interesting in the section, but I am very interested to see how we will use it.

Monday, October 12, 2009

Section 3.12, 6.2 - October 14

The hardest part of the chapter was following the math that allows us to break small components of D. It is simple enough to follow what to do with the information, but understanding why it exactly work still confuses us. I guess I will have to pay good attention in class on wednesday to get it.

The coolest part is that there are attacks on RSA. This time the coolest part was the timing attack on RSA. It is really interesting to see how something that isn't a flaw with the original mathematics, but rather a flaw with implementation, and the way hardware works. It is really slick.