The first two conversions were pretty straight forward. Understanding the Translation for ElGamal transformations was a little harder, but overall it was pretty followable. It works in nearly exactly the same function, we just swap out the necessary function.
I really like how well the elliptic curves lend them selves really well to problems with discrete logs. This makes implementing them in systems all the easier.
Friday, December 4, 2009
Section 16.4, December 7
The section was a little hard for me to see applicability. The number of points that we can generate are very few in number. I guess we still have a lot more in GF(2^8), but it still seems rather small. I guess I must not be fully grasping the concept.
It was cool to see us returning to items in base 2 seeing as that is what computers use.
It was cool to see us returning to items in base 2 seeing as that is what computers use.
Tuesday, December 1, 2009
Section 16.2, December 2
The hardest part to follow was how we were going to represent plain test. It seems like a lot of work to encode plain test, but I can only guess that it must be very fast to do on a computer. It is annoying that it is only probabilistic.
The coolest part was to find that there is a law about elliptic curves that is analagous to solving discrete logarithms. This helps me see how to use Elliptic curves in cryptography.
The coolest part was to find that there is a law about elliptic curves that is analagous to solving discrete logarithms. This helps me see how to use Elliptic curves in cryptography.
Section 16.1, November 30
I would say the hardest part of the whole section was wrapping my mind around the section as a whole. No one part of it was really that bad, but there were a lot of little different things. Actually on second though I did have a hard time initially following how we were going to find the tangent line. The whole derivation thing through me off.
The coolest idea in the whole section was that we can always find a third point on the curve based on the two points that we are given.
The coolest idea in the whole section was that we can always find a third point on the curve based on the two points that we are given.
Monday, November 23, 2009
Section 2.12; November 24
Today's stuff didn't have very much hard stuff. This was partially because it was more of a story in the section, and also because I have studied the enigma. It is a really interesting machine though, and amazing how effective it was.
What I love the most was how Enigma spured the creation of computers. Without it, who knows how much slower the need for computers might have been. It is also interesting to see how people realized they new how to solve something but needed to be able to test solutions faster than humans could by hand.
What I love the most was how Enigma spured the creation of computers. Without it, who knows how much slower the need for computers might have been. It is also interesting to see how people realized they new how to solve something but needed to be able to test solutions faster than humans could by hand.
Friday, November 20, 2009
Section 19.3; November 23
Difficult? Try everything :) While I think that quantum computing is really cool, it is also really good at confusing me. I really enjoyed the paper that explained Shor's algorithm without any complex math. It helped me when I read section 19.3. Even with that paper though, I am not sure I understand all of the math or exactly how it works.
As for being cool, I think the coolest thing is how quatum computers work. That they can take essentially multiple imputs at once and return multiple return values. It is also an interesting paradaim where you try to control probability curves to get the correct answer.
As for being cool, I think the coolest thing is how quatum computers work. That they can take essentially multiple imputs at once and return multiple return values. It is also an interesting paradaim where you try to control probability curves to get the correct answer.
Wednesday, November 18, 2009
Section 19.1 - 19.2; November 20
The hardest part to understand is most certainly just the way that quantam mechanics work. Luckily I have studied this in another computer science lesson. It is hard to really wrap your head around the idea that observing data will change data, but that is the basis of quantum cryptography.
The cooles thing is how quantum mechanics work. It does provide us with a potential to transfer keys securely. It does seem strange that we couldn't just use this medium for transfering data itself. Since no one can observe it without changing it, then why don't we just find a way to send messages over this channel. Thus we impede key easedropping on standard communicaiton.
The cooles thing is how quantum mechanics work. It does provide us with a potential to transfer keys securely. It does seem strange that we couldn't just use this medium for transfering data itself. Since no one can observe it without changing it, then why don't we just find a way to send messages over this channel. Thus we impede key easedropping on standard communicaiton.
Tuesday, November 17, 2009
Section14.1-14.2: November 18th
The hardest part, though still not hard, is following the Feige-Fiat-Shamir scheme. Wow, couldn't they have picked a smaller name... It is really just an extension of what we already saw, but expands on it to enhance performance.
The most interesting is probably just the concept of zero-knowledge techniques for proving ability. I think this could have application to identity proving without having to say exactly who a person is and not individual attributes.
The most interesting is probably just the concept of zero-knowledge techniques for proving ability. I think this could have application to identity proving without having to say exactly who a person is and not individual attributes.
Friday, November 13, 2009
Section 12.1-12.2, November 16
At first the secret sharing schemes were a little hard to follow, but after I looked at them a little close it became apparent what was happening. The hardest one to follow was the one described by Blakley. It was harder because it did stuff inside of a plane instead of as a line, which I was more familiar with.
The most interesting part was the note at the end where it mentioned how to combine the strengths of both threshold schemes and simpler schemes to create complex secret sharing methodologies. It is interesting to see how they all combine.
The most interesting part was the note at the end where it mentioned how to combine the strengths of both threshold schemes and simpler schemes to create complex secret sharing methodologies. It is interesting to see how they all combine.
Thursday, November 12, 2009
Exam 2 Review, November 13
I think the most important thing we have studied in these sections would be about the math that power the encryption. I think this is so for several reason. The first reason is that it lets us work with algorithms we might not understand, but maybe even more importantly in shows us how to think creatively about the problems presented to us.
Like the first test I am expecting to see problems that mainly deal with the items from section 3 that we have learned. I would also expect to have to explain in a general sense how the other algorithms function.
I need to work on remembering all the different ways to solve the factorization and discrete logs. I also need to review ElGamal.
I really want to go over elliptical curve cryptography and quantum cryptography.
Like the first test I am expecting to see problems that mainly deal with the items from section 3 that we have learned. I would also expect to have to explain in a general sense how the other algorithms function.
I need to work on remembering all the different ways to solve the factorization and discrete logs. I also need to review ElGamal.
I really want to go over elliptical curve cryptography and quantum cryptography.
Monday, November 9, 2009
Section 8.3, 9.4 - November 11
The hardest part was how DES worked. I understood the SHA-1 section easily since I have coded it before. DES was new. It wasn't that hard to follow. It reminds me a lot of ElGamal encryption in the way that it works.
The coolest part was DES. It was interesting to see how our pair of information doesn't really gives a lot of information about the document, but provides a way to check it very easily.
The coolest part was DES. It was interesting to see how our pair of information doesn't really gives a lot of information about the document, but provides a way to check it very easily.
Saturday, November 7, 2009
Section 9.1-9.4,
The hardest part for me was to see how the ElGamil version of digital signatures worked. It was still pretty straight forward using what we already learned about ElGamil. It was only hardest because it is the only one I haven't done myself yet.
THe coolest part was too read about the birthday attacks on signatures. I had never though of this. It makes sense, and while easy to fool it was cool to look at it.
THe coolest part was too read about the birthday attacks on signatures. I had never though of this. It makes sense, and while easy to fool it was cool to look at it.
Wednesday, November 4, 2009
Section 8.4-8.5, 8.7, November 06
Nothing very hard here either. I am aware of the birthday attack, as I wrote code that did this before. Because of this probably the hardest thing to understand was how to use hash functions as cryptographic functions, but even that is not that hard to do.
The coolest part is how quickly birthday attacks reduce the number of tries needed. It isn't a small reduction either, but reduces the exponent by half which is huge.
The coolest part is how quickly birthday attacks reduce the number of tries needed. It isn't a small reduction either, but reduces the exponent by half which is huge.
Monday, November 2, 2009
Section 8.1-8.2, November 4th
Well Hashing is pretty straight forward. I have personally implemented SHA-1 so I am pretty aware of how everything works. The their is a proof in the section, that while not hard, would be the hardest to follow.
The most interesting thing is how it mentions that the sometimes hash functions are allowed to have viable birthday attacks and still be considered feasible. It makes sense, especially when you realize that even with SHA-1 a birthday attack take 2^80 iterations to find, and what you find isn't even viable as a different message.
The most interesting thing is how it mentions that the sometimes hash functions are allowed to have viable birthday attacks and still be considered feasible. It makes sense, especially when you realize that even with SHA-1 a birthday attack take 2^80 iterations to find, and what you find isn't even viable as a different message.
Thursday, October 29, 2009
Section 7.3-7.5, November 2nd
All of the stuff was pretty straight forward this time. The hardest part would have been to follow all the little steps in ElGamill encryption, but that wasn't really hard. It does seem silly to me thought that it would say to break the message up. Why would you use public key encryption to send a long message and not just a key? That didn't make sense to me.
The most interesting thing was reading about ElGamill encryption. I knew about the Diffie-Hellman encryption, but not this. It is always interesting to learn more. I am most excited to learn about eliptic curve cryptography.
The most interesting thing was reading about ElGamill encryption. I knew about the Diffie-Hellman encryption, but not this. It is always interesting to learn more. I am most excited to learn about eliptic curve cryptography.
Tuesday, October 27, 2009
Section 7.2, October 28
So this is another one of those sections that makes up for all the sections that I think are not really hard. I can't really follow what is happening when we get to the index calculus step. I am not really sure about what is going on to find the different exponents.
The most interesting thing for me was that this works out for p = 1 mod 4. I think this is interesting because before we talked about finding squares in the environment where p = 3 mod 4. It is just cute to me that this would happen and be so opposite.
The most interesting thing for me was that this works out for p = 1 mod 4. I think this is interesting because before we talked about finding squares in the environment where p = 3 mod 4. It is just cute to me that this would happen and be so opposite.
Monday, October 26, 2009
Section 6.5 - 6.7, 7.1, October 28
Nothing was really hard to understand in these sections. They were more explanatory then they involved math and thus were not hard. Probably the hardest would have been following the explentation of the discrete logorithm, but I have already learned about that.
What is cool to me is the discrete logarithmic problem. It provides us with another unique way to generate keys that are compitationally very hard to break, but very easy to create. The use of the duffie-helman encryption based on this also interests me.
What is cool to me is the discrete logarithmic problem. It provides us with another unique way to generate keys that are compitationally very hard to break, but very easy to create. The use of the duffie-helman encryption based on this also interests me.
Thursday, October 22, 2009
Section 6.4.1, October 26
The hardest part, and the part I am still not 100% on is how we pick the numbers squared from wich we find small primes. Are these semi-randomly chosed within the bounds of being just a little larger than n? Is there a set procedure for deriving these. This is the part that I can't quite grasp.
The coolest part was seeing how it is possible to generate these numbers that give us equations to solve the x*2 = y^2 (mod n) where x != y. It is cool to see how trivial the actuall work is. The real catch is in just how long it will take.
The coolest part was seeing how it is possible to generate these numbers that give us equations to solve the x*2 = y^2 (mod n) where x != y. It is cool to see how trivial the actuall work is. The real catch is in just how long it will take.
Wednesday, October 21, 2009
Section 6.4, October 22
The hardest part for me to follow was right at the beginning in the p-1 factoring algorithm. I couldn't understand why we were being putting a to the B! power. It took me a second read through to understand we are trying to find a bound where B! is divisible by p-1.
The most interesting fact for me was to find out why when generating strong primes we first need to assure that the (prime - 1 / 2) is also a prime. When I implemented RSA I had this requirnment and was not completly sure why, but now I see. It gives us a large prime factor for p-1, which makes it resistant to the p-1 factoring algorithm.
The most interesting fact for me was to find out why when generating strong primes we first need to assure that the (prime - 1 / 2) is also a prime. When I implemented RSA I had this requirnment and was not completly sure why, but now I see. It gives us a large prime factor for p-1, which makes it resistant to the p-1 factoring algorithm.
Monday, October 19, 2009
Section 6.3, October 19
It wasn't too hard to follow the chapter. I have done most of this in my programming classes before. All the proofs made sense. I took the longest to figure out what was happening with the Miller-Rabin primality testing. I didn't get why it worked until later when they ran the proof.
The coolest test was the Miller-Rabin primality testing, because it was the onet hat I have not used before. It is also cool to see how it can be used to quickly find a factor if we prove that it is not a composite number.
The coolest test was the Miller-Rabin primality testing, because it was the onet hat I have not used before. It is also cool to see how it can be used to quickly find a factor if we prove that it is not a composite number.
Friday, October 16, 2009
Section 3.11, October 19
I feel a little silly saying it, but the thing I had the hardest time understanding was the format of the legendre numbers. I originally asumed the two numbers where being divided by one another. It was only after I went to class that I finally understand. This little mistake made it nearly impossible for me to follow along, but once I grasped this it was cool.
The cools part for me was learning about the quadratic reduction problem. After reading the chapter I also looked over it on Wikipedia. It is cool to find something attached to RSA that I had not already heard about.
The cools part for me was learning about the quadratic reduction problem. After reading the chapter I also looked over it on Wikipedia. It is cool to find something attached to RSA that I had not already heard about.
Thursday, October 15, 2009
Section 3.9, October 16
The hardest part for me to follow was the proof. At first I didn't realize it was only talking about numbers where p is congruent to three mod four. This made it a little bit hard for me to follow at first. Once I got this I could see how we could use it.
Nothing was all that interesting in the section, but I am very interested to see how we will use it.
Nothing was all that interesting in the section, but I am very interested to see how we will use it.
Monday, October 12, 2009
Section 3.12, 6.2 - October 14
The hardest part of the chapter was following the math that allows us to break small components of D. It is simple enough to follow what to do with the information, but understanding why it exactly work still confuses us. I guess I will have to pay good attention in class on wednesday to get it.
The coolest part is that there are attacks on RSA. This time the coolest part was the timing attack on RSA. It is really interesting to see how something that isn't a flaw with the original mathematics, but rather a flaw with implementation, and the way hardware works. It is really slick.
The coolest part is that there are attacks on RSA. This time the coolest part was the timing attack on RSA. It is really interesting to see how something that isn't a flaw with the original mathematics, but rather a flaw with implementation, and the way hardware works. It is really slick.
Tuesday, September 29, 2009
Section 6.1, October 9
To be honest I don't think that there is much hard in RSA. I actually love that about it. It is pretty darn straight forward. The hard part was finding the algorithm and not really doing it. If anything it takes a little to find out why the two expenotiations work, but other than that it is good.
The most interesting fact to me is that it just works. I love the idea of public key encryption. Finding ways where we can each know part of the answer, but it isn't enough to let us solve the problem. The very fact that this is possible truly amazes me. I wonder what lies further in this field that would allow us to find other public key algorithms that are faster and thus even more usable than RSA.
The most interesting fact to me is that it just works. I love the idea of public key encryption. Finding ways where we can each know part of the answer, but it isn't enough to let us solve the problem. The very fact that this is possible truly amazes me. I wonder what lies further in this field that would allow us to find other public key algorithms that are faster and thus even more usable than RSA.
Section 3.6 - 3.7, October 7
The hardest part for me was to figure out all the different ways we were using Euler's function. It took me quite a while to understand while 0(p) = p - 1. Once I was able to understand that it started to come together why the other ones started to work out in my head.
The most interesting item was how we could find numbers that are essentially generators of all the other numbers. This is so important to me because I have studied the Duffie-Hillman algorithm where this decomes essential.
The most interesting item was how we could find numbers that are essentially generators of all the other numbers. This is so important to me because I have studied the Duffie-Hillman algorithm where this decomes essential.
Section 3.4-3.5, October 5
The hardest part for me was following what exactly was happening in the Chinese remainder theorem. Following why I need the inverse and what to do with it, was really hard for me to follow. For some reason when I tried to follow the various symbols I lost track, it was only when I was able to think of it more as picture that I was able to get it.
The most interesting part was the modular exponentiation. It is funny how easy such an efficient way of doing the exponentiation is.
The most interesting part was the modular exponentiation. It is funny how easy such an efficient way of doing the exponentiation is.
Test Improvment, October 2
I think the most important thing that we have studied to this point was how to critically think about codes. By learning the different ways of description algorithms we have seen better ways to think through problems. Most of the math is done by computers, but computers will never think up the methodologies, and that is what I feel that we are learning the most of it.
I would expect to see questions that deal with breaking algorithms, and showing how algorithms are built. This could happen as explanation questions or by doing small examples. I think it should mimic in-class stuff as much as possible.
I need to understand better the proofs of the math behind the cryptography. That has been the thing that I have had the hardest time following.
I would expect to see questions that deal with breaking algorithms, and showing how algorithms are built. This could happen as explanation questions or by doing small examples. I think it should mimic in-class stuff as much as possible.
I need to understand better the proofs of the math behind the cryptography. That has been the thing that I have had the hardest time following.
Section 5.1 - 5.4, September 30
To be completly honest this section was rather easy. This is because I have already implemented AES once this semester. Most likely the hardest part is the construction of the S-box. It is definately more involved and takes a little thought to get.
The most interesting part is the polynomial multiplication in the field of G(2^8). I love how everything can be worked out as simple machine instructions. It makes it really easy to implement in hardware and software.
The most interesting part is the polynomial multiplication in the field of G(2^8). I love how everything can be worked out as simple machine instructions. It makes it really easy to implement in hardware and software.
Friday, September 25, 2009
Review, September 28
On average I have spent around two hours on each of the assignments. I felt that the lectures and the reading did help. I think the lectures were the most useful followed by the book when I forgot things.
I enjoy the class a lot. I really enjoy the lectures. You have a really good way of making things clear, and showing us why they work that way. I get most of what I read in the book, but anything I don't is quickly addressed in class. I also like the way that you show us examples of using computers and other methods to break codes. This makes everything snap in place in my mind.
Just keep up the examples. They really are what solidifies everything for me.
I enjoy the class a lot. I really enjoy the lectures. You have a really good way of making things clear, and showing us why they work that way. I get most of what I read in the book, but anything I don't is quickly addressed in class. I also like the way that you show us examples of using computers and other methods to break codes. This makes everything snap in place in my mind.
Just keep up the examples. They really are what solidifies everything for me.
Monday, September 21, 2009
3.11, September 25
Initially the hardest thing to understand was how polynomial math was done mod a polynomial. I had forgeton how to do polynomial division and such it confused me until I saw it in the book. Also the ideas of finite fields were a little confusing, as it is an area of math that I have not studied until now.
The very idea of finite fields was the subject I found the most interesting. I would love to learn more about them and how they are used in other branches of math. Since I have already coded AES for another class, I know how the finite field F(2^8) works in AES, and as such I also think that is really cool.
The very idea of finite fields was the subject I found the most interesting. I would love to learn more about them and how they are used in other branches of math. Since I have already coded AES for another class, I know how the finite field F(2^8) works in AES, and as such I also think that is really cool.
4.5 - 4.8, September 23
The hardest thing to follow was all the little nuances that exhibited themselves in the different modes. The error correctly especially took me a little to understand since I have never dealt with it before. The way the key was having an eighth of it shifted off was what confused me at first.
The most interesting thing in the book was actually reading how DES was broken. That is actually my favorite part of this entire class, learning how various algorithms are attacked. It is interesting to me that we still choose brute forcing, as there really aren't many weaknesses that are usefull enough. I find it really interesting that till this day DES's greatest weakness is the length of the key. It is too bad that unlike AES it didn't allow multiples of keys. Though maybe AES should be changed to allow more than just the three that it does.
The most interesting thing in the book was actually reading how DES was broken. That is actually my favorite part of this entire class, learning how various algorithms are attacked. It is interesting to me that we still choose brute forcing, as there really aren't many weaknesses that are usefull enough. I find it really interesting that till this day DES's greatest weakness is the length of the key. It is too bad that unlike AES it didn't allow multiples of keys. Though maybe AES should be changed to allow more than just the three that it does.
Saturday, September 19, 2009
4.1, 4.2, 4.4; September 21
The most interesting deatil in this section would be about groups in encryption. I admit that I have never though of it possible that using two levels of encryption was the same as using on level with only one key. It was interesting to look at crypto-analysis from another angle.
The hardest part of the chapter to understand was the groups. Like other sections of the chapter it wasn't very hard (especially since I have already studied this algorithm in my Computer Security class.) Following the proof was the strangest part. The whole thing with the remainder and dividing mq still doesn't make sense to me.
The hardest part of the chapter to understand was the groups. Like other sections of the chapter it wasn't very hard (especially since I have already studied this algorithm in my Computer Security class.) Following the proof was the strangest part. The whole thing with the remainder and dividing mq still doesn't make sense to me.
Thursday, September 10, 2009
Section 2.9-2.11, September 18
1. The was the first time I felt that there was anything truly hard to follow. The proof provided for breaking LSFRs was hard for me to follow. I get lost after we have proved that det (Mn) = 0 (mod 2). I guess I will have to wait to we discuss it in class to really understand it.
2. The most interesting part is what the begining of the section 2.11 talks about. The fact that their is often a trade off between security and usability/speed. I think this is a subject that might be very important for the future of cryptographic methods. We need to focus on secure algorithms that are both fast and accessable.
2. The most interesting part is what the begining of the section 2.11 talks about. The fact that their is often a trade off between security and usability/speed. I think this is a subject that might be very important for the future of cryptographic methods. We need to focus on secure algorithms that are both fast and accessable.
Section 2.5 - 2.8, September 16
1. The hardest part of this reading was in understanding how to break the Hill algorithm. Once again like the rest of the chapter it was hard, it was just the hardest of what their was. I was interesting to see how easy it was to find a working key matrix. I didn't notice that we need to first check the determinate, but it was easy after that.
2. The most interesting part was the Sherlock homes section. It was interesting because it is basically a subsitution cipher that goes to a different alphabet. It was also the most interesting because the story that accompinied it. So it was the most mathametically interesting, but certainly the most exciting part in the chapter.
2. The most interesting part was the Sherlock homes section. It was interesting because it is basically a subsitution cipher that goes to a different alphabet. It was also the most interesting because the story that accompinied it. So it was the most mathametically interesting, but certainly the most exciting part in the chapter.
Section 2.3, September 14
1. The hardest part to understand in this article was deciphering the key length used in the cipher. It is actually a really interesting analysis of the language, in the way that their is a higher chance of letters occurring the same with the shiften cipher. It does show me that to come up with breaks for older ciphers a solid knowledge of the language was needed.
2. The most interesting part for me was the cipher its self. I think it is really interesting how traditionally people took known ciphers and modified them so they make it harder to break. It is interesting how much harder it is to break these slightly shifted codes. It is also interesting because it represents a key being added to a known cipher who key before was just a pairing of letters to other letters.
2. The most interesting part for me was the cipher its self. I think it is really interesting how traditionally people took known ciphers and modified them so they make it harder to break. It is interesting how much harder it is to break these slightly shifted codes. It is also interesting because it represents a key being added to a known cipher who key before was just a pairing of letters to other letters.
Wednesday, September 9, 2009
2.1-2.2, 2.4, September 11
The most interesting part for me was the affine filter. It was interesting to see how such an easy modification could be used to increase the security of the message. I also enjoyed seeing a formal description of how to solve substitution ciphers. Often as students we just intuit the way things should be without knowing why they are that way.
Once again we haven't reached any really hard concepts so the hardest of what I did read was the formal description of how to break a cryptography cipher. This was because it made me pay attention to not only frequencies of letters, but also the frequencies at which letters occur together.
Once again we haven't reached any really hard concepts so the hardest of what I did read was the formal description of how to break a cryptography cipher. This was because it made me pay attention to not only frequencies of letters, but also the frequencies at which letters occur together.
Guest Presentation, September 11
The presentation today was very interesting. I really loved to hear the different ways that things had been coded in the past. The most interesting one for me was the final cipher she showed us. It was interesting because it reminds me quite a bit of the way that modern symetric encryption works. You could even do a one-time pad with it if you had a long enough key.
It was also really interesting to see the other way they hid messages besides codes. It was a good stody of the history of codes, back when the key was not knowing the algorithm rather than not knowing the key.
Nothing really stood out as hard to understand, but that was just because the presenter did a great job.
It was also really interesting to see the other way they hid messages besides codes. It was a good stody of the history of codes, back when the key was not knowing the algorithm rather than not knowing the key.
Nothing really stood out as hard to understand, but that was just because the presenter did a great job.
Wednesday, September 2, 2009
Section 3.2-3.3, September 4
1) For me the most difficult part was doing modulus division. I am used to calculating the modulus of a number, but I have not had to do much arithmetic on it. I have never done division so it was really weird, and weird to have to think to check for relative primeness.
2) The most interesting part was also the modulus math. Like I said I have not done much math when dealing with congruency. Solving for x was new and cool.
2) The most interesting part was also the modulus math. Like I said I have not done much math when dealing with congruency. Solving for x was new and cool.
Monday, August 31, 2009
Section 1.1-1.2, 3.1, due on September 2
1) The most dificult part for me was the GCD algorithm. I have really ever cared to understand how it needs to work, since it is not something that you need to code all that often. I know that this is not a very hard part, but all the other sections are things we have covered extensivly in other CS classes.
2) The most interesting part for me was the prime number theorum. I have seen that there were a lot of large prime numbers, but I never knew that their was a formula that would allow us to solve for this.
2) The most interesting part for me was the prime number theorum. I have seen that there were a lot of large prime numbers, but I never knew that their was a formula that would allow us to solve for this.
Introduction, due on September 2
I am a Senior with a major in Computer Science and Chinese. I have taken Linear Algebra and Multi-variable calculus. I am taking Math 485 because I have always been interested in Cryptography, and because this class has less pre-reqs then number theory.
I have experience in mathamatic. As for programming experience outside of these programs I am at an expert level.
My best Math professor was Jeffery Humpherys. He made everything we did very applicable. Too often math classes focus only on purely numeric problems that have no relation to real problems that math is used for. Dr. Humpherys was able to have several mathamatica labs that allowed us to apply the linear algebra we were learning to real problems.
An interesting fact is that next year I will be taking CS class at NanJing university in China as part of the Chinese Flagship program.
I have experience in mathamatic. As for programming experience outside of these programs I am at an expert level.
My best Math professor was Jeffery Humpherys. He made everything we did very applicable. Too often math classes focus only on purely numeric problems that have no relation to real problems that math is used for. Dr. Humpherys was able to have several mathamatica labs that allowed us to apply the linear algebra we were learning to real problems.
An interesting fact is that next year I will be taking CS class at NanJing university in China as part of the Chinese Flagship program.
Subscribe to:
Posts (Atom)