So I was reading the goofs on for this movie, and one of them said Ms. Vaughn was teaching her students about prime numbers and she left off 2(which I agree is a goof) and she added 1. Since when is 1 not a prime number? I don't remember this in any of my math classes from high school. Now I will admit, its been 5 years since I took a high school math class...but I think I know a thing or too about such trivial things...when did this change? Because it sure as heck confuses the heck out of me...thanks in advance :)
1 is excluded from the prime numbers because if it were considered prime, no number at all would ever be prime.
It's kind of hard to explain and this will probably need clarification but we can see that 7 has only itself as a factor and 1. Essentially, we say that it is just itself, the number 7.
With the number 1 we can say that it is not prime if we consider it prime (it's kind of paradoxical). That is 1 * 1 is 1. But, 1 * 1 * 1 * 1 is also still just one. You can do this with any number. 7 is 7 * 1 * 1.
Because 1 can be used as a factor multiple times it would render every number not prime.
You can't do this with other numbers.
After all, 2 is not 1 * 2* 2* 2 even if 2 is 2 * 1 * 1.
The explanation given by dma-maillist is wrong because it confuses communicativity with associativity. The truth is that both 1 & 2 are primes, & having both of them be primes doesn't prevent any other numbers from also being prime. --- IF I want your opinion, I'll GIVE it to you.
Yeah, you're right in a way. I think I was just trying to think outside the box or something. I can't remember what I was doing at the time I wrote that.
But, generally speaking, 1 is excluded from prime because the general definition is a number that has itself and a 1 as factors. I suppose one can say that 1 has ONLY itself as a factor, which is different than other primes. 2 is 2 * 1. 3 is 3 * 1. 5 is 5 * 1. 7 is 7 * 1.
There are always TWO distinct factors for 2 on up for prime numbers.
But, with 1, there is only one distinct factor. Simply number 1.
2 can be 2 * 1 * 1 * 1....
There will still be only 2 distinct factors, even if 1 is multiplied an infinite amount of times. There will always be the digit 2 as a factor and 1 as another.
Whereas 1 is simply itself. It can be 1 * 1 * 1 on into infinity, but it still is simply, no other factor besides the digit 1.
I think that is what I was trying to illustrate but I don't think I did it correctly