1 is equal to .999
1/3 = .33
2/3 = .66 3/3 = .99 = 1 |
Better:
1/3 = .333333 2/3 = .666667 3/3 = 1.000000 |
1/3 = .33
2/3 = .67 1/3 + 2/3 = .33 + .67 = 3/3 = 1 Fun, but wrong. :) |
x = .9999..
10x = 9.9999.. - x = 0.9999.. 9x = 9 x = 1 x = 1 (now) x = 0.9999.. (before) so 1 = 0.9999... No? |
lim n-->inf sum (for i=1, i<=n, i++ (3/10^i)) = 1/3
lim n-->inf sum (for i=1, i<=n, i++ (6/10^i)) = lim n-->inf 2*sum (for i=1, i<=n, i++ (3/10^i)) = 2 (lim n-->inf sum (for i=1, i<=n, i++ (3/10^i))) = 2/3 lim n-->inf sum (for i=1, i<=n, i++ (9/10^i)) = lim n-->inf (sum (for i=1, i<=n, i++ (3/10^i)) + sum (for i=1, i<=n, i++ (6/10^i))) = lim n-->inf sum (for i=1, i<=n, i++ (3/10^i)) + lim n-->inf sum (for i=1, i<=n, i++ (6/10^i)) = 1 |
Quote:
There's a whole Wikipedia article about the issue! :D |
We know that 1 = 2.. because we can have one 2.. It's just how do you teach a computer that 1 = 2 & 3 & 4 & 5 and so on, without the computer needing to see a 2 as a one..?
|
Quote:
Code:
int one[4] = {2,3,4,5}; :D |
I'm no mathematician so perhaps I'm missing something here, but I don't really understand what the fuss was about (at least in the wikipedia article that was mentioned above).
Saying that 1/3 = .33 or .333 or .33333 is an approximation at best and therefore the equal sign should NOT stand between both sides. For the same reason, it can't be the starting point for any further deductions. We can say that 1/3 = .(3) (which makes all the following operations impossible to start with) |
1 == 2 = 3 == 4
|
Quote:
jlinkels |
Quote:
|
I think the truth is ... we should get rid of decimals altogether.
|
Quote:
|
Quote:
|
All times are GMT -5. The time now is 05:47 AM. |