I don't know where so many people got this misconception from, but 0.999... does not equal 1. Numbers have a unique decimal representation and by writing 0.999... you explicitly denote a number which is infinitesimally smaller than 1.
I don't know where so many people got this misconception from, but 0.999... does not equal 1...
This. Math is completely retarded. Like for example adding 1+2+3+4 and so on gives you -1/12. Wtf? It is obviously infinity. If you disagree you are a fucking retard, same with the 0.999... Math is stupid.
nice bait thread retard
I always just assumed it was basically just because our mind can't even comprehend such a small difference. It makes sense to use the repeating decimal for formulae and such but when I think of a number I think of items. So if someone says 0.999.... I think of like an apple that brushed up against something rough
To add, I've only seen redditfags say they're the same thing, just to bd quirky I guess
Stupid math magicians. Like, I can see the two things are different, stop making shit up like rounding.
>Numbers have a unique decimal representation
Prove it
this. give a proof faggot your words are meaningless.
>Numbers have a unique decimal representation
Okay, so 0.999... , 0.9999... and 0.99999... must be different numbers then
Real numbers have two representations. For 1, the other one is 0.999... Just because it looks different, it doesn't mean it's literally different. Otherwise you can tell what the difference between the two is, right?
1/3 = 0.333...
3 times 1/3 = 3 times 0.333... =1
3 times 0.333... = 0.999... = 1
0.999... =1
Retard