The fraction 1/3 is a compact and unambiguous representation—it doesn’t rely on an ellipsis or an understanding of infinite series to be interpreted. It can easily be used in later calculations (you never see … notation in algebra). It is a useful notation.
As soon as you use decimals in computer and human calculations, they become lossy.
I’m not really sure what hill you are trying to die on. Fractions are useful, even if you don’t know how to use them.
well, no, it’s understood that a third is .333 to infinity, so .333+.333+.333 does equal 1 for any use not requiring precision to the point of it mattering that it was actually .33333335 when measured.
It came from it not being actually .333 to infinity when measured in the required engineering precision i was talking about. It’s literally a “common use” mathematical convention (you clearly are unaware of) that three times .333 is one. Solves a lot of problems due to a failure of the notation.
You knows when a person informs you of a convention people use to solve a problem created by notation, you could just fucking learn instead of arguing stupidity.
For example, they allow you to write
1/3 + 1/3 + 1/3 = 1
Which is not possible in decimal
Tf you mean?? You can write it in a repeating decimal as
0.333....
using ellipsis. https://wiki.froth.zone/wiki/Repeating_decimalSo you think
0.333.... + 0.333.... + 0.333.... = 1
Is clearer and more concise than
1/3 + 1/3 + 1/3 = 1
Fractional representation is the method for rational numbers, particularly if they are part of an intermediate calculation.
Decimals are lossy, fractions aren’t.
No because you said this:
You can also precisely write to infinity if you write 0.333…
Decimals aren’t lossy, any fraction can be converted to decimal but it just takes longer to write.
The fraction 1/3 is a compact and unambiguous representation—it doesn’t rely on an ellipsis or an understanding of infinite series to be interpreted. It can easily be used in later calculations (you never see … notation in algebra). It is a useful notation.
As soon as you use decimals in computer and human calculations, they become lossy.
I’m not really sure what hill you are trying to die on. Fractions are useful, even if you don’t know how to use them.
What does lossy mean? I’m not trying to die on any hill, but I’m quite confused aswell.
well, no, it’s understood that a third is .333 to infinity, so .333+.333+.333 does equal 1 for any use not requiring precision to the point of it mattering that it was actually .33333335 when measured.
No. You wrote .333
If you want to precisely write to infinity you write 1/3.
Holy fuck. Where did that 5 come from?
It came from it not being actually .333 to infinity when measured in the required engineering precision i was talking about. It’s literally a “common use” mathematical convention (you clearly are unaware of) that three times .333 is one. Solves a lot of problems due to a failure of the notation.
3 times 0.333 is 0.999 not 1.
Saying it equals 1 may be a common engineering convention, but it is mathematically incorrect.
There is no failure of notation if fractions are used, which is why I gave this example of usefulness.
You knows when a person informs you of a convention people use to solve a problem created by notation, you could just fucking learn instead of arguing stupidity.