Charlie Harvey


This is a lot of normal behavior that people don't understand.

js> Array(3)==",,"

That is fully expected, you are comparing the value of an array which would be represented as [",,"] so without checking the types, this is exactly what you should get. ex.


The issue with something like

js> 9999999999999999

is that the largest exact number available is 9007199254740992 due to the fact that javascript uses double-precision floating-point numbers.

Now, there is a perfectly valid reason why this is an expected result:

js> 0.1+0.2==0.3

The issue is that the first subject of addition isn't actually just 0.1, it's actually
0.100000000000000005551 same with 0.2, it's actually value is 0.200000000000000011102 if you execute 0.1 it will say 0.1 due to the fact that javascript has a default precision size to show you which is the length of your input OR 1 if it's over 18.

If you execute 0.10000000000000006 it will return 0.10000000000000006 however if you execute 0.100000000000000006 (one more precision) it will return 0.1

Now, here is what is actually happening

js> (0.1).toPrecision(21)
js> (0.2).toPrecision(21)
js> (0.1+0.2).toPrecision(21)

Now that makes more sense, it was just hidden by the way JS chooses to show numbers. Python suffers from the same situation.

So in conclusion 0.1+0.2 is not equal to 0.3, it's equal to 0.300000000000000044409

js> 0.1+0.2 == 0.300000000000000044409
From a high level I see why the following example is the case
js> Math.max()>Math.min() 
js> Math.max() ['hello'] + [' '] + ['world']
'hello world'

So []+[] returning a string is normal, but what about when you check the typeof []+[] ...?

js> typeof []+[]
welp, that's sure not a string BUT, what about when you assign []+[] to a variable and