If I set 175% scaling in Gnome Settings, the value is saved as 1.7518248558044434 in ~/.config/monitors.xml:
<monitors version="2">
<configuration>
<logicalmonitor>...
It’s not a “language” issue it’s a “computer” issue. This math is being done on the CPU.
IEEE 754
Some languages do provide for “arbitrary precision math” (Java’s BigDecimal for example) but it’s slower to do that. Not what you want if you’re multiplying a 4k matrix every millisecond.
It’s not a “language” issue it’s a “computer” issue. This math is being done on the CPU.
IEEE 754
Some languages do provide for “arbitrary precision math” (Java’s BigDecimal for example) but it’s slower to do that. Not what you want if you’re multiplying a 4k matrix every millisecond.
I see, thanks for the explanation.