The meter is defined by the distance light travels in a specific fraction of a second. It may have been initially defined by a rough estimate but it (and all metric measurements) are now fixed to universal constants.
Also the foot is currently defined exclusively as 0.3048 meters, so you are already using metric.
I’m well aware that the U.S. has been on metric since the 19th century. My point is the base unit should be sensible. 1/299,792,458 of a second is not that. If the argument is “yeah, but that’s what we’re used to,” then what was the point of the metric system in the first place? Nine significant digits in a denominator suggests a systematic issue, not sensible science.
We’ve secretly replaced your arbitrary base unit with Folger’s Crystals. Let’s see if they notice. Tell me why that definition makes more sense than an inch being three barleycorns.
The meter is defined by the distance light travels in a specific fraction of a second. It may have been initially defined by a rough estimate but it (and all metric measurements) are now fixed to universal constants.
Also the foot is currently defined exclusively as 0.3048 meters, so you are already using metric.
I’m well aware that the U.S. has been on metric since the 19th century. My point is the base unit should be sensible. 1/299,792,458 of a second is not that. If the argument is “yeah, but that’s what we’re used to,” then what was the point of the metric system in the first place? Nine significant digits in a denominator suggests a systematic issue, not sensible science.
We’ve secretly replaced your arbitrary base unit with Folger’s Crystals. Let’s see if they notice. Tell me why that definition makes more sense than an inch being three barleycorns.
Genuine question, do you think seconds are sensibly defined either in SI or otherwise?