Can confirm. Moved from the US to Canada and maybe a year of using Celcius revealed to me just how fucking stupid and convoluted Fahrenheit is. My dad spent three weeks out here and started using Celcius on his phone. Now I only use Fahrenheit when dealing with fevers or temping cases of suspiciously overripe produce.
Fellow Americans. Celcius is superior and more intuitive for those who take a moment to adjust to it. It is okay to accept this as fact without developing an inferiority complex. USA not always #1. USA quite often not #1 and that is okay. It is okay for USA to not be #1 without developing an inferiority complex.
The universe is mostly empty space with an average temperature of like… 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.
I don’t know why “techtarget” would be a credible source on Physics questions, but the SI convention, which is, according to Wikipedia, the “only system of measurement with an official status in nearly every country in the world, employed in science, technology, industry, and everyday commerce”, poses that “kelvin is never referred to nor written as a degree.”
But I also made the mistake to write it as “Kelvin” instead of “kelvin”.
So then we should use the system that reflects the freezing point and boiling points of water at nice round values such as 0 and 100 then? Sounds like Celsius is the better system
Slightly off topic, but 23°C is a nice room temperature? We have our thermostats at 20°C and I find it quite warm. In the sleeping room we have 18°C and so do I have in my office, which I find quite comfortable. I hate visiting my parents, they always have 22.5°C which I find uncomfortably warm.
Well it’s all subjective after all, I’ll be happy about chilly 23°C inside when summer comes.
What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?
If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.
Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck
If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.
We don’t have issues with decimals in many places. For example, why are there pennies? Why aren’t dollars just scaled up 100? Generally speaking: why don’t people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you’re correct, those should be simplified too - yet they aren’t.
Why bother with Celcius at all when there is Kelvin.
Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.
Even Kelvin is arbitrary. Best to use Plank normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
I would argue it’s because of historical usage, familiarity, and resistance to change. Most countries and most people living in hot climates use Celsius.
Do not say anything positive about Fahrenheit in this thread… the Temperature Scale Inquisition is watching closely for any dissent from the party line.
Can confirm. Moved from the US to Canada and maybe a year of using Celcius revealed to me just how fucking stupid and convoluted Fahrenheit is. My dad spent three weeks out here and started using Celcius on his phone. Now I only use Fahrenheit when dealing with fevers or temping cases of suspiciously overripe produce.
Fellow Americans. Celcius is superior and more intuitive for those who take a moment to adjust to it. It is okay to accept this as fact without developing an inferiority complex. USA not always #1. USA quite often not #1 and that is okay. It is okay for USA to not be #1 without developing an inferiority complex.
deleted by creator
Fahrenheit is European.
*was
I use it and I am not European.
Fahrenheit has a fine granularity that is lost in cold climates. It’s why the Bahamas/Belize use it as well.
Well you know that you can use the decimals?
How is - 40.000001°F more fine than - 40.00000000001°C?
23°C is a nice room temperature.
18°C is a bit chilly but still a comfortable temperature.
If you want to go for a finer destinction then we cann say 18.5°C is warmer but I personally can’t feel the difference.
Our bodies are mostly water why not use a system that reflects this?
The universe is mostly empty space with an average temperature of like… 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.
…rankine glowers in your general direction…
Are you made of mostly empty space? Your response does leave me questioning. Please aknowledge that you are made of 64% water and not 4°k nothing.
I mean, yeah, we all are. That’s how atoms work.
alternatively, yeah, mostly between his ears.
As a matter of fact…
Plese do not use Kelvin with a degree symbol. There ist no “degree Kelvin”.
Please make sure you are right before you correct someone https://www.techtarget.com/whatis/definition/kelvin-K
I don’t know why “techtarget” would be a credible source on Physics questions, but the SI convention, which is, according to Wikipedia, the “only system of measurement with an official status in nearly every country in the world, employed in science, technology, industry, and everyday commerce”, poses that “kelvin is never referred to nor written as a degree.”
But I also made the mistake to write it as “Kelvin” instead of “kelvin”.
So then we should use the system that reflects the freezing point and boiling points of water at nice round values such as 0 and 100 then? Sounds like Celsius is the better system
Slightly off topic, but 23°C is a nice room temperature? We have our thermostats at 20°C and I find it quite warm. In the sleeping room we have 18°C and so do I have in my office, which I find quite comfortable. I hate visiting my parents, they always have 22.5°C which I find uncomfortably warm.
Well it’s all subjective after all, I’ll be happy about chilly 23°C inside when summer comes.
I can feel the difference between 71 and 73 in my house.
At 73, my kids room is uncomfortably hot. At 71, it has a perfect chill for sleeping.
What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?
If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
The point was they need that extra decimal because C isn’t good for human temperature sense.
It’s not like you are prohibited from using decimals in Fahrenheit. It’s that you don’t need 3 digits because it works better for people.
And fuck you for making me defend the most ass backwards measurement system on the planet.
It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.
Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck
If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.
We don’t have issues with decimals in many places. For example, why are there pennies? Why aren’t dollars just scaled up 100? Generally speaking: why don’t people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you’re correct, those should be simplified too - yet they aren’t.
Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.
Why? That scale is still arbitrarily chosen.
Dude 71 is way too warm for sleeping, try 64-65 its healthier.
I don’t know if my thermostat is just wrong or if the layout of my house makes it inaccurate, but 64-65 in my house is frigid.
Plus we have a baby so 67-68 is really the lowest we could go at night I think.
But I agree, I sleep better in general when the blankets are warm and the house is cold!
Well it’s all subjective, I guess. Also depends on where you live.
I would argue it’s because of historical usage, familiarity, and resistance to change. Most countries and most people living in hot climates use Celsius.
Save yourself before it’s too late.
Do not say anything positive about Fahrenheit in this thread… the Temperature Scale Inquisition is watching closely for any dissent from the party line.