The universe's expansion may actually have started to slow rather than accelerating at an ever-increasing rate as previously thought, a new study suggests.
"Remarkab...
As I understand it, it is based on the velocities of galaxies/stars (redshift/blueshift) along with their relative locations. IIRC the velocities of the bodies don’t square with the constantly increasing distance between the bodies which is why they came to the conclusion that the universe was expanding.
It’s been over 10 years since I took that one astronomy class in college but I think the math and logic made good enough sense. However, I always reserve a small amount of skepticism for astronomy because 90% of it (as I understand it) is based on bits of light we can see from out here. The universe is so vast how can we be so sure the light shift is only due to movement, or that flicker frequency is only tied to distance.
The scientific history of humans is filled to the brim with shit we were confidently wrong about for centuries.
Either way, the expansion of the universe is not something that has a practical effect on our lives for maybe a few hundred generations, if that.
There’s some wild stuff going on in astronomy. This particular hypothesis was made using baryonic acoustic oscillation (BAO) data. The idea is that the early universe was hot and dense, and random fluctuations would make certain areas more or less dense, like ripples in a pond. Then when the universe rapidly expanded, all those oscillations and over/under densities were frozen as they were, and we can look at the cosmic microwave background and the structure of the modern universe to see how those original changes in density evolved into modern galaxies and voids. This gives us a measuring stick for distance that’s completely independent of using the light of distant events. And BAO and standard candles like 1a supernovas basically agree with each other completely on the expansion rate, and only with higher precision modern instruments have we detected that the methods disagree on the rate of expansion. This paper is trying to square that circle by critically analyzing the validity of the data we measure.
As I understand it, it is based on the velocities of galaxies/stars (redshift/blueshift) along with their relative locations. IIRC the velocities of the bodies don’t square with the constantly increasing distance between the bodies which is why they came to the conclusion that the universe was expanding.
It’s been over 10 years since I took that one astronomy class in college but I think the math and logic made good enough sense. However, I always reserve a small amount of skepticism for astronomy because 90% of it (as I understand it) is based on bits of light we can see from out here. The universe is so vast how can we be so sure the light shift is only due to movement, or that flicker frequency is only tied to distance.
The scientific history of humans is filled to the brim with shit we were confidently wrong about for centuries.
Either way, the expansion of the universe is not something that has a practical effect on our lives for maybe a few hundred generations, if that.
There’s some wild stuff going on in astronomy. This particular hypothesis was made using baryonic acoustic oscillation (BAO) data. The idea is that the early universe was hot and dense, and random fluctuations would make certain areas more or less dense, like ripples in a pond. Then when the universe rapidly expanded, all those oscillations and over/under densities were frozen as they were, and we can look at the cosmic microwave background and the structure of the modern universe to see how those original changes in density evolved into modern galaxies and voids. This gives us a measuring stick for distance that’s completely independent of using the light of distant events. And BAO and standard candles like 1a supernovas basically agree with each other completely on the expansion rate, and only with higher precision modern instruments have we detected that the methods disagree on the rate of expansion. This paper is trying to square that circle by critically analyzing the validity of the data we measure.