- cross-posted to:
- hackernews@derp.foo
- cross-posted to:
- hackernews@derp.foo
I did fake Bayesian math with some plausible numbers, and found that if I started out believing there was a 20% per decade chance of a lab leak pandemic, then if COVID was proven to be a lab leak, I should update to 27.5%, and if COVID was proven not to be a lab leak, I should stay around 19-20%
This is so confusing: why bother doing “fake” math? How does he justify these numbers? Let’s look at the footnote:
Assume that before COVID, you were considering two theories:
- Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade.
- Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.
And suppose before COVID you were 50-50 about which of these were true. If your first decade of observations includes a lab-leak-caused pandemic, you should update your probability over theories to 76-24, which changes your overall probability of pandemic per decade from 21% to 27.5%.
Oh, he doesn’t, he just made the numbers up! “I don’t have actual evidence to support my claims, so I’ll just make up data and call myself a ‘good Bayesian’ to look smart.” Seriously, how could a reasonable person have been expected to be concerned about lab leaks before COVID? It simply wasn’t something in the public consciousness. This looks like some serious hindsight bias to me.
I don’t entirely accept this argument - I think whether or not it was a lab leak matters in order to convince stupid people, who don’t know how to use probabilities and don’t believe anything can go wrong until it’s gone wrong before. But in a world without stupid people, no, it wouldn’t matter.
Ah, no need to make the numbers make sense, because stupid people wouldn’t understand the argument anyway. Quite literally: “To be fair, you have to have a really high IQ to understand my shitty blog posts. The Bayesian math is is extremely subtle…” And, convince stupid people of what, exactly? He doesn’t say, so what was the point of all the fake probabilities? What a prick.
OK my knowledge of Bayes is rusty at best, but isn’t the idea that the occurrences should be relatively common, and/or not correlated?
So far, there has been zero or one[1] lab leak that led to a world-wide pandemic. Before COVID, I doubt anyone was even thinking about the probabilities of a lab leak leading to a worldwide pandemic.
Also, ideally, if there was a lab leak, then people running labs would take note and ensure that that particular failure mode doesn’t happen again. Thus the probability of an occurrence would be less than the first time it happened, because people actually take note of what has happened and change stuff.
Scottyboy could have used something that has occurred multiple times, like a nuclear powerplant accident, but his audience loves nuclear power, so that’s a non-starter. Also it’s a given that the mainstream press is the big bad in the fight against nuclear, just because serious accidents with widespread death and economic destruction happen again and again with nuclear power.
Raising the lab leak “hypothesis” is just signalling to his base.
[1] depending on where you stand in current US politics
This makes me wonder, we know the Rationalists did worry about a global pandemic before COVID19, we checked the waste water and the smug particles increased exponentially for a short time in feb 2020. But did they also worry about a normal lab leak like which might have happened here? Or was it all either nature/terrorism/AGI stuff?
Also, if you think either of these are true:
You should probably be campaigning to increase safety or shut down the labs you think would be responsible. 10% risk of pandemic per decade due to lab leaks (so in addition to viruses mutating on their own) isn’t rare or an acceptable risk.
So, actually, many people were thinking about lab leaks, and the potential of a worldwide pandemic, despite Scott’s suggestion that stupid people weren’t. For years now, bioengineering has been concerned with accidental lab leaks because the understanding that risk existed was widespread.
But the reality is that guessing at probabilities of this sort of thing still doesn’t change anything. It’s up to labs to pursue safety protocols, which happens at the economic edge of of the opportunity vs the material and mental cost of being diligent. Reality is that lab leaks may not change probabilities, but yes the events of them occurring does cause trauma which acts, not as some bayesian correction, but an emotional correction so that people’s motivations for atleast paying more attention increases for a short while.
Other than that, the greatest rationalist on earth can’t do anything with their statistics about label leaks.
This is the best paradox. Not only is Scott wrong to suggest people shouldn’t be concerned about major events (the traumatic update to individual’s memory IS valuable), but he’s wrong to suggest that anything he or anyone does after updating their probabilities could possibly help them prepare meaningfully.
He’s the most hilarious kind of wrong.