I'm really late on this one, but I wanted to explain Roko's Basilisk, for all the people who heard about it a while ago and never really "got" it.
The idea first started going around the internet a few years ago, and apparently was seriously freaking out a number of people in the Less Wrong forums. I think I first heard about it from this Slate article, maybe, then spent time trying to find somewhere to explain why this idea was considered so horrfying. The RaionalWiki explanation likewise failed to shed any light about why anyone would actually be scared of the thing.
The concept builds on a number of premises that float around the Less Wrong community that relate to the technological singularity, namely "friendly god" AI, utilitarian ethical calculus, and simulated consciousness.
The Basilisk is a superhuman AI from the future. The abilities of this AI are essentially infinite, even up to traveling backwards in time. The idea of the Basilisk is that it wants you to contribute your money to helping it be built, and if you refuse to help it, it will create a simulation of you and torture the simulation forever.
And so I think a normal person quite understandably has trouble understanding why anyone would even think this is a good B-list villain for Star Trek, much less a cause for existential dread.
But it's actually not that silly. And once you understand the background to it better, it all makes sense. So let me explain to you what the Basilisk is in clearer terms, so that you too can experience the angst. (That was your warning)
Monday, January 29, 2018
Monday, January 15, 2018
The Monty Hall Problem, Bayes Theorem, and a fault in Numberphile
I watch a lot of educational videos on YouTube, in particular the awesome channel Numberphile. I recently saw their video on the Monty Hall Problem, and was kind of disappointed at what seemed to be a rather pointless calculation that didn't really show the result, and instead showed something that was already kind of obvious.
The video can be found here and explains everything, but let me explain it again for completeness.
The Monty Hall Problem is a classic apparent paradox in probability, named after gameshow host Monty Hall from Let's Make a Deal. In the show, the contestants are shown three doors and told behind one of the doors is a brand new car. Behind the other two doors are "worthless" prizes; anything works, but traditionally the problem says the other two doors hold goats. The player gets to pick any of the three doors, and whatever is behind the door is what they win. If they pick right they get a car, otherwise they get a goat.
To add tension, after the contestant picks, Monty Hall would walk to another door, a door that the player did not pick, and show them what was behind it. And look! It's a goat! The car is still out there!
In the Monty Hall Problem (not necessarily the show), Monty then asks if the contestant would like to change their mind.
The question is, what is the probability of the player guessing correctly if they swap their pick?
The video can be found here and explains everything, but let me explain it again for completeness.
The Monty Hall Problem is a classic apparent paradox in probability, named after gameshow host Monty Hall from Let's Make a Deal. In the show, the contestants are shown three doors and told behind one of the doors is a brand new car. Behind the other two doors are "worthless" prizes; anything works, but traditionally the problem says the other two doors hold goats. The player gets to pick any of the three doors, and whatever is behind the door is what they win. If they pick right they get a car, otherwise they get a goat.
To add tension, after the contestant picks, Monty Hall would walk to another door, a door that the player did not pick, and show them what was behind it. And look! It's a goat! The car is still out there!
In the Monty Hall Problem (not necessarily the show), Monty then asks if the contestant would like to change their mind.
The question is, what is the probability of the player guessing correctly if they swap their pick?
Subscribe to:
Posts (Atom)