Artwork

Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

EA - Manifold markets isn't very good by Robin

9:05
 
Share
 

Manage episode 424602103 series 3314709
Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Manifold markets isn't very good, published by Robin on June 20, 2024 on The Effective Altruism Forum. Disclaimer I currently have an around 400-day streak on Manifold Markets (though lately I only spend a minute or two a day on it) and have no particular vendetta against it. I also use Metaculus. I'm reasonably well-ranked on both but have not been paid by either platform, ignoring a few Manifold donations. I have not attended any Manifest. I think Manifold has value as a weird form of social media, but I think it's important to be clear that this is what it is, and not a manifestation of collective EA or rationalist consciousness, or an effective attempt to improve the world in its current form. Overview of Manifold Manifold is a prediction market website where people can put virtual money (called "mana") into bets on outcomes. There are several key features of this: 1. You're rewarded with virtual money both for participating and for predicting well, though you can also pay to get more. 2. You can spend this mana to ask questions, which you will generally vet and resolve yourself (allowing many more questions than on comparable sites). Moderators can reverse unjustified decisions but it's usually self-governed. Until recently, you could also donate mana to real charities, though recently this stopped; now only a few "prize" questions provide a more exclusive currency that can be donated, and most questions produce unredeemable mana. How might it claim to improve the world? There are two ways in which Manifold could be improving the world. It could either make good predictions (which would be intrinsically valuable for improving policy or making wealth) or it could donate money to charities. Until recently, the latter looked quite reasonable: the company appeared to be rewarding predictive power with the ability to donate money to charities. The counterfactuality of these donations is questionable, however, since the money for it came from EA-aligned grants, and most of it goes to very mainstream EA charities. It has a revenue stream from people buying mana, but this is less than $10k/month, some of which isn't really revenue (since it will ultimately be converted to donations), and presumably this doesn't cover the staff costs. The founders appear to believe that eventually they will get paid enough money to run markets for other organisations, in which case the donations would be counterfactual. But this relies on the markets producing good predictions. Sadly, Manifold does not produce particularly good predictions. In last year's ACX contest, it performed worse than simply averaging predictions from the same number of people who took part in each market. Their calibration, while good by human standards, has a clear systematic bias towards predicting things will happen when they don't (Yes bias). By contrast, rival firm Metaculus has no easily-corrected bias and seems to perform better at making predictions on the same questions (including in the ACX contest). Metaculus' self-measured Brier score is 0.111, compared to Manifold's 0.168 (lower is better, and this is quite a lot lower, though they are not answering all the same questions). Metaculus doesn't publish the number of monthly active users like Manifold does, but the number of site visits they receive are comparable ( slightly higher for Metaculus by one measure, lower by another), so it doesn't seem like the prediction difference can be explained by user numbers alone. Can the predictive power be improved? Some of the problems with Manifold, like the systematic Yes bias, can be algorithmically fixed by potential users. Others are more intrinsic to the medium. Many questions resolve based on extensive discussions about exactly how to categorise reality, meaning that subtle clarifications by the author can ...
  continue reading

2433 episodes

Artwork
iconShare
 
Manage episode 424602103 series 3314709
Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Manifold markets isn't very good, published by Robin on June 20, 2024 on The Effective Altruism Forum. Disclaimer I currently have an around 400-day streak on Manifold Markets (though lately I only spend a minute or two a day on it) and have no particular vendetta against it. I also use Metaculus. I'm reasonably well-ranked on both but have not been paid by either platform, ignoring a few Manifold donations. I have not attended any Manifest. I think Manifold has value as a weird form of social media, but I think it's important to be clear that this is what it is, and not a manifestation of collective EA or rationalist consciousness, or an effective attempt to improve the world in its current form. Overview of Manifold Manifold is a prediction market website where people can put virtual money (called "mana") into bets on outcomes. There are several key features of this: 1. You're rewarded with virtual money both for participating and for predicting well, though you can also pay to get more. 2. You can spend this mana to ask questions, which you will generally vet and resolve yourself (allowing many more questions than on comparable sites). Moderators can reverse unjustified decisions but it's usually self-governed. Until recently, you could also donate mana to real charities, though recently this stopped; now only a few "prize" questions provide a more exclusive currency that can be donated, and most questions produce unredeemable mana. How might it claim to improve the world? There are two ways in which Manifold could be improving the world. It could either make good predictions (which would be intrinsically valuable for improving policy or making wealth) or it could donate money to charities. Until recently, the latter looked quite reasonable: the company appeared to be rewarding predictive power with the ability to donate money to charities. The counterfactuality of these donations is questionable, however, since the money for it came from EA-aligned grants, and most of it goes to very mainstream EA charities. It has a revenue stream from people buying mana, but this is less than $10k/month, some of which isn't really revenue (since it will ultimately be converted to donations), and presumably this doesn't cover the staff costs. The founders appear to believe that eventually they will get paid enough money to run markets for other organisations, in which case the donations would be counterfactual. But this relies on the markets producing good predictions. Sadly, Manifold does not produce particularly good predictions. In last year's ACX contest, it performed worse than simply averaging predictions from the same number of people who took part in each market. Their calibration, while good by human standards, has a clear systematic bias towards predicting things will happen when they don't (Yes bias). By contrast, rival firm Metaculus has no easily-corrected bias and seems to perform better at making predictions on the same questions (including in the ACX contest). Metaculus' self-measured Brier score is 0.111, compared to Manifold's 0.168 (lower is better, and this is quite a lot lower, though they are not answering all the same questions). Metaculus doesn't publish the number of monthly active users like Manifold does, but the number of site visits they receive are comparable ( slightly higher for Metaculus by one measure, lower by another), so it doesn't seem like the prediction difference can be explained by user numbers alone. Can the predictive power be improved? Some of the problems with Manifold, like the systematic Yes bias, can be algorithmically fixed by potential users. Others are more intrinsic to the medium. Many questions resolve based on extensive discussions about exactly how to categorise reality, meaning that subtle clarifications by the author can ...
  continue reading

2433 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide