EA - EA is becoming increasingly inaccessible, at the worst possible time by Ann Garth
The Nonlinear Library: EA Forum - A podcast by The Nonlinear Fund
Categories:
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: EA is becoming increasingly inaccessible, at the worst possible time, published by Ann Garth on July 22, 2022 on The Effective Altruism Forum. Many thanks to Jonah Goldberg for conversations which helped me think through the arguments in this essay. Thanks also to Bruce Tsai, Miranda Zhang, David Manheim, and Joseph Lemien for their feedback on an earlier draft. Summary An influx of interest in EA makes accessibility really important right now Lots of new people have recently been introduced to EA / will be introduced to EA soon These people differ systematically from current EAs and are more likely to be “casual EAs” I think we should try to recruit these people There are two problems that make it hard to recruit casual EAs Problem 1: EA is (practically) inaccessible, especially for casual EAs Doing direct work is difficult and risky for most people Earning to give, at least as it’s commonly understood, is also difficult and risky Problem 2: EA is becoming (perceptually) inaccessible as a focus on longtermism takes over Longtermism is becoming the face of EA This is bad because longtermism is weird and confusing for non-EAs; neartermist causes are a much better “on-ramp” to EA To help solve both of these problems, we should help casual EAs increase their impact in a way that’s an “easier lift” than current EA consensus advice A few notes On language: In this post I will use longtermism and existential risk pretty much interchangeably. Logically, of course, they are distinct: longtermism is a philosophical position that leads many people to focus on the cause area(s) of existential risk. However, in practice most longtermists seem to be highly (often exclusively) focused on existential risks. As a result, I believe that for many people — especially people new to EA or not very involved in EA, which is the group I’m focusing on here — these terms are essentially viewed as synonymous.I will also consider AI risk to be a subsection of existential risk. I believe this to be the majority view among EAs, though not everyone thinks it is correct. On the structure of this post: The two problems I outline below are separate. You may think only one of them is a problem, or that one is much more of a problem than the other. I’m writing about them together because I think they’re related, and because I think there are solutions (outlined at the end of this post) that would help address both of them. On other work: This post was influenced by many other EA thinkers, and I have tried to link to their work throughout. I should also note that Luke Freeman wrote a post earlier this year which covers similar ground as this post, though my idea for this post developed independently from his work. An influx of interest in EA makes accessibility really important right now Lots of people are getting introduced to EA who weren’t before, and more people are going to be introduced to EA soon EA is becoming more prominent, as a Google Trends search for “effective altruism” shows pretty clearly. EA is also making strides into the intellectual mainstream. The New York Times wrote about EA in a 2021 holiday giving guide. Vox’s Future Perfect (an EA-focused vertical in a major news outlet) started in 2018 and is bringing EA to the mainstream. Heck, even Andrew Yang is into EA! I (and others) also think there will be a lot more people learning about EA soon, for numerous reasons. Sam Bankman-Fried and FTX have been all over the news recently, including in articles focused on EA. The number of EA local groups has grown hugely and continues to grow (from 2017-2019, there were 30 new EA groups founded per year). The huge influx of funding from FTX means that in the coming years more EA grants will be made, more EA orgs will come into existence, and presumably more people will thus learn about EA. Wha...
