Last updated June 2025
Atlas Fellowship – Resource Gallery - last updated in 2023…
https://airtable.com/app53PsYpHxJW61l3/shrQSYXSW9z96y5WE/tbls2B9CeeFfavLXo - List of opportunities in high impact fields
https://docs.google.com/spreadsheets/d/1JTNfDeRXZKR5hEF3WYIBpi2Wu8QAeghFwJvJ53DrJWo/edit#gid=0 - for whatever you’re building.
https://www.aisi.dev/resources - List of opportunities to get into AI safety (technical and policy) - maintained and updated by 3rd parties.
https://aisafetyfundamentals.com/ - a really good introduction to the most pressing problems in AI safety, current landscape, and current solutions.
https://www.arena.education/: technical upskilling for alignment researchers
https://www.youtube.com/c/robertmilesai - conceptual arguments for why care about AI safety, from the goat that got me into alignment.
https://arc.net/folder/D0472A20-9C20-4D3F-B145-D2865C0A9FEE - Sutskever’s list of relevant ML papers in the current paradigm