There is an overwhelming amount of material on AI safety, and it can be difficult to know where to begin. Rather than trying to maintain an exhaustive or definitive reading list, we’ve collected a small set of resources that can serve as starting points for further exploration. These links are meant to help you discover the topics, questions, and perspectives that most interest you.
Introduction & Overview
- AI Safety Map
- List of links for getting into AI Safety
- List of AI Safety Newsletter & other resources
- AISafety.info
Courses
Forums & Online Discussion
Podcasts
Blogs
- Astral Codex Ten
- Redwood Research Substack
- In general, many AI Safety orgs or researchers also have blogs or writings online.
YouTube
Fellowships
We would highly encourage you to apply to fellowships if you want to work on AI Safety or test your fit for this kind of work. But don’t be discouraged if it doesn’t work on your first attempt to get in, these fellowships have become extremely competitive. There are both full time and part time fellowships and some focus more on technical work while others are more about AI policy and governance. Some fellowships that you might be interested include:
- Full-time (typically 12 weeks)
- Part-time
- SPAR
- AI Safety Camp
- Algoverse
- Athena (women & non-binary only)
- Upskilling Programs
- Goverance
- Apart (Fellowships + Hackathons)
- Tarbell (Journalism)
- Global Challenges Project
- Future Impact Group
- Impact Academy Global AI Safety Fellowship
- EA conferences (there is usually also a lot of content about AI Safety, and a great opportunity to meet other people, find collaborators etc. )
If you want to apply for a fellowship, but are unsure which one would be a good fit for you, feel free to reach out anytime. We are happy to give you some guidance & advice.
