“With the advent of nuclear weapons, humanity entered a new age, where we face existential catastrophes—those from which we could never come back. Since then, these dangers have only multiplied, from climate change to engineered pathogens and [transformative] artificial intelligence. If we do not act fast to reach a place of safety, it will soon be too late.”—Toby Ord

This week we’ll cover the definition of an existential risk; examine why existential risks might be a moral priority; and explore why existential risks are so neglected by society. We’ll also look into one of the major risks that we might face: a human-made pandemic, worse than COVID-19. Pandemics have plighted humanity for millennia, and Covid-19 has shown us all that we continue to be ill-prepared for these events as a global society. Meanwhile, advances in biotechnology have the potential to both drastically increase or decrease the risks from future pandemics, through improved pandemic prevention and response or making dangerous pathogen research easier to conduct.

Required Materials (80 mins.)

On extinction risks

On Careers

More to explore