The Precipice by Toby Ord

  • book argues that safeguarding humanities future is the defining challenge of our time
  • humanity lacks the maturity, coordination, and foresight necessary to avoid making mistakes from which we could never recover
  • greatest risks are caused by human action, and they can be addressed by human action
  • existential risks — threaten the destruction of humanities longterm potential; extinction is most obvious way humanity’s entire potential could be destroyed
  • we live in the most important era of human history
  • major reorientation in the way we see the world, and our role in it
  • closing the gap between our wisdom and power
  • effective altruism: people aspire to use evidence and reason to do as much good as possible (author helped to create this term)
  • what set humans apart was not physical but mental — our intelligence, creativity, and language
  • each human’s ability to cooperate with the dozens of other people in their band was unique among large animals.
  • we learned from our ancestors, added minor innovations of our own, passed this all down to our children
  • Agricultural Revolution: reduced the amount of land needed to support each person by a factor of a hundred, allowing large permanent settlements to develop, which began to unite together into states
  • Scientific Revolution: replace a reliance on received authorities with careful observation of the natural world, seeking simple and testable explanations for what we saw
  • idea of progress: working together to build a better future
  • Industrial Revolution: modern era of sustained economic growth, income grew faster than population and ushering in an unprecedented rise in prosperity that continues to this day
  • so many have so little is among the greatest problems of our time, and has been a major focus of the author’s work
  • every single country nowadays has a life expectancy above 50
  • our planet will remain habitable for roughly a billion years
  • with the detonation of the atom bomb, a new age of humanity began
  • humanity has risen to a position where we control the rest of the world precisely because of our unparalleled mental abilities
  • trend toward increase in the power of humanity which has reached a point where we pose a serious risk to our own existence
  • very fact that these risks stem from human action shows us that human action can address them
  • risks are complex (so not amenable to simple mathematical analysis) and unprecedented (so cannot be approximated by a longterm frequency)
  • unsustainable level of risk — Precipice
  • highest levels of risk, humanity opening its eyes, coming into maturity and guaranteeing its long and flourishing future
  • existential catastrophe: destruction of humanity’s longterm potential
  • existential risk: risk that threatens the destruction of humanity’s longterm potential
  • need to preserve our vast potential, protect it against the risk of future destruction
  • allow our descendants to fulfill our potential
  • our potential is a matter of what humanity can achieve through the combined actions of each and every human
  • even if civilization did collapse, it is likely that it could be re-established
  • civilization has already been independently established at least seven times by isolated peoples
  • absent catastrophe, most generations are future generations
  • recognizing that people matter equally, regardless of their geographic locations, is a crucial form of moral progress; and also recognizing that people matter equally, regardless of where they are in time
  • longtermism: especially concerned with the impacts of our actions upon the longterm future
  • partnership between those that are living, those who are dead, those who are to be born
  • paying it forwards: duties to future generations may thus be grounded in the work our ancestors did for us when we were future generations
  • insight into systematic strengths or weaknesses in humanity’s ability to achieve flourishing
  • humanity spends more in ice cream every year than on ensuring that the technologies we develop do not destroy us
  • economic theory tells us that existential risk will be undervalued by markets, nations, and even entire generations
  • protection from existential risk is a public good: would benefit all of us and my protection doesn’t come at the expense of yours — global public good
  • protection from existential risk is an intergenerational global public good
  • when citizens are empathetic and altruistic, identifying with the plight of others, they can be enlivened with the passion and determination needed to hold their leaders to account
  • availability heuristic: tendency for people to estimate the the likelihood of events based on their ability to recall examples
  • with existential risk it fails completely
  • more expansive compassion: acts over the long term
  • scope neglect: lack of sensitivity to the scale of benefit or harm
  • almost all the large asteroids have been tracked
  • no other existential risk is as well handled as that of asteroids and comets
  • willingness to think seriously about imprecise probabilities of unprecedented events is crucial to grappling with risks to humanity’s future
  • supernova: explosion of a star, releases as much energy as our Sun will radiate over its ten-billion-year lifetime
  • premature to conclude that we have discovered all of the possible mechanisms or natural extinction while major mass extinction events remain unexplained
  • detailed fossil record starts 540M years ago with the Cambrian explosion: rapid diversification of complex life into most of the major categories we see today
  • risks are substantially greater now for humans than they were for early humans or related species
  • chief among these is pandemics
  • natural risks are dwarfed by those of our own creation
  • thousand times more anthropogenic his over the next century than natural risk, so it is the anthropogenic risks that are the focus
  • CO2 release in the industrial period: more being released since 1980 than in the entire industrial period before that
  • can climate change itself directly threaten our extinction or permanent collapse?
  • most extreme effect is runaway greenhouse effect — driven by the relationship between heat and humidity
  • more vapor in the atmosphere produces more warming, which produces more water vapor, an amplifying feedback
  • runaway greenhouse effect: warming continues until the oceans have mostly boiled off, leaving a planet incompatible with complex life
  • most greenhouse effect: stops short of boiling the oceans
  • cause for more research on these two issues
  • unrecoverable could be if we trigger other feedback effects which release more carbon into the atmosphere, emit more carbon ourselves, or a given amount of carbon may cause much more warming than we thought
  • melting arctic permafrost and release of methane from the deep ocean are concerning
  • human activity has already releases more than an entire biosphere worth of carbon into the atmosphere
  • climate sensitivity is the number of degrees of warming that would eventually occur if greenhouse gas concentrations were doubled from their pre-industrial baseline of 280 PPM
  • major effects don’t necessarily predict extinction or irrevocable collapse
  • when we look at many of the past cases of extremely high global temps or rapid warming we don’t see a corresponding loss of biodiversity
  • obvious is heat stress
  • geoengineering = carbon dioxide removal and solar radiation management — limiting the amount of sunlight absorbed by earth
  • massive unintended consequences
  • amount of food per person has risen over the last 50 years
  • Norman Borlaug, invented new, high yield varieties of wheat, may be responsible for saving more lives than anyone else in human history
  • population is now increasing in a linear manner, fixed number of people being added each year instead of a fixed proportion
  • fraction of species that have gone extinct is much lower than a mass extinction
  • toll on biodiversity within each region may be much higher, and this may be what matters most
  • loss of ecosystem services — purifying water and air, providing energy and resources, improving soil, that animals and plants currently do for us, but we may find costly or impossible to do ourselves
  • risks technology imposes on humans have been outweighed by the benefits it has brought
  • Black Death = between 1/4 and 1/2 of all Europeans were killed
  • 5 to 14 percent of the world
  • during the 20th century, fifteen countries are known to have developed bioweapons programs, including the US, UK, and France
  • 100 years of improvements to bioweaponry should alarm us
  • most dangerous escapes are not microbes, but information: information hazards
  • unilateralists curse: takes just one overly optimistic estimate to lead to the information being released
  • information hazards are especially important for boorish, due to its high ratio of misuse risk to accident risk
  • deep learning gives networks the ability to learn subtle concepts and distinctions
  • concerns about AI entrenching social discrimination, producing mass unemployment, supporting oppressive surveillance, violating norms of war
  • unmatched intelligence led to unmatched power and thus control of our own destiny (humans)
  • specification of which acts and states produce reward for the agent is known as its reward function
  • assumes, with AI, that the builders of the systems are striving to align it to human values
  • AI researchers don’t know how to make a system which, upon noticing this misalignment, updates its ultimate values to align with ours rather than updating its instrumental goals to overcome us
  • transition to a world where humans are no longer the most intelligent entities on earth could easily be the greatest ever change in humanity’s place in the universe
  • enforced dystopia: Orwell, totalitarian state achieving global dominance and absolute control, locking the world into a miserable condition
  • key aspects of the future of the civilization are being locked in in such that they are almost impossible to change
  • unforeseen risks are thus important to understanding the relative value of broad versus narrowly targeted efforts
  • unaligned AI = greatest risk, in the author’s view
  • Global Burden of Disease = major study, influenced the author’s work in the field; study of global public health
  • if something is a risk factor, its opposite will be a security factor
  • the more a problem is important, tractable or neglected, the more cost-effective it is to work on it, and thus the higher its priority
  • the importance of a problem is the value in solving it
  • focus on risks that are soon, sudden, and sharp
  • we should prioritize sharp risks — those that are less likely to be preceded by warning shots — for they are more likely to remain neglected over the long run
  • most existential risk comes from human action: from activities that we can choose to stop, or to govern effectively
  • humanity’s point of view: what all humans would do if we were sufficiently coordinated and had humanity’s longterm interests at heart
  • a place where existential risk is low and stays low = existential security
  • protect humanity’s potential, preserve it
  • existential security is about reducing total existential risk by as many percentage points as possible
  • visions for humanity would be best realization of our potential: Long Reflection
  • which is the best kind of future for humanity?
  • ultimate aim is fully achieving humanity’s potential
  • first get ourselves to safety — to achieve existential security
  • make decisions of grave importance without access to robust probabilities for the risks involved
  • each nation is inadequately incentivized to take actions that reduce risk and to avoid actions that produce risk, preferring instead to free-ride on others
  • current predicament stems from humanity’s power outstripping the slow and unsteady growth of our wisdom
  • responsible deployment and governance of new technology
  • Montreal Protocol = successful governance, phasing out chemicals that were deleting the Ozone
  • differential technological development = speeding up the development of protective technologies relative to dangerous ones
  • state risks = asteroids, comets
  • transition risks = arise during a transition to a new technology or social regime
  • more transition than state risks
  • contraceptive pill made possible by a single philanthropist, Katharine McCormick
  • within 100K years, Earth’s natural systems will have scrubbed our atmosphere clean of over 90 percent of the carbon we have released, leaving the climate mostly restored and rebalanced
  • most of the intrinsic value of the world lies in the rest of our ecosystem, humanity’s instrumental value may yet be profound
  • planets are other Earth’s; starts are other suns, many with their own planets
  • 100B Suns, our earth is just one island amidst billions
  • sunlight hitting Earth’s surface each day carries 5,000 times more energy than modern civilization requires
  • gives in two hours what we use in a year
  • solar collectors in orbit around the Sun?
  • humanity’s potential — scope of what we might someday be able to achieve
  • pure time preference: preference for one benefit over another simply because it comes earlier, and it is a third reason for discounting future benefits (economists don’t like)
  • population ethics: increasing someone’s wellbeing is good, reducing it is bad
  • Total View: the moral value of future wellbeing is just the total amount of wellbeing in the future
  • value of having a thousand more generations is a thousand times the value of our generation
  • person-affecting restriction: that an outcome can’t be better than another (or at least not in terms of wellbeing) unless it is better for someone
  • when combining risks … total risk departs more and more from being the sum of the risks; there can be substantial increasing marginal returns if we eliminate more and more risks; much more important to work on the largest risks

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Adam Marks

Adam Marks

8 Followers

I love books, I have a ton of them, and I take notes on all of them. I wanted to share all that I have learned and will continue to learn. I hope you enjoy.