EF41 Show Notes

EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future


Looking to play or download the episode? Click here… EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future


DEDICATION: Episode EF41 is dedicated to Yuval Noah Harari, a historian and author of two of the most influential and important books of the last decade. In Sapiens: a Brief History of Humankind, he mines history, and leverages concepts of evolutionary psychology, in order to postulate what created the human condition of today. Then in Homo Deus, he used the same foundation to extrapolate our technology-driven evolution forward, speculating on what will become of the humans of tomorrow, building deep roots for the likelihood of transhumanism and coining terms like dataism.

One of Harari’s most prominent arguments is that by the middle of the 21st century, a new class of humans will emerge that he calls “The Useless Class.” He feels the world is on a crash course with growing unhappiness and dis-ease caused by a post-work world and nothing with which to replace it. Harari also projects a growing schism between regular people and the superhumans who are vastly superior due to technological advancements.

At the start of the third decade in the 21st century, we’re already witnessing these predictions starting to  become true as AI begins it’s slow but sure decimation of jobs. Although most likely won’t soon accept the reality of what’s coming, it’s a chilling thought. The only thought more worrying is if it’s already too late to stop the “so-called” progress?

You can find out more:


INSPIRATIONS: Episode EF41 is further inspired by Isaac Asimov, Steven Pinker


Will Storia is a graduate student whose Ph.D. work in history and evolutionary psychology is evolving into a manifesto about radically changing the trajectory of civilization.  On his show, The Next Evolution, he argues to a global audience that we are at the edge of an abyss, about to hand off what remains of our humanity to machines. His latest guest is John Weber, a revolutionary techie, who believes technology will solve all of humanity’s problems. As they publicly clash with the world watching, Will worries how people might twist his ideas; because it’s increasingly obvious that John’s transhuman ideology has already become a religion.

To explore if technology might be the quick solution for every problem humanity is facing, we must look deep into the core of what makes us tick — human nature. Transhumanism will likely exponentially multiply all of our human weaknesses. Does the hard-wired human desire to seek power over others pose a serious problem for our inevitable evolution of wanting to be superhuman? Also, Yuval Noah Harari believes dataism — the trend towards reducing humans down to our data and trusting AI to make all our decisions — has civilization on a crash course with becoming useless. Is this our fate? Or will technology open doors to new heights for humanity?


Nick Bostrom, Transhumanist Values, nickbostrom.com. https://www.nickbostrom.com/ethics/values.html (Accessed: Oct 13, 2019)

Sarah Shearman, Nanobots kill off cancerous tumours as fiction becomes reality, ft.com. https://www.ft.com/content/57c9f432-de6d-11e7-a0d4-0944c5f49e46 (Accessed: Oct 13, 2019)

Sarwant Singh, Transhumanism And The Future Of Humanity: 7 Ways The World Will Change By 2030, forbes.com. https://www.forbes.com/sites/sarwantsingh/2017/11/20/transhumanism-and-the-future-of-humanity-seven-ways-the-world-will-change-by-2030/#690e48937d79 (Accessed: Oct 13, 2019)

Shelly Fan, Mind-Controlled Nanobots Used to Release Chemicals in Living Cockroaches, singularityhub.com. https://singularityhub.com/2016/09/18/mind-controlled-nanobots-used-to-release-chemicals-in-living-cockroaches/ (Accessed: Oct 13, 2019)

Unknown Author, Transhumanist Declaration, humanityplus.org. https://humanityplus.org/philosophy/transhumanist-declaration/ (Accessed: Oct 13, 2019)

Yuval Noah Harari, Sapiens: A Brief History of Humankind. Harvill Secker, 2014.

Yuval Noah Harari, Homo Deus: A History of Tomorrow. Harper, 2017.

episode quotes

Any of these quotes make you think? If so, please support the show by sharing…

EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future

Human nature is running a drastically outdated operating system; an algorithm that awards those who increase their power and control over the largest number of people.

EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future

Humanity holds in its hands the next phase of evolution — technologies to upgrade our bodies, minds and lives. But this time, we’re in control, for better or for worse.

EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future

Humanity stands at the edge of an abyss. Behind is fractured civilization, inequality, infighting, and tribalism. Ahead is transhumanism, where we fully become one with technology and release our final stronghold — actually being human.

EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future

We are upgrading ourselves with technology in a maligned effort to make the next evolution of humanity infallible. But can you build perfection on top of faulty code?

EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future

(EF41) The stories our tribes tell tap into the core human emotions and basic desires we all seek. — The Evolve Faster Podcast (Season 2, Episode 2)

EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future

Season Two, Episode Two: Technology is like a powerful drug that can either improve you or make you miserable and dependent. Is it our evolutionary fate that we will become superhuman? Or is the codebase of human nature inherently too fallible for the story of transhumanism to end well?