(EF41) Can We Handle the Responsibility?
Humanity holds in its hands the next phase of evolution — technologies to upgrade our bodies, minds and lives. But this time, we’re in control, for better or for worse.
Humanity holds in its hands the next phase of evolution — technologies to upgrade our bodies, minds and lives. But this time, we’re in control, for better or for worse.
Humanity stands at the edge of an abyss. Behind is fractured civilization, inequality, infighting, and tribalism. Ahead is transhumanism, where we fully become one with technology and release our final stronghold — actually being human.
There are unintended consequences of telling powerful stories and letting humans with agendas interpret those ideas.
Even if we reach the transhumanism pinnacle — no work, no suffering, no dying — will our lives really be fulfilled?
One of the core weaknesses of human nature is our propensity to believe story over reality or facts
We are upgrading ourselves with technology in a maligned effort to make the next evolution of humanity infallible. But can you build perfection on top of faulty code?
(EF41) The stories our tribes tell tap into the core human emotions and basic desires we all seek. — The Evolve Faster Podcast (Season 2, Episode 2)
Episode EF41 is dedicated to Yuval Noah Harari, a historian and author of two of the most influential and important books of the last decade. In Sapiens: a Brief History of Humankind, he mines history, and leverages concepts of evolutionary psychology, in order to postulate what created the human condition of today. Then in Homo Deus, he used the same foundation to extrapolate our technology-driven evolution forward, speculating on what will become of the humans of tomorrow, building deep roots for the likelihood of transhumanism and coining terms like dataism.
Sam Payne is on the brink of tearing down the entire world in hopes of getting in touch with a higher force she believes created the universe. Not knowing what the force might be, she’s risking everything to get to the other side. In the last moment, as she plans to quit to spare the universe, the force replies.
Although the distant future is unknowable, we can still predict what the near future brings. Unlike philosophical theories that deal with the present and the past, futurism doesn’t have the benefit of events happening or that they already happened — thus, it becomes the biggest guessing game. But is it important to know the future or is it all about living in the now?
Will Storia is a graduate student whose PhD work in history and evolutionary psychology is evolving into a manifesto about radically changing the trajectory of civilization. On his show, The Next Evolution, he argues to a global audience that we are at the edge of an abyss, about to hand off what remains of our humanity to machines. His latest guest is John Weber, a revolutionary techie, who believes technology will solve all of humanity’s problems. As they publicly clash with the world watching, Will worries how people might twist his ideas; because it’s increasingly obvious that John’s transhuman ideology has already become a religion. To explore if technology might be the quick solution for every problem humanity is facing, we must look deep into the core of what makes us tick — human nature. Transhumanism will likely exponentially multiply all of our human weaknesses. Does the hard-wired human desire to seek power over others pose a serious problem for our inevitable evolution of wanting to be superhuman? Also, Yuval Noah Harari believes dataism — the trend towards reducing humans down to our data and trusting AI to make all our decisions — has civilization on a crash course with becoming useless. Is this our fate? Or will technology open doors to new heights for humanity?
EF41 (S2-E2): Transhuman Fallibility: The Existential Risk of a Superhuman Future Read More »
Season Two, Episode Two: Technology is like a powerful drug that can either improve you or make you miserable and dependent. Is it our evolutionary fate that we will become superhuman? Or is the codebase of human nature inherently too fallible for the story of transhumanism to end well?
The Big Question driving Episode EF41 is … why are humans so fallible?