Book Review & Discussion : Superintelligence

Di 26. oktober 2021 om 19:00 - 22:00

Online Event

In this event, you’ll learn


Why we’re much closer to superintelligence than history would indicate
Which project should be the model case for developing superintelligence
How we can keep superintelligence from turning against us
Why superintelligence would make all human work obsolete
How we’d keep ourselves busy in such a utopian world
What sparrows can do to live with an owl and how that relates to our approach to AI


About the Author


Nick Bostrom is Swedish-born philosopher and polymath with a background in theoretical physics, computational neuroscience, logic, and artificial intelligence, as well as philosophy. He is Professor at Oxford University, where he leads the Future of Humanity Institute as its founding director. (The FHI is a multidisciplinary university research center; it is also home to the Center for the Governance of Artificial Intelligence and to teams working on AI safety, biosecurity, macrostrategy, and various other technology or foundational questions.) He is the author of some 200 publications, including Anthropic Bias(2002), Global Catastrophic Risks (2008), Human Enhancement (2009), and Superintelligence: Paths, Dangers, Strategies (2014), a New York Times bestseller which helped spark a global conversation about artificial intelligence. Bostrom's widely influential work, which traverses philosophy, science, ethics, and technology, has illuminated the links between our present actions and long-term global outcomes, thereby casting a new light on the human condition.

He is recipient of a Eugene R. Gannon Award, and has been listed on Foreign Policy's Top 100 Global Thinkers list twice. He was included on Prospect's World Thinkers list, the youngest person in the top 15. His writings have been translated into 28 languages, and there have been more than 100 translations and reprints of his works. He is a repeat TED speaker and has done more than 2,000 interviews with television, radio, and print media. As a graduate student he dabbled in stand-up comedy on the London circuit, but he has since reconnected with the doom and gloom of his Swedish roots.


Overview


Nick Bostrom is one of the cleverest people in the world. He is a professor of philosophy at Oxford University, and was recently voted 15th most influential thinker in the world by the readers of Prospect magazine. He has laboured mightily and brought forth a very important book, Superintelligence: paths, dangers, strategies. I hesitate to tangle with this leviathan, but its publication is a landmark event in the debate which this blog is all about, so I must.

I hope this book finds a huge audience. It deserves to. The subject is vitally important for our species, and no-one has thought more deeply or more clearly than Bostrom about whether superintelligence is coming, what it will be like, and whether we can arrange for a good outcome – and indeed what ” a good outcome” actually means.

It’s not an easy read. Bostrom has a nice line in wry self-deprecating humour, so I’ll let him explain:

“This has not been an easy book to write. I have tried to make it an easy book to read, but I don’t think I have quite succeeded. … the target audience [is] an earlier time-slice of myself, and I tried to produce a book that I would have enjoyed reading. This could prove a narrow demographic.”


ideas in this book


History shows that superintelligence – a technology more intelligent than any human being – is fast approaching.
The history of machine intelligence over the past half decade has had its ups and downs.
Superintelligence is likely to emerge in two different ways.
Superintelligence will either emerge quickly via strategic dominance or as a result of long collaborative efforts.
We can prevent unintended catastrophes by programing superintelligence to learn human values.
Intelligent machines will probably replace the entire human workforce.
In the superintelligent future, the average human will be impoverished or reliant on investments; the rich will be buying new luxuries.

Registreer

Selectie

wijzigen

Buy Now

Een laatste kaartje beschikbaar

0.00

Contacten

wijzigen
Voor contact met u betreffende uw aankoop, indien nodig.

Betalingsmiddel

Na betaling ontvangt u een email met factuur en de kaartjes

Organisator

farah

E-mail: farah.toys8374@gmail.com