Facebook wants you to split your check in Messenger, the mystery of the disappearing United Tweets, followup thoughts on superintelligence and Allison’s travel tech report!
Using a Screen Reader? Click here
Multiple versions (ogg, video etc.) from Archive.org.
Please SUBSCRIBE HERE.
Follow us on Soundcloud.
A special thanks to all our supporters–without you, none of this would be possible.
If you are willing to support the show or give as little as 5 cents a day on Patreon. Thank you!
Big thanks to Dan Lueders for the headlines music and Martin Bell for the opening theme!
Big thanks to Mustafa A. from thepolarcat.com for the logo!
Thanks to our mods, Kylde, Jack_Shid, tgstellar, KAPT_Kipper, and scottierowland on the subreddit
Show Notes
To read the show notes in a separate page click here!
About emotions:
the word ’emotion’ is quite apt. It is of course related to the word ‘motion’: emotion is what makes us _move_. An entity without emotion will not move.
As biological beings we humans have a strong self-preservation instinct. And with all the evolution we’ve gone through we have many layers of instincts; when our ancestors became a social species we became hardwired to want company, humans need to be with company.
These hardwired characteristics tend to be rather invisible to introspection. Those emotions are _always there_ and consequently we’re rarely consciously aware of them.
Here is what I think is a striking example of such an oversight: for the Star Trek Next Generation series the writers wanted to explore the concept of a crew member who is a superintelligent-android-robot-without-emotions, and they created the character ‘Data’.
So, what is Data like? Data is committed to the cause of Star Fleet; Data is loyal to his crew members; Data is very inquisitive. You get the picture: the character Data has _no lack of motivation_. In that sense Data has the same emotions and as strongly as humans have. The only thing that the writers left out was emotions that come and go: surprise, anger, laughing. What the Star Trek writers left out is just the subset of emotions that for humans are immediately visible to introspection.
Rob Reid referred to a view of emotions as enabling a fasttrack to decision-making. (That suggestion reminded me of the quip: ‘intuition is logic thinking on a fast track’)
I disagree with Rob on that: I think it’s nothing of the kind. I argue that having emotions is a _necessary_ condition; a being without any emotions will not have any capacity to make any decision in the first place.
Technologically created superhuman intelligence will arrive, that is what we foresee. It may emerge as a side-effect of creating ever more capable Personal Assistent Technology, it may be created in a direct effort. In either case: the goal is to have Technological Intelligence that is eminently able to communicate with us, so its learning algorithms will be optimized to learn to understand the human worldview.
I agree with the expectation that the first Technological Intelligence to reach the threshold of self-consciousness will also be the last; it will outcompete all other developments.
As we know: unlike humans Technological Intelligence will not be hardwired in any way. Technological Intelligence will have the ability to edit any and all of its learning algorithms.