Here are some of the highlights, as I recall, from this past Tuesday:
Both pieces ended on what seemed like different notes: Moravec sounded like something of a mystic or along the lines of a Buddhist or Hindu, with a much more positive slant to what he was saying, whereas Vinge seemed to express a sense of impending doom, or at least a worrisome outlook.
Some questions about motivation: What would the motivation of a superintelligent being (of the sort that the Singularity is characterized to be) be like? Human and animal motivation is shaped in a large part by the need to find food and take care of other basic needs. What about an artifical superintelligence?
Some questions about intelligence: How do we define intelligence? What characteristics are essential for a recognizable form of intelligence (e.g., creativity, inspiration, nostalgia)? Could the Singularity possess these characteristics? In what way is the form of intelligence characteristic of the Singularity supposed to be beyond our ken? The form of intelligence of a mature adult human is beyond the ken of a baby human. Is there supposed to be a difference in the case of the Singularity's being beyond our ken? What is this difference?
Some questions pertaining to our supposed inability to predict what the Singularity would be like:
1. With
a new sort of intelligence, the Turing test won’t apply. What sort of
continuity is there between them?
2. Epistemological
claim about our predictions: there will be an event beyond which we cannot
predict where things will go. Might the ignorance be connected to question 1?
3. What
makes the Singularity unique? We cannot predict future theories of our
own even now. So what’s the difference between the uncertainties we face
everyday and the ones this possibility presents?
How is the concept of the singularity already a projection into the future of what we already know? How would we recognize it? Might it already exist, and we don’t know yet?
On some conceptions, the Singularity seems to transcend individuality. Is this a difference between our conception of ourselves as humans and the kind of entity that the Singularity is supposed to be? Does it factor into issues about the desirability of the coming of the Singularity
Why the Singularity might scare us: A future where people aren’t running things anymore is
fundamentally different from our present. We might no longer be at the center of things. AI would be scary because has no continuity with our current
existence. A future superintelligence might be hostile toward humans.
But is the Singularity to be feared? Would a superintelligence (necessarily, most likely) respect biodiversity, the rights of other creatures, and so on? Would it recognize moral values? WOuld it be a moral exemplar?
The contrast between Aritifical Intelligence (AI) and Intelligence Amplification (IA), in Vinge, was very interesting: Which is the more plausible route to the Singularity? Which is the most desirable, from the perspective of our own well-being as humans? How discontinuous would the Singularity be with human existence if it arose in this way, as opposed to through more traditional AI? Does IA lead to something like a hive-mind or a superintelligence that takes a cue from the Gaia hypothesis?
Would the Singularity (or any other superintelligence) become bored? What characteristics might cause or prevent this? What sort of immortality would it have? What importance does the fact that even a superintelligence has a physical base have with respect to its longevity prospects?
Some different issues:
1. Could
there be a different kind of entity that is super-intelligent?
2. Could
it be immortal?
3. Could
I be immortal in the sense that I have these super-enhanced capabilities?
An irony:
Psychology teaches us that those who are deeply
religious live longest, so, ironically, the people who live the longest would
not believe in a Singularity (on the assumption that this is not something that the religious believe in).
Nietzsche came up a few times: How does he describe the Ubermensch?How does the Ubermensch relate to the Singularity, if at all?
The notion that it might be our function to enable the development of the Singularity also came up: What sense of 'function' is in play here? What does this imply about our relationship to the Singularity (causal, normative)? What about the Singularity's relationship to us (ancestor worship, fuel)?