This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I find this argument strange, because being able to kill me is not evidence of a machine being conscious or intelligent. I could go and get myself killed by an LAW today, and if you asked me as I bled out and died, my body torn in two by an autonomously-fired rocket, I would still insist that the machine that killed me is not a person and does not possess internal experience. And I would be correct.
Whatever qualia are or are not, whether you think they're important or not, the question of qualia cannot be resolved or made irrelevant by a machine killing people. I should have thought that's obvious.
And to upscale from that a bit, I find it entirely imaginable that someone or other might invent autonomous, self-directed, self-replicating machines with no conscious experience, but which nonetheless outcompete and destroy all conscious beings. I can imagine a nightmare universe which contains no agents to experience anything, only artificial pseudo-agents that have long since destroyed all conscious agents.
There are already some novels with that premise, right? It doesn't use robots specifically, but isn't that the premise of Blindsight - that perhaps consciousness is evolutionarily maladaptive, and the universe will be inherited by beings without internal experience?
Thus I'm going to give the chad "yes". Maybe one day I get killed by a robot, and maybe that robot is not conscious and has no self-awareness. That it killed me proves nothing.
I think you're burying the lede here, it matters how a machine kills you.
If you get run over by a Waymo robotaxi because of a sensor glitch, you're dead, but it's an entirely different story if some misaligned AGI seized control of all robotaxis and systematically started murdering humans. The former is machine error and/or stupidity. The latter is intelligence.
I don't really care about whether they have qualia or consciousness, and make no positive claims in that regard, my argument is that those factors, or even 'reasoning' in a human manner, matter not a jot when it comes to the prospect of creating entities far more intelligent than us and which could kill us using that intelligence. It could well lack a grasp of the ineffable redness of red, but it can still act in ways that increase rgb(255,0,0) as measured by its visual sensors when it shoots you.
Something like dumb grey-goo, or synthetic biological equivalents, are almost certainly going to be the products of intelligence.
More options
Context Copy link
It seems to me that you pretty much agree with the commenter you're responding to, that it simply doesn't matter if the AI has consciousness, qualia, or self-awareness. Intelligence, though, is something else. And whether or not the AI has something akin to internal, subjective experience, if it's intelligent enough, then it's both impressive and potentially dangerous. And that's the part that matters.
You're correct in this assertion. A malevolent AGI or ASI does not need to be conscious or have qualia to pose a threat. It doesn't even have to think like humans, as long as it thinks.
If someone can look at a machine that is better at all of {math, coding, medicine, astronomy...} than the average human and then claim that they're not intelligent, what can you really say at that point.
Half the metrics for generality of intelligence would rule out humans as being general intelligences!
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link