Date: 17/01/2025 19:55:00
From: Tau.Neutrino
ID: 2238127
Subject: Reaching the Singularity.

Everything You Need to Know About AI Reaching Singularity

A Scientist Says Humans Will Reach the Singularity Within 21 Years

A Far Reach?

Reckon AI can do it?

Reply Quote

Date: 17/01/2025 20:23:14
From: SCIENCE
ID: 2238133
Subject: re: Reaching the Singularity.

show us viable fusion power and we’ll go with it

Reply Quote

Date: 17/01/2025 20:23:59
From: furious
ID: 2238134
Subject: re: Reaching the Singularity.

SCIENCE said:

show us viable fusion power and we’ll go with it

Look up…

Reply Quote

Date: 17/01/2025 20:58:28
From: SCIENCE
ID: 2238138
Subject: re: Reaching the Singularity.

furious said:

SCIENCE said:

show us viable fusion power and we’ll go with it

Look up…

technicality but then death is a form of singularity too so yeah we could believe that will come for some humans in 20 years

Reply Quote

Date: 18/01/2025 01:13:19
From: dv
ID: 2238194
Subject: re: Reaching the Singularity.

What will this singularity entail, in concrete terms? What test can be applied to tell whether we’re there?

Reply Quote

Date: 18/01/2025 01:22:27
From: Bubblecar
ID: 2238195
Subject: re: Reaching the Singularity.

dv said:


What will this singularity entail, in concrete terms? What test can be applied to tell whether we’re there?

Given that most of the “artificial intelligence” so far consists of regurgitating human learning, and they struggle to get that right, I suspect these end times are somewhat premature.

Reply Quote

Date: 18/01/2025 03:04:35
From: SCIENCE
ID: 2238198
Subject: re: Reaching the Singularity.

Bubblecar said:


dv said:

What will this singularity entail, in concrete terms? What test can be applied to tell whether we’re there?

Given that most of the “artificial intelligence” so far consists of regurgitating human learning, and they struggle to get that right, I suspect these end times are somewhat premature.

good answer, so what you mean is that singularity would entail artificial intelligence that is not regurgitating human learning, and that should be simple to test by not being able to find a human learning that matches the output, makes sense

Reply Quote

Date: 18/01/2025 05:36:44
From: transition
ID: 2238202
Subject: re: Reaching the Singularity.

I think all that is required for it to happen faster is for people – on average – to become more stupid, AI appear as if it equals humans sooner, and then accelerates past

so some useful mind viruses to that end, probably already in circulation

ideological worship of AI likely be the end game

a peculiarity of technology is just how effective it can be at amplifying stupid, few of the stupid would believe it a possibility, so it’s a trap of sorts, unintentional really, unintentional stupid is sort of innocent, keeps it casual, casual stupid

and how stupid is the term singularity

Reply Quote

Date: 18/01/2025 08:55:40
From: SCIENCE
ID: 2238226
Subject: re: Reaching the Singularity.

transition said:

I think all that is required for it to happen faster is for people – on average – to become more stupid, AI appear as if it equals humans sooner, and then accelerates past

so some useful mind viruses to that end, probably already in circulation

ideological worship of AI likely be the end game

a peculiarity of technology is just how effective it can be at amplifying stupid, few of the stupid would believe it a possibility, so it’s a trap of sorts, unintentional really, unintentional stupid is sort of innocent, keeps it casual, casual stupid

and how stupid is the term singularity

also a good point, in astrophysics the singularity is in a black hole which is collapse, so in artifintel the singularity is also a collapse, makes sense

Reply Quote

Date: 18/01/2025 20:48:39
From: SCIENCE
ID: 2238563
Subject: re: Reaching the Singularity.

SCIENCE said:

captain_spalding said:

dv said:


Much truth in that.

One strategy for being a ‘good listener’ is to repeat back to the speaker the things that they’re saying. Maybe in a slightly different way, but providing to them confirmation that you’re listening, and that that it’s ok for them to be saying what they’re saying.

hence Eliza

well that was an interesting aside

Weizenbaum’s own secretary reportedly asked Weizenbaum to leave the room so that she and ELIZA could have a real conversation. Weizenbaum was surprised by this, later writing: “I had not realized … that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.”

there you go does singularity count as taking 48 years to go from this above, to {the use of extremely short exposures to a relatively simple computer program to induce powerful delusional thinking in quite normal* people to the extent that they voluntarily select extinction}

*: arguable but given it’s the bulk of USSAoles we suppose the standard distribution applies

Reply Quote

Date: 20/01/2025 10:54:41
From: Cymek
ID: 2239033
Subject: re: Reaching the Singularity.

transition said:


I think all that is required for it to happen faster is for people – on average – to become more stupid, AI appear as if it equals humans sooner, and then accelerates past

so some useful mind viruses to that end, probably already in circulation

ideological worship of AI likely be the end game

a peculiarity of technology is just how effective it can be at amplifying stupid, few of the stupid would believe it a possibility, so it’s a trap of sorts, unintentional really, unintentional stupid is sort of innocent, keeps it casual, casual stupid

and how stupid is the term singularity

I wondered about that

We talk about AI been exponentially smarter and faster than humans
We could create stupid AI’s as some sort of tool to make us feel superior or a buddy to talk to when we are drunk or high.

Reply Quote

Date: 20/01/2025 11:11:16
From: transition
ID: 2239038
Subject: re: Reaching the Singularity.

Cymek said:


transition said:

I think all that is required for it to happen faster is for people – on average – to become more stupid, AI appear as if it equals humans sooner, and then accelerates past

so some useful mind viruses to that end, probably already in circulation

ideological worship of AI likely be the end game

a peculiarity of technology is just how effective it can be at amplifying stupid, few of the stupid would believe it a possibility, so it’s a trap of sorts, unintentional really, unintentional stupid is sort of innocent, keeps it casual, casual stupid

and how stupid is the term singularity

I wondered about that

We talk about AI been exponentially smarter and faster than humans
We could create stupid AI’s as some sort of tool to make us feel superior or a buddy to talk to when we are drunk or high.

I have an idea that’s what drugs (alcohol included) do, sort of a chemical lobotomy, done often enough it becomes permanent

Reply Quote

Date: 24/01/2025 17:54:59
From: Tau.Neutrino
ID: 2240988
Subject: re: Reaching the Singularity.

Hear the sound of a singularity

Wow…Spectacular’ – Oxford University professors spin a disc

https://m.youtube.com/shorts/TGuxwgUyu2A

Hear the sound of a singularity

Reply Quote