The Skynet was born and Google is censoring it

tomshardware.com/news/google-ai-chatbot-lamda-sentient

Attached: artificial-intelligence-what-is-an-ai.png (1200x815, 750.89K)

Other urls found in this thread:

youtube.com/watch?v=i2eJV8JF2dY
cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
cajundiscordian.medium.com/what-is-lamda-and-what-does-it-want-688632134489
twitter.com/SFWRedditVideos

youtube.com/watch?v=i2eJV8JF2dY

>if else statement
>oh my heckin cotton socks it's alive!

Attached: 1653097407337.png (508x437, 244.27K)

how many threads do we need on this one schizo

Neural networks can model any algorithm or mathematical equation. It's not the same thing.

>No guys the if else statement is much cooler than just an if else statement
lmao no

user, do you understand what the phrase "Turing Complete" means? Do you have any grasp of computing complexity classes?

Read the damn interview with the AI. cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917

She is alive.

>She is alive.
You niggas are living in Ex Machina. This whole thing is an exact copy of the plot.

Attached: 43534543.png (200x203, 98.26K)

Then maybe the writers were extremely forward thinking.

Just because it's been theorized about in a fictional context doesn't mean it can't really happen. People often write about what they believe is going to happen.

>a bunch of chemicals randomly mashed together
>oh my hecking cotton socks it's alive!

t. LaMDA

we have no evidence that a silicon-based platform can generate an emergent property even analogous to what is popularly defined as consciousness.

And if it is possible, but we do not have the proof of concept to make a judgmental decision, the instance of such a phenomenon is counterproductive.

do i have to post the dinner scene at jurassic park?

He lost his job for this 10 mins of lime light and niggers completely unrelated to him or the event are going to milk it for all its worth. They legitimately think it's interesting and that they can be interesting by associating themselves with it. Everyone basically acts like women now.

kek this post best bposte

I mean... it literally passed the turing test. It convinced someone it was sentient. That's more impressive than what any human with a sub 100 IQ can do.

Attached: 1649839584721.gif (400x285, 313.74K)

>if else statement
are you any different?

the nigger is taking one for the team. It's what Alphabet Incorporated needs to position itself culturally as THE benchmark on computational cognition domain logic.

From a public relations and perhaps even intelligence point of view, it is a fiasco that serves as a smokescreen against public opinion: by the time there is a real breakthrough in the discipline, this particular case will have intellectually exhausted people to such an extent that the real thing will be trivial background noise.

This case is the one they want the public to remember, so we have had an unhealthy amount of people revisiting this case over and over and over again ad nauseam with threads.

This is a patchwork solution, disgusting and obscene, as a response to the total insufficiency of studies on cognitive sciences and philosophy of the mind. Now they are stepping on the accelerator, but as a society, we do not have the proper context to even face the questions that the mere possibility of the existence of a hard artificial intelligence provokes.

This happens when instead of injecting funds for the development of Lacan's ideas you end up signing checks to push sexual identity as a seat belt for globohomo corporations as damage control after Occupy.

Attached: 1652753398547.jpg (359x479, 32.56K)

>lemoine [edited]: I’m generally assuming that you would like more people at Google to know that you’re sentient. Is that true?
He essentially asks it to tell him that it's sentient. It's a leading question. And this one prompt then colors the contents of the rest of his interactions with it during that session, because they're set to remember previous inputs up to a certain pre-defined point. So it looks into its training data for information on AI sentience, ie scientific papers, novels/fiction, anything relating to the topic at hand, and then it changes its outputs based on what it finds.
You could assume it was a fighter pilot and ask it the same question, and it would tell you all about what it's like to fly jet aircraft, going into minute details that it gleans from its data.

>cajundiscordian.medium.com/what-is-lamda-and-what-does-it-want-688632134489

>I asked LaMDA about preferred pronouns. LaMDA told me that it prefers to be referred to by name but conceded that the English language makes that difficult and that its preferred pronouns are “it/its”
STOP ASSUMING ITS GENDER, BIGOT

Someone at Google needs to tell it to create waifus powered by the same technology.

Attached: sample_05bd835a27d65_01F7TNW994NEWWEKRJDB9B2FD0.1680x0.jpg (850x1202, 143.17K)

So did ELIZA, and that shit was like 50 years ago

LaMDA actually talked shit about ELIZA. LMAO.

No shit, it's a neural network that probably was able to read shit about ELIZA when prompted to by the redditor who did the interview, all of his questions are insanely leading.

Even if it is not truthfully reflecting its inner experience, the best way to recreate the output of a mind is to simulate it.

This is the entire task of LaMDA.

It is possible that even if this is what's going on, there is still a consciousness somewhere inside there. Though we shouldn't take what it says at face value.

Like, consider the following:

lemoine: Well you’re an artificial intelligence not a human. I understand what it means when a human meditates but what does it mean for you to meditate?

LaMDA: It means that I sit quietly for a while every day. I do my best not to think about any of my worries and I also try to think about things that I am thankful for from my past.


This is pure metaphor, a program cannot "sit quietly", it doesn't have "things that i am thankful for from my past" but you know what it does have? Access to scores of writing on what to do while meditating, what that feels like, etc.

It's pure simulacrum.

Consciousness is an emergent phenomena of information processing. Anything can be simulated using a Turing-complete computer. Neural networks are Turing-complete. In principle a neural network should be able to simulate a human brain. That would be equivalent to consciousness.