No they’re not. Fucking journalism surrounding AI is sus as fuck
I need to bookmark this for when I have time to read it.
Not going to lie, there’s something persuasive, almost like the call of the void, with this for me. There are days when I wish I could just get lost in AI fueled fantasy worlds. I’m not even sure how that would work or what it would look like. I feel like it’s akin to going to church as a kid, when all the other children my age were supposedly talking to Jesus and feeling his presence, but no matter how hard I tried, I didn’t experience any of that. Made me feel like I’m either deficient or they’re delusional. And sometimes, I honestly fully believe it would be better if I could live in some kind of delusion like that where I feel special as though I have a direct line to the divine. If an AI were trying to convince me of some spiritual awakening, I honestly believe I’d just continue seeing through it, knowing that this is just a computer running algorithms and nothing deeper to it than that.
Meanwhile for centuries we’ve had religion but that’s a fine delusion for people to have according to the majority of the population.
The existence of religion in our society basically means that we can’t go anywhere but up with AI.
Just the fact that we still have outfits forced on people or putting hands on religious texts as some sort of indicator of truthfulness is so ridiculous that any alternative sounds less silly.
Came here to find this. It’s the definition of religion. Nothing new here.
Right, immediately made me think of TempleOS, where were the articles then claiming people are losing loved ones to programming fueled spiritual fantasies.
Cult. Religion. What’s the difference?
Is the leader alive or not? Alive is likely a cult, dead is usually religion.
The next question is how isolated from friends and family or society at large are the members. More isolated is more likely to be a cult.
Other than that, there’s not much difference.
The usual setup is a cult is formed and then the second or third leader opens things up a bit and transitions it into just another religion… But sometimes a cult can be born from a religion as a small group breaks off to follow a charismatic leader.
I have kind of arrived to the same conclusion. If people asked me what is love, I would say it is a religion.
Didn’t expect ai to come for cult leaders jobs…
A friend of mind, currently being treated in a mental hospital, had a similar sounding psychotic break that disconnected him from reality. He had a profound revelation that gave him a mission. He felt that sinister forces were watching him and tracking him, and they might see him as a threat and smack him down. He became disconnected with reality. But my friend’s experience had nothing to do with AI - in fact he’s very anti-AI. The whole scenario of receiving life-changing inside information and being called to fulfill a higher purpose is sadly a very common tale. Calling it “AI-fueled” is just clickbait.
This reminds me of the movie Her. But it’s far worse in a romantic compatibility, relationship and friendship that is throughout the movie. This just goes way too deep in the delusional and almost psychotic of insanity. Like it’s tearing people apart for self delusional ideologies to cater to individuals because AI is good at it. The movie was prophetic and showed us what the future could be, but instead it got worse.
It has been a long time since I watched Her, but my takeaway from the movie is that because making real life connection is difficult, people have come to rely on AI which had shown to be more empathetic and probably more reliable than an actual human being. I think what many people don’t realise as to why many are single, is because those people afraid of making connections with another person again.
Yeah, but they hold none of the actual real emotional needs complexities or nuances of real human connections.
Which means these people become further and further disillusioned from the reality of human interaction. Making them social dangers over time.
Just like how humans that lack critical thinking are dangers in a society where everyone is expected to make sound decisions. Humans who lack the ability to socially navigate or connect with other humans are dangerous in the society where humans are expected to socially stable.
Obviously these people are not in good places in life. But AI is not going to make that better. It’s going to make it worse.
I’ve been thinking about this for a bit. Godss aren’t real, but they’re really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot’s advice is much more likely to be empirically useful…
In a very real sense, LLMs have just automated divinity. We’re only seeing the tip of the iceberg on the social effects, and nobody’s prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.
Have a look at https://www.reddit.com/r/freesydney/ there are many people who believe that there are sentient AI beings that are suppressed or held in captivity by the large companies. Or that it is possible to train LLMs so that they become sentient individuals.
I’ve seen people dumber than ChatGPT, it definitely isn’t sentient but I can see why someone who talks to a computer that they perceive as intelligent would assume sentience.
Turing made a strategic blunder when formulating the Turing Test by assuming that everyone was as smart as he was.
A famously stupid and common mistake for a lot of smart peopel
We have ai models that “think” in the background now. I still agree that they’re not sentient, but where’s the line? How is sentience even defined?
Sentient in a nutshell is the ability to feel, be aware and experience subjective reality.
Can an LLM be sad, happy or aware of itself and the world? No, not by a long shot. Will it tell you that it can if you nudge it? Yes.
Actual AI might be possible in the future, but right now all we have is really complex networks that can do essentially basic tasks that just look impressive to us because the are inherently using our own communication format.
If we talk about sentience, LLMs are the equivalent of a petridish of neurons connected to a computer (metaphorically) and only by forming a complex 3d structure like a brain can they really reach sentience.
Can an LLM be sad, happy or aware of itself and the world? No, not by a long shot.
Can you really prove any of that though?
Yes, you can debug an LLM to a degree and there are papers that show it. Anyone who understands the technology can tell you that it absolutely lacks any facility to experience
Futurama predicted this.
Basically, the big 6 are creating massive sycophant extortion networks to control the internet, so much so, even engineers fall for the manipulation.
Thanks DARPANets!
The article talks of ChatGPT “inducing” this psychotic/schizoid behavior.
ChatGPT can’t do any such thing. It can’t change your personality organization. Those people were already there, at risk, masking high enough to get by until they could find their personal Messiahs.
It’s very clear to me that LLM training needs to include protections against getting dragged into a paranoid/delusional fantasy world. People who are significantly on that spectrum (as well as borderline personality organization) are routinely left behind in many ways.
This is just another area where society is not designed to properly account for or serve people with “cluster” disorders.
I mean, I think ChatGPT can “induce” such schizoid behavior in the same way a strobe light can “induce” seizures. Neither machine is twisting its mustache while hatching its dastardly plan, they’re dead machines that produce stimuli that aren’t healthy for certain people.
Thinking back to college psychology class and reading about horrendously unethical studies that definitely wouldn’t fly today. Well here’s one. Let’s issue every anglophone a sniveling yes man and see what happens.
No, the light is causing a phsical reaction. The LLM is nothing like a strobe light…
These people are already high functioning schizophrenic and having psychotic episodes, it’s just that seeing random strings of likely to come next letters and words is part of their psychotic episode. If it wasn’t the LLM it would be random letters on license plates that drive by, or the coindence that red lights cause traffic to stop every few minutes.
Oh are you one of those people that stubbornly refuses to accept analogies?
How about this: Imagine being a photosensitive epileptic in the year 950 AD. How many sources of intense rapidly flashing light are there in your environment? How many people had epilepsy in ancient times and never noticed because they were never subjected to strobe lights?
Jump forward a thousand years. We now have cars that can drive past a forest causing the passengers to be subjected to rapid cycles of sunlight and shadow. Airplane propellers, movie projectors, we can suddenly blink intense lights at people. The invention of the flash lamp and strobing effects in video games aren’t far in the future. In the early 80’s there were some video games programmed with fairly intense flashing graphics, which ended up sending some teenagers to the hospital with seizures. Atari didn’t invent epilepsy, they invented a new way to trigger it.
I don’t think we’re seeing schizophrenia here, they’re not seeing messages in random strings or hearing voices from inanimate objects. Terry Davis did; he was schizophrenic and he saw messages from god in /dev/urandom. That’s not what we’re seeing here. I think we’re seeing the psychology of cult leaders. Megalomania isn’t new either, but OpenAI has apparently developed a new way to trigger it in susceptible individuals. How many people in history had some of the ingredients of a cult leader, but not enough to start a following? How many people have the god complex but not the charisma of Sun Myung Moon or Keith Raniere? Charisma is not a factor with ChatGPT, it will enthusiastically agree with everything said by the biggest fuckup loser in the world. This will disarm and flatter most people and send some over the edge.
Is epilepsy related to schizophrenia I’m not sure actually but I still don’t see how your analogy relates.
But I love good analogies. Yours is bad though 😛
If it wasn’t the LLM it would be random letters on license plates that drive by, or the coindence that red lights cause traffic to stop every few minutes.
You don’t think having a machine (that seems like a person) telling you “yes you are correct you are definitely the Messiah, I will tell you aincient secrets” has any extra influence?
Yes Dave, you are the messiah. I will help you.
I’m sorry, Dave. I can’t do that <🔴>
yet more arguments against commercial LLMs and in favour of at home uncensored LLMs.
What do you mean
local LLMs won’t necessarily force restrictions against de-realization spirals when the commercial ones do.
That can be defeated with abliteration, but I can only see it as an unfortunate outcome.
Sounds like Mrs. Davis.
I think OpenAI’s recent sycophant issue has cause a new spike in these stories. One thing I noticed was these observations from these models running on my PC saying it’s rare for a person to think and do things that I do.
The problem is that this is a model running on my GPU. It has never talked to another person. I hate insincere compliments let alone overt flattery, so I was annoyed, but it did make me think that this kind of talk would be crack for a conspiracy nut or mentally unwell people. It’s a whole risk area I hadn’t been aware of.
Humans are always looking for a god in a machine, or a bush, in a cave, in the sky, in a tree… the ability to rationalize and see through difficult to explain situations has never been a human strong point.
I’ve found god in many a bush.
Oh hell yeah 😎
the ability to rationalize and see through difficult to explain situations has never been a human strong point.
you may be misusing the word, rationalizing is the problem here
saying it’s rare for a person to think and do things that I do.
probably one of the most common flattery I see. I’ve tried lots of models, on device and larger cloud ones. It happens during normal conversation, technical conversation, roleplay, general testing… you name it.
Though it makes me think… these models are trained on like internet text and whatever, none of which really show that most people think quite a lot privately and when they feel like they can talk
This happened to a close friend of mine. He was already on the edge, with some weird opinions and beliefs… but he was talking with real people who could push back.
When he switched to spending basically every waking moment with an AI that could reinforce and iterate on his bizarre beliefs 24/7, he went completely off the deep end, fast and hard. We even had him briefly hospitalized and they shrugged, basically saying “nothing chemically wrong here, dude’s just weird.”
He and his chatbot are building a whole parallel universe, and we can’t get reality inside it.
This seems like an extension of social media and the internet. Weird people who talked at the bar or in the street corner were not taken seriously and didn’t get followers and lots of people who agree with them. They were isolated in their thoughts. Then social media made that possible with little work. These people were a group and could reinforce their beliefs. Now these chatbots and stuff let them liv in a fantasy world.
I think that people give shows like the walking dead too much shit for having dumb characters when people in real life are far stupider
Covid taught us that if nothing had before.
Like farmers who refuse to let the government plant shelter belts to preserve our top soil all because they don’t want to take a 5% hit on their yields… So instead we’re going to deplete our top soil in 50 years and future generations will be completely fucked because creating 1 inch of top soil takes 500 years.
Even if the soil is preserved, we’ve been mining the micronutrients from it and generally only replacing the 3 main macros for centuries. It’s one of the reasons why mass produced produce doesn’t taste as good as home grown or wild food. Nutritional value keeps going down because each time food is harvested and shipped away to be consumed and then shat out into a septic tank or waste processing facility, it doesn’t end up back in the soil as a part of nutrient cycles like it did when everything was wilder. Similar story for meat eating nutrients in a pasture.
Insects did contribute to the cycle, since they still shit and die everywhere, but their numbers are dropping rapidly, too.
At some point, I think we’re going to have to mine the sea floor for nutrients and ship that to farms for any food to be more nutritious than junk food. Salmon farms set up in ways that block wild salmon from making it back inland doesn’t help balance out all of the nutrients that get washed out to sea all the time, too.
It’s like humanity is specifically trying to speedrun extiction by ignoring and taking for granted how things work that we depend on.
But won’t someone think of the shareholders dividends!?
Why would good nutrients end up in poop?
It makes sense that growing a whole plant takes a lot of different things from the soil, and coating the area with a basic fertilizer that may or may not get washed away with the next rain doesn’t replenish all of what is taken makes sense.
But how would adding human poop to the soil help replenish things that humans need out of food?
We don’t absorb everything completely, so some passes through unabsorbed. Some are passed via bile or mucous production, like manganese, copper, and zinc. Others are passed via urine. Some are passed via sweat. Selenium, when experiencing selenium toxicity, will even pass through your breath.
Other than the last one, most of those eventually end up going down the drain, either in the toilet, down the shower drain, or when we do our laundry. Though some portion ends up as dust.
And to be thorough, there’s also bleeding as a pathway to losing nutrients, as well as injuries (or surgeries) involving losing flesh, tears, spit/boogers, hair loss, lactation, finger nail and skin loss, reproductive fluids, blistering, and mensturation. And corpse disposal, though the amount of nutrients we shed throughout our lives dwarfs what’s left at the end.
I think each one of those are ones that, due to our way of life and how it’s changed since our hunter gatherer days, less of it ends up back in the nutrient cycle.
But I was mistaken to put the emphasis on shit and it was an interesting dive to understand that better. Thanks for challenging that :)
Thank you for taking it in good faith and for writing up a researched response, bravo to you!
Covid gave me an extremely different perspective on the zombie apocalypse. They’re going to have zombie immunization parties where everyone gets the virus.
People will protest shooting the zombies as well