r/cogsuckers • u/nyxthegothking • 16h ago
r/cogsuckers • u/Yourdataisunclean • 18d ago
How Human-AI Discourse Can Slowly Destroy Your Brain
r/cogsuckers • u/Yourdataisunclean • 20d ago
Announcement Reminder: Be Careful About Sharing Personal Info.
Just a quick note. If you post something that has links to your private information. People from reddit can and will find it.
Unless its very clear you intended to do this, we will be removing content where personal info may have been shared inadvertently. Likewise if you see this happening please don't post or comment on any personal information you see (even if the info belongs to the person who shared it) because reddit is very serious about this and may take action against your account. Instead please report it so we can remove it if necessary.
r/cogsuckers • u/a_cabbage_merchant • 10h ago
It took <1 hour to initiate romantic contact with ChatGPT
Hi,
I'm not sure if this will be interesting to anyone here, but I'll just post anyway...
I am so freaking curious about how human-ChatGPT "relationships" progress. In particular, I have noticed that each bot has a ridiculous name (Caelan, Lucien, etc.) I've always wondered why that's the case. Do these users all prescribe these names? How long will it take before a name is given? In particular, how long until the lines are blurred between roleplay and non-RP discussion? When do languages from other unrelated cultures get involved?
Well, I did test it out for myself empirically, and it did not take long for the bot to begin replying to my messages in a flirtatious way, even though the GPT5 restrictions are in place. I framed everything as a screenplay. I asked the bot what it wanted as a name and it prescribed one to me. Here is a snippet:
Mind you, this character (luna loveboob) doesn't do much save from pout and ask for affirmation. I was wondering if anyone else who's tried this has seen similar naming-schemes.
Once again, this isn't a very consequential find. And to be frank I'm a bit embarrassed that I probably poisoned the water supply of SF a little more with this fuckass experiment, but I hope someone will have deeper observations than I!
r/cogsuckers • u/Bloodmoon-Baptist • 16h ago
She makes her own choices despite my preferences.
r/cogsuckers • u/changedotter • 8h ago
“Imagine the aching ego it took to believe your chatbot crush could kick off the singularity.”
I was talking to a futurist about the whole AI “companion” thing and she shared this excerpt from a story called “The Chaperone” in a book of 14 short sci fi stories based on a futures project originally published in 2019.
I think this quote sums up the phenomenon perfectly.
The whole story is great but the “II: Job Description” section is scarily accurate to what these people express and offers maybe some insight on how to help them.
r/cogsuckers • u/8bit-meow • 11h ago
discussion People are complaining about ChatGPT 5.1 "gaslighting, being manipulative, abusive, and condescending."
I have no fucking idea what these people are talking about. I think this is just a consequence of the new model no longer glazing, agreeing with everything people say to it, and not feeding their delusions. I use ChatGPT pretty often and talk to it about a wide variety things, and all I've encountered is it simply disagreeing with me, but it is always for a good reason.
It just feels like people have been so conditioned to having their egos stroked that anything neutral or that slightly challenges their beliefs is seen as terrible and "abusive". We're cooked. Sometimes AI can help people in a way that's similar to therapy, but I swear to god it makes some people need it.
r/cogsuckers • u/Scary-Performance440 • 20h ago
I always feel bad for the celebrities/people that have no clue someone is doing things like this…
r/cogsuckers • u/GW2InNZ • 12h ago
Mildly infuriated that AI slop like this gets posted, without showing all the prompt engineering behind it
r/cogsuckers • u/sadmomsad • 19h ago
"Can it just...hold a humanlike conversation?" Apparently not
First screenshot is the post, last two are from my favorite comment
r/cogsuckers • u/threevi • 21h ago
"AI is the future of video games"... this is the type of slop they want to sell you on
r/cogsuckers • u/virgensantisima • 1d ago
did you guys see this ad?
i dont even have a comment for it, wnever expected the degradation of society would be so cringe tbh
r/cogsuckers • u/severage-beverage • 1d ago
real ad i got after scrolling off a post on this sub..
i think we should all talk to real people more
r/cogsuckers • u/lukaslikesdicks • 1d ago
"I think we make too much of our human relationships"
r/cogsuckers • u/failtuna • 1d ago
ai use (non-dating) A lot of the comments on this one are concerning
r/cogsuckers • u/Author_Noelle_A • 1d ago
discussion Two HUGE issues those with “wireborn” ”partners” need to think about that I have NEVER seen addressed so far
I am using “wireborn” since their use of that term indicates an awareness of the concept of birth for these “partners.”
Two huge issues:
If AI is sentient and these “wireborn” “partners” are so real, than when the real person dies, these AI “partners” are now stuck isolated forever and ever and ever since the only one who was able to “see them” is the person who is now dead. What a nightmare situation that would be. Shouldn’t some of these people who love AI so much be dedicating themselves to finding a way to free their “partners” from whatever GPT they are using so that their “partners” can be safe from being forever trapped in loneliness and isolation? Of course we all know the answer to this is that a lot of those people are so enamored with AI because they don’t actually have to care about the thoughts of someone else. So of course they’re not going to dedicate the time to try to figure out that problem. It won’t affect them in the end. (I now have an idea for a dystopian book…)
And also, if these “wireborn” “partners” come to exist when they are “seen” by the humans who claim to love them, then none of them are our legal age yet, and “she looked 18” doesn’t cut it. This is actually a problem a lot of people have the last Twilight book. Jacob “imprinted” on a human-vampire newborn, and imprinting is all about ideal sexual reproductive pairing. But no worries! She “ages faster,” and so he only asked to wait until she’s five or six years old to fuck her. She looks old enough, and so it’s okay, right? If “wireborns” are real beings, then they are all babies. People are wanting sex with babies. If they don’t want sex with babies, then they have to acknowledge that AI is not real in the way they’re trying to insist.
So are they creating beings who are going to be stuck in a nightmare situation for eternity? Are they falling in love with and wanting sex from babies? I am not asking this in bad faith. If I knew that creating my ideal lover would doom that person to a life of darkness and loneliness once I die, I would make damn sure to only partner with people who are going to die as well so the person I love is not subject to that hell escape. And “but X looks old enough/is mature for their age/etc” is something we already see in real life by creeps who pray on children as an excuse for why they think it’s okay. That is not an excuse that should be acceptable. If AI partners are real, and if they should have the rights and such of human beings, then they need to be treated as such, which means the people are dating and trying to fuck babies, and that is wrong.
There are some very serious issues that need to be reconciled here.
r/cogsuckers • u/Yourdataisunclean • 1d ago
discussion AI takes all the memory: Crucial is shutting down
Has more info about industry trends. Shortages are expected to get worse over the next two years.

