• 0 Posts
  • 15 Comments
Joined 3 years ago
cake
Cake day: November 8th, 2021

help-circle










  • Still less criminal than the admistration so very shaky ground for this claim.

    All current drug problens themselves were created by republicans who they invented the drug war in the 70s intentionally to curtail free speech of Vietnam war protesters.

    The drug over criminalization created the environement that directly leads to fentanyl being an optimal border crossing narcotic.

    Drug dealers are more respectable right now than all administration members combined, even the “illegal” ones.

    All very dubious for the most powerful country in the world, which rapes the entire planet for mineral ressources to call any human “illegal”.

    I speak for all humans when I say, this planet would be a lot better without the memetic infections known as America, China, Europe, India, Russia.

    Maybe if they all had a nuclear fireworks party the survivors would have the opportunity to learn not to build those monstrous egregores.


  • You didn’t, I did

    The article discusses a study by OpenAI and MIT Media Lab revealing that heavy ChatGPT users — dubbed “power users” — are showing signs of addiction to the chatbot. These signs include preoccupation, withdrawal, mood changes, and loss of control. While most users don’t form emotional connections with ChatGPT, those who use it more frequently tend to view it as a “friend,” especially if they feel lonely or emotionally unfulfilled.

    Interestingly, people using ChatGPT for personal reasons (like sharing emotions) showed less emotional dependency than those using it for non-personal tasks (like brainstorming). Also, text-based interactions led to more emotional language than voice interactions, with brief voice use correlating to better well-being. Overall, prolonged use — regardless of purpose — increased the risk of emotional dependence.

    You said:

    Make a list of the main points and for each explain why it will be worse when it is society wide

    ChatGPT said:

    Here’s a list of the main points from the article, along with why each could be more concerning on a society-wide scale:

    Heavy ChatGPT users show signs of addiction (preoccupation, withdrawal, mood changes, loss of control):
    
    Why it's worse at scale:
    
    If large numbers of people become dependent on AI tools, especially to the point of addiction, it could lead to widespread mental health issues, reduced productivity, and a society increasingly unable to function without AI assistance.
    
    Long-term users tend to form emotional bonds with ChatGPT, sometimes seeing it as a “friend”:
    
    Why it's worse at scale:
    
    A society where many people substitute AI for real human connection could see rising loneliness, social detachment, and a decline in community and interpersonal relationships. This weakens social bonds and may impact mental health and civic cohesion.
    
    Lonelier individuals are more likely to use ChatGPT heavily and become emotionally attached:
    
    Why it's worse at scale:
    
    This could create a feedback loop where loneliness drives AI use, and AI use further isolates people from real-world interactions, exacerbating social alienation on a mass level.
    
    People stress out over subtle changes in ChatGPT’s behavior:
    
    Why it's worse at scale:
    
    If millions of people are emotionally reactive to minor algorithmic changes, this gives tech companies enormous influence over collective emotional states. It also creates societal instability, as shifts in AI behavior could unintentionally trigger widespread anxiety or confusion.
    
    Emotional language is used more in text interactions than voice, and brief voice use is associated with better well-being:
    
    Why it's worse at scale:
    
    If the most popular and accessible form of interaction (text) increases emotional entanglement, and the healthier option (brief voice use) is less utilized, we might unintentionally nudge society toward less emotionally healthy patterns of AI interaction.
    
    Users who engage with ChatGPT for non-personal tasks are more emotionally dependent than those who use it for personal reasons:
    
    Why it's worse at scale:
    
    This is counterintuitive and dangerous because it suggests people may not realize they're forming dependencies while doing routine work. If AI tools meant to boost productivity are quietly making users emotionally reliant, it could have long-term effects on workforce dynamics and mental resilience.
    
    Prolonged usage increases risk of emotional dependence, regardless of how it's used:
    
    Why it's worse at scale:
    
    As AI becomes embedded in everyday life, the likelihood of widespread emotional dependence increases. This could shift power dynamics between humans and AI, making people less autonomous and more easily influenced by algorithmic systems.
    

    Let me know if you want this formatted for a presentation, policy paper, or discussion.