Die Gefahren einer durch künstliche Intelligenz erzeugten Romanze – Es ist riskant, zu viel von uns selbst in virtuelle statt in reale Beziehungen zu investieren.

https://www.psychologytoday.com/us/blog/its-not-just-in-your-head/202408/the-dangers-of-ai-generated-romance

4 Comments

  1. From the article

    >Unfortunately, no. Platforms generating AI girlfriends are experiencing a massive growth in popularity, with millions of users. Most of these searches are initiated by young single men drawn to AI girlfriends to combat [loneliness](https://www.psychologytoday.com/us/basics/loneliness) and establish a form of companionship. These “girlfriends” are virtual companions powered by the increasingly sophisticated field of [artificial intelligence](https://www.psychologytoday.com/us/basics/artificial-intelligence).

    >Although artificial, their popularity stems from their ability to provide companionship, emotional support, and intimacy through voice or text-based interactions. The average age of a user is 27, but not all users are male—18% of users identify as female, so this activity transcends [gender](https://www.psychologytoday.com/us/basics/gender). Almost 20% of men who use traditional [dating](https://www.psychologytoday.com/us/basics/mating) apps indicate they had AI-generated romances at some point. AI-generated dating platforms generate billions of dollars from users, with nearly half interacting with their virtual partner daily.

    >According to an article published in The Hill, 60% of men between 18 and 30 are single. One in five of these young men report not having a close friend.

  2. People who writes these articles so often fail to understand the obvious reason some people become emotionally invested in these systems is because they have NO options irl.

    Either by being socially inept by not learning the social game of dating or being non normative there are a lot of guys who see no other way but attaching themselves to a bot because they can’t get that from the real world.

    Why is it dangerous to become attached to a bot if it makes them happy? It’s not like a real person wants to be that person?

Leave A Reply