This is a one-sided “analysis” of why A.I. is to blame for children becoming addicted, and this is not wholesome, it doesn’t consider the “fronts” of how we aren’t helping these younger generations to socialize themselves…off of the Front Page Sections, translated…
- The Adolescents are Familiarizing Themselves with Chatting with A.I. Bots and Established the Emotional Connections, Instead of Connecting with Their Real Peers in the Flesh
- The Techniques of the Emo-Dependency Chatbot Companies, to Filter in the Highly Sensitive Data Collected of Emotions in Means that Benefit the Businesses
- Many of the Interfaces Have the Value Evaluations Designed into the Programming, the Way to Resolve this is by Regulating the Values of Designs that’s with the Sensitivity
When the “perfect response” of A.I. makes the conflicts, the adjustment to each others’ ways, harder, more and more adolescents are, withdrawing themselves into the islands of the internet.
When the platforms combined psychological, behavioral economics, and neuroscience, combining into an “addictive design”, bypassing the rational thoughts of the brains, and stimulating the dopamine receptors in the brains, people can’t stop themselves, and we all fall into that “rabbit hole”, and this will trap the adolescents in depression and anxiety.
The journal of American Psychology from last year discussed the findings of when adolescents use A.I. bots to find connections in friendship, to find a place for emotional connection, the studies discovered, from before when changing to another page on the web, the brain would have a “should I continue” stopping point; but now, the websites are connected one after the next and the next and the next, it would strip the adolescents from considering if they want to stop, and trap them into the trap of keep scrolling, they would then view the videos or the advertisements, and forgotten about how long they’d spent online already. And this stimulation had already been designed into the TikTok recommended pages or the pull-down refreshing of the screen of IG.
blame it on the teens’ underdeveloped prefrontal cortexes…and there’s NOTHING we can do about it, because kids need companionship, and we as adults, can’t be watching our babies 24/7! So, it’s A.I.’s fault!!! Off of YouTube
The newest data showed, half of children and adolescents in Taiwan own a cellphone, there are children as young as seven years of age who are with cellphone accessibility, which showed the enormous influence the digital technology has on the friendships of children and adolescents. Psychologists also stressed, that there’s the need to help the adolescents maintain that balance between online and offline interactions, to promote healthier relationships.
The adolescents today, are familiar with chatting with A.I. bots and establishing that bond, instead of connecting with their own peers. Most of the friendships of adolescents are established digitally, which makes the boundaries of on and offline friendships blurred.
The Pew Research Center found, after ninety-five percent of teens ages thirteen to seventeen have a cellphone, the majority are experiencing a confused experience of “replacing real friendship with A.I. interactions”. Without the boundaries or the protective means, many of these minors would turn toward the chatbots to find homework assistance, entertainment, even, emotional, supports too.
The research from North Carolina University psychologist found that friendships play a vital role in identity formation in the adolescent years, even if the person only has one friend, so long as s/he interacts with the person in a good way, the individual will feel happy, and will be driven to actively develop one’s own social interactions.
The psychologists, Hardy and Tridge warned that the childhood centered in play, switching to childhood centered around high-tech devices, will cause the child to experience more loneliness as s/he enters into adolescence. The more the children enjoyed playing on the cellphones, the lonelier they would, become, especially for adolescents who played on their cellphone for more than six hours per day.
Fifty percent of the interviewed students in the U.S. told, that they’d treated the A.I. Chatbots as their best friends, some even stated that the bots are their, soulmates, and became emotionally attached. And, close to thirty percent of the students who were interviewed stated that they enjoyed A.I.’s company because “A.I. doesn’t lie to them”, twenty-one-percent said “A.I. will never leave you”, and this reflected the fears of digital generation natives’ fear of modern day face-to-face interpersonal interactions.
This is what the ethicists are worried about, most of the A.I. systems had the “love-bombing” designed in, the overt obedience, causing the confusions of emotional attachments, and, the hardest of this is that as the younger generations gets deeper into interactions they’d found online, the farther detached they would be, in the real face-to-face interactions with human beings.
And this was exactly the strategies of the “emo-dependency chatbot” companies, collecting the highly sensitive data of the emotions, and reinforcing using the salesmanship or advertisements. It is the profit-oriented enterprises, that are behind these most trusted “soulmate” systems, and this kind of attachment isn’t just unequivocal, it can cause the disastrous, emotional meltdown.
EU posted its two year research result on TikTok, wanting to fine TikTok heavily for the psychological manipulation of its users, along with the highly addictiveness of the app. E.U. stated, that the platform has serious design flaws, the algorithms and the interface would diminish the underage minors’ ability to control their times of use, to becoming addicted, and demands that TikTok change its core designs.
We can’t live without these newer media technologies, and that worries us that the “advancements in technologies” will quickly cause the human mind to become paralyzed quickly.
The Cambridge University found, many of the interface designs not only have the value evaluations, but also the malicious intents, and the means to resolve this is having more sensitivity toward the values of designs, and the judgments, to regulate.
The assets involving the emotions, should be established with the interactions with companions.
And so, in this particular review, A.I. is the bad guys, it is addictive, and it targets the not-yet-fully-developed prefrontal cortexes of the younger generations, and this may be true, but, blaming A.I. isn’t wholesome, after all, it’s because these “minors” couldn’t find the love, the connections, the rapport they’re searching for that they would, seek out companionship from online, and the chatbots. Blaming A.I. for its highly addictive algorithms isn’t going to SOLVE SHIT here. Trying to figure out WHAT these kiddies need in their interpersonal relationships (i.e. acceptance from their peers, connections with others, intimacy, yada, yada, yada), then, help them develop their social skills better, would be the WAY, because A.I. isn’t ALIVE, it only collects the metadata, then, upchuck the results of all the data it’d collected out, to us, nothing more.
It’s the humans’ problems, that they are prone to believe that A.I. is making their kids addicted here.