In terms of confidence and companionship, human-AI dialogue is well understood. However, the role of associated with such relationships and the role of experiences is not completely clear. In a new development, researchers at the University of Wesda have developed a novel itself a report scale and highlighted the concepts of anxiety and avoidance associated with AI. His work is expected to work as a guiding principle to further discover human AI relations and add moral concerns to the AI design.
Artificial intelligence (AI) is everywhere in this period. As a result, human AI talks are becoming more frequent and complicated, and it is expected that this trend will accelerate soon. Therefore, scientists have made significant efforts to better understand human AI relationships in terms of confidence and companionship. However, the interaction of these main machines can also be considered in terms of potentially connected functions and experiences, which are traditionally used to explain human bilateral relations.
In a modern work, which includes two pilot studies and a formal study, a group of researchers at the University of Wesda, Japan, which includes the Faculty of Arts, Arts and Sciences, has used the AI relationship to test the human AI relationship. Their searches were recently published online in the journal Present psychology May 9, 2025.
Mr Yang explained the stimulus behind his research. As a researcher of associated and social psychology, we have long been interested in how people create an emotional relationship. In recent years, in recent years, in recent years, Generating AI has become stronger and sensible, not only offers information but also offers a sense of security. AI has demanded attention.
Specifically, the team developed a new self -report scale called human AI relationship scale, or experiences in Ehars, to measure trends related to AI. They found that some people seek emotional help and guidance from AI, just as they interact with people. About 75 % of participants turn to AI for advice, while about 39 39 % AI is considered as a permanent, reliable presence.
This study distinguished the two dimensions of human affiliation from AI: anxiety and avoidance. The person suffering from high -end anxiety with AI needs emotional assurance and is the port for fear of getting insufficient response from AI. On the contrary, avoiding getting more attached to the AI is preferred as a result of the discomfort of proximity and emotional distance from AI.
However, these results do not mean that humans are currently creating real emotional attachments with AI. Rather, the study shows that the psychological framework used for human relations can also apply to human AI talks. Current results can inform the moral design of AI peers and mental health aid tools. For example, AI chat boats used in isolation or therapy apps can be made according to the emotional needs of different users, which can provide users more sympathetic reactions for more associated anxiety users or respected distances for users with grazant trends. The results also recommend the need for transparency in the AI system to prevent excessive dependence or manipulation, which imitates emotional relationships, such as romantic AI apps or care robots.
In addition, the proposed Ehars can be used by developers or psychologists on how people are emotionally related to AI and accordingly adjusting the AI interaction strategy.
“Since AI is rapidly integrated into everyday life, people can not only start getting information but can also get emotional support to the AI system. Our research highlights psychological dynamics behind these interactions, and to assess the emotional trends towards AI, tools are better than the tools. Helps to practice policy and design.







