
To explain or not? Need for AI transparency depends on user expectation - psu.edu
To explain or not? AI transparency depends on user expectations
Researchers created a simulated AI-driven dating website to investigate how user expectations influence the desire for transparency in AI systems. The study, which included Penn State researchers, revealed a direct correlation between how well the AI met or missed user expectations and the level of trust users placed in the AI system.
The findings highlight significant applications for industries such as healthcare and finance, where AI is increasingly used to streamline processes and enhance user experience. S. Shyam Sundar, co-author and director of the Penn State Center for Socially Responsible Artificial Intelligence, emphasized the impact on sensitive user interactions, saying, “AI can create all kinds of soul searching for people — especially in sensitive personal domains like online dating.” Users who received fewer matches than expected may feel inadequate, while those who received more might question their criteria.
The study involved 227 participants who used the fictitious dating site smartmatch.com and responded to varying conditions concerning the number of matches they were shown. Participants who received the expected five top picks reported trust in the system without seeking further explanations. However, when their expectations were exceeded or not met, they sought clarity, reinforcing the need for tailored transparency in AI.
The research indicates that as AI becomes more prevalent, companies must shift from standard disclaimers to user-centered explanations that actually enhance understanding and trust. By addressing user needs directly, industries can foster a more responsible approach to AI interactions, paving the way for improved user confidence and satisfaction.


