Acts of meaning: How AI-based interviewing will transform career preparation in higher education — from er.educause.edu by Alan Jones, Suzan Harkness and Nathan Mondragon
Excerpt:
Machines parrot and correlate information. They do not comprehend or synthesize information the way humans do. Factors such as accents in pronunciation, word ambiguity (especially if a word has multiple meanings), deeply coded biases, limited association data sets, narrow and limited network layers used in job screening, and static translations will continue to provide valid ground for caution in placing too much weight or attributing too much confidence in AI in its present form. Nonetheless, AI has crept into job candidate screening, the medical field, business analytics, higher education, and social media. What is currently essential is establishing an understanding of how best to harness and shape the use of AI to ensure it is equitable, valid, and reliable and to understand the shifting paradigm that professional career counselors play on campus as AI becomes more ubiquitous.
There appear to be three points worth considering: the AI interview in general, the predominance of word choice, and expressiveness as read by facial coding.
From DSC:
Until there is a lot more diversity within the fields of computer science and data science, I’m not as hopeful that biases can be rooted out. My niece, who worked for Microsoft for many years, finally left the company. She was tired of fighting the culture there. The large tech companies will need to do a lot better if AI is going to make FAIR and JUST inroads.
Plus, consider how many biases there are!