HR Tech Update: Why you need a rulebook for your AI in HR

allison-moore-2

Artificial intelligence (AI) is booming in popularity, and this is true in the human resources sector as well. We’ve covered several instances of the practical use-cases AI offers to HR – from AI-driven chatbots to AI reducing bias during the hiring process. However, AI is a new and largely unexplored field when it comes to human resources applications. HR tasks often involve sensitive information and require expert handling, even when it comes to AI software that has been specifically designed for those tasks.

 

Why a rulebook is crucial

Recent data shows that many companies have already incorporated AI into their HR systems, such as a statistic from The Guardian showing 83% of US companies use AI in hiring, especially predictive AI. These applications can scan and analyse candidate information from their profiles or CVs, and filter out any individuals who aren’t suitable for the role. The key advantage of this method is incredible time and cost-savings.  

Many companies are aware of the potential impact of AI, and they know that it needs to be used safely and securely – both for legal compliance reasons and to ensure the wellbeing of their employees and customers. Recruiters might have concerns about their AI’s potential to overlook certain candidates due to their unique background or worsen bias, or they might be unsure of how exactly their AI tool works to prevent bias. This is why HR needs to implement a strict rulebook when looking to use artificial intelligence. A rulebook can clearly state policies and procedures that will reduce the potential for AI to cause harm or to fall short in fulfilling its goals.

 

Responsible policies to include

Every company should have a customised rulebook – not just for their AI, but for all their HR tech that employs automation. One critical factor to include is human management of AI solutions. How often are human eyes, with experience and authority in HR, monitoring and evaluating the results of the company’s AI tools? This audit can be done on a daily, monthly, or quarterly basis depending on HR’s frequency of use. 

Screening AI tools is also a must. Companies may be liable for discrimination litigation, even if their hiring is done through a third-party AI software. Asking thorough questions of tech vendors about any litigation history, complaints, and the procedures in place to avoid bias, misinformation and other potential complications is a must. When HR is armed with the knowledge of how its AI-enabled tech functions, it can create better guardrails for its usage. 

Maintaining this transparency is crucial when it comes to candidates as well. Informing applicants that they’re being screened by an automated software, but are still able to request workplace accommodations, can prevent instances of accidental discrimination or filtering out of suitable candidates with unique needs. 

HR Tech Update — Insights you need to know about recent technologies to streamline HR functions; Scheduled every Thursday @ 6:00 AM brought to you by Chief of Staff Asia

Share This Article

Facebook
LinkedIn
Twitter

Advertise Now

Pricing
Click to zoom
What's in it for you?
Click to zoom

WELCOME TO
Chief of Staff Asia