Companies across industries are eagerly experimenting with generative AI (GenAI) to deliver better outcomes for customers and improve internal operations. But it’s one thing to have GenAI streamline tasks and generate content; it’s another to use genAI-powered tools for something as personal as performance management.
Performance management outcomes can be career-changing, affecting employees’ career development, compensation, and future within the organization. Because there’s so much at stake, your workforce may hesitate to embrace or trust GenAI’s influence on performance processes and outcomes. To resolve management’s and employees’ natural reservations and make the best use of new technology, you need a concrete and careful plan for implementing AI in performance management.
These three steps will help organizations thoughtfully apply GenAI to support managers in the performance management process.
Organize an internal AI council
A change as big as embedding GenAI in the flow of work requires several different perspectives to get it right. Bring internal stakeholders together in an AI council to drive the effective adoption of AI within the business. Include representatives from HR, legal, security, IT, and finance to evaluate use cases, consider possible consequences, and develop solutions. One imperative of the council is to come up with an AI policy and guidelines in line with leadership’s perspective on AI.
Every organization will apply AI differently. Define use cases for AI within your business, and especially within your existing performance management processes. Prioritize the obvious use cases where AI can have the biggest impact by simplifying processes and adding efficiencies for quick wins.
For example, you can make a strong case for GenAI’s use in summarizing performance conversations and feedback, drafting growth plans, and aligning employee objectives with company goals. Using GenAI to summarize past feedback saves managers time they’d otherwise spend sorting through content themselves and eliminates the bias toward recent events that are fresher in the manager’s memory.
As a group, set criteria to look for in AI tech tools. Consider how you plan to apply AI, safety and security features, privacy, reliability, and user experience. For example, consider the question of bias: Look for vendors who are taking steps to neutralize bias, such as by anonymizing or de-identifying the data. If HR tech tools don’t feed bias triggers (like names and pronouns) through the algorithm, then the output will be more objective.
Selectively test GenAI applications
Applying GenAI to performance management is a big change, and one that managers and employees may hesitate to accept. Once you’ve identified the best use cases and evaluated vendors, the next step is to test GenAI’s ability to meet your desired use cases.
Roll out AI performance management tools to a pilot group, such as a single team or department. Test these tools’ abilities to integrate with existing tech and pull data in from the right places for the most accurate results. Does the GenAI tool integrate with Slack or Teams, or wherever managers and employees converse throughout the day? Can it recognize performance feedback in the flow of work and bring that data into the performance management system?
During the pilot phase, HR leaders must keep a close eye on the types and quality of output AI-driven performance management tools are generating — an essential step to building trust in performance management tools across the workforce. Use your pilot program to test AI-generated output for accuracy, reliability, and applicability.
Solicit feedback from pilot users to assess how effectively AI tools helped the performance management process to be more efficient and people more effective. Ask managers and employees whether they find the tools to be user-friendly, how actionable their output is, and whether AI tools support manager effectiveness, for example, by helping managers to provide precise, comprehensive, and constructive feedback.
Train users to approach GenAI output with caution
No matter how accurate and reliable GenAI tools are, they are not perfect. There is always a risk of errors and inaccuracies with GenAI, which is why it’s essential to keep humans in the loop. People provide the oversight and ensure the output is accurate. GenAI should always serve as a co-pilot whose results are ultimately reviewed and judged by its human handlers.
One of GenAI’s best features is its ability to pull together unstructured, written performance feedback and summarize it effectively to help managers have better conversations with employees. However, managers need to know how to spot potential errors or biases in its output. If the algorithm doesn’t account for planned parental leave, for example, it may appear that one employee achieved far less than their colleagues in the same year even though they were out for much of it.
Train users to assess, edit, and finalize AI-generated output. Managers can provide additional context to flesh out AI-driven feedback summaries to create more nuanced, actionable feedback for employees.
By taking a thoughtful approach to implementing GenAI rather than jumping on the bandwagon, you increase your chances of successful implementation and better outcomes.
Interested in learning more? Find out how HR can lead in the area of AI governance to safely and securely innovate with GenAI.
Arnaud Grunwald is Betterworks’ chief product officer who is the master builder of the company’s product innovation, development, and road maps — ensuring that the user experience is as delightful as possible, brings value, and is lightweight for employees, managers, and HR leaders. He was the co-founder and CEO of Hyphen, an employee listening and engagement platform that was acquired by Betterworks in 2020.
HR’s role in AI governance