The headline of our September 2019 cover story asked: “AI or Just Sci-Fi?” as organizations began dipping their toes into the world of artificial intelligence (AI) and what it might mean for the workplace and training. Fast-forward four short years later, and the explosion of AI tools such as ChatGPT is pushing Learning & Development (L&D) professionals into the deep end as they try to figure out how to successfully put the AI in training while still maintaining the human connection employees say they want. (Tempting though it was, I did not ask ChatGPT to write this article for me.)
Interestingly enough, while nearly one-third of workplaces are actively using artificial intelligence, employees don’t seem to be quite ready for L&D activities to be taken over by AI, according to new data from Wiley, which surveyed 3,000 professionals across various industries in North America. Nearly 6 in 10 respondents (59 percent) in the latest Wiley Workplace Intelligence report, Artificial Intelligence in Learning and Development: Five Surprising Facts You Need to Know, say they prefer to see an instructor—in-person or virtual—direct their workforce development learning, while only 7 percent prefer AI-directed learning.
In addition, the vast majority (87 percent) of respondents say they want the L&D content to be developed by a subject matter expert as opposed to AI technology (12 percent). Respondents are not opposed to L&D practitioners using AI technology to improve their efforts, but they still want humans to create, plan, and direct the activities—although that may change down the road.
“It’s essential to remember that AI is here to augment human efforts, not replace them,” stresses Pooja Jaisingh, senior director of Digital Learning at Icertis. “Tools such as ChatGPT are designed to enhance the learning experience and streamline processes, not automate everything. There’s still a big role for human judgment, particularly when dealing with topics that require deep contextual understanding, empathy, and ethics.”
HOW CAN AI HELP IN TRAINING?
AI has been gradually taking on a larger role in training over the last few years, including chatbot coaching and learning management system (LMS) functionality, such as Nethix/Amazon-type recommendations for training courses based on past attendance. David Metcalf, director of the Mixed Emerging Technology Integration Lab at the Institute for Simulation and Training at the University of Central Florida, believes many types of training could benefit further from the use of:
- ChatGPT and other text-based tools
- Generative AI art tools such as Midjourney, Adobe Firefly, Leonardo.ai, Photoleap, Stable Diffusion, and DeepFloyd IF, and tools that produce animations and motion graphics such as Runway Gen 2 and WordsEye World
- AI-enabled programming tools such as GPT 4 (with code interpreter), Amazon CodeWhisperer, and Microsoft 365 Copilot
“These tools are particularly useful for technical topics, as well as very specific topics that combine multiple domains of knowledge that previously required a substantial amount of time to create valid content with subject matter experts,” he says. “Now, with ChatGPT, EntrepreneurGPT, and other tools, learning objectives, outlines, and even full learning content modules can be produced in minutes to hours rather than days to weeks,” then followed up with quality control and subject expert matter review.
The programming tools allow trainers to write out objectives and details of the functions of a computer programming application and then automatically write it for them. “They can even tell you where to post it for testing and final deployment, whether a Web course module, a mobile app, a simulation, or even an augmented/ virtual reality application,” Metcalf says. “It’s like having a programming assistant or a teammate who never has a typo in their code.”
One valuable but often overlooked aspect of AI is that it has helped to create a generation of text-to-speech voiceover technology that sounds highly realistic, adds Karl Kapp, an author, speaker, and Director for Interactive Technologies at Bloomsburg University. Kapp points to the tool 7taps, which has a functionality to allow AI to create a series of cards based on a text prompt. It provides an AI-generated “actor,” voiceover, and even content. Instructional designers can then edit that information to ensure accuracy and customize it to their organizational needs.
In the Vyond Go platform, Kapp says, instructional designers can type in a prompt, choose the vibe of the video (ranging from formal to informative, casual, or playful), and then pick the format (including an anecdote, announcement, diatribe, how to, and more). Then, the designer picks the layout (ranging from conversation to talking head) and chooses a setting, and Vyond creates a two- to four-minute video covering the topic. The designer can edit and add text-to-speech items to it and then publish the video.
One of the biggest applications for AI in training is providing coaching and feedback and allowing for practice through role-playing—all at the learner’s pace. “If you’re learning a new language, for example, AI tools can generate exercises, help you practice conversations, and give you instant feedback,” says Shomron Jacob, spokesperson and head of Applied ML and Platform at Iterate.
Similarly, he says that AI can create simulated customer interactions in customer service training, allowing you to practice your skills in a safe and controlled setting. “They also can analyze your performance, identify areas for improvement, and offer personalized recommendations to help you enhance your skills,” Jacob says.
AI tools likewise can be an asset in the onboarding process. “It’s like having a virtual assistant who guides you through the initial stages of your employment, making sure you feel confident and well-prepared,” Jacob says. “Throughout your learning journey, AI tools provide personalized reminders, reinforcement exercises, and micro-learning modules to help you retain information and continuously improve your skills.”
Jacob also cites AI tools’ ability to curate relevant learning materials. “They can analyze a vast amount of content and handpick resources that align with your specific learning goals and progress while adjusting the pace and difficulty of training based on your performance.”
CAVEAT EMPTOR
While it’s easy to jump on the AI bandwagon, these tools can present some significant dangers. “This is new, uncharted territory, and these models were built to make money— not to serve you,” warns Stefanie Boyer, Marketing professor and director of the Northeast Intercollegiate Sales Competition at Bryant University, and cofounder of the RNMKRS app, which offers opportunities for skill development and enhanced performance. “It’s a buyer beware situation. It’s best to use AI tools in walled-off applications where they can do the least amount of damage if they go haywire and start hallucinating (when AI generates false information based on the information within its language model).”
A hallucination can be misstating something, creating a research article for a citation, or even writing a sentence that contradicts a previous sentence, Kapp explains. “Since AI embeds the hallucination within the body of other information that is often true, it becomes critically important for the designer of the instruction to vet the content to ensure accuracy.”
Jaisingh has come across occasional pronunciation issues in AI-generated audio and inaccurate responses from ChatGPT. “These hiccups highlight the importance of using an iterative approach to AI integration, consistently reviewing outputs, and not solely depending on AI for quality control,” she says.
Regular review and updating of the data and models used by AI systems are crucial to maintaining accuracy, Jacob agrees. He recommends having a process in place for users to report errors and provide corrections to help ensure the integrity of the content.
Engines like Prometheus from Microsoft will greatly improve the quality of outputs, Metcalf believes, as will only using verified references from credible sources. “But for now, it is a best practice to always review every response for accuracy,” he stresses.
Boyer also reminds users to assume that any proprietary content—including sensitive training materials—they put into a Large Language Model (LLM) such as ChatGPT is unsafe. L&D professionals must ensure that any AI tools they use comply with relevant data privacy laws and adhere to best practices.
Last but not least, Jacob cautions it’s crucial to avoid over-reliance on AI. “Complex topics often require the expertise and guidance of human trainers to provide effective instruction. Human trainers excel in areas such as emotional intelligence, cultural sensitivity, and nuanced decision-making, which are areas where AI may fall short.”
TIPS FOR CREATING PROMPTS
The key to successfully using AI tools is knowing how to tell the AI model what you want. This is called prompting. “The more specific you are, the better,” Boyer says. “Remember, the model doesn’t ‘think,’ even though it seems that way. Rather, it hunts, combines data sources, and returns information. And, for the most part, models are limited to delivering accurate responses based only on data that existed at the time they were trained.” Think about the parameters in your prompts, Boyer suggests, and don’t assume the model can read your mind. For a text search, give the model as much specific content as possible. For example:
“Write (this is the mission or action) a “funny” (tone) 200-word (a scope parameter) description (the desired result) in the style of a “John Deere sales brochure” (a guiding reference with double quotes denoting emphasis) of a car trailer hitch with wheels (the subject) available for sale (a parameter) to consumers (another parameter) in 2023 (context) in the United States and Canada (more context). Do not give me advertising copy, only sales copy (parameter).
Coming to grips with AI language is like learning to dance—it’s all about rhythm and cues, Jaisingh believes. “We’ve found success by treating the AI interaction as a conversation. It’s a back-and-forth process—ask questions, review the AI’s response, then guide the AI to the desired output.”
Jaisingh says crafting prompts is like preparing a roadmap for AI. Here’s how she navigates the process:
- Establish the role of AI. For strategic insights, assign the role of a Digital Learning Leader. For reviewing content, there’s a Content Editor. When scripting a voiceover, prompt the AI to play Script Writer. For a more dramatic approach, change it to a Movie Script Writer. Jaisingh’s prompts usually start with “Act as a…”
- Focus on detail. For example, when asking ChatGPT to create assessment questions, specify the number of questions, quiz instructions, type of questions, number of options, feedback text, difficulty level, and so on.
- Specify the output format. For example, ask ChatGPT to generate a storyboard in a table format and specify the rows and columns required. You can even specify the word count, bullet points, or paragraphs you need.
- Identify the tone. Whether it’s conversational, crisp, thoughtful, or empathetic, specifying the tone can dramatically change the output.
Metcalf suggests understanding your learning objectives and business goals before you begin creating prompts. “‘Prompt engineering’ is a term for the input that you provide either as text or images or code that can produce the desired end result,” he explains.
Try to stay away from leading prompts, Kapp advises. “A leading prompt can lead to biased information if it is not written carefully or considered fully. For example, if you are writing a prompt for a course trying to objectively compare different instructional design approaches and you ask, ‘Why is the ADDIE model better than SAM?’, you will get a response biased toward ADDIE. Instead, write something like: ‘Compare and contrast the ADDIE and SAM approaches to designing instruction.’”
THE FUTURE OF AI IN TRAINING
As AI tools continue to become more sophisticated, how will their role in L&D and the workplace evolve in the coming years? Jacob believes AI’s ability to analyze emotions and sentiments will contribute to a better understanding of learner engagement and help L&D professionals adjust their training approaches accordingly.
In that vein, Kapp sees chatbots morphing into chat characters that look and sound realistic. “They’ll have a sense of humor and be a virtual assistant who will ‘sit’ on the bottom of your computer screen and help you through daily tasks.”
Jacob expects predictive analytics powered by AI will help organizations anticipate future training and talent needs by considering industry trends, evolving job roles, and employees’ skill progression. “This proactive approach will ensure that training programs align with future requirements, enabling employees to stay ahead in their fields,” he says. “AI will furthermore facilitate enhanced peer learning by matching employees based on their learning styles, interests, and complementary skills. By fostering collaboration and creating dynamic learning communities, AI will enable employees to learn from and support one another.”
Instructional designers have the potential to be the next generation of digital engineers, especially when training not only people but machines (AI engines), Metcalf says. “I’ve started to talk about the next-generation learning engineer and the role in what I call ‘Learngineering.’”
CASE STUDY: STEM LEARNING CAMPAIGN
The University of Central Florida (UCF) recently was asked to create a curriculum for the new laws surrounding quantum computing and cybersecurity. David Metcalf, director of the Mixed Emerging Technology Integration Lab at the Institute for Simulation and Training at UCF, was able to create AI prompts and produce the core content and design for a whole learning campaign with a specific emphasis on the overall curriculum and a transmedia CASE STUDY: STEM LEARNING CAMPAIGN solution (a combined deck of playing cards, mobile app, online game, and augmented reality feature) in just two hours. This used to take approximately four days for such a complex topic, Metcalf says.
Next, the art team spent hours— rather than days or weeks—producing custom artwork for each card. “The whole project came together in just a few days,” Metcalf says. “It was delivered to the quantum computing experts doing STEM (science, technology, engineering, mathematics) outreach for our next generation of leaders and technologists in summer enrichment camps.”
CASE STUDY: DELL TECHNOLOGIES
Dell Technologies knows that “live” role-plays are not necessarily the best way to get sellers to practice conversations with prospects numerous times as managers rarely have time for multiple role-plays, and their feedback to trainees can be inconsistent.
Instead, sellers using the RNMKRS app have real, natural language conversations with customer bots. “Our bots are packed with personality and domain knowledge, which makes them highly qualified sparring partners,” says Stefanie Boyer, cofounder of the RNMKRS app. “Our AI expert system allows reps to practice the Dell Rugged Laptop sales conversation, for example, with the customer bots over and over again at their own pace. With 400,000-plus role-plays on our platform, our data shows that it takes between 20 and 30 role-plays to see a step-change in sales conversation competence.”
Trainees use the instant feedback they get to constantly improve their scores. The data that comes out of all those conversations helps managers assess and train talent. New hires who begin with this kind of training become productive faster, Boyer says. Dell also uses this role-play practice data to identify top candidates for its sales teams.
TRAINING AND RESOURCES
Looking to boost your AI IQ? Shomron Jacob, head of Applied ML and Platform at Iterate, notes L&D professionals can benefit from training around foundational understanding of AI and machine learning, data literacy (including how to handle data responsibly, ensure data quality and integrity, and interpret data analysis results), and ethical and legal considerations associated with AI.
Here are some resources that can help:
- Training Magazine Network offers Webinars and a Learning Center of AI-related resources: http://www.trainingmagnetwork.com/AI_for_Training
- The Training 2024 Conference & Expo, February 26-28, will host the Innovations in Training Test Kitchen, featuring a new extension on AI tools, plus a host of AI-related breakout sessions. Register at: www.trainingconference.com
- OpenAI and Microsoft’s free course on prompt engineering: deeplearning.ai
- Bob Pike’s Trainer Talk column with tips on creating ChatGPT prompts: https://trainingmag.com/are-you-using-chatgpt/
- Josh Cavalier’s Webinar, interactive chat practice, and PowerPoint handout on ChatGPT strategies and prompts: https://www.trainingmagnetwork.com/events/3514