A Critical Role for L&D: Navigating AI, Regulation, and Cybersecurity

L&D must develop adaptable, behavior-focused strategies that prepare teams for an evolving landscape of cybersecurity threats and regulatory requirements.

As artificial intelligence (AI) tools proliferate, so do the associated risks. The misuse of AI technology is a growing concern. For organizations and their learning and development (L&D) teams, there is an urgent need to go beyond educating employees about AI’s opportunities and focus instead on the threats it poses and to build resilience against fraud and misinformation. This is why L&D professionals are starting to think outside of the traditional tech training and consider more and more both human behavior and regulatory complexities.

Expanding the role of L&D

Cyber threats, especially those using AI, often exploit human vulnerabilities on top of technical weaknesses. Fraudsters use tactics such as social engineering, leveraging AI to reach broader audiences. This shift requires L&D to design training programs that address not only the technical aspects but also the behavioral factors that may make people susceptible to such attacks.

Consider incorporating scenario-based exercises that simulate real-life situations where people might be vulnerable. Such simulations can help employees recognize manipulation tactics and build awareness, making the learning experience more engaging and effective.

L&D programs need to evolve constantly to stay relevant and useful. Continuous updates to reflect emerging risks and preventive measures are essential for keeping teams prepared and fostering a culture of ongoing learning and vigilance.

Regulatory Complexities

The regulatory landscape for AI is changing rapidly and varies significantly by region. The European Union, for instance, has set a precedent with sweeping regulations, which some see as a model. However, such regulations also raise concerns about consistency and effectiveness. As regulations tighten in one region, fraudulent activities may shift to others with looser controls, creating challenges for global organizations.

For multinational companies, a one-size-fits-all approach simply doesn’t work. Consider developing training programs that are adaptable and region-specific. It may mean partnering with compliance and legal teams to ensure training remains accurate, relevant, and aligned with local regulations.

While organizations may aim to create globally consistent training, L&D needs to remain agile to adapt and regionalize content. AI and privacy regulations might require modular training elements that can be customized for different regions. This shift mirrors how organizations approach learning in regulated areas such as financial compliance, where local adaptation is required legally.

It is also important to keep in mind that, while regulation aims to safeguard systems, it also can be used by dominant players to maintain their market positions. Some large tech companies advocate for regulation, possibly to protect their early mover advantage and stifle competition. L&D has a role in helping employees navigate these complexities by fostering critical thinking around AI and regulations. Training programs should not only cover compliance specifics but also encourage a strategic mindset.

Engaging Compliance Training

Regulatory training sometimes can feel like a “check-the-box” exercise. Traditional compliance modules often lack engagement, which limits their effectiveness. The challenge is to make these programs compelling and meaningful.

L&D teams can make regulatory training more impactful by incorporating storytelling and case studies. Highlighting real-world examples—both successes and failures—makes the material more relatable and relevant, helping employees understand the consequences and importance of compliance.

Gamification and immersive learning experiences can transform dry compliance content into engaging and practical learning opportunities. Scenario-based learning, for example, allows employees to simulate real-world decision-making, making compliance training feel less like a formality and more like a skill-building exercise.

Incorporating diverse voices, such as legal experts and ethical thinkers, into training can provide deeper insights. Employees gain a better understanding of the “why” behind regulations, which builds engagement and commitment beyond mere compliance.

Preparing for the Future

The convergence of AI, regulation, and cybersecurity creates new challenges for organizations and their L&D teams. It’s no longer enough to rely on traditional training methods; L&D must develop adaptable, behavior-focused strategies that prepare teams for an evolving landscape of threats and regulatory requirements.

This moment calls for proactive, thoughtful approaches that not only equip teams with skills but also foster a culture of vigilance, ethical awareness, and continuous learning. The future of AI and regulation isn’t just on the horizon—it’s already here. Is your organization ready for it?

Peter Hirst
Peter Hirst is the senior associate dean for Executive Education at the MIT Sloan School of Management. For more information, visit: https://mitsloan.mit.edu/faculty/directory/peter-hirst