Thought you’d nailed cybersecurity training? Think again, AI is coming for your teams

Max Vetter, VP of Cyber at Immersive Labs looks at how financial advice firms can get their workforces’ cyber resilience right in the face of Generative AI.


 
The cybersecurity landscape is undergoing a tectonic shift, driven not only by an evolving set of global threats, but also by the rapid advancements in AI. While organisations have fortified their digital perimeters with security tools and policies, the advent of generative AI is redefining cybersecurity and its impact on people.
  

For all of its benefits in other fields, this transformative technology is arming cybercriminals with easily accessible tools to create persuasive, realistic, and highly targeted social engineering attacks. Most concerningly, it’s making the process of launching a cyberattack less resource-intensive and more convenient for malicious actors.  

Traditional cybersecurity training, while essential, is no longer sufficient in this new era of AI-enhanced threats. As vulnerabilities are becoming increasingly subtle and attacks more intelligent, it’s critical for businesses to adapt how they prepare their workforces – not just cyber professionals – to counteract emerging threats, including those posed by AI.  
 
How generative AI has elevated the threat landscape 

Financial advisers have always been a major target for social engineering attacks. These attacks exploit human psychology rather than technical vulnerabilities to gain unauthorised access to systems and information. Threat actors manipulate individuals into divulging confidential information or performing actions that compromise security. The most common types of social engineering threats include phishing, tailgating, and baiting.   

 
 

Last year, financial institutions were the most targeted industry for phishing attacks. SMEs within this industry are more vulnerable to such threats, as small businesses are 350% more likely to be targeted by social engineering attacks. So, the emergence of generative AI could further increase the vulnerabilities and risks of such businesses.  

 
AI tools with advanced NLP (natural language processing) capabilities can craft emails with flawless language and even mimic specific communication styles. These emails can be so persuasive that they trick even the most vigilant among us into sharing confidential information. They can leverage perfect syntax, deploy psychological tactics like urgency or reciprocity, and can even use publicly available information to tailor attacks to specific individuals or organisations. As financial advisers often manage vast amounts of sensitive data, these subtle threats pose a serious risk. 

 
Threat actors can also potentially use prompt injection, crafting nuanced prompts to hijack a language model’s output, and have it say anything they want. We’ve recently developed a prompt injection lab using OpenAI’s API to demonstrate how easily and quickly malicious messages can be crafted.  

This is a wake-up call, underscoring the need for a fresh look at our cybersecurity measures. Traditional practices are falling short in the face of AI-enhanced threats, and it’s clear that a new approach to training is needed. Businesses need to understand that tackling this issue will require more than just updating existing protocols; it calls for an entirely new strategic framework, which involves increasing their cyber workforce resilience through continuous and measurable cyber exercising.  

 
 

Thinking beyond compliance  

With this recent surge of generative AI, most businesses are rightfully establishing an internal AI policy to ensure data privacy and security compliance. While this is a commendable first game, it’s not enough. What is crucial is the resilience of your workforce, honed through real-life cyber simulations and labs that engage them with real-world AI threats. Simulated exercises and advanced cybersecurity tools can be invaluable in this regard. 

A first step for businesses is to conduct a thorough risk assessment, evaluating the current skill levels of their teams and identifying weaknesses. This creates a roadmap for targeted upskilling. Continuous learning is key; don’t merely fill a skills gap and consider the job done. With the pace at which AI is evolving, skills become obsolete quickly. Therefore, organisations must implement a cycle of regular training updates and hiring strategies that focus on specialised skills for combating AI-powered threats. 

It’s important to recognise that legacy approaches grounded in historical threat data are no longer reliable. The need of the hour is a forward-thinking strategy that not only identifies and fills skills gaps but also inculcates a culture of cybersecurity awareness within the organisation. 

 
 

Think of this as the next wave of people-centric cybersecurity—a revamped, AI-ready approach that involves the entire workforce, not just the IT department. But to get started, businesses must first measure and benchmark their current level of cyber resilience.  

How to effectively measure cyber workforce resilience 

Assessing the cyber resilience of your workforce requires a multi-faceted approach. In our latest research, a staggering 55% of directors admitted they lacked the data to effectively gauge their organisation’s preparedness for cyber threats. This is alarming given the escalation of AI-powered risks.  

Furthermore, our study also revealed that while most respondents acknowledge the link between workforce cyber resilience and organisational success, only 58% believe their companies effectively assess this resilience. This suggests an urgent need to overhaul existing assessment methodologies. 

To bridge this gap, organisations must introduce robust assessment methodologies tailored to AI-era threats. This could mean adopting stronger metrics that accurately reflect the ever-changing threat landscape, or investing in more dynamic assessment tools that can adapt as quickly as the threats themselves. These metrics and methods should not only be reliable but also inspire trust within the organisation. 
 

An organisation-wide cybersecurity culture with continuous exercising and the metrics to prove cyber capabilities across roles can identify any issues or areas for improvement, making it easier to create a skills development plan rather than relying on a blanket approach.  

Overall, businesses must remember that measuring cyber resilience is no longer a check-box activity. In an era where AI is revolutionising both opportunities and threats, getting your cyber skills assessment methods right is not just a best practice; it’s a survival necessity. 

Related Articles

Sign up to the IFA Newsletter

Please enable JavaScript in your browser to complete this form.
Name

Trending Articles


IFA Talk logo

IFA Talk is our flagship podcast, that fits perfectly into your busy life, bringing the latest insight, analysis, news and interviews to you, wherever you are.

IFA Talk Podcast – listen to the latest episode