Growing risks of multi-channel 3D phishing attacks

Patrick Harr is the CEO of SlashNextthe authority on phishing protection across all devices.

I recently coined the term “3D phishing” to describe the practice of cybercriminals targeting victims through multiple channels of contact, especially when this is enabled by AI. These different channels can include email, SMS text messages, social media platforms, collaboration tools like Slack or Microsoft Teams, messaging apps like Signal or WhatsApp, voice calls, and video calls.

In other words, anywhere you can receive a digital message, you can be phished.

How 3D phishing works

Fishing 3D combines a sophisticated, multi-channel approach with various elements of deception to design a highly compelling experience. Attackers communicate with victims on various platforms to create more credibility and instill a greater sense of urgency in their victims.

By applying such immersive tactics, criminals can move targets out of secure environments to further escalate their attacks. Currently, the top five 3D phishing lures include plain text files with malware links, files with malware attachments, smishing texts, captchas, and now even QR codes.

A classic 3D phishing request is initiated with a standard request via email, such as, “Send me your phone number,” before moving the link to another channel, such as through a combination of audio, video and/or text links.

3D phishing has been used for years in the email space, but these attacks are now spreading beyond email in less secure ways, including personal mobile devices. Or 3D attacks can be launched in channels such as Teams or LinkedIn chats, before moving to other environments.

This 3D approach also uses the latest advances in AI to further manipulate victims through social engineering techniques that impersonate trusted entities or impersonate colleagues and work partners. Deepfakes that use artificial intelligence to generate voice and video are currently used less frequently and by highly sophisticated threat actors, but the time will likely soon come when these tactics will become more widespread in everyday phishing efforts.

3D STORM-0539 Hack Group targets corporate gift cards

Many 3D phishing incidents are not publicly disclosed due to their sensitive nature. However, several recent public breaches at Accenture, Amtrak and the LA County Department of Public Health showed characteristics that are consistent with these types of sophisticated social engineering attacks.

As 3D phishing continues to grow, we should expect many more high-profile breaches to be attributed to this threat vector.

For example, the threat actor group known as STORM-0539 has conducted sophisticated 3D phishing campaigns to compromise employee accounts and gain unauthorized access to corporate gift card systems. STORM-0539’s tactics are of concern due to the group’s ability to bypass multi-factor authentication and move within the network to locate and exploit gift card systems.

By targeting employees’ personal and work cell phones with brute force attacks, the group can gain a foothold. Attackers then perform reconnaissance to identify the company’s gift card business process and target high-privilege employee accounts.

After entering the gift card system, STORM-0539 created fraudulent gift cards using compromised employee accounts. Even more troubling, when corporations implement controls to prevent the creation of fraudulent gift cards, the group further adapts by changing the email addresses on unredeemed gift cards to ones they control, allowing them to siphon off the funds.

How companies can prepare for 3D phishing

As generative AI (GenAI) and deepfake technologies have reached maturity, 3D phishing is exploding in 2024 as more cybercriminals adopt new malicious GenAI capabilities. We should expect a large increase in 3D phishing attacks over the next few years as these techniques become even more prevalent among threat actors.

3D phishing is very effective because it exploits our natural trust in environments where we are not traditionally trained to be on the lookout for threats. AI-powered impersonation capabilities can make these attacks feel truly authentic and believable. By combining multiple fraud methods across different channels, attackers can launch highly credible frauds that are difficult for the average person to detect.

Generative AI is the key enabling technology behind 3D phishing attacks allowing attackers to generate convincing text content at scale, clone realistic fake voices and create fake videos. Cybercriminals are also turning to new malicious GenAI tools like WormGPT and DarkBERT to automate the creation of phishing emails, websites and other assets to orchestrate their campaigns.

Organizations will likely need to use GenAI tools and technologies, such as AI-based security platforms to detect the hallmarks of 3D phishing attempts across multiple channels – mobile apps, email and web browsers – because humans are no longer able to recognize these formidable security threats on their own based on AI.

AI-driven behavioral analytics can also identify unusual activity bubbling up within a corporate network, such as discovery efforts and lateral movements, enabling rapid detection and response to issues such as gift card system compromise and other nefarious activities.

By continuously learning communication patterns and writing styles, GenAI is also equipped to help detect business email compromise and social engineering efforts that deviate from the norm.

GenAI tools cannot solve everything, however, so individuals should be extremely cautious about opening unsolicited communications, even when they appear to come from a trustworthy source. Ongoing employee awareness training is also essential to help users recognize and report suspected 3D phishing attacks. Verifying requests through a known good contact method is a critical first step before users take any further action.


The Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology leaders. Do I qualify?


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top