More than half (53%) of cybersecurity professionals in Asia Pacific and Japan (APJ) now view insiders, whether malicious or compromised, as a greater risk than hackers outside the company.
Nearly two-thirds (60%) have seen a rise in insider incidents over the past year, and 69% expect the problem to worsen in the next 12 months, according to Exabeam’s From Human to Hybrid: How AI and the Analytics Gap Are Fuelling Insider Risk report.
Artificial intelligence (AI) is accelerating the risk. AI-enhanced phishing and social engineering rank as the most concerning vectors, while unapproved use of generative AI is adding fuel to the fire. Nearly two-thirds (64%) of APJ firms reported employees using unapproved generative AI tools, with 12% calling it their top insider concern.
The convergence of insider access and AI capabilities is producing threats that evade traditional controls, pushing demand for more advanced behavioural analytics.
Yet, only 37% of organisations surveyed use user and entity behaviour analytics, which is the foundational capability for insider threat detection. Many still rely on tools that provide visibility but miss subtle behavioural red flags, such as identity and access management, data loss prevention and endpoint detection solutions.
Almost all organisations (94%) are already using some form of AI to tackle insider risk, yet governance and operational readiness remain weak. Privacy pushback, tool fragmentation and difficulty interpreting user intent are leaving major blind spots.
See also: Why board directors should undergo cyber crisis simulation training
“Security teams are deploying AI to detect these evolving threats, but without strong governance or clear oversight, it’s a race they’re struggling to win,” said Kevin Kirkwood, Exabeam’s CISO.
To combat insider threats more effectively, organisations need approaches that focus on context, distinguish between human- and AI-driven activity, and foster collaboration across teams to close visibility gaps. This requires leadership engagement, cross-functional cooperation and governance models that can keep pace with the rapid adoption of AI, advises Exabeam.