- The Sophos Future of Cybersecurity in Asia Pacific & Japan 2025 report shows that 91% of Malaysian businesses have adopted AI tools in their operations, but 36% admit employees are using unauthorized AI — a phenomenon called Shadow AI.
- Shadow AI is the use of AI tools by employees that are not approved or monitored by the IT or security department. 41% of businesses are unaware of which tools are being used, and 35% have discovered security vulnerabilities in these AI platforms.
- The consequence: sensitive data (customer, intellectual property, internal financial) can be uploaded to public AI models, creating a risk of exposure or external information leakage without the user’s awareness.
- In heavily regulated industries such as finance, telecommunications, and state-owned enterprises, uncontrolled AI operation is not only a security risk but also a violation of the Personal Data Protection Act (PDPA) — which can lead to severe penalties.
- The cause is not malice: employees use AI to increase productivity — faster report writing, data analysis, or content creation. But just one unvetted application or a wrong configuration can open the door to a cyberattack.
- Expert Aaron Bugal recommends: businesses must increase AI observability and monitoring using a zero-trust model, extending security to all layers — data, identity, device, and user behavior.
- Beyond paper policies, practical AI awareness training is needed for employees to help them understand the risks of entering data into external models.
- Leaders should guide instead of prohibit: provide approved, safe, and monitored AI tools so employees don’t have to “sneak use” AI outside the system.
📌 Shadow AI is the use of AI tools by employees that are not approved or monitored by the IT or security department. 91% of Malaysian businesses use AI, but over a third have Shadow AI — a “dark area” potentially leading to internal data leaks. Leaders should guide instead of prohibit: provide approved, safe, and monitored AI tools so employees don’t have to “sneak use” AI outside the system.

