- Three out of five workers say they use ia (shadows) tools not approved
- The executives and the upper managers are the worst guilty
- Workers lack the appropriate tools or policies
New research has claimed three out of five (59%) workers say they use AI tools that have not been approved by their company, also known as Shadow AI.
Even the worst thing is that 75% of those who use AI Shadow admit to sharing confidential data that could put their companies at risk, and 57% of the direct managers of these employees support the use of an unproven AI, the figures of Cybernews say.
This is because they are actually executives and senior managers who are more likely to use the Shadow (93%) tools, with managers (73%) and professionals (62%) less likely to do so.
Shadow AI is a great concern for organizations
Among the most commonly shared confidential information are employee data (35%), client data (32%), internal documents (27%), legal and financial information (21%), information related to security (21%) and the property code (20%), and this is despite the majority of workers (89%) workers who associate the AI with risks.
Almost two thirds (64%) recognize that data violations could result from the use of shade, but although 57% agree that they would stop using not approved tools if a data violation occurred, few take preventive measures today.
“Once the confidential data enters a non -safe tool, it loses control. It can be stored, reused or exposed so that it never knows,” explained the head of product of links. Ai Žilvinas Girėnas.
Although companies are trying to handle the use of Shadow AI, a quarter (23%) still has no official AI policy. Similarly, only half (52%) of employers offer AI tools for work, and only one in three workers sees them how to meet their needs.
Clearly, then, it is in companies to implement stronger policies and offer the correct type of tools that meet the needs of their workers, not just the generic ones.
“Companies must look for ways to incorporate AI in their processes in a safe, efficient and responsible way,” Cybernews The safety researcher Sabeckis concluded.