- A claim has been updated against the criminal gang-2139
- Microsoft has appointed four defendants
- The group is allegedly responsible for creating illegal defenders.
A demand has partially appointed a group of criminals who allegedly used API keys leaked from “multiple” Microsoft clients to access the company’s Azure Openai service and generate deep explicit celebrities. According to the reports, the gang developed and used malicious tools that allowed the threat actors to avoid generative railing of AI to generate harmful and illegal content.
It is said that the group, called “Blue Abuse Company”, are key members of a global cybercriminal gang, tracked by Microsoft as Storm-2139. Individuals were identified as; Arian Yadegarnia, also known as “Fiz” by Iran, Alan Krysiak, alias “Drago” of the United Kingdom, Ricky Yuen, aka “CG-Dot” by Hong Kong, China and Phát Phùng Tấn, also known as “Asakuri” of Vietnam.
The Microsoft Digital Crimes Unit (DCU) filed a lawsuit against 10 “John Does” for violating the Law of the United States and the policy of acceptable use use and the code of conduct for generative services of AI, now modified to appoint and identify people.
A global network
This is an update of the previously submitted lawsuit, in which Microsoft described the discovery of the abuse of the keys of the Azure OpenAi service, and took out a github repository out of line, and the Court allowed the company to take over a domain related to the operation.
“As part of our initial presentation, the court issued a temporary restriction order and a preliminary court order that allows Microsoft to confiscate an instrumental website for the criminal operation, effectively interrupting the group’s ability to operationalize their services.”
The group is organized in creators, suppliers and users. According to the reports, the appointed defendants used credentials of registered customers of public sources (most likely involved in data leaks), and illegal access accounts with generative services of AI.
“Then they altered the capacities of these services and resigned access to other malicious actors, providing detailed instructions on how to generate harmful and illicit content, including the non -consensual intimate images of celebrities and other sexually explicit content,” said Steven Masada, Microsoft DCU general advisory assistant.