Be careful with AI fans: cybercriminals are using Jailbroken and Gok shit tools to build a powerful new malware




  • IA tools are more popular than ever, but so are security risks
  • The main tools are being leveraged by cybercriminals with malicious intention
  • Grok and Mixtral were found used by Crimianls

New research has warned that the TOP top tools are promoting ‘Wormgpt’ variants, malicious genii tools that are generating malicious code, social engineering attacks and even providing piracy tutorials.

With large language models (LLM) now widely used together with tools such as Mistral AI and Xai’s Grok, Catrl Ctrl experts discovered that this is not always in the way in which it is intended to be used.

Leave a Comment

Your email address will not be published. Required fields are marked *