There is a subtle but significant battle happening in the world of artificial intelligence (AI) tools. While major players like Microsoft and Google invest heavily in defensive uses of generative AI, a new breed of tools known as dark AI tools is emerging on the dark web. These tools are specifically designed for malicious purposes, offering cybercriminals a range of capabilities, from creating phishing scams to writing malware code and exploiting vulnerabilities.
One example of a dark AI tool is WormGPT, which was recently discovered for sale on an underground cybercrime forum. WormGPT uses malware training data to inform its responses and bypasses content moderation guidelines associated with mainstream language models (LLMs) like Bard and Claude. Similarly, FraudGPT, another dark AI tool, can be used to create phishing emails, crack malware, and commit carding.
While there is debate over the authenticity and effectiveness of these dark AI tools, it is clear that cybercriminals are actively exploring their potential. Some experts believe that these tools might simply be “wrapper services” that redirect user prompts to legitimate LLMs, while others acknowledge the existence of genuine dark AI tools like WormGPT.
The future of dark AI depends on its profitability for cybercriminals. If these tools prove to be lucrative, there will be increased investment in their development. Social engineering-style attacks, leveraging the capabilities of dark AI tools, are expected to rise in the future, as cybercriminals exploit these technologies to impersonate individuals and launch sophisticated phishing campaigns.
Organizations must be prepared to confront the reality that cybercriminals will find ways to maliciously exploit LLMs for profit. The risk of phishing attacks, facilitated by dark AI tools, should be taken seriously. Even though these tools may not yet dominate the cyberthreat landscape, the potential consequences of a successful phishing email cannot be ignored.
Q: What are dark AI tools?
A: Dark AI tools are artificial intelligence tools specifically designed for malicious purposes, such as creating phishing scams, writing malware code, or exploiting vulnerabilities.
Q: Are dark AI tools a significant threat?
A: While there is debate over the effectiveness and authenticity of dark AI tools, organizations should take the potential threat seriously. These tools can be used to launch sophisticated phishing attacks, which can result in data breaches.
Q: How are dark AI tools priced?
A: Dark AI tools are typically advertised on a subscription basis. The pricing varies depending on the specific tool, ranging from monthly subscriptions to lifetime access.
Q: What is the future of dark AI?
A: The future of dark AI will depend on its profitability for cybercriminals. If these tools prove to be lucrative, there will likely be increased development and investment in their capabilities.