Beware of WormGPT: AI Tool Enables Cyber Attacks and Impersonation Scams

This tool's author proudly declares that it is a direct competitor of ChatGPT.

The widespread use of Generative AI and AI chatbots has captivated millions of people who rely on them for a variety of tasks, including problem solving and engaging in daily interactions. As the availability of such tools grows, distinguishing between reliable and potentially hazardous AI tools becomes more difficult. Surprisingly, AI techniques can be used to launch cyber-attacks on unsuspecting humans.

WormGPT

WormGPT, a tool similar to ChatGPT, has emerged as a significant concern. This tool's author proudly declares that it is a direct competitor of ChatGPT because it lacks any controls against malpractice or the development of unlawful content.

So, what exactly makes WormGPT so dangerous? Individuals with bad intent can use WormGPT to facilitate malicious actions such as phishing and business email compromise (BEC) attacks. According to SlashNext results, this program may generate convincing phony emails tailored to individual targets, considerably improving the likelihood of a successful attack.

Hackers can easily build sophisticated phishing emails that boost the efficiency of their fraudulent texts by utilizing WormGPT's chat memory retention and code formatting features. What makes matters worse is that fraudsters do not need substantial knowledge to use WormGPT for malicious reasons. WormGPT, unlike ChatGPT, has no established constraints for content development, increasing the risk of cyber assaults and resulting in an increase in scams and crimes.

To combat the misuse of such AI tools, ordinary individuals must exercise constant vigilance when encountering emails or engaging in chats on social media platforms or elsewhere. The most effective strategy is to refrain from clicking on or interacting with unfamiliar entities on the internet.

It is worth noting that cybercriminals not only exploit text-based communication but also leverage AI tools to create AI-driven videos that impersonate people—such as friends or family members—in order to deceive individuals and defraud them of their money, particularly on platforms like WhatsApp.

READ MORE