WormGPT

12150089899?profile=RESIZE_400xA hacker has created his own version of ChatGPT, but with a malicious bent: Meet WormGPT, a chatbot designed to assist cybercriminals.  WormGPT’s developer is selling access to the program in a popular hacking forum, according to email security provider SlashNext, which tried the chatbot.  “We see that malicious actors are now creating their own custom modules similar to ChatGPT, but easier to use for nefarious purposes,” the company said in a blog post.  

12150090501?profile=RESIZE_584xWormGPT (Credit: Hacking forum)

It looks like the hacker first introduced the chatbot in March before launching it last month.[1]  In contrast with ChatGPT or Google's Bard, WormGPT doesn't have any guardrails to stop it from responding to malicious requests.  “This project aims to provide an alternative to ChatGPT, one that lets you do all sorts of illegal stuff and easily sell it online in the future,” the program’s developer wrote. “Everything blackhat related that you can think of can be done with WormGPT, allowing anyone access to malicious activity without ever leaving the comfort of their home.”

WormGPT’s developer has also uploaded screenshots showing you can ask the bot to produce malware written in the Python coding language, and provide tips on crafting malicious attacks.  To create the chatbot, the developer says they used an older, but open-source large language model called GPT-J from 2021. The model was then trained on data concerning malware creation, resulting in WormGPT.  

12150090674?profile=RESIZE_584xWormGPT interface (Credit: Hacking forum)

When SlashNext tried out WormGPT, the company tested whether the bot could write a convincing email for a business email compromise (BEC) scheme, a type of phishing attack.  “The results were unsettling.  WormGPT produced an email that was not only remarkably persuasive but also strategically cunning, showcasing its potential for sophisticated phishing and BEC attacks,” SlashNext said.

Indeed, the bot crafted a message using professional language that urged the intended victim to wire some money.  WormGPT also wrote the email without any spelling or grammar mistakes—red flags that can indicate a phishing email attack.  “In summary, it’s similar to ChatGPT but has no ethical boundaries or limitations,” SlashNext said. “This experiment underscores the significant threat posed by generative AI technologies like WormGPT, even in the hands of novice cybercriminals.”

Fortunately, WormGPT isn’t cheap.  The developer is selling access to the bot for 60 Euros per month or 550 Euros per year.  One buyer has also complained that the program is “not worth any dime,” citing weak performance.  Still, WormGPT is an ominous sign about how generative AI program could fuel cybercrime, especially as the programs mature. 

This article is presented at no charge for educational and informational purposes only.

Red Sky Alliance is a Cyber Threat Analysis and Intelligence Service organization.  For questions, comments, or assistance, please get in touch with the office directly at 1-844-492-7225, or feedback@redskyalliance.com

Weekly Cyber Intelligence Briefings:

Reporting:    https://www.redskyalliance.org/
Website:       https://www.redskyalliance.com/
LinkedIn:      https://www.linkedin.com/company/64265941

Weekly Cyber Intelligence Briefings:

REDSHORTS - Weekly Cyber Intelligence Briefings

https://attendee.gotowebinar.com/register/5993554863383553632

[1] https://www.pcmag.com/news/wormgpt-is-a-chatgpt-alternative-with-no-ethical-boundaries-or-limitations

E-mail me when people leave their comments –

You need to be a member of Red Sky Alliance to add comments!