WormGPT – AI & Cyber Security

 

By now you might have heard of ChatGPT. If you haven’t, it’s an online chatbot that uses AI to generate text based on user prompts. This could be an answer to a question, generate code, or even write content.

 

The use of AI, and the ethical use of it, is an emerging subject of discussion, and governments are already looking into regulation of such programming. The National Cyber Security Centre also said that AI needs to have better security.

 

As with any form of new technology, there are bad faith actors who look to exploit vulnerabilities for their own personal gain. This can be through identifying weaknesses in a system, or taking advantage of someone’s lack of knowledge about the new technology.

 

One such tool is WormGPT, which is like ChatGPT but has malicious intent. So what is WormGPT, what does it do, and what should you be on the lookout for?

Insurance Made Personal

What Is WormGPT?

 

WormGPT is a chatbot like ChatGPT, however, its founder says that it’s “the biggest enemy of ChatGPT” because it “lets you do all sorts of illegal stuff”.

 

Regular chatbots will have a programmed morality, where they will refuse to engage with certain topics, or answer certain questions that might be immoral or illegal. With tools like WormGPT however, they’re built specifically without this in mind, allowing you to use the tool to draft fake emails, compose phishing scams and create malware code.

 

CSO has identified that it can be, and is being used, to create compromising business emails, the sort of emails that will land in your inbox and read as completely authentic. Normal ways to spot phishing scams such as grammatical errors, weird turns of phrase and unusual requests will become a thing of the past as they will become more sophisticated and harder to identify.

 

Programmes like WormGPT are being sold to criminals to use indiscriminately, and it won’t be the last AI tool to be created for nefarious gains. To test out what was possible, security firm Mithril Security created PoisonGPT, which was designed to spread misinformation. It shows what the capabilities of these tools are and why you should be aware when operating online.

How To Protect Yourself

 

As with any form of cyber crime or cyber attack, you should always be alert to threats. These can traditionally include phishing and malware, which can be relatively easy to spot if you know what to look out for. However, with WormGPT and other tools, it can be more difficult to identify what is genuine and what is fake.

 

Cyber risks should always be a part of your risk management strategy if you’re running a business, and as a part of this you should be regularly reminding your staff of the risks that can appear, as well as alerting staff to recent attempted breaches.

 

Risk management strategies should also be updated to reflect the current threats and trends that are constantly evolving; if you have a strategy from three years ago, it’s probably that it won’t even mention AI let alone the threats from tools like WormGPT.

How The Yorkshire Broker Can Help

 

In order to protect yourself from the many cyber risks, you should have a cyber insurance policy in place. This can include the implementation of a risk management strategy, but also include financial protection if you do fall foul of a cyber attack.

 

Scenarios covered commonly include being locked out of files through ransomware, having your website affected by a DDOS attack, and losing money because of a successful phishing scam.

 

By using an insurance broker like The Yorkshire Broker, we can work with you to create a policy that covers you in the right areas and also gives you a suitable level of cover too.

Read or leave a review

Website Hosting by Dark Cherry Creative.