Russian hackers use ChatGPT AI to write viruses.

1 min

ChatGPT Artificial Intelligence Helps Hackers Write Malicious Codes to Steal Personal Data

Russian hackers have managed not only to successfully use ChatGPT’s AI to write malicious code, but also to bypass geofencing aimed at preventing them from accessing the platform.

ChatGPT, a chatbot launched by the American company OpenAI to demonstrate advances in artificial intelligence (AI) research, has become known for its interactive style of communication with the user. However, it’s worth studying it a little more, and the bot will help you write a term paper for college, poems for a loved one, or even a short story.

However, according to Business Insider, Russian hackers went too far and used a chatbot to write malicious code. It is well known that ChatGPT is able to help in writing program code, and perhaps do it better than an entry-level programmer. The fact that hackers could use it to write malicious code was certainly foreseen, but the fact that it would happen so quickly came as a complete surprise to Western experts.

According to a report by Check Point Research (CPR), a cybersecurity company, Russian cybercriminals tried to circumvent Open AI restrictions aimed at preventing this kind of use of the technology.

A conversational chatbot is able to help you write any piece of code, and it doesn’t know whether your intentions regarding the code are good or bad. Business Insider, Journal Interesting Engineering.

Last month, a topic called ChatGPT- Benefits of Malware appeared on one of the popular underground hacker forums, in which the user describes in detail how he uses the chatbot service to create strains and methods of malware, information about which was published in scientific publications, writes in his report Business Insider.

Earlier, another member of this forum boasted of creating his first Python script, which other users found similar to the OpenAI writing style. Later, the author of the script confirmed that he used the OpenAI platform to write the code. This raises concerns among cybersecurity experts, as even people with little programming skills can become potentially dangerous by using a chatbot as an assistant.

Source: https://chat.openai.com/chat

Join Guidady AI Mail List

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.

Like it? Share with your friends!



Your email address will not be published. Required fields are marked *


I am an IT engineer, content creator, and proud father with a passion for innovation and excellence. In both my personal and professional life, I strive for excellence and am committed to finding innovative solutions to complex problems.
Choose A Format
Personality quiz
Series of questions that intends to reveal something about the personality
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Voting to make decisions or determine opinions
Formatted Text with Embeds and Visuals
The Classic Internet Listicles
The Classic Internet Countdowns
Open List
Submit your own item and vote up for the best submission
Ranked List
Upvote or downvote to decide the best list item
Upload your own images to make custom memes
Youtube and Vimeo Embeds
Soundcloud or Mixcloud Embeds
Photo or GIF
GIF format