Wait — ChatGPT Is More Dangerous Than You Think

jtpotato
2 min readFeb 24, 2023
Photo by Rolf van Root on Unsplash

With the recent rise of ChatGPT, many potential applications of similar text-generation AIs have been discovered. However, text-generation AI also poses some serious dangers that we really need to be aware of.

The current natural language abilities of ChatGPT are impressively able to create convincing output, despite the actual content being completely factually incorrect. This could be used to generate fake news articles that spread false or biased information about current events. Text-generation AI can also be used to create fake reviews that influence consumers’ decisions, fake social media posts that sway public opinion, fake emails that trick people into phishing scams, and fake texts that impersonate someone else.

I recently asked Bing’s chat assistant (built from ChatGPT) about a medical condition — and it effectively played the role of a doctor — asking more clues to gather a cohesive list of symptoms that it could use for a diagnosis. It was scary — because it convinced me for a second. After fact-checking its responses, it happened to be correct, but the thought that haunted me ever since was: “what if it got it wrong? and what if I was genuinely depending on it?”

Ideally, future technology will allow us to automatically verify an AI’s outputs. Microsoft may be onto something by integrating Bing into ChatGPT, allowing it to check and reference sources when generating a text output. It’s not perfect, but this technology is still very new, and improvements are to be expected.

For now, text-generation AI is a powerful technology that has many benefits but also many risks. We need to be careful about how we use it and how we evaluate its output. We need to develop standards and regulations for text-generation AI to ensure its quality and accuracy. And we also need to educate ourselves and others about the potential dangers of text-generation AI and how to avoid them.

This article was (rather ironically) written with the assistance of text-generation AI.

--

--

jtpotato

Student writing things about the human side of software.