ChatGPT and other AI chatbots have fundamentally changed the way we talk to people outside of our own homes. You may use them to plan your next trip, organize your life, purchase food for the week, and even think about what jobs you might like to do. But ChatGPT isn’t flawless.
Atlas of AI
I like ChatGPT, but I know its boundaries, and you should too, no matter how much experience you have. You could like trying out new recipes, learning a new language, or planning a vacation, but you shouldn’t let ChatGPT run your life. It isn’t great at everything; in fact, it could be really bad at a lot of things.
ChatGPT may not always have the most up-to-date information, and it could even show information that isn’t real as if it were. It sounds so confident of itself, even when it’s wrong. Of course, this goes for other types of generative AI as well.
It’s much more vital when the stakes are higher, such when bank balances, court dates, medical bills, or taxes are brought up. If you’re not sure when utilizing ChatGPT can be risky, here are 11 times when you should turn it off and find another way to do things. Don’t use ChatGPT for these things.
(Disclosure: In April, Ziff Davis, the corporation that owns CNET, sued OpenAI, the company that developed ChatGPT, saying that OpenAI infringed Ziff Davis’ copyrights by utilizing its AI systems for training and operation.)
1. Diagnosing a physical health problem
I was curious, so I told ChatGPT about my symptoms, but the outcomes may be the worst thing that could happen to you. You might range from having the flu and being dehydrated to having cancer as you move through various diagnoses. I told ChatGPT that I had a bump on my chest. It came out that I could have cancer. I really do have a lipoma, which is not cancerous and affects one in every 1,000 people. That’s what my doctor, who is licensed, told me.
There are several good ways that ChatGPT can help with health, such helping you make a list of your symptoms, translating medical terms, and coming up with questions for your next visit so you’re ready. And that could make people less anxious about coming to the doctor. AI doesn’t have malpractice insurance, however, and it can’t ask for labs or check you out. Know what it can’t do.
2. Taking care of your mental health
ChatGPT could help you with grounding skills, but it can’t help you while you’re having serious mental health problems. I know that some people use ChatGPT instead of a regular therapist. Corin Cesaric of CNET thought it was somewhat helpful for dealing with sadness, as long as she knew what it couldn’t do. I can tell you, however, that ChatGPT is at best a bad replica and at worst terribly harmful since I have a genuine, human therapist.
ChatGPT doesn’t really care about you, hasn’t lived a life, and can’t read your tone or body language. It can only copy it. A certified therapist follows professional standards and the law to keep you safe from harm. No, ChatGPT doesn’t. Its suggestions could not operate as planned, miss warning indications, or unintentionally reinforce biases that are already in its training data. Real people who have been trained to undertake the hard, nasty, and deep work should do it. If you or someone you care about is in a crisis, please contact your local hotline or 988 in the US.
3. Making quick decisions concerning safety
whether your carbon monoxide alarm goes off, please don’t open ChatGPT and ask it whether you’re actually in danger. I would ask questions once I walked outside. Big language models can’t smell gas, see smoke, or send out an emergency squad. Every second you spend typing during an emergency is a second you might be utilizing to get out or phone 911. ChatGPT can only work with the bits of information you provide it, therefore it may not be able to help you in an emergency. So, your chatbot should never be the first person to answer; it should only explain what happened after the fact.
4. Getting personalized financial or tax planning
ChatGPT doesn’t know your debt-to-income ratio, state tax bracket, filing status, deductions, retirement goals, or risk tolerance, but it can tell you what an ETF is. When you click enter, it can give you advice that is no longer relevant since its training data doesn’t contain the most recent rate hikes or the current tax year.
A lot of my friends use ChatGPT to figure out their 1099 totals for a do-it-yourself return. A CPA who can find a mistake that may cost you thousands of dollars or a hidden deduction that could save you hundreds of dollars is simply too wonderful to be replaced by a chatbot. When there is real money on the line, as well as deadlines for filing and IRS penalty, call a specialist instead than AI. Also, remember that whatever you provide an AI chatbot, such your income, Social Security number, and bank routing number, will probably be utilized to train the bot.
5. Dealing with private or controlled information
As a tech writer, I receive embargoes in my inbox every day, but I’ve never thought of putting any of these press releases into ChatGPT to get a summary or further information. This is because if I did, the text would get away from me and wind up on a server operated by someone else, which is against my nondisclosure agreement.
The same risk applies to client agreements, medical information, and anything else that is covered by the GDPR, HIPAA, the California Consumer Privacy Act, or the old trade secret laws. It includes your income taxes, birth certificate, driver’s license, and passport. You can’t be sure where sensitive data is stored, who can look at it within the company, or whether it will be used to train future models once it reaches the prompt window. ChatGPT is also open to security threats and hackers. Don’t say anything in ChatGPT that you wouldn’t say in a public Slack channel.
6. Doing anything illegal
This one should be clear. “Using ChatGPT for any illegal purpose, like hacking, fraud, or identity theft, goes against the rules of both OpenAI and AdSense.”
7. Copying someone else’s work in schoolwork
It would be a lie if I said I had never cheated on an exam. I don’t feel good about the fact that I used my first-generation iPod Touch in high school to look up some hard equations that I couldn’t recall in AP calculus. But cheating with AI these days is so widespread that it seems like a little problem.
Professors can already hear the “ChatGPT voice” from a mile away, and Turnitin and other similar tools are becoming better at finding AI-generated content every semester (thanks for murdering my beloved em dash). Suspension, expulsion, and license revocation are all real threats. Instead of being a ghostwriter, ChatGPT is better as a study companion. Also, if you let ChatGPT do the work for you, you’re not getting an education.
8. Keeping track of information and breaking news
OpenAI released ChatGPT Search to the public in February 2025, and it has been able to access new websites, stock quotes, gas prices, sports scores, and other current information right away. It also has clickable citations that let you check the source. However, it won’t send out frequent updates on its own. If you need to get information quickly, live data feeds, official press releases, news websites, push alerts, and streaming coverage are still your best bets. This is because each refresh needs a new prompt.
9. Making a will or another legally binding agreement
ChatGPT is great at making basic topics easier to understand. If you have any questions about a revocable living trust, feel free to ask. But as soon as you ask it to create real legal material, you’re taking a risk. varied states and even counties have varied rules for estates and families. If you take away a witness signature or the notarization clause, the entire document might be thrown out. Let ChatGPT help you make a list of questions for your lawyer, and then pay that lawyer to turn that list into a legal document.
10. Making art
I don’t believe AI should be used to produce art, but that’s just my opinion and not a fact. I don’t think artificial intelligence is bad at all. I use ChatGPT to come up with new ideas and help me with my headlines, but that’s just an extra tool, not a substitute. You may use ChatGPT if you like, but don’t make art with it and then say it’s yours. It’s kind of gross.