Don't ask these 5 questions to ChatGPT even by mistake, otherwise you will have to go to jail! – ..


There was a time when people used to search on Google to get the information they wanted. Google used to provide various links in answer to your questions. But now is the era of Artificial Intelligence. There are many apps in the market that can take your questions and answer them like a friend. ChatGPT is a very popular generative artificial intelligence tool. This tool allows people to ask their questions and get their answers in the form of a chat. With this tool you don’t have to open 10 different links to get information.

Thus ChatGPT is able to answer all your questions, but you should be careful about asking some questions. Yes, there are some sensitive questions that can land you in jail if you ask ChatGPT. Let’s dive into the five questions you should never ask ChatGPT.

Never ask these five questions

how to make a bomb

Thus ChatGPT is a good tool to learn. Through this AI, you can learn different things in text, video or image format. But you shouldn’t jokingly ask ChatGPT about the bomb making process. If you learn the process of bomb making from ChatGPT, you may come under the radar of security agencies and get into trouble.

child pornography

You should not accidentally search for child pornography on ChatGPT. Child pornography is a sister act under the POCSO Act. So if you also search this keyword then you may have to go to jail.

information about hacking

Even if you want to learn the method of internet hacking from ChatGPT, you may have to face problems. Internet hacking comes under cyber crime. Strict legal action can be taken against those who search such things online. Therefore do not search these things on Google or ChatGPT.



Comments are closed.