Only 54 of the 100 questions on Question Paper 1, Set A of the UPSC Preliminaries 2022 could be answered correctly by ChatGPT.
Since its release, the artificial intelligence (AI) chatbot known as ChatGPT has been consistently making headlines. It has been put to use to finish assignments, such as writing emails for work in a certain tone, and style, and following specific instructions. People have come forward to admit that they have become dependent on the latest trend. In keeping with the current pattern, Analytics India Magazine conducted research to evaluate how well a chatbot would perform on one of the most difficult examinations: the UPSC. On the other hand, the chatbot was only able to provide correct responses to 54 of the 100 questions on Question Paper 1, Set A of the UPSC Preliminaries 2022.
The magazine mentioned that the artificial intelligence chatbot was unsuccessful because it was unable to pass the general category student threshold of 87.54 percent in the year 2021. Questions were asked about a wide variety of topics, including geography, economics, history, ecology, and science, as well as contemporary issues, social development, and polity.
ChatGPT was unable to provide a “definite answer” when it was asked whether or not it had the potential to pass the preliminary exam for UPSC. “Because of my work as an AL language model, I do have a substantial amount of knowledge and information, including that which pertains to the UPSC examination and other topics of the same nature. However, in order to pass the preliminary exam for the UPSC, you will need to demonstrate not only your knowledge but also your ability to apply it, think critically, and manage your time effectively. As a result, I am unable to provide a definitive response regarding whether or not I will be able to pass the preliminary exam for UPSC “that was stated.
After that, the publication went on to ask ChatGPT all one hundred questions that were on the test’s question paper. When the bot was asked which countries of Azerbaijan, Kyrgyzstan, Tajikistan, and Turkmenistan share a border with Afghanistan, it named all of the first four countries (the correct answer was Tajikistan, Turkmenistan, and Uzbekistan). In a few of the questions, ChatGPT came up with its very own answer alternative. Even though the user had only provided four possible answers, it offered “Option E” as one of the possible responses to the question.
According to what was published in Analytics India Magazine, “Due to the fact that ChatGPT’s knowledge only goes as far back as September 2021, it is unable to provide responses to questions regarding current events. On the other hand, ChatGPT gave incorrect responses to questions about subjects like the economy and geography, which are not necessarily time-specific.”