BES Blog

Artificial Unintelligence

Written by Meghan Benavides | Nov 7, 2024 9:53:45 PM

 

             I have previously written about the benefits and harms of AI in education (Should I let my Kid Use ChatGPT) as well as different ways to use these tools (50 Ways to use ChatGPT without Cheating). Today, I want to share some of the areas where ChatGPT fails…epically. I have learned the hard way that AI just is not as smart as advertised and we need to double check its work. It’s not a replacement for a human being, and that’s a good thing. Here are the key areas that I have observed ChatGPT, and other AI platforms fail after extensive use and research.

             In general, artificial intelligence can only perform lower level thinking tasks. If you plug in the right words, phrases, and instructions, it can follow basic formulas. It cannot form relationships, create novel ideas, or problem solve. These engines simply take vast amounts of knowledge that it was fed and spits out answers. In addition to these higher-level thinking tasks, the following are tasks that are formulaic that hypothetically ChatGPT and other AI servers should be great at but for whatever reason they consistently fail.

Order of Operations

Like a standard calculator, artificial intelligence cannot seem to figure out order of operations. On a teacher workday, I decided to make a cute puzzle activity dealing with order of operations for my 5th graders. I asked ChatGPT to give me different levels of questions with answer keys. I did not check the math before putting the questions and answers into nice laminated, cut puzzle pieces. I found out after many wasted hours that most of its questions and answers were incorrect. I find it to be accurate with order of operations about 50% of the time. I do think that AI will get smarter, and this will not be a problem in a few years, but it has burned my trust and I will be sticking with PEMDAS for now.

Summarizing a Text not Pasted into the Engine

If you type out or paste a text into ChatGPT it can do a relatively good job of summarizing or changing the reading difficulty of that text. However, if you ask it to write about a novel, particularly one that is not a literary classic, it fails. I wanted to take a book and make an easier version of the text for my students who are learning basic English. I had already read the book more than once, I just needed help matching events to chapters (a formulaic task that should be easy for AI).  Not only did it have the chapters wrong, but it had essential points and characters completely confused. I caution anyone from using AI to shortcut reading any book. Not only are you robbing yourself of learning, but you will risk getting a series of totally incorrect answers.

Supporting Evidence with Sources

Many schools do not want their students to use AI because they are worried their students will use ChatGPT to write their essays. I think that this fear is unfounded as this technology currently exists. AI is horrendous at writing academic essays.  I typed in several of my school prompts for essays I had already written to test the power of this machine and my writing was objectively better each time. AI is completely unable to cite its sources. It cannot look through scholarly journals or websites and tell you where it got the information that it is spitting out. This limitation exists for ChatGPT and other AI generators, even those specifically designed to write academic papers.

Evaluating Quality of Information

On a similar note, it would be amazing as a researcher, if AI could find all the academic articles published on a topic during a range of time. It cannot do this because it is unable (like any search engine) to determine which sources are most applicable, high quality, or important for a topic. Only a human being can do the hard work of digging through an article and evaluate if it is reliable, valid, or generalizable.

Fluently Translating

For students who are in the early phases of learning a language, it is easy to be impressed by the ease of AI tools. If you need to send a quick e-mail, understand assignment directions or want the general gist of a foreign language text, AI can be a great support. However, language is not as static as published dictionaries may suggest. AI is not able to process and keep up with these subtle shifts in language. Additionally, culture is connected to each language. That culture and nuance is very difficult to translate without a human being.

Eliminating Subtle Bias

AI is a machine. Its fuel is the information that currently is published on the internet. While there are millions of amazing resources and educational materials online, there are also millions of publications filled with racism, bias, sexism, and classism. When it writes articles, gives answers, it is using the good and the bad fuel that has been put into its engine. As a society, we are still growing in our acceptance of human difference, AI cannot do this work for us.

So what’s next? I think with artificial intelligence and new open-source resources like ChatGPT it is essential to look at these as tools and with a lot of nuance. At least right now, these tools are unable to replace humans. It is the responsibility of teachers and students not to fully rely on these tools to replace our learning, thinking, and creativity. For students especially those with learning differences, this can be an amazing at-home support, but it cannot do your work for you…and that’s a good thing.

Navigating a learning difference? Contact us at Bass Educational Services to see how we the real, human tutors can help you achieve your students' goals.