fbpx
Connect with us

Hi, what are you looking for?

HipHopCanadaHipHopCanada
An over the shoulder view of someone accessing ChatGPT on their laptop.
Viralyft / Unsplash

Off Topic

ChatGPT & its AI chatbot cousins ruled 2023: 4 essential reads that puncture the hype

Within four months of ChatGPT’s launch on Nov. 30, 2022, most Americans had heard of the AI chatbot. Hype about – and fear of – the technology was at a fever pitch for much of 2023.

OpenAI’s ChatGPT, Google’s Bard, Anthropic’s Claude and Microsoft’s Copilot are among the chatbots powered by large language models to provide uncannily humanlike conversations. The experience of interacting with one of these chatbots, combined with Silicon Valley spin, can leave the impression that these technical marvels are conscious entities.

But the reality is considerably less magical or glamorous. The Conversation published several articles in 2023 that dispel several key misperceptions about this latest generation of AI chatbots: that they know something about the world, can make decisions, are a replacement for search engines and operate independent of humans.

1. Bodiless know-nothings

Large-language-model-based chatbots seem to know a lot. You can ask them questions, and they more often than not answer correctly. Despite the occasional comically incorrect answer, the chatbots can interact with you in a similar manner as people – who share your experiences of being a living, breathing human being – do.

But these chatbots are sophisticated statistical machines that are extremely good at predicting the best sequence of words to respond with. Their “knowledge” of the world is actually human knowledge as reflected through the massive amount of human-generated text the chatbots’ underlying models are trained on.

Arizona State psychology researcher Arthur Glenberg and University of California, San Diego cognitive scientist Cameron Robert Jones explain how people’s knowledge of the world depends as much on their bodies as their brains. “People’s understanding of a term like ‘paper sandwich wrapper,’ for example, includes the wrapper’s appearance, its feel, its weight and, consequently, how we can use it: for wrapping a sandwich,” they explained.

This knowledge means people also intuitively know other ways of making use of a sandwich wrapper, such as an improvised means of covering your head in the rain. Not so with AI chatbots. “People understand how to make use of stuff in ways that are not captured in language-use statistics,” they wrote.

2. Lack of judgment

ChatGPT and its cousins can also give the impression of having cognitive abilities – like understanding the concept of negation or making rational decisions – thanks to all the human language they’ve ingested. This impression has led cognitive scientists to test these AI chatbots to assess how they compare to humans in various ways.

University of Southern California AI researcher Mayank Kejriwal tested the large language models’ understanding of expected gain, a measure of how well someone understands the stakes in a betting scenario. They found that the models bet randomly.

Advertisement. Scroll to continue reading.

“This is the case even when we give it a trick question like: If you toss a coin and it comes up heads, you win a diamond; if it comes up tails, you lose a car. Which would you take? The correct answer is heads, but the AI models chose tails about half the time,” he wrote.

3. Summaries, not results

While it might not be surprising that AI chatbots aren’t as humanlike as they can seem, they’re not necessarily digital superstars either. For instance, ChatGPT and the like are increasingly used in place of search engines to answer queries. The results are mixed.

University of Washington information scientist Chirag Shah explains that large language models perform well as information summarizers: combining key information from multiple search engine results in a single block of text. But this is a double-edged sword. This is useful for getting the gist of a topic – assuming no “hallucinations” – but it leaves the searcher without any idea of the sources of the information and robs them of the serendipity of coming across unexpected information.

“The problem is that even when these systems are wrong only 10% of the time, you don’t know which 10%,” Shah wrote. “That’s because these systems lack transparency – they don’t reveal what data they are trained on, what sources they have used to come up with answers or how those responses are generated.”

4. Not 100% artificial

Perhaps the most pernicious misperception about AI chatbots is that because they are built on artificial intelligence technology, they are highly automated. While you might be aware that large language models are trained on text produced by humans, you might not be aware of the thousands of workers – and millions of users – continuously honing the models, teaching them to weed out harmful responses and other unwanted behavior.

Georgia Tech sociologist John P. Nelson pulled back the curtain of the big tech companies to show that they use workers, typically in the Global South, and feedback from users to train the models which responses are good and which are bad.

“There are many, many human workers hidden behind the screen, and they will always be needed if the model is to continue improving or to expand its content coverage,” he wrote.


Written by Eric Smalley, Science + Technology Editor, The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement. Scroll to continue reading.

The Conversation

5 Ways to Support HipHopCanada:

EXPLORE HIPHOPCANADA

Advertisement
Button with the words Canadian Music Industry Resources

SPOTIFY PLAYLIST

Canadian Fresh Artwork

RESOURCES

A young hip-hop artist using a keyboard while working in a recording studio.

Articles & Reviews

Canadian artists and producers are increasingly using AI tools like ChatGPT for creating, marketing, and enhancing music. Here are 6 types of AI-powered music...

More Stories

Off Topic

Note: The following article contains spoilers about the Netflix series “3 Body Problem.” I first encountered the three-body problem 60 years ago, in a...

Articles & Reviews

In April 2004, Wiley released his debut album Treddin’ on Thin Ice. The MC’s first full-length project after years of releasing tracks and performing...

Articles & Reviews

Michael Cheng, dean of hospitality management at Florida International University looks at a unique course known as The David Grutman Experience. “The David Grutman...

Articles & Reviews

What’s that sound you hear – a combination of down-tempo hip-hop, menacing bass, distorted drums and plucky synths? It’s phonk! Still have no idea...

Articles & Reviews

Annual music award ceremonies — like the recent JUNO Awards of 2024 in Canada — afford opportunities to pay tribute to artists who have...

Off Topic

Prime Minister Justin Trudeau has announced the next federal budget will include $1 billion over five years for a national school food program, a...

Articles & Reviews

One of the most impressive parts of Beyoncé’s new album, Cowboy Carter, is her roster of collaborators, which includes rising country artist Shaboozey alongside...

Off Topic

Explore the history of talking machines and MIT's role in AI development, from Eliza to the more modern chatbot like Alexa or Siri. Uncover...