Skip to main content

Chat AI tech will eventually make Siri smarter, but not anytime soon [Opinion]

Chat AI tech is without a doubt the most exciting thing to have happened in the world of technology for quite some time – providing answers to questions, writing essays, generating working code, or even passing an MBA exam through a conversational UI.

Microsoft announced that a version of ChatGPT will power a conversational search feature within its Bing search engine. Google has been quick to announce that it, too, will use its own AI chatbot, Bard, for searches. Does all this mean that Apple too will climb aboard this next-generation chat AI tech to finally make Siri smarter … ?

What is chat AI tech?

Rather than me write this, I thought I’d ask the best-known system – ChatGPT – to do it for me. I asked it, “What is ChatGPT?” and here’s its answer:

ChatGPT is a language model developed by OpenAI.

It’s a state-of-the-art AI language model based on the transformer architecture that has been trained on a large corpus of text data. It can generate human-like text based on a given prompt, and has been used for various applications such as chatbots, language translation, question answering, and text completion.

The name “ChatGPT” stands for “Chat Generative Pre-trained Transformer”. The “pre-trained” part means that the model has been trained on a massive amount of text data before being released, allowing it to be fine-tuned for specific tasks with smaller amounts of task-specific data.

To which I’d add only that the generic name for this type of tech is Large Language Models, or LLMs.

Its first answer to the question, “What are the benefits of chat AI systems like ChatGPT?” was, for some reason, to focus in on customer service applications. It argued that, for many routine customer enquiries, a bot could realistically replace human operators. I then followed up by asking “What about beyond customer service?”

Yes, beyond customer service, Chat AI systems like ChatGPT can also be used for several other purposes, including:

  1. Personal Assistance: Chat AI systems can be used as virtual assistants for personal or business tasks, such as scheduling appointments or finding information.
  2. Sales and Marketing: Chat AI systems can be used to support sales and marketing efforts by engaging with potential customers and providing product information.
  3. Education and Training: Chat AI systems can be used to provide educational content and answer student questions, supplementing or replacing traditional methods.
  4. Healthcare: Chat AI systems can be used to provide medical information, diagnose symptoms, or connect patients with medical professionals.
  5. Financial Services: Chat AI systems can be used to provide financial advice, manage investments, or answer banking-related questions.
  6. Research and Development: Chat AI systems can be used to support research by answering questions, generating ideas, or assisting with data analysis.
  7. Entertainment: Chat AI systems can be used to provide entertainment, such as generating jokes or playing games.

This immediately demonstrates one of the key benefits of ChatGPT over Siri: it understands context. Siri occasionally does – for example, when I asked, “Where is Delhi?” it showed me the city on a map, and when I followed up with “What time is it there?” it did understand that I was still referring to Delhi, and gave the correct answer. But mostly if you ask a follow-up, Siri has no idea.

For example, when I asked, “How do I get to the Tower of London?” it pulled up Apple Maps and showed me a route. But when I followed up with, “What can I see there?” it had no idea what I was asking.

Will Apple use chat AI tech to make Siri smarter?

The long-term answer is: Yes. It’s already clear that this technology is going to be the future of search, and that its capabilities make all current smart device AIs – Siri, Alexa, and Google Assistant alike – look as dumb as a box of bricks.

Apple also clearly has a keen interest in AI. It developed its own Neural Engine chip specifically for AI tasks. The iPhone’s photographic capabilities are also heavily driven by computational photography, which is AI-powered. And it has an annual in-house AI summit. It’s a safe bet that ChatGPT and Bard will be a key focus of next month’s event.

But we shouldn’t hold our breath for a smarter Siri anytime soon, for two reasons. First, Apple’s trademark approach to almost all new tech: wait and see. The company rarely tries to be early to market with any new technology. Instead, it watches what others do, then tries to figure out how to do it better.

But there’s a more specific second reason …

Chat AI tech is much dumber than it appears

Systems like ChatGPT are good at appearing to be smart. They write very convincingly, because they have been trained on millions of documents written by human beings, and they essentially borrow liberally from all they’ve seen to replicate everything from specific phrases to document structures.

But they also get all their apparent knowledge from those same millions of sources. Of those, they have no idea how to tell truth from fiction; reasoned stance from prejudice; statistical evidence from bias; reputable data from junk.

If you ask ChatGPT to write a scientific paper, for example, it will happily do so. But it will pull in data from random sources, and, as Nature notes, make up non-existent references.

ChatGPT and its competitors work by learning the statistical patterns of language in enormous databases of online text — including any untruths, biases or outmoded knowledge. When LLMs are then given prompts (such as Greene and Pividori’s carefully structured requests to rewrite parts of manuscripts), they simply spit out, word by word, any way to continue the conversation that seems stylistically plausible.

The result is that LLMs easily produce errors and misleading information, particularly for technical topics that they might have had little data to train on. LLMs also can’t show the origins of their information; if asked to write an academic paper, they make up fictitious citations. “The tool cannot be trusted to get facts right or produce reliable references,” noted a January editorial on ChatGPT in the journal Nature Machine Intelligence.

A more amusing example:

An equally amusing one is Google’s Bard giving the wrong answer to a question during a live demonstration designed to show how impressive it is. It was asked ““What new discoveries from the JWST can I tell my 9 year old about?”

It gave the answer of “JWST took the very first pictures of a planet outside our solar system”. Sadly for Google and Bard AI, the first direct images of an exoplanet outside our Solar System instead came from VLT in 2004, according to NASA, not JWST as Bard stated. This could be due to incorrect reporting that Bard AI had seen, which was interpreted to be correct.

Google removed the video, and as of the time of writing its shares are down 6%. The error remains for all the world to see in a Google tweet:

So while both Bing and Google search-engines will incorporate chat AI systems into the results, they won’t replace it anytime soon.

Siri’s spoken responses pose far greater risks

If there’s one thing more dangerous than not knowing something important, it’s asking for the information and being given the wrong answer with great confidence.

When a Google search shows you conventional results next to a chat window answering the same question, it’s very easy for the company to include prominent warnings that the chat answer may not be accurate.

But Siri is designed to provide spoken answers to verbal questions. Even more annoying than Siri ‘answering’ a question with “Here’s what I found on the web,” would be “Here’s a lengthy answer which you first have to listen to, then I’ll note that it may not be correct, and recommend that you search the web.”

But it has huge potential for HomeKit

We may need to wait a while before Siri can use Apple’s equivalent of ChatGPT to answer questions, but one demo has already shown the potential for giving Siri the ability to better understand our needs in a smart home.

That could well be a sensible starting place for Apple to begin using the tech – and I very much hope it will do so. Especially as I recently had to rename a HomeKit scene called “I’m going to bed” because, after interpreting it correctly for a couple of weeks, and switching off living-room lights while switching on a bedroom one, Siri decided to instead switch off all the lights and unlock my front door, having apparently decided I was going out …

Visual: Aideal Hwa/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing