The New York Times said that the first round of Siri upgrades might be made as early as this autumn to support the faltering digital assistant, which is meant to form the foundation of Apple Intelligence.
The following fact was included by author Tripp Mickle in an article about Apple’s difficulties, which covers everything from US tariffs on China to conflict among the executives and teams in charge of developing Siri: “The company plans to release a virtual assistant in the fall capable of doing things like editing and sending a photo to a friend on request, three people with knowledge of its plans said.”
The linked smart assistant that was hinted at at WWDC 24 and at the iPhone 16 series introduction, where Siri would be able to extract context from emails and messages about family members arriving on inbound aircraft, is still quite different from that description. Actually, it doesn’t seem to solve the problem that Siri seems to be regressing at the moment.
Apple, a notoriously private company, admitted in a March statement to Daring Fireball, through spokeswoman Jacqueline Roy, that its efforts to introduce a more intelligent Siri digital assistant as part of Apple Intelligence are “going to take us longer than we thought” and that the company “anticipate(s) rolling [these features] out in the coming year.” According to the Times story, we could start to see changes as early as the autumn.
Simple questions shouldn’t be difficult.
Apple Intelligence and Siri have been under fire recently. Both the Times report and The Information’s more in-depth behind-the-scenes look (which MacRumors does a good job of summarizing) describe how Apple reorganized its senior levels and fired John Giannandrea from his position in charge of Siri.
However, Siri also doesn’t appear to have context for simple inquiries. An previous issue that occurred when you questioned Siri, “What month is it?” was resolved by Apple. The response was a terse “Sorry, I don’t understand.” Even today, when I ask the same query, I receive the current full date instead of the month. Additionally, when I ask, “What is the current month?” I’ve been informed that “It was Tuesday, April 1, 2025.” (I may question if Siri was attempting to pull an April Fools’ prank on me if I was unaware of its problems.)
Two screenshots of Siri inquiries on an iPhone. The response to the question “What month is it?” on the left is “It’s Friday, April 11, 2025.” The response to the question, “What is the current month?” is “It was Tuesday, April 1, 2025.”
Although Siri still struggles to answer simple questions, at least it is no longer responding with “I don’t understand.”
It doesn’t seem like much work to parse such a simple inquiry. Maybe it never came up because only someone who is being rescued from a desolate island or emerging from a coma would ask such a question.
Customers, journalists, and stockholders find all of this annoying, particularly when they anticipate a level of assistant proficiency from Apple that just does not exist. We have reluctantly come to terms with this. The months-long drumbeat of “Apple is falling behind on AI” that preceded the release of Apple Intelligence is also encouraged by the secrecy.
By confirming that analysts, reporters, and fans were correct, Apple may have made matters worse by taking the uncommon (for Apple) step of giving in to pressure from investors and the media and unveiling capabilities that aren’t even close to ready.
It would be wise for Apple to stick to their covert practices and hold off on revealing its features and capabilities until they are considerably closer to being ready for release. The leak this week raises the possibility that the business is starting to take notice.
Apple’s exaggerated expectations
When it comes to product development, Apple has always worked in secrecy, often for years at a time, until products are ready to be released. Although they are often not fully developed at release, the essential features and capabilities are there when they are prepared for public distribution.
I have many instances to share. Although it’s true that the Vision Pro isn’t a successful product due to its high cost, lack of widespread customer or developer adoption, discomfort, and other issues, the fundamental components—processing power, micro-OLED screens, and VisionOS—are all present as a strong basis.
When a product’s existence is widely publicized in advance, Apple usually releases a completed version, even if its functionality is still restricted. Prior to the 2007 Macworld Expo, it was widely anticipated that Apple would make a phone announcement, especially in light of the Motorola ROKR E1 phone’s embarrassing performance. However, few anticipated that its huge screen, absence of a physical keypad, and complete web browser would set it apart from other smartphones of the era.
The first of many interactive gestures we use on a daily basis was Steve Jobs’ demonstration of pinch to zoom on the first iPhone. These days, tapping, swiping, and pinching on displays comes naturally. However, since then, gadgets have proliferated, including
The decision by Steve Jobs to exclude a physical keyboard on the initial iPhone was revolutionary.
This time, however, Apple’s pledge to develop an enhanced Siri to serve as the foundation of Apple Intelligence seems to be a response to investors, the media, and early adopters who are fixated with AI’s presence as well as its immediacy. Apple has to be seen as a competitive competitor in the AI market with features that are in the horizon.
Around that bend, too? the annual update of the iPhone. Since only its iPhone 15 Pro and iPhone 16 series devices have the processing capability to run Apple Intelligence, Apple, like other phone manufacturers, views AI as a significant driver of new phone sales. In 2024, the WWDC keynote addressed Apple Intelligence and promised that Siri will soon develop into an intelligent agent capable of retrieving information from all areas of your iPhone to answer questions like “What time does my mom’s flight arrive?”
Traditional release models are not followed by LLMs.
ChatGPT and other large language models (LLMs) are developing at an unprecedented rate. They can now effectively condense a lot of information and are considerably more conversational by nature. For instance, real-time audio transcription is revolutionary for someone like me who has never been good at handwriting notes.
However, these AI tools aren’t producing the types of results that digital behemoths like Google and OpenAI anticipate. The AI future of other companies, including Apple, depends on intelligent agents that are fully aware of our personal information.
It’s possible that Apple, like Google, recognized the rapid speed of development in LLMs’ capabilities and believed that a few minor bug patches and AI model recompiles would resolve the issues it is now experiencing. After issues were resolved, it would take many months to connect the dots and introduce them as the next Siri.
However, things aren’t going that way. Are you getting enough of the appropriate amount of pebbles in your diet? AI hallucinations and inaccurate data are still an issue.
I have a suspicion that Apple is taking advantage of the fact that it was compelled to postpone its Siri ambitions in public as well. However, there are still lots of chances to keep enhancing Apple Intelligence capabilities even if Future Siri doesn’t come anytime soon. Preparations for WWDC 2025 are undoubtedly well underway, as is work on the iOS 19 and iPhone 17 models. With less demands placed on the reliable assistant, maybe Siri’s year will get better from here.