The Homebrew Computer Club was a legendary early computer hobbyist group, founded in March 1975 in Menlo Park, California, in the heart of Silicon Valley. The club served as a gathering place for electronics enthusiasts to share knowledge about building computers at home. It was started two months after the news of the MITS Altair 8800, the first commercially successful microcomputer kit (news that also propelled Microsoft). A short time later, in July of 1976, an early prototype of the Apple I computer was demonstrated to the club by Steve Wozniak, a self-educated electronics engineer.
Wozniak's hand-built computer (later named the Apple I), was far more advanced than the Altair 8800. Unlike the Altair which used switches and lights like many early computers, the Apple I had a keyboard interface and could connect to a TV as a display--a major innovation. Steve Jobs was present at the meeting and urged Wozniak to turn it into a sellable product. The Apple I's design with separate keyboard, display, input/output, and fewer chips became the template for future PCs. Units were hand soldered in Wozniak's apartment and assembled in Job's garage. Only about 200 units were made, of which about 70 are known to exist. If you are lucky enough to have one in your attic or garage, it could be worth a million or more.
The commercial success of the Apple I led to the design and production of the Apple II, a machine so successful that over 6 million units were made and produced for 16 years, one of the longest product lifetimes in the history of computing. The "computer for everyone" catapulted Apple to the pinnacle of the computer industry. Today, Apple Inc. is the world's most valuable company and rides as one of the Magnificent Seven.
Apple has been involved in AI (Artificial Intelligence) and machine learning (ML) for years, but its major public push into AI became more prominent starting in the 2010s, with a significant acceleration in recent times. Here's a breakdown of Apple's AI journey:
Apple's purchase of Siri Inc. on April 28, 2010, marked its first major push into AI-powered consumer technology. Developed by SRI International and co-founded by Dag Kittlaus, Adam Cheyer, and Tom Gruber, Siri was initially a standalone iOS app launched in February 2010 that could book taxis, reserve tables, and answer questions using AI. Steve Jobs saw Siri as not just a search tool, but a "do-engine" that could integrate deeply into iOS. Apple's competitors were Google, with Android's Voice Search, and Microsoft with Cortana, which was then in development.
Siri was integrated into the iPhone 4S after being announced on October 4, 2011, as a flagship feature of iOS 5. It was marketed as an intelligent assistant, not just a voice recognition tool. Siri's key innovations were its Natural Language Processing (NLP), third-party App control, and voice personality. Siri understood context such as "Wake me up at 7 AM" to set an alarm. Siri could send messages, set reminders, and book services through Uber, OpenTable, etc. Siri featured a conversational, witty tone which was later toned down following user feedback.
Siri was not without its controversies and challenges. There was no on-device processing, for Siri required internet connectivity. Siri often struggled with accents and complex queries. Apple's closed ecosystem removed third-party integrations like Uber and Yelp in favor of Apple's own services. Finally, critics argued Siri fell behind Google Assistant and Alexa due to slower AI improvements.
In the decade of the 2010s, Apple quietly built an AI Powerhouse. The 2010s saw Apple embed machine learning across iOS in ways most users never noticed. This feat was accomplished by prioritizing on-device intelligence, privacy, and seamless integration. Siri was initially cloud-dependent, but quickly introduced voice pattern learning, recognizing the now common command "Hey Siri". In 2014, with iOS version 8, Apple introdued predictive text using n-gram language models to suggest next words. Soon thereafter, in 2014, with iOS 9, Apple featured face recognition for photos with an underpinning of deep learning for facial clustering. The Differential Privacy feature was introduced in 2016 in iOS 10. It aggregated user data anonymously to improve features like QuickType.
The Neural Engine era began in 2017 with the A11 Bionic Chip. It paved the way for M-series chips with even stronger machine learning acceleration. The Bionic Chip powered the Neural Engine, a dedicated machine learning processor for Face ID (3D facial mapping) and Animoji (real-time facial tracking). Core Machine Learning was Apple's framework for on-device machine learning, and is now the backbone for Apple Intelligence. The Augmented Reality ARKit in 2017 used machine learning for object detection and plane tracking. Siri Shortcuts started in 2018 with iOS 12 providing on-device habit learning, like saying "it's time to go to work".
On-Device Speech Recognition happened in 2019 with iOS 13 as voice-to-text processing moved entirely offline. Also in 2019, Photos Object Recognition auto-organized pictures by people, places, and things (e.g., "dogs," "beaches"). Finally, Apple shifted Siri processing to on-device AI for privacy reasons rather than being cloud-based.
Apple's M1 and M2 Chips revolutionized performance with machine learning
Apple's shift from Intel to its ARM-based M-series chips marked a huge change in personal computing by combining raw power, energy efficiency, and deep machine learning integration. Apple Silicon Macs with M-series chips included enhanced Neural Engines, improving AI performance in apps like Photos, Final Cut Pro, and Xcode.
AI-Powered Features include:
During this period, Apple acquired AI startups like Xnor.ai, Voysis, and AI Music, and published influential AI research papers
In 2020 Apple acquired Xnor.ai, a company focused on Edge AI and On-Device Machine Learning. Their low-power, edge-based AI can run machine learning models on devices without a cloud dependency. Xnor.ai specialized in binary neural networks, which are efficient for mobile chips. The technology improved Photos object recognition, enhanced on-device Siri processing, and powered Live Text, text recognition in images and videos.
Apple also bought Voysis in 2020 providing Natural Language Processing capability for Siri. Voysis pionered Voice AI and natural language understanding (NLU). Their technology enabled context-aware voice interactions such as "Find me a pizza nearby under $20". Apple used the technology to upgrade Siri's conversational abilities with less reliance on scripted commands. The tech was integrated into iOS 15's on-device speech recognition.
AI Music was introduced in 2022 with Generative Audio and Adaptive Soundtracks. The AI-generated music adapts dynamically to situations like workout intensity and mood. AI Music used neural networks to remix tracks in real-time and power personalized Apple Music playlists. It heralded a future integration with Vision Pro spatial audio or Fitness+ workouts.
Apple's AI revolution occurred over the past two years, taking the company from being behind the curve to being a market leader. After years of being perceived as lagging in AI, Apple made aggressive moves in 2023 and 2024 to compete with OpenAI, Google, and Microsoft, culminating in the announcement of "Apple Intelligence", Apple's version of artificial intelligence, at the World Wide Developer's Conference (WWDC) in 2024.
The secret AI engine powering Apple Intelligence was code-named Ajax, Apple's large language model trained on Apple's private datasets. Ajax contains over 200 billion parameters, and was designed to run on M-series Macs and A-series iPhones, not just on the cloud.
At WWDC 2024, Apple unveiled its most ambitious AI initiative ever, integrating generative AI across platforms; iOS, macOS, and more. Siri 2.0 marked the rebirth of the venerable AI Assistant with a generative AI upgrade. Siri is hence forward conversational and contextual ("Summarize my meeting notes and text Mom that I'll be late"), with an on-screen awareness ("Add this address to my contacts"), and ChatGPT Integration so users can toggle OpenAI's model for complex queries.
Apple Intelligence features include:
Apple's strategy with AI is to focus on enhancing user experiences in their products while maintaining its commitment to data privacy. Apple emphasizes privacy in AI development in contrast to competitors like Google and Microsoft who utilize extensive user data for AI training.
Apple is working on significant upgrades to Siri to enhance its capabilities with features like improving contextual understanding for more natural interactions, integration with third-party applications through App Intents, and enhancing functionalities such as editing and sending photos via voice commands.
Apple has recently announced a substantial investment plan, committing over $500 billion in the U.S. over the next four years. This includes creating a new AI server manufacturing facility in Texas, aimed at bolstering AI research, silicon engineering, and software development.
To enhance its AI offerings, Apple has partnered with OpenAI, integrating ChatGPT into Siri. This collaboration allows Siri to handle more complex queries by leveraging ChatGPT's capabilities, providing users with more comprehensive responses.
Apple's approach to AI focuses on delivering practical, user-centric features while upholding its longstanding commitment to user privacy. As the company continues to invest in AI infrastructure and development, users can expect more advanced and integrated AI functionalities across Apple's ecosystem.
Rumors suggest Apple is working on AI-powered robotics, advanced AR, and autonomous systems.