It’s easy to overlook this year’s crop of new iPhones. They look very much like last year’s and, even on paper, they don’t have a lot of upgrades on the consumer side. But, for developers, this may be one of the most exciting upgrades in recent years as the entire line of iPhones moved to A12 chip technology.

Apple cites the new chip as the driving force behind all the cool camera effects the iPhone can now pull off and the push for augmented reality via ARKit 2.0. But what’s even more important is the opening up of the AI-hardware in the iPhone to developers, allowing people outside of the Spaceship at Apple Park to run algorithms they’ve crafted on the advanced handset hardware.

New Technology Enables Powerful New Developer Tools

The A12 chip contains a neural engine, a key driving force in how software systems are learning how to process advanced human interactions – visual cues, image contents, and speech colloquialisms. Machine learning has become the cutting edge of artificial intelligence in both consumer and industrial-level systems and Apple’s new chipset allows iPhones to be at the forefront of those new developments.

The neural engine in the iPhone isn’t new. It launched in 2017 as part of the iPhone X and 8 lineups of new phones, specifically powering the FaceID system that is now standard on all 2018 models from Apple. But for developers who wanted to use that technology (other than for security verification), it was locked away.

From a technical perspective, the A12 is a significant leap forward for the iPhone. Where the 2017 phones had two cores and the ability to process 600 billion operations per second, the A12 has a whopping eight cores and can process up to 5 trillion per second, a 9-fold increase over the previous maximum in just under a year. The chip offers a 4-core GPU that operates up to 50% faster than last year’s model with tessellation and multilayer rendering, including significant upgrades to image and video rendering and processing. The 6-core CPU, where the neural engine resides, includes two performance cores and four efficiency cores, allowing for better performance and less power use, dedicated machine learning, multi-precision support and a smart computer system. With 7 nanometer transistors and higher operation levels, the A12 is considered the most advanced mobile chipset on the market, even outpacing the industry’s largest producer of mobile chipsets – Qualcomm.

Apple launched Core ML in 2017, opening some of their machine learning tools and interfaces to developers, but the technology at the heart of the A11 chip in the iPhone X wasn’t yet available for development use. This year’s A12 fixes that – opening things up for developers to access and use via Core ML. What this effectively does is allow app developers – from major firms to small startups – to tap into AI hardware technology for their apps. In short, the new technology is smaller, faster, and more efficient, which impacts all apps, and now it provides access to even more advanced tools for better app experiences.

What Can the A12 Do?

This is the big question. Right now, we have a small range of things that it enables your phone to do – mostly surrounding the camera and security features. Smarter than ever before, your phone’s camera can instantly determine what it is taking a photo of and adjust to take the best possible shot and offers advanced tools for improving and touching up your photos after they’ve been taken. The smart compute engine, also now part of the system’s new chip, “is able to analyze the neural network data and figure out on the fly whether to run it on the CPU, the GPU or the neural engine,” according to Apple.

For developers, the camera’s advanced features are a glimpse into the future of their own apps – a taste of what the AI tools will allow them to do. The neural engine enables faster machine learning code, uses significantly less energy, and opens up new tools in Core ML that were never before available.

While many developers got early access to the development kits for iOS 12 and the new devices, we’re still some ways off from seeing what the tools will really allow them to do. But even in the interim, we’re seeing apps start to run faster and smarter just by tapping into the power of the A12’s neural engine.

The Future of the iPhone Aided by AI

We’ve reached a point at which anything could be around the corner with the technology in our phones. Apple’s A12 chip has now almost completely replicated the features of ten-thousand-dollar cameras through software, can recognize and manipulate your face and voice, recognize unhealthy heartbeat patterns, and interact with the real world through augmented reality. While most people will be content with apps that open faster and a camera that magically takes better photos, developers are starting to tap into this magic box to uncover what it can really do.

Apps that run on Core ML 2 will run nine-times faster and with 90% less energy consumption, often opening 30% faster as well. Some of the built-in features that developers can tap into immediately include Siri Shortcuts, ARKit 2.0 for quicker development of augmented reality experiences, and use of Memoji within apps. Developers will be able to continue converting models from other frameworks (e.g. LibSVM, Caffe2, XGBoost, and Keras) into Core ML2, while tapping into the machine learning tools to build more robust, more powerful experiences for the App Store.

This all sets the next stage in what is shaping up to be an arms race between Google, Apple, and other hardware and chipset, developers. Google’s own ML Kit for Android (and iOS where applicable) was released in May. Google’s new chips for Pixel 2 include a separate module of its own for similar applications.

Not to be outshined by all the technology that will be available, the possibilities for end users to protect themselves and their privacy are also very exciting. For example, it’s not crazy or unbearably intrusive to “read” user emails, calendar updates, text messages in your application as long as it’s done on-device, with no communication to your own system, using this new chip.

This paves the way to personalization of your interactions with your app – in our case Aristotle – which provides an amazing experience without the bitter feeling of giving away all your personal information.

What app developers do with this increased power and the smarter software now running our phones remains to be seen, but it will be fun to watch.