Shutterstock

The newest iOS 11 operating system started rolling out to iPhone users last month and it includes a slew of updates that will benefit users and app developers alike. If you’re an iPhone user or have an existing iOS app, here are a number of new features you can start taking advantage of and tips on how to implement them.

Apple’s updates are designed to improve the overall user experience with things like better functionality and reductions in data usage and battery life. New operating systems also include behavioral updates geared toward developers—new frameworks, APIs, and services that can improve your app’s functionality and usability. It’s also a great time to update your app’s existing code to ensure 100% compatibility across all iOS devices.

So, what’s new in the iOS 11 update? We’ll break it down by key functions and look at how users and developers can benefit from each.

Augmented reality for iOS.

Augmented reality (AR) is taking off, and iOS 11 is keeping up, offering app makers the ability to work this technology into their apps that let them “interact with the real world in entirely new ways.” This is big, and possible thanks to a few things. ARKit is a new framework that leverages the new TrueDepth Camera (which allows you to track your face for a 3D character, for example) and Visual Inertial Odometry (VIO) that takes input from device sensors to make an object on screen accurately track around the room you’re seeing through the camera.

Machine learning comes to iOS apps.

Machine learning is one of the fastest-growing fields in data science that’s being adopted by nearly every industry. It’s applications and algorithms are far-reaching and yielding amazing productivity—and now iOS apps can benefit from its brains with the Core ML framework. With both deep learning and standard models, “Core ML delivers blazingly fast performance with easy integration of machine learning models enabling you to build apps with intelligent new features using just a few lines of code.” The Vision framework and natural language processing (NLP) allow apps (and your own device) to recognize you, your speech patterns, preferences, and more so it’s more your own than ever before.

For iPhone X users, security has a new face.

iOS 11 has support for two different types of biometric authentication—the best ways for users to make their devices as close to airtight as possible. There’s support for Touch ID—the thumbprint scan available on devices with a home button—and now Face ID, which uses the forward-facing camera to assess the depth map of your face to ensure that you’re you. It takes the place of the thumbprint scan during Apple Pay transactions, too. Learn more about Face ID in this article.

The Files app is a hub for files from all devices, apps, and storage services—all in one place.

The new Files app is a very dynamic new feature for users and developers both. Files can be centrally accessed from the app—and not just local files, files from other apps, other devices, iCloud, and even third-party services like Dropbox. By integrating with these other storage service APIs and the Files app, developers can create apps that can directly access a user’s files, helping people be more productive.

Drag and drop functionality moves files between apps.

For developers, building this feature in is a great way to make your app even more useful (and integrated) for users. With Multi-touch in Drag and Drop, users can move images, text, and files from one app to another, something developers can build in with an API. For users, this is a super simple way to select multiple items and move them from one app to another—even things like reminders, maps, and contacts.

Scanning capabilities are built-in with Near Field Communication (NFC).

This new capability is designed to snap into action when a device is centimeters from an object—a barcode on a product, a sign in a store, and more. Whether it’s a store employee tracking inventory or a customer scanning a coupon, this capability connects users with the world around them in a new way. Apple has a few recommendations for how developers should implement this technology, including terminology, to try.

As always, the look and feel has a few new tweaks.

For users, this is all about an improved experience and better accessibility. Things like bolder navigation, typographic updates (increased font sizes and weights), and clearer icons. For developers, look to “safe area layout guides” that help ensure none of your content gets hidden under the various bars (status bar, navigation bar, etc.). This update coincides a bit with the release of the iPhone X, which does have a few more implications for design.

Siri got a few updates, too.

Siri is improved with better search results and a more natural tone of voice, both thanks to machine learning. Siri lets users do a lot with their phones via voice alone: search for photos, send money, send texts, add items to lists, and more. For developers, the new SiriKit has a few new capabilities as well. App makers can use SiriKit to integrate voice control into iOS and watchOS apps so users can interact with their apps via voice commands alone.

The App Store’s first ever total redesign.

The changes to the App Store—like improved search and new tabs for navigation and “Today” daily features—make it easier for users to find what they’re looking for, browse what’s new, and update to new versions of apps they already have. For developers, it’s always a priority to get your app noticed in the App Store, so knowing what’s new with it can help guide your strategy when it’s time to submit.

Explore the full list of new iOS 11 SDK features for developers here.