Last week, June 5th — 9th, was Apple’s annual developers conference, World Wide Developers Conference, otherwise known as WWDC. While not normally being known for announcing new hardware at the conference, Apple announced several new products, including new iPads, new iMacs and the HomePod.
While there was a lot announced last week, let’s focus on what Apple’s conference last week means for developers on their platform.
Drag and Drop on iPad was the star of the week. Dedicating four sessions about how this new feature works, Apple made it clear that they want the iPad to play a key piece in their productivity story.
With iOS 11, applications can utilize Drag and Drop, not only within their own app, but also between applications. Apple showcased use-cases such as grabbing text, multiple images, and a map location, and dropping them all onto Mail.app to create a rich email, filled with information from multiple applications.
The API that Apple has designed for this feature is beautiful, as well. By creating an additive API instead of a destructive one, apps can opt in to the new feature as developers see fit. Declaring an array of s allows for content to be lifted up, and creating a delegate for a gives developers the power to create and consume data across the platform. Table views and collection views even have specialized support for these new paradigms, that lead to intuitive experiences for reordering. There’s even a brand new section in Apple’s Human Interface Guidelines showing how to best take advantage of the new feature.
I’m extremely excited to see how apps implement Drag and Drop. This will open up a world of new productivity possibilities for the iPad and all of iOS. I’m personally looking forward to utilizing Drag and Drop using collection views within the Shortcut iOS app.
ARKit was one of the more glitz and glam pieces of Apple’s keynote on Monday. With their first foray into augmented reality, Apple had the audience fawning over visually appealing demos with Legos and outrageous table-top gameplay.
For developers, there’s a lot of potential here. With nearly every iOS device being capable of now being an augmented reality device, we will see more apps adopting the unique physical-reality-altering experience of apps like Pokémon GO did.
For developers that are already using SceneKit or SpriteKit in your apps, getting started with ARKit will be a seamless transition. and will both allow you to get started by rendering your content over a live camera image, with real-time motion and light tracking.
From Google’s AlphaGo to “Not Hotdog” from HBO’s Silicon Valley, machine learning is set to be the hot topic of the next year already.
With iOS 11, developers will have access to the framework that enables machine learning in Siri, Camera and QuickType, called Core ML. Touting speed and efficiency, Core ML builds on top of the vibrant ecosystem of today’s machine learning. Apple published a tool for converting trained models from popular tools like Caffe, Keras, and XGBoost and more, for consumption by Core ML.
Alongside Core ML, Apple also announced a new framework named Vision, which analyzes images to gain some interesting information. Vision can detect faces, facial landmarks (like eyes, lips, eyebrows), barcodes, QR codes, text, track objects or rectangles… the list goes on. I have a feeling we’ll see some really interesting applications of this framework once iOS 11 launches.
When both the Music and News app were redesigned last year featuring heavier weight fonts, many people wondered how far Apple would take this style. With iOS 11, we have our answer.
iOS 11 introduces a few new properties around to enable large, bold fonts as part of the navigation. and both allow an app to opt in to the new style, gracefully transitioning these titles between screens.
Fonts in other places across the OS have bumped up their weight a bit to match the new style as well, showing a slight drift in style in iOS. Personally, I find the new bold style to be quite appealing, after a bit of getting used to it.
iOS isn’t the only place where radical changes were happening last week. Xcode 9 launched into beta last week with a ton of new features:
These are just a handful of the new features that Xcode 9 brings us. There’s loads more new here that I would love to talk your ear off with, but I’d recommend just playing around with the new Xcode to see how spectacular it is. It deserves more than just a single version bump; it’s miles better than previous versions of Xcode. While working on the Shortcut iOS app, I’ve already made use of multiple simulators.
With larger swaths of Apple’s IDE and tooling being built in Swift, we can see that Apple is committed to using Swift more and more. As someone who has felt some growing pains with Swift over its infancy, I feel comforted knowing that teams at Apple will be going through the same.
Apple launched a new App Store app in iOS 11, complete with a new “Today” view as the first tab in the app. It’s clear that Apple wants to tell the stories behind the apps on the store, featuring editorials about apps, videos that play inline, and groupings of apps that fit well together.
All of this comes with a few changes in iTunes Connect. Gone are the days of uploading a separate app icon asset directly to iTunes Connect. Now, iTunes Connect uses your app’s icon asset directly.
A new, 30 character “subtitle” field was added to apps, to compliment the app’s name. Developers can choose in-app purchases can be promoted within the App Store.
The new feature of iTunes Connect that I’m most excited about is Phased Release. Phases Release allows percentages of devices to automatically update an app over time. Rather than 100% of users getting an update the minute it goes live, incrementing percentages of users will get the update over a seven day period. This lets developers do things like gradually scale up a new production environment or feature, without going all-in the moment it goes live. The best part? It’s available today.
Every year at WWDC, Apple seems to hint at That One API™ that you should probably adopt by September when the next iPhone revision usually launches.
This year, one API change stood out to me as pretty unique in the current landscape. Apple announced that and were both deprecated in iOS 11, in favor of a new set of constructs: and .
These new APIs are meant to describe an area that is not “occluded by ancestors,” such that it’s safe to draw important UI in that area. Both of these new APIs improve upon the old top and bottom guides by providing leading and trailing guides as well, but beyond navigation bars on the top and tab bars on the bottom, what else might be occluding our UI? Perhaps we’ll see in September. 🤔
There were several awesome quality of life changes in UIKit this year in the latest iOS SDK. Again, I could go on and on about every change in the hundreds of APIs in UIKit, but here’s a shortlist of my favorites:
UIKit is a core building block to almost every iOS app, so improving the framework year after year is essential. The changes this year should make many apps feel right at home alongside the stock iOS apps. I’m looking forward to exploring more of the new features in UIKit to make the Shortcut iOS app feel right at home in iOS 11.
This year, Swift turns four years old. However, we heard little from Apple during the WWDC keynote about what’s new in Swift 4. I have a feeling that the open sourcing of Swift and the Swift Evolution have a bit to do with this. By building the language in the open with the community, it seems that Apple chose to take the time in other sessions to highlight what’s new in their young language.
One hit feature of Swift 4 that many seemed to be a fan of during the conference was the new protocol. Proposal SE-0167 seemed to slip in right before WWDC, so many developers haven’t seen much of it yet. allows for types to easily be converted to and from JSON and property lists, with a clear and concise API. Since the Swift community has an obsession with converting to and from JSON, many are happy to see Apple embracing this as a first-class citizen in the language.
Another surprise came from the land of key paths. Proposal SE-0161 caused a bit controversy with it’s syntax, but many seemed to forget about the syntax after Apple demoed some of it’s usages, such as in the new block-based key-value observer API that was introduced. Eschewing the prior stringly typed API for one utilizing Swift 4’s new key paths, users of third-party frameworks like ReactiveCocoa or RxSwift seemed to rejoice at Apple embracing the benefits of type-safety here. However, this new API seems to be a bit sparse on documentation so far.
I’ve gushed on and on about all the new toys coming out of WWDC, but I’d be remiss not to give a few more nods at some great technologies announced last week as well.
MusicKit, CoreNFC, DeviceCheck and PDFKit all look like awesome new frameworks. The move to 64-bit is now complete with iOS 11.; it won’t even download or launch them if they’re still 32-bit only. Dynamic refresh rates and 120hz will do wonders for apps of all shapes and sizes. I’ve already seen people using the built-in iOS screen recording all over Twitter. Password AutoFill for Apps will be a welcome addition alongside 1Password for many apps.
This only covers a fraction of all the new things that this year’s WWDC brought. There’s over 80 hours of sessions from this year, all available online. Apple gave us so many new tools, frameworks, and features this year. I think developers will build some fantastic things with all the new technologies that Apple introduced this year.
Nothing to see here...