Quantcast
Channel: Apple Media Event – iPad Insight
Viewing all articles
Browse latest Browse all 24

Some of the Major Themes of Yesterday’s WWDC Keynote

$
0
0

It’s a Good Day to be an iPad User!


The iPad has been at a crossroads for a couple of years now. Since the original iPad Pro didn’t move the sales needle, it has felt a little like Apple has been marking time while they came up with a Plan B on how to jumpstart things. Part one came with the release of the rolled back, but less expensive iPad this Spring. While it was a solid device that fit a need in Apple’s tablet lineup, it was hard to tell too much based on such a modest start down a different path.

That all changed when Apple pulled back the curtain on the new master plan yesterday, however. Right after the release of the iPad Pro, they tried to position it as a laptop replacement, then more recently they cast it as something different, but better. Now we finally see some real vision, and more importantly, ACTION behind the words. With the addition of greatly enhanced multitasking, app switching and grouping, drag and drop, and a real file system, the iPad Pro is now poised to REALLY get to work. Combine these new features with deeper Apple Pencil integration and some serious power, and Apple now has a much different machine to take to market.

How receptive the market is to these changes remains to be seen, but after today there can be no doubt that Apple still sees that iPad as a large part of its future plans. I think this also proves that the restraints are off the iOS team in terms of blurring the lines between macOS and iOS, at least on the iPad.

Apple is Embracing Reality


The virtual and augmented kinds. Today was the first mention of VR that I can remember at an Apple event. It wasn’t just mentioned, either. Apple drove the point home that their platforms now have the power to handle both using it and developing for it. That is a key point, because that ability had been brought into question as the first wave of VR Headsets hit the market. They very clearly are interested in positioning the Mac as a legitimate development platform in that space, which wouldn’t have even been a consideration for devs a few months ago.

The AR presentation was even more important. It is incredibly clear now that Apple sees Augmented Reality as one of their core technologies going forward. The smartest thing they did was getting ARKit put together now and in developers’ hands, rather than waiting for the Fall or even later. Now, when the new iPhones come out, there will be games and other apps that harness AR available day one.

While they are really just setting the table for developers, rather than taking the bull by the horns themselves right now, I think Apple believes that AR will provide the shot in the arm that the App Store needs. If it works, they will also benefit from learning how both users and devs use and work with AR, which can inform their later AR-centric hardware decisions. I like how pro-active this move was, rather than waiting longer and being more reactive to the market.

Machine Learning was a Constant Theme

I was wondering if Apple would have a segment discussing their Machine Learning initiatives at WWDC. However, rather than give a single presentation on what they’re up to, Apple’s presenters touched on Machine Learning SEVERAL times throughout the entire event. Machine Learning is part of their updates to Siri and key to the new ways it will be used, such as the new Siri Watchface on Apple Watch and the new watchOS Dock. Machine Learning also featured prominently in the presentations of macOS Safari tracking prevention, Photos, the iPhone Camera, Apple News, Maps, iMessage App surfacing, and spelling suggestions, just to name a few.

As positive as it is hear Apple name-drop Machine Learning all over the place, the more important announcement was that they are opening up Machine Learning APIs to developers using Core ML. They went all out on this, touting hardware and on-device speed advantages of the iPhone 7 over the Google Pixel, as well as Apple’s concern about preserving user privacy in this space. It remains to be seen if or how quickly devs will tap into Apple’s Machine Learning capabilities, but if they do, it will bring Apple a new flow of raw data that they’ve never had before.

Depending on how these APIs work, this could give Apple’s AI initiatives a big boost, and be a win-win-win for them, developers, and users. Opening up their Machine Learning to devs certainly won’t get them close to the level of data that Google and Microsoft have, but it is a way that they can close the gap and improve their services over the long term.

Apple is Waging War on Device Storage Space Limitations


This was more subtle, but Apple made two announcements that pertain to cutting down on wasted storage space on your iOS devices in iOS 11. First of all, in moving iMessage fully to the cloud and syncing to your devices from a single source, they now have the ability to store older conversations there, keeping them off your devices. If you use photos, stickers, and effects heavily, this will be a welcomed change. I usually stick to text, myself, but I still currently have 522 MB of space on my iPhone devoted to iMessage. I’m sure I sit at the low end of the spectrum, but that’s still a lot of space devoted to texts. It’s great to see Apple finding solutions to small problems like these, rather than just throwing more on-device memory at it.

The other announcement was that Apple is switching iOS Camera video to the HEVC file format, and photos to HEIF. Both of these formats offer far better compression, with HEIF evidently cutting photo sizes in half. This is a very smart move, as even with great options like their own iCloud Photo Library and Google Photos available to sync and store your libraries in the cloud, people still want to keep a lot of their recent and favorite content locally. As you can see in the screenshot above, even with iCloud Phot Library’s Optimized Photo Storage feature turned on, I still have 42.37 GB of photo and video on my iPhone. That’s a LOT of space. By cutting the file sizes without sacrificing quality, Apple is again finding solid solutions rather than just trying to spec away the problem.

Apple Sherlocked Venmo and Sonos Today


Apple Pay direct payments has been rumored for a few months, so Venmo has had time to prepare. They knew this was coming. However, Apple positioning the new HomePod intelligent speaker so aggressively as a high-end, whole home audio system may have come as more of surprise to Sonos. They have integrated tightly with Apple devices for years and their products are sold in Apple Retail Stores….for now.

Apple hedged a bit on the intelligent assistant capabilities of Siri and the HomePod, focusing much more on the pair’s collaboration to bring your high quality audio. There was also a good bit said about HomeKit integration and the HomePod’s capability as a Home Automation hub. However, this presentation clearly wasn’t the shot across the bow of Amazon or Google, as much as it was Sonos’. If Apple delivers on the audio claims of the device, Sonos may be in real trouble, especially with a very reasonable price of $329.

These were some of the major themes and software and developer-focused announcements I noticed during yesterday’s presentation. I’ll be back later with a new Apple Slices devoted to all of the hardware announcements from the WWDC Keynote. Until then, if you have any thoughts or comments about these or any other items from WWDC, feel free to let me know in the Comments section below, on Flipboard, our Facebook page, or on Twitter @iPadInsightBlog.


Viewing all articles
Browse latest Browse all 24

Trending Articles