News

Apple announces iPadOS 15 with homescreen and multitasking improvements

Apple is today announcing the latest version of iPadOS and there’s a clear focus on making Apple’s tablets more capable productivity machines. At least if you find yourself using split-screen mode a lot, that is. Otherwise, there aren’t any radical changes for the platform.

iPadOS 15 will make the homescreen more customizable and allow for more flexible placement of widgets. You can now stick them anywhere you’d like, a capability that came to iOS 14 last year. But iPadOS 14 didn’t offer the same functionality, and widgets could only be placed in the Today View sidebar despite the tablet’s vast screen real estate.

Apple is also bringing the App Library to iPadOS. Much like on the iPhone, it will let you maintain a less-cluttered homescreen by filing away the apps you rarely use to an automatically organized section of folders. On iPads, the App Library is located in the dock.

Multitasking is also getting some much-needed refinement. New icons will make it simpler to go into split-view mode, and Apple also mentioned a “shelf” feature that makes it easier to jumble different tasks. It’s a significant change to how multitasking currently works on iPadOS, and it seems like a major improvement.

QuickNote is a new convenience that will let you attach notes to webpages and other areas of iPadOS, making them easier to get back to.

Finally, the standalone Translate app is also now coming to iPadOS.

Craig Federighi announced updates to the Messages app. Instead of having to keep your unread notifications around to remind you when you receive something from a friend but aren’t ready to check it out yet, the new Share With You feature can help. When friends send you photos in Messages, for example, the relevant images will be saved to your device album. Apple determines relevant images as those that you’re in, so you won’t automatically get screenshots cluttering up your gallery. Share With You also works with Apple Music and Apple TV so that playlists and shows that people send you can be pinned and easily found again later.

Apple is also refining the way notifications work in iOS 15 to make it easier to disconnect when you need. Aesthetically, iPhone alerts will be easier to read with new contact photos for people that message you and larger icons for apps so you can more easily identify which app pinged you.

A notification summary gives you a snapshot of all the alerts you’ve missed, and arranges them based on what the system determines are your favorite apps. More relevant notifications will be placed nearer the top, too. When you need a break from the relentless onslaught of pings, Focus mode turns all most of them off across the system. Apple will use “on-device intelligence about your past activity to suggest apps and people to allow notifications from.” You can also set different Focus profiles like Work, Sleep and personal to prioritize different apps or contacts during different times of day.

Federighi also introduced something called Live Text, which seems very similar to Google’s Lens feature. It will scan things you’re pointing your camera at and highlight things it recognizes as words. If you’re aiming your viewfinder at a whiteboard, for example, Live Text will highlight what’s written there, and you can tap the selection to copy it as text that you can then paste somewhere else on your phone. This also works for pictures (or screenshots) that are already on your device’s gallery. Much like Google’s system, Live Text can also identify things like locations in a picture or the breed of a dog you saw on Reddit and display links to open your browser for a relevant website.

Photos is also getting easier to use with new search integration. Your Spotlight search for everything in your device will now return images in your photo albums, too. You can look for specific people or scenes from your home screen and even find pictures by entering text that was in them. In another update reminiscent of a popular Google feature, Apple is bringing Memories to your photo album, and will curate and stitch together your photos and videos for AI-generated home movies. But unlike Google’s offering, iOS 15 will also bring Apple Music into these clips, setting some empathetic background music to add to the experience. Apple will also add animated title cards to these videos. You can customize memories to pick a different song or add and remove pictures, and the system will add filters and time the transitions and speed of the slideshow to the music.

Throughout iOS 15, Apple’s made some small changes too. It’s making results for contacts richer, letting you see conversations with them and their location in addition to their call, text, facetime and email.

Federighi wrapped up the iOS 15 section of WWDC 2021 saying there’s more to come, including voice search, support for cross-app drag-and-drop and new outfit options for emoji as well. It won’t be long now before we can start playing with these new features ourselves, so stay tuned for our first look to learn how useful these might actually be.

(Visited 52 times, 1 visits today)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.