Home » What’s new for developers in WWDC 2019

What’s new for developers in WWDC 2019

What is WWDC ?

The Apple Worldwide Developers Conference (WWDC, also known as Dub Dub) is a conference held annually by Apple Inc. in San Jose, California. Apple uses the event to showcase its new software and technologies for software developers. Attendees can participate in hands-on labs with Apple engineers and attend in-depth sessions covering a wide variety of topics.

About WWDC 2019

Apple talked about its products for over 2 hours so naturally, there was a lot to cover. But for all that was said about the future of the iOS13, iPad, WatchOS, Apple tvOS, HomePod, AirPods, and the Mac Pro, just as much was left unsaid.

Apple spent most of its keynote showing off features that users can appreciate, there are lots of breakout sessions throughout the week to go over new tools for developers. During the keynote, Apple briefly explained two specific areas: augmented reality (AR) and it’s Swift programming language. After the WWDC launch, these techs are getting appreciated by developers as Apple took a step ahead with innovations on the peak.

# SWIFT UI:

SwiftUI is an innovative, exceptionally simple way to build user interfaces across all Apple platforms with the power of Swift. It Builds user interfaces for any Apple device using just one set of tools and APIs. With a declarative Swift syntax that’s easy to read and natural to write, SwiftUI works seamlessly with new Xcode design tools to keep your code and design perfectly in sync. It allows Automatic support for Dynamic Type, Dark Mode, localization,right-to-left language support,  internationalization, and accessibility means your first line of SwiftUI code is already the most powerful UI code one has ever seen.

SwiftUI uses a declarative syntax so you can simply state what your user interface should do. Xcode 11 includes new design tools that make it much easier for the developer to build interfaces with SwiftUI just like drag and drop. As you work in the design canvas, everything you edit is completely in sync with the code in the adjoining editor. Code is instantly visible as a preview as you type, and any change you make to that preview immediately appear in your code. Xcode recompiles your changes instantly and inserts them into a running version of your app, visible, and editable at all times. New graphical UI design tool built into Xcode 11 makes it much easier for UI designers to quickly assemble a user interface with SwiftUI without having to write any code.

  1. Dynamic Replacement: Design canvas of user interface isn’t just an approximation its a live preview of the application, in other words, it’s the live app. The Swift compiler and runtime are fully embedded throughout Xcode, so your app is constantly being built and run. Xcode can directly swap the app with the live app by dynamic replacement feature which has been introduced by Apple in this Swift UI.
  2. Drop & Drag: The components within the user interface can easily be placed according to the design requirements by simply dragging the controls on the canvas. Click to open an inspector to select font, color, alignment, and other design options, and easily re-arrange controls with your cursor.

# ARKit 3:

Apple at WWDC 2019 officially announced ARKit 3 with reality kit and reality composer. It features a new object and image detection, motion capture, and People Occlusion. “People Occlusion” feature allows virtual objects to be placed in front and behind people in real-time.

With ARKit 3 it supports 2 major features: automatic real-time occlusion of people viewed by the host device’s camera, and real-time motion capture with the camera. Face Tracking feature by TrueDepth front-facing cameras supports 3 persons at a time and developers can simultaneously access both face and world tracking on the front and back cameras at once.

Reality Composer: It allows you to build 3D animations and interactions on OS &Mac to enrich the content. It has many advanced features and below they are mentioned:

  1. Built-in AR Library: It imports its own USDZ files or can import hundreds of ready to use virtual objects present in these libraries. These libraries allow creating more virtual objects since it has the power of content generation so you can customize a virtual object’s size, style, and more.
  2. Seamless Tools: Reality composer is included with Xcode and its also an iOS application, so it allows the developer to build, test, modify, and to simulate the AR experiences whether on iPhones, iPad, or Mac whatever suits you the best for working.
  3. Record&Play: It allows you to record the data of sensors and cameras of the location where the AR experience will take place, and then it reloads it later on the iOS device while building the application.
  4. Animations&Audio: Add animations that let you move, scale, and add emphasis like a “wiggle” or “spin” to virtual objects. You can choose for actions to happen when a user taps an object, comes in close proximity with it or activates some other trigger. You can also take advantage of spatial audio to add a new level of reality to your AR scene.

RealityKit: It was built from the ground up for specifically for augmented reality with photo-realistic rendering, camera effects, animations, physics and more. It also has many tremendous features and they are mentioned below:

  1. Swift API: RealityKit uses rich features of swift language for providing a full set of features so that you can build AR experiences even faster without the need for boilerplate codes.
  2. Shared AR Experiences: RealityKit makes it easier for building AR experiences by taking on the hard work of networking, such as maintaining a consistent state, optimizing network traffic, handling packet loss, or performing ownership transfers.
  3. Scalable Performances: It takes the complete advantage of the CPU caches and multiple cores to deliver incredibly fluid visuals and physics simulations. Since it scales the performances so you just need to build a single AR experience.

Conclusion:

Swift UI is bundled with various new features which makes it smooth and easy for iOS developers to quickly assemble a user interface with SwiftUI without having to write any code. ArKit3 provides image detection, people occlusion, real-time motion capture which makes it easier to develop AR environments easily. Both the tech are quite good and I think by the time they’ll go far ahead.