# 38 : Better or Cheaper?

Published on

Photo by Bram Van Oost on Unsplash

Get weekly handpicked updates on Swift and SwiftUI!

Weekly Comment

Recently, Apple officially expanded the Apple Vision Pro (AVP) to markets in more countries. Given the uniqueness of AVP, especially the need for custom lenses for eyeglass wearers, allowing consumers to experience the product firsthand will undoubtedly improve their understanding and potentially drive sales. However, without significant improvements in price, wearing comfort, and ecosystem, merely expanding the sales territory may not lead to exciting market performance.

Lately, rumors have been swirling about Apple possibly halting the development of the second-generation Apple Vision Pro. It’s said that Apple might focus on developing a more affordable, lower-end headset. While lowering the price could stimulate market demand, it would inevitably lead to compromises in hardware specifications. Whether this strategy can truly promote AVP’s long-term development remains debatable.

Notably, among the first-generation AVP users, many didn’t purchase the device as a daily computing device but for specific use cases. For instance, some users employ AVP as a high-end audio-visual device, leveraging its superior visual effects and immersive experience. Compared to traditional audio-visual equipment, AVP demonstrates a higher value for money. Moreover, applications in the medical field are particularly noteworthy, with many doctors already using AVP in surgeries, showcasing its unique value and practicality compared to traditional medical equipment. For these users, expectations for the next generation product include further improvements in wearing comfort, performance, and display quality.

Although AVP’s price is high for most consumers, it’s this price range that allows it to offer performance and experiences surpassing competitors in specific fields. Therefore, offering only a lower-cost version while abandoning the high-end market might not be the best choice, especially if it doesn’t significantly boost sales.

Currently, these reports remain rumors. Consumers’ demands are always straightforward: to obtain satisfactory products at reasonable prices. Apple might consider a two-pronged strategy: on one hand, introducing a more cost-effective entry-level model to increase the popularity of headset devices; on the other hand, continuously optimizing AVP’s performance to maintain its industry-leading position. If conditions allow, the coexistence of high-end and affordable options might be an ideal solution to balance various needs.

In the future, with technological advancements and reduced production costs, we have reason to expect more perfect headset devices. However, it’s still a long way to go before most people recognize and accept the headset form factor. We can’t even rule out the possibility that before headset devices truly become widespread, a brand-new revolutionary technology might emerge, fundamentally changing how we interact with the digital world. Regardless, continuous innovation, listening to user needs, and maintaining keen insights into future technology trends will be key to ensuring the thriving development of AVP and its subsequent product lines, as well as the entire wearable device industry.

Originals

The Evolution of SwiftUI Scroll Control APIs and Highlights from WWDC 2024

Fatbobman

At WWDC 2024, Apple once again introduced a series of remarkable new APIs for SwiftUI’s ScrollView component. These new features not only enhanced developers’ ability to control scrolling behaviors but also reflected the ongoing evolution of the SwiftUI framework’s design philosophy. This article will explore these latest scroll control APIs and review the development of all significant APIs related to scroll control since the inception of SwiftUI. Through this micro view, we will reveal the changes in SwiftUI’s design style over the past few years and the underlying macro design trends.

Recent Selections

Reverse Engineering Photos’ New Search UI

Seb Vidal

In the latest iOS 18 update, Apple has completely redesigned its Photos app, including adjustments to the location and display of the search box. In this in-depth technical article, Seb Vidal analyzes how to simulate this innovative design using both public APIs and some undisclosed ones. The article thoroughly explores the application of key classes such as UIKBVisualEffectView and UIKBBackdropView, and leverages the Objective-C runtime and other tools to access and utilize these private APIs to achieve a visual effect consistent with the system keyboard background.

When all you have is a Core Data, everything looks like…

Wade Tregaskis

In this article, Wade Tregaskis revisits his experiences at Apple, especially when the Core Data team was eager to promote the newly developed Core Data technology to the Shark team. At that time, Core Data was similar to today’s SwiftData, lacking in functionality and fraught with issues. Through sharing a practical example, the article reveals the apparent limitations of Core Data when faced with the vast and complex data of the Shark project. This experience is not only intriguing but also provides profound insights into the importance of technology selection and performance evaluation, emphasizing the various factors that need to be considered when choosing a data management solution.

Create Custom SF Symbols in Sketch

Danijela Vrzan

SF Symbols is a set of built-in vector graphic icons developed by Apple for the Apple ecosystem. This icon system is perfectly integrated with the system’s default San Francisco font, ensuring consistency and flexibility in cross-platform interface design. Although Apple provides thousands of customizable, easy-to-use high-quality icons, developers may still need specific icons that are not found in the existing collection. In this article, Danijela Vrzan introduces how to create custom SF Symbols in Sketch. Through a specific example of separating fork and knife icons, the article provides a detailed demonstration of the entire process, offering a practical guide for developers.

WebSocket tutorial using Swift and Hummingbird

Tibor Bödecs

Hummingbird is a high-performance, flexible web framework written in Swift, designed specifically for developing modern server-side applications. In this tutorial, Tibor Bödecs provides a detailed guide on how to use Swift and Hummingbird to create real-time communication applications using the WebSocket protocol. The article not only compares WebSockets with other real-time communication technologies such as HTTP long polling, HTTP streaming, Comet, and SSE but also highlights the limitations of these methods at the protocol level. Through this article, readers will gain a deep understanding of how to implement WebSocket communication in Swift and master its technical advantages and application scenarios.

Migrating Widget Configurations with Parent Parameters to use AppIntent

Quentin Zervaas

Starting with iOS 17, Apple introduced the “App Intents” system, replacing the previous “INIntents” system, to make widget configuration more flexible. Although the transition from the old system to the new one has been relatively smooth in most cases, challenges arise when dealing with parent parameters that need to dynamically show and hide configuration options. In this article, Quentin Zervaas shares his solution to this problem. By adopting a new method, the complexity of configuration has been greatly reduced, and the widget configuration interface has been significantly streamlined.

SwiftUI app lifecycle: issues with ScenePhase and using AppDelegate adaptors

Jesse Squires

Since WWDC 2020, SwiftUI has introduced the ScenePhase API to represent the application’s lifecycle states. However, compared to traditional methods based on the AppDelegate, ScenePhase falls short in handling crucial events such as application launch and termination. Jesse Squires advocates for the development of a more comprehensive AppPhase API to separate window scene events and independently manage application-level lifecycle events. The author points out that the current ScenePhase API is overly reliant on window management and fails to cover the full spectrum of application state changes, performing poorly on macOS while being somewhat better on iOS.

Tip: Classes handling AppDelegate logic can implement the ObservableObject protocol and can also be combined with the Observation framework. Therefore, developers can fully utilize these features when constructing their own lifecycle notification mechanisms, for more details see Exploring SwiftUI Property Wrappers.

Weekly Swift & SwiftUI insights, delivered every Monday night. Join developers worldwide.
Easy unsubscribe, zero spam guaranteed