New Wave of AI-first Devices (Part 2: AI x Companions)

Since the availability of the first smartphone, the Simon Personal Communicator (SPC) by IBM in 1994, three decades of relentless research, development, refinement, and optimization have transformed smartphones into remarkable super devices. Their versatility has made them an indispensable tool in daily life for many people. Yet any form factor of hardware is limited by the laws of physics, making certain capabilities beyond their reach. This is where companions come into play, extending the capabilities of smartphones and sometimes acting as simplified versions of their hosts.

Capabilities beyond the direct reach of smartphones

All devices are designed to interact with users through their sensors. The five primary senses of sight (vision), hearing (audition), touch (somatosensation), taste (gustation), and smell (olfaction) are most commonly considered. Some features in smartphones have even leveraged other senses, such as proprioception - the ability to sense the position and movement of the body and its parts. 

However advanced and versatile, smartphones can’t do everything. It’s either not feasible or too detrimental to the user experience to incorporate them. Let’s examine a few examples and explore how these represent ideal opportunities for companion devices like smartwatches, earbuds, and smart glasses augmented with AI.  

Sight

First-person image & video capture.  Smartphones all have cameras (some of them very sophisticated), but targeting and capturing the subject(s) requires the help of hands (to hold the phone), human eyes (to find where the subject is), and often fingers (to click and focus lens). Capturing images and especially videos from a first-person perspective is a test of endurance and tenacity.

Leaving the job to a pair of smart glasses that are placed in the right place and naturally employing human vision is an ideal solution coming with many benefits to optimize user experience. For example, enjoying the images and videos shot in an immersive way, locking focus with eyes, and learning the world by seeing it—all in a more natural way without having to pull the phone out of the pocket. 

Visual health monitoring. Billions of people worldwide suffer from vision-related issues. Some commonly seen conditions like Glaucoma and Cataracts need consistent monitoring. I anticipate the development of smart earwear capable of monitoring these conditions, reducing the need for frequent clinic visits, and addressing the challenges of understaffing and limited capacity in healthcare facilities.  Importantly, if smart glasses can detect early signs of diseases, patients are able to seek timely treatment.

Hearing

Smartphones do have auditory capability, but it’s far from ideal. Users may need privacy and a more immersive and spatially aware audio experience, which earbuds and headphones can provide. Additionally, there are many more use cases tied to hearing, such as AI-powered real-time translation. 

Real-time translation directly into ears. One widely developed translation solution in the market requires users to take out their phones, press and release some buttons, show translations on the screen, and often try again due to a noisy background and inaccurate capturing of conversation. 

While this solution meets the bare minimum need of users, a more streamlined approach is to build real-time translation into hearing devices like earbuds or headphones that can filter out background noise or enhance specific sounds*. One can expect that people speaking different languages yet understanding each other can easily build a modern Babel Tower. Given the development of AI voice cloning tech, builders can even speak to each other in their own voices and tones.

*For hearing impaired users, smartphone or smart glasses solutions with screen and display are more helpful. 

More Senses: Touch, Smell and Maybe Taste

Fitness & health tracking. This is a significant category in fitness and healthcare where companions like smartwatches and rings excel. While smartphones have impressive computing power, they typically have a limited range of sensors, giving rise to a relatively low data accuracy, and a lack of continuous monitoring.  Dedicated wearables are ideal for collecting accurate data and transmitting it to smartphones for processing. Users have been utilizing smartwatches and fitness wearables to track many vitals and health metrics, and more can be expected in the confluence of hardware and AI capabilities.

Air Quality Analysis: Devices can resemble our olfactory receptors that detect chemical compounds in the air to detect and analyze specific airborne pollutants or dangerous gas in the local environment, offering real-time alerts to users. Making such devices portable or wearable allows people with respiratory problems to take preventive measures anytime, anywhere. 

Taste simulation. This is still very rudimentary, but research has been made in developing devices that electrically or chemically stimulate taste receptors on the tongue to induce artificial flavors, so as to help people manage their diets or neurologically-driven food behaviors.

If a smartphone is the brain of a person, companions are everything else that helps a person collect data and information from the surroundings and the world, and soon completes tasks such as grabbing a book, hitting a ball, or something more complicated that has a longer decision-making chain.

There is tremendous potential for companion products in fitness & healthcare, and understanding the people and world better, significantly augmenting the capabilities of their hosts — the AI-first super devices like smartphones.

Positioning and Messaging

For companion devices, product messaging should align with their positioning as intelligent extensions, not replacements for smartphones. While the device may have many features, only a few key benefits should be highlighted to avoid confusion or disappointment. Emphasizing multiple features can lead to comparisons with smartphones, where the user experience might be more optimized. Instead, it's better to focus on showcasing unique experiences and features that complement the smartphone, creating a sense of added value. I personally enjoy discovering hidden features, as they feel like unexpected bonuses.

When it comes to specific messaging tactics for AI-first companion products, here are some useful tips based on my years of product marketing experience.

  1. Focus on value, not technology. It's more about augmenting users' self-discovery than devices knowing users better than themselves. People value having a sense of control over their own identity and decisions. Hinting that a product knows them better undermines this autonomy.

  2. Directly address the real users’ pain points that the hosts alone cannot fully solve.  For example, "Constantly losing your keys? Smartglasses with object finding make it a thing of the past".  Users will respond to this message by automatically extending the cases to beyond keys, but to everything forgettable. Unless these items have built-in trackers, they're not usually found by smartphones.

  3. Numbers are more persuasive than words when supporting claims of benefits.  For example, consider earbuds that offer “real-time translation with less than 0.3-second delay for seamless conversations”. This claim feels honest because it does not promise zero latency, and users tend to perceive a 0.3-second latency as negligible.

  4. Avoid the uncanny valley effect resulting from a hint of AI sentience, which in turn triggers mistrust (adding to privacy concerns) rather than excitement for AI capabilities. Companion products, especially those in familiar form factors, have already had decent population penetration, so they are not always marketed to early adopters.

  5. Refrain from attributing divine qualities to AI, as its mechanisms are not fully understood yet. AI systems, no matter how powerful they are, can still hallucinate or experience unexpected malfunctions for unknown reasons as of now. These limitations pose challenges in the realm of user experience.

  6. Letting others speak for you can be a risky but highly effective strategy. Real users, influencers (especially tech media and reviewers), and partners can all effectively engage with the target consumer audience by offering fresh perspectives on products. We can assume they have conducted thorough evaluations and will provide honest feedback. Comparing these results to product messaging can sometimes yield unexpected outcomes, either exceptionally positive or disastrous, depending on whether the product exceeds or falls short of the promised benefits. It's important to note that pricing has a multiplying effect on these results. If a product is priced too high (in the eyes of the user), failure to deliver on the full value proposition can aggravate its downfall.

Finally, I’d like to include some stats on the penetration of some companion devices in the US. 

  • In 2023, the smartphone penetration rate was around 92% (source)

  • In 2023, more than 65% of American households had headphones or/and earbuds (source)

  • 63.7% of adult Americans wear prescription eyeglasses (source)

  • 35% of Americans aged 12 and above owned a smart speaker in 2022 (source)

  • 1/3 of Americans use a wearable device, like a smartwatch or band, to track their health and fitness (source).  Approximately 26% of American households had access to a smartwatch (source).

There are undoubtedly immense opportunities to augment these companion devices with AI capabilities, requiring less user education compared to introducing a new form factor. I eagerly anticipate upgrading my devices to gain more insight into my health, productivity, surroundings, and the world.

Previous
Previous

New Wave of AI-first Devices (Part 3: AI x novel form factors)

Next
Next

New Wave of AI-first Devices (Part 1: AI x Smartphone)