The AI Wearables Problem: Why Smart Glasses Keep Failing (And What UX Strategy Can Learn)

As product design consultants we're watching another tech hype cycle unfold—this time around AI-powered wearables. Google Glass failed. Apple Vision Pro is being discontinued. Yet companies keep pouring billions into smart glasses that users don't want.

This isn't just another product failure story. It's a masterclass in what happens when technology chases trends instead of solving real user problems—and why strategic UX design agencies need to be involved from day one, not after $3,500 goggles sit unsold on shelves.

The Pattern: Promising Demos, Disappointing Reality

Google Glass: The Original Sin

Remember the Google Glass demo video? It showed compelling use cases:

  • Turn-by-turn walking directions overlaid on your vision
  • Hands-free photography capturing moments naturally
  • Instant translation of foreign text
  • Seamless video calls while staying present
  • Voice-activated information retrieval

The demo was brilliant. The product was vaporware.

What Google showed was aspirational futures, not functional reality. As UX design consultants, we see this pattern constantly: sales and marketing teams create videos showing what the product might do someday, not what it actually can do today.

The result? Google Glass launched to developers with bright orange frames that made users look like they were cosplaying Star Trek characters who couldn't commit to the full costume. The interface was clunky. The features didn't work as promised. The $1,500 price tag bought you social stigma and technical frustration.

More critically: Google never validated that people actually wanted computers strapped to their faces in the first place.

Apple Vision Pro: $3,500 of Misplaced Assumptions

Fast forward to 2024, and Apple—the company that supposedly "gets" user experience—launched the Vision Pro with the same fundamental flaws.

Immediate barriers to adoption:

  • You can't try it without booking a demo in advance
  • $3,500 price point eliminates casual buyers
  • Requires controlled hand gestures in open space (making you look ridiculous in public)
  • Eye-tracking control creates cognitive load
  • Heavy, uncomfortable for extended wear
  • Zero social acceptability for public use

As a UX design agency working with hardware and software companies, we know the first rule of product strategy: validate assumptions before spending billions on manufacturing.

Did anyone at Apple conduct ethnographic research to see if people would actually wear these in public? Did they test whether the $3,500 price point had product-market fit? Did they validate that eye-tracking and hand gestures were better than simply using your phone?

The market answered: No. Apple is now discontinuing the Vision Pro line.

Meta's Ray-Ban Glasses: Closer, But Still Missing the Point

Meta partnered with Ray-Ban to create smart glasses that look normal—a massive improvement. They've added a "neural band" wristband that reads muscle movements, allowing you to "write" messages by tracing letters on your knee with your finger.

The innovation is impressive. The technology works. But as UX consultants working across industries, we have to ask: Who is this for?

Think about the use case:

  • You need both the glasses and the wristband
  • You need to be somewhere you can't speak aloud (but somehow can gesture visibly)
  • You need to write in cursive (does Gen Z even learn cursive anymore?)
  • The person you're messaging needs to wait for you to slowly trace out letters

Compare this to: pulling out your phone and typing. Or using voice-to-text. Or literally any other communication method we've developed over the past 15 years.

This is technology for technology's sake—not technology solving real user problems.

The Trust Problem: Why Users Are Rejecting AI Wearables

Beyond usability issues, there's a more fundamental problem: users don't trust the companies making these devices.

The Facebook/Meta Credibility Crisis

Meta's Ray-Ban glasses come with a camera. You're essentially wearing Facebook's surveillance apparatus on your face, recording everything you see, everywhere you go.

Why this matters:

  • Meta eliminated fact-checking, signaling they won't moderate misinformation
  • Facebook has repeatedly violated user privacy
  • Instagram's algorithms have been shown to harm teen mental health
  • The company's business model depends on surveillance capitalism

As fractional design officers consulting with tech companies, we emphasize that trust is the foundation of adoption. When users don't trust a company with their data, they won't adopt technology that puts that company's sensors on their bodies.

Real-world rejection: Strip clubs and bars are turning people away who wear smart glasses. The social contract is clear: we don't trust you not to record us without consent.

The AI Training Data Problem

Here's where it gets worse. Both Claude (Anthropic) and LinkedIn recently changed their terms of service to train AI models on user data. Apple has been more protective, but the pattern is clear: these companies will use whatever you feed their devices to improve their AI.

What this means for smart glasses:

  • Everything you see gets fed into training data
  • Your private conversations become model improvements
  • Your medical appointments, family moments, intimate spaces—all potentially scraped
  • No meaningful opt-out, no real privacy protection

UX consulting firms working in regulated industries (healthcare, finance, legal) understand: privacy isn't a nice-to-have feature. It's a legal requirement and a user expectation.

Would you wear glasses that record everything and send it to a company that's already violated your trust? Neither would we.

The Interface Problem: Gestures Aren't Better Than Screens

Let's talk about why these interfaces fundamentally don't work from a UX perspective.

The Gesture Fallacy

Movies show us characters making cool hand gestures in the air, manipulating holograms and interfaces that look visually stunning on screen. This is designed for cinema, not reality.

As a UX design agency in Nashville working with actual human beings in actual contexts, we know:

Problems with gesture interfaces:

  • Physical fatigue: Your arms get tired quickly ("gorilla arm" effect)
  • No tactile feedback: You're touching nothing, getting no physical confirmation
  • Spatial confusion: No frame of reference for where interface elements "are"
  • Social awkwardness: You look absurd to everyone around you
  • Accuracy issues: Small movements are hard to control precisely
  • Context blindness: Gestures appropriate in your living room aren't acceptable in public

One of our team members tried a Nissan VR demo at an innovation summit. The experience? Clunky interface, taking up unnecessary space with wild gestures, looking narcissistic while others watched you flail at nothing.

The promised use case: Customize a car's interior and see thread detail in seats.

The reality: An awkward, uncomfortable experience that could have been accomplished better with a tablet, a website, or literally just visiting a showroom.

Eye Tracking: Cognitive Load Nightmare

Apple Vision Pro uses eye tracking for navigation. You look at interface elements to select them, then pinch your fingers to activate.

Why this increases cognitive load instead of reducing it:

  • Your eyes naturally scan environments—now every glance is a potential input
  • You have to consciously control where you look (exhausting)
  • Interface elements compete with real-world objects for attention
  • No muscle memory develops (unlike physical interfaces)
  • Accessibility nightmare for anyone with vision or motor control differences

Product design consultants know that good interfaces become invisible. You don't think about how to use them—you just use them. Eye tracking requires constant conscious attention.

The Trust Problem: Why Users Are Rejecting AI Wearables

Beyond usability issues, there's a more fundamental problem: users don't trust the companies making these devices.

The Facebook/Meta Credibility Crisis

Meta's Ray-Ban glasses come with a camera. You're essentially wearing Facebook's surveillance apparatus on your face, recording everything you see, everywhere you go.

Why this matters:

  • Meta eliminated fact-checking, signaling they won't moderate misinformation
  • Facebook has repeatedly violated user privacy
  • Instagram's algorithms have been shown to harm teen mental health
  • The company's business model depends on surveillance capitalism

As fractional design officers consulting with tech companies, we emphasize that trust is the foundation of adoption. When users don't trust a company with their data, they won't adopt technology that puts that company's sensors on their bodies.

Real-world rejection: Strip clubs and bars are turning people away who wear smart glasses. The social contract is clear: we don't trust you not to record us without consent.

The AI Training Data Problem

Here's where it gets worse. Both Claude (Anthropic) and LinkedIn recently changed their terms of service to train AI models on user data. Apple has been more protective, but the pattern is clear: these companies will use whatever you feed their devices to improve their AI.

What this means for smart glasses:

  • Everything you see gets fed into training data
  • Your private conversations become model improvements
  • Your medical appointments, family moments, intimate spaces—all potentially scraped
  • No meaningful opt-out, no real privacy protection

UX consulting firms working in regulated industries (healthcare, finance, legal) understand: privacy isn't a nice-to-have feature. It's a legal requirement and a user expectation.

Would you wear glasses that record everything and send it to a company that's already violated your trust? Neither would we.

The Interface Problem: Gestures Aren't Better Than Screens

Let's talk about why these interfaces fundamentally don't work from a UX perspective.

The Gesture Fallacy

Movies show us characters making cool hand gestures in the air, manipulating holograms and interfaces that look visually stunning on screen. This is designed for cinema, not reality.

As a UX design agency in Nashville working with actual human beings in actual contexts, we know:

Problems with gesture interfaces:

  • Physical fatigue: Your arms get tired quickly ("gorilla arm" effect)
  • No tactile feedback: You're touching nothing, getting no physical confirmation
  • Spatial confusion: No frame of reference for where interface elements "are"
  • Social awkwardness: You look absurd to everyone around you
  • Accuracy issues: Small movements are hard to control precisely
  • Context blindness: Gestures appropriate in your living room aren't acceptable in public

One of our team members tried a Nissan VR demo at an innovation summit. The experience? Clunky interface, taking up unnecessary space with wild gestures, looking narcissistic while others watched you flail at nothing.

The promised use case: Customize a car's interior and see thread detail in seats.

The reality: An awkward, uncomfortable experience that could have been accomplished better with a tablet, a website, or literally just visiting a showroom.

Eye Tracking: Cognitive Load Nightmare

Apple Vision Pro uses eye tracking for navigation. You look at interface elements to select them, then pinch your fingers to activate.

Why this increases cognitive load instead of reducing it:

  • Your eyes naturally scan environments—now every glance is a potential input
  • You have to consciously control where you look (exhausting)
  • Interface elements compete with real-world objects for attention
  • No muscle memory develops (unlike physical interfaces)
  • Accessibility nightmare for anyone with vision or motor control differences

Product design consultants know that good interfaces become invisible. You don't think about how to use them—you just use them. Eye tracking requires constant conscious attention.

The Use Case Crisis: Solutions Looking for Problems

Here's the fundamental issue that UX consultants see across all these products: they're solutions looking for problems, not solutions to identified problems.

What Problems Do Smart Glasses Actually Solve?

Let's be honest about actual use cases:

Turn-by-turn directions:

  • Your phone already does this with audio cues
  • You can glance at your phone while stopped
  • Overlaying directions on reality is neat but not meaningfully better

Hands-free photography:

  • Surveillance concerns outweigh benefits
  • Quality doesn't match phone cameras
  • Social stigma of obviously recording people

Information lookup:

  • Voice assistants on phones already do this
  • Pulling out phone takes 2 seconds
  • Not worth wearing $500-3,500 device 24/7

Translation:

  • AirPods now do real-time translation
  • You still need the other person to understand you
  • Google Translate with OCR already handles text

Augmented reality overlays:

  • Cool in demos
  • Distracting and cognitively taxing in reality
  • Rarely solves actual daily problems

The Only Compelling Use Case: Specialized Professional Work

The one area where AR/VR shows promise is specialized professional applications:

Engineering and design:

  • Visualizing complex 3D structures
  • Collaborative design sessions across locations
  • Prototyping without physical materials
  • Testing spatial relationships

Medical training:

  • Surgical simulation and practice
  • Anatomy education with 3D models
  • Remote surgical assistance

Industrial maintenance:

  • Overlay repair instructions on machinery
  • Remote expert guidance for technicians
  • Safety information and warnings

Architecture and construction:

  • Visualizing buildings before construction
  • On-site plan verification
  • Client walkthroughs of unbuilt spaces

Notice what these have in common? Professional settings with specific, high-value problems that justify the cost and awkwardness.

As fractional UX experts working with B2B companies, we'd tell clients: focus there. Build for professionals who will tolerate interface quirks because the value justifies it. Don't try to make mass consumer products until the technology actually solves mass consumer problems.

The Cost-Benefit Calculation Nobody's Making

Let's talk about what UX design agencies should be asking during product strategy sessions:

Apple Vision Pro Analysis

Costs:

  • $3,500 purchase price
  • Social stigma of wearing in public
  • Physical discomfort during extended use
  • Learning curve for eye tracking and gestures
  • Limited app ecosystem
  • Isolation from people around you

Benefits:

  • Watch movies on a virtual giant screen (instead of your TV)
  • Browse the internet while looking ridiculous
  • Play immersive games (that you could play on existing VR headsets for $500)

Product design consultants would immediately flag: the cost-benefit doesn't close. You're paying $3,500 for marginally better movie watching. That's not a viable product.

Meta Ray-Ban with Neural Band Analysis

Costs:

  • $300-500 for glasses + wristband
  • Learning to write in cursive with finger gestures
  • Battery life limitations
  • Privacy concerns from always-on camera
  • Trust issues with Meta as a company

Benefits:

  • Hands-free messaging (that's slower than pulling out your phone)
  • Camera for capturing moments (with massive social stigma)
  • AI assistant (that your phone already has)

Again: the math doesn't work. UX consulting firms exist to help companies see this before they invest millions in manufacturing.

What the AI Wearables Industry Is Missing: Basic UX Research

Every failure we've discussed comes back to the same root cause: nobody validated that users wanted these products before building them.

As UX consultants in Nashville working across industries, here's the research that should have happened:

Ethnographic Research

What they should have done:

  • Observe people in real contexts throughout their day
  • Identify actual pain points and friction in daily activities
  • Map where current tools (phones, laptops, watches) fall short
  • Understand social contexts and privacy concerns
  • Test prototypes in real environments with real people

What they actually did:

  • Built cool technology
  • Created aspirational demo videos
  • Assumed people would find uses for it
  • Ignored social stigma and trust concerns
  • Skipped validation until after production

Use Case Validation

Questions they should have asked:

  • What problem are we solving?
  • Who experiences this problem?
  • How do they currently solve it?
  • Would our solution be meaningfully better?
  • Would they pay our price point for this improvement?
  • Would they tolerate the social awkwardness?

What they assumed:

  • Technology is inherently better
  • Users want to be early adopters
  • If we build it cool enough, they'll find uses
  • Privacy concerns will be overlooked for convenience

Competitive Analysis

What they should have compared:

  • Current phones (free with service contracts, universally accepted)
  • Smart watches ($200-500, socially acceptable, solves specific problems)
  • Tablets (portable, great screens, full app ecosystem)
  • AirPods ($150-250, socially acceptable, hands-free audio)

The verdict: In almost every use case, existing tools work better, cost less, and don't make you look like an asshole.

The Five-Stage Evolution of AI (And Why Wearables Jump Ahead)

Luke Wroblewski (Luke W) outlined the evolution of AI products in an insightful 60-page article. Understanding these stages helps explain why wearables are failing.

Stage 1: Machine Learning Behind the Scenes (Pre-2022)

What it looked like:

  • Amazon's "you may also like" recommendations
  • Google's search ranking algorithms
  • Netflix content suggestions
  • Spotify's personalized playlists

Why it worked: Users didn't need to understand the technology. It quietly improved experiences without requiring behavioral changes.

Stage 2: Natural Language Interfaces (2022-2023)

What it looked like:

  • ChatGPT conversational prompts
  • Text-based AI assistants
  • Natural language queries replacing keyword searches

Why it worked: Leveraged existing behavior (typing, conversations) with familiar interface patterns (text boxes, chat windows).

Stage 3: Retrieval Augmented Generation (RAG) (2023-2024)

What it looked like:

  • AI citing sources and showing references
  • Context-aware responses based on document libraries
  • Specialized AI trained on company-specific data

Why it works: Adds transparency and trust to AI outputs. Users can verify claims and understand where information comes from.

Stage 4: Foreground Agents (2024-2025)

What it's becoming:

  • AI that plans multi-step tasks
  • Agents that can edit and refine mid-task
  • Tools that surface their own limitations
  • Visible AI assistance rather than invisible automation

Why it's challenging: Users need to understand what the AI is doing and maintain appropriate oversight.

Stage 5: Background Agent Ecosystems (Future)

What it could become:

  • Multiple AI agents collaborating automatically
  • Your personal AI coordinating with others' AIs
  • Seamless integration across tools and platforms
  • True "ambient computing"

Why it's not here yet: Requires solved problems we haven't solved (trust, interoperability, privacy, safety).

Where Wearables Fit (Or Don't)

The problem: Companies are trying to jump straight to Stage 5—ambient computing with glasses and wearables—without successfully implementing Stages 3 and 4.

As fractional design officers consulting on AI strategy, we tell clients: walk before you run. Master text-based AI interfaces before you try to revolutionize how humans interact with computers.

The Use Case Crisis: Solutions Looking for Problems

Here's the fundamental issue that UX consultants see across all these products: they're solutions looking for problems, not solutions to identified problems.

What Problems Do Smart Glasses Actually Solve?

Let's be honest about actual use cases:

Turn-by-turn directions:

  • Your phone already does this with audio cues
  • You can glance at your phone while stopped
  • Overlaying directions on reality is neat but not meaningfully better

Hands-free photography:

  • Surveillance concerns outweigh benefits
  • Quality doesn't match phone cameras
  • Social stigma of obviously recording people

Information lookup:

  • Voice assistants on phones already do this
  • Pulling out phone takes 2 seconds
  • Not worth wearing $500-3,500 device 24/7

Translation:

  • AirPods now do real-time translation
  • You still need the other person to understand you
  • Google Translate with OCR already handles text

Augmented reality overlays:

  • Cool in demos
  • Distracting and cognitively taxing in reality
  • Rarely solves actual daily problems

The Only Compelling Use Case: Specialized Professional Work

The one area where AR/VR shows promise is specialized professional applications:

Engineering and design:

  • Visualizing complex 3D structures
  • Collaborative design sessions across locations
  • Prototyping without physical materials
  • Testing spatial relationships

Medical training:

  • Surgical simulation and practice
  • Anatomy education with 3D models
  • Remote surgical assistance

Industrial maintenance:

  • Overlay repair instructions on machinery
  • Remote expert guidance for technicians
  • Safety information and warnings

Architecture and construction:

  • Visualizing buildings before construction
  • On-site plan verification
  • Client walkthroughs of unbuilt spaces

Notice what these have in common? Professional settings with specific, high-value problems that justify the cost and awkwardness.

As fractional UX experts working with B2B companies, we'd tell clients: focus there. Build for professionals who will tolerate interface quirks because the value justifies it. Don't try to make mass consumer products until the technology actually solves mass consumer problems.

The Cost-Benefit Calculation Nobody's Making

Let's talk about what UX design agencies should be asking during product strategy sessions:

Apple Vision Pro Analysis

Costs:

  • $3,500 purchase price
  • Social stigma of wearing in public
  • Physical discomfort during extended use
  • Learning curve for eye tracking and gestures
  • Limited app ecosystem
  • Isolation from people around you

Benefits:

  • Watch movies on a virtual giant screen (instead of your TV)
  • Browse the internet while looking ridiculous
  • Play immersive games (that you could play on existing VR headsets for $500)

Product design consultants would immediately flag: the cost-benefit doesn't close. You're paying $3,500 for marginally better movie watching. That's not a viable product.

Meta Ray-Ban with Neural Band Analysis

Costs:

  • $300-500 for glasses + wristband
  • Learning to write in cursive with finger gestures
  • Battery life limitations
  • Privacy concerns from always-on camera
  • Trust issues with Meta as a company

Benefits:

  • Hands-free messaging (that's slower than pulling out your phone)
  • Camera for capturing moments (with massive social stigma)
  • AI assistant (that your phone already has)

Again: the math doesn't work. UX consulting firms exist to help companies see this before they invest millions in manufacturing.

What the AI Wearables Industry Is Missing: Basic UX Research

Every failure we've discussed comes back to the same root cause: nobody validated that users wanted these products before building them.

As UX consultants in Nashville working across industries, here's the research that should have happened:

Ethnographic Research

What they should have done:

  • Observe people in real contexts throughout their day
  • Identify actual pain points and friction in daily activities
  • Map where current tools (phones, laptops, watches) fall short
  • Understand social contexts and privacy concerns
  • Test prototypes in real environments with real people

What they actually did:

  • Built cool technology
  • Created aspirational demo videos
  • Assumed people would find uses for it
  • Ignored social stigma and trust concerns
  • Skipped validation until after production

Use Case Validation

Questions they should have asked:

  • What problem are we solving?
  • Who experiences this problem?
  • How do they currently solve it?
  • Would our solution be meaningfully better?
  • Would they pay our price point for this improvement?
  • Would they tolerate the social awkwardness?

What they assumed:

  • Technology is inherently better
  • Users want to be early adopters
  • If we build it cool enough, they'll find uses
  • Privacy concerns will be overlooked for convenience

Competitive Analysis

What they should have compared:

  • Current phones (free with service contracts, universally accepted)
  • Smart watches ($200-500, socially acceptable, solves specific problems)
  • Tablets (portable, great screens, full app ecosystem)
  • AirPods ($150-250, socially acceptable, hands-free audio)

The verdict: In almost every use case, existing tools work better, cost less, and don't make you look like an asshole.

The Five-Stage Evolution of AI (And Why Wearables Jump Ahead)

Luke Wroblewski (Luke W) outlined the evolution of AI products in an insightful 60-page article. Understanding these stages helps explain why wearables are failing.

Stage 1: Machine Learning Behind the Scenes (Pre-2022)

What it looked like:

  • Amazon's "you may also like" recommendations
  • Google's search ranking algorithms
  • Netflix content suggestions
  • Spotify's personalized playlists

Why it worked: Users didn't need to understand the technology. It quietly improved experiences without requiring behavioral changes.

Stage 2: Natural Language Interfaces (2022-2023)

What it looked like:

  • ChatGPT conversational prompts
  • Text-based AI assistants
  • Natural language queries replacing keyword searches

Why it worked: Leveraged existing behavior (typing, conversations) with familiar interface patterns (text boxes, chat windows).

Stage 3: Retrieval Augmented Generation (RAG) (2023-2024)

What it looked like:

  • AI citing sources and showing references
  • Context-aware responses based on document libraries
  • Specialized AI trained on company-specific data

Why it works: Adds transparency and trust to AI outputs. Users can verify claims and understand where information comes from.

Stage 4: Foreground Agents (2024-2025)

What it's becoming:

  • AI that plans multi-step tasks
  • Agents that can edit and refine mid-task
  • Tools that surface their own limitations
  • Visible AI assistance rather than invisible automation

Why it's challenging: Users need to understand what the AI is doing and maintain appropriate oversight.

Stage 5: Background Agent Ecosystems (Future)

What it could become:

  • Multiple AI agents collaborating automatically
  • Your personal AI coordinating with others' AIs
  • Seamless integration across tools and platforms
  • True "ambient computing"

Why it's not here yet: Requires solved problems we haven't solved (trust, interoperability, privacy, safety).

Where Wearables Fit (Or Don't)

The problem: Companies are trying to jump straight to Stage 5—ambient computing with glasses and wearables—without successfully implementing Stages 3 and 4.

As fractional design officers consulting on AI strategy, we tell clients: walk before you run. Master text-based AI interfaces before you try to revolutionize how humans interact with computers.

Meet Faraj Nayfa. We are currently managing the social media of his restaurant, Hala In, located in Mayfair neighborhood in Chicago, Illinois. He is a seasoned small business owner of 11 years, and is busy with managing the restaurant.

Since he personally has no time or social media experience to curate an online presence for it, EVE has helped to start the foundation for an online following onInstagram and Facebook to reach customers Faraj would previously have missed out on.

It is important to recognize that social media marketing is becoming the new norm. While the start up of a social media strategy can be overwhelming, it doesn’t have to be.

While you focus on your passion of running your business, EVE is here to focus on our passion of helping you navigate the social media world and digital business.

Since he personally has no time or social media experience to curate an online presence for it, EVE has helped to start the foundation for an online following onInstagram and Facebook to reach customers Faraj would previously have missed out on.

It is important to recognize that social media marketing is becoming the new norm. While the start up of a social media strategy can be overwhelming, it doesn’t have to be.

While you focus on your passion of running your business, EVE is here to focus on our passion of helping you navigate the social media world and digital business.

The Lesson: Technology Trends Aren't Product Strategy

We have seen this pattern repeatedly across decades:

Failed "Future" Technologies:

  • 3D TVs (everyone bought them, nobody used the 3D feature)
  • VR/AR mass adoption (Oculus and PlayStation VR remain niche)
  • Crypto/Web3 (speculation asset, not useful technology)
  • NFTs (see: crypto)
  • Google Glass (already discussed)
  • Segway (remember when this was going to "revolutionize cities"?)

What they had in common:

  • Massive hype and marketing
  • Passionate advocates insisting "you don't get it yet"
  • Predictions it would change everything
  • Actual users finding it didn't solve their problems
  • Eventual market rejection

The pattern: When everyone is screaming about "the future," be skeptical. The actual future arrives quietly and solves real problems so effectively that adoption happens organically.

Example: Smartphones weren't hyped as "the future of computing." They just worked so much better than previous phones that everyone switched.

What's Actually Working: Incremental Improvements to Existing Form Factors

While companies chase AR/VR dreams, the real innovation is happening in boring, incremental improvements:

AirPods Translation Feature

What it does: Real-time language translation through earbuds you already wear.

Why it works:

  • Solves a real problem (language barriers while traveling)
  • Uses existing, socially acceptable hardware (earbuds)
  • Doesn't require both parties to have the technology
  • Affordable ($150-250 vs. $3,500)
  • Works with devices people already own

This is the kind of innovation product design consultants help companies identify: high-value improvements to existing workflows using accepted form factors.

Smart Watch Health Monitoring

What it does: Continuous health tracking, fall detection, heart monitoring, sleep analysis.

Why it works:

  • Solves real problems (health awareness, emergency detection)
  • Socially acceptable (watches are normal)
  • Passive data collection (no active interface required most of the time)
  • Integrates with existing healthcare systems
  • Price point justifies value ($200-500)

Phone Camera Computational Photography

What it does: AI-enhanced photos, night mode, portrait effects, object detection.

Why it works:

  • Makes existing tool (camera) dramatically better
  • No behavioral change required (just take photos normally)
  • Results are immediately visible and valuable
  • Free with phone purchase

The pattern: The most successful AI integrations improve existing tools and workflows. They don't require users to adopt entirely new behaviors or interface paradigms.

The Interface Isn't Disappearing—It's Evolving

There's a narrative in tech that "interfaces will disappear" and we'll interact naturally through voice and gestures. We're watching this prediction fail in real-time.

Why interfaces aren't disappearing:

1. Interfaces Provide Cognitive Scaffolding

Physical and visual interfaces help users:

  • Understand available options
  • Remember system state
  • Develop muscle memory
  • Receive confirmation of actions
  • Correct mistakes easily

Voice and gesture interfaces remove this scaffolding, making interactions harder, not easier.

2. Context Matters

Different contexts require different interfaces:

  • Private spaces: Voice commands work fine
  • Public spaces: Visual interfaces are less disruptive
  • Noisy environments: Touch interfaces are more reliable
  • Quiet environments: Voice might work
  • While moving: Simple physical buttons beat everything

One interface paradigm can't serve all contexts. UX design agencies know you need adaptive systems that match interface to situation.

3. Precision Requires Visual Feedback

Try this experiment: Close your eyes and tell your voice assistant to send an email. Include specific formatting, check for typos, ensure the recipient is correct.

It's nearly impossible. Visual interfaces will always be necessary for precision work.

4. Humans Like Tangible Interactions

There's a reason mechanical keyboards are popular despite touchscreens existing. There's a reason people still prefer physical books for deep reading. There's a reason musicians prefer physical instruments.

Tactile feedback and physical interaction aren't bugs—they're features of human cognition.

Service Design: The Missing Piece in AI Wearables Strategy

One critical element absent from every failed wearables launch: comprehensive service design.

Service design focuses on the entire customer experience by aligning processes, people, and infrastructure across the user journey. It's sometimes called design thinking.

As a UX design agency, we facilitate service design workshops where we bring together:

  • Designers
  • Developers
  • Business stakeholders
  • Product managers
  • Customer service teams
  • Legal and compliance
  • Marketing

The goal: Map the complete ecosystem and ensure every touchpoint works together.

What Service Design Would Have Revealed About Vision Pro

The complete user journey:

  1. Awareness: See marketing, get interested
  2. Consideration: Want to try it, but can't—must book demo
  3. Demo: Wait days, go to Apple Store, need 30-60 minute appointment
  4. Purchase decision: $3,500 with unclear use cases
  5. Setup: Complex calibration and fitting process
  6. First use: Learning curve for eye tracking and gestures
  7. Daily use: Where do you store it? How do you clean it? Battery life limitations?
  8. Social use: Everywhere you go, you look ridiculous
  9. Content ecosystem: Limited apps and experiences
  10. Support: Issues require in-person Apple Store visits
  11. Abandonment: Device sits unused, feels like waste of money

A proper service design process would have identified that almost every step contains friction, social stigma, or unclear value. We would have flagged these issues long before manufacturing.

What Service Design Looks Like in Practice

When we run service design workshops, here's our process:

Day 1: Problem Identification

  • Map current state with all stakeholders
  • Identify pain points at every touchpoint
  • Gather data from customer service, reviews, support tickets
  • Create empathy maps for different user types
  • Prioritize problems by impact and frequency

Day 2: Ideation and Solution Design

  • Brainstorm solutions without judgment
  • Use "yes, and" thinking to build on ideas
  • Challenge assumptions about what's possible
  • Create rough prototypes of promising concepts
  • Test assumptions with quick validation

Day 3: Alignment and Planning

  • Align stakeholders on priorities
  • Create roadmap for implementation
  • Identify dependencies and risks
  • Define success metrics
  • Establish feedback loops for iteration

This isn't a one-time workshop. It's a continuous practice that should happen throughout product development, not just at the beginning.

The Apple Vision Pro clearly skipped this process. Otherwise someone would have asked: "Who wants to wear these in public?" and the answer would have been: "Nobody."

The Real Future of AI: Not on Your Face

So if wearables aren't the answer, where is AI actually heading in ways that will improve user experiences?

Background Automation That Actually Helps

What's working:

  • Email categorization and priority sorting
  • Calendar scheduling finding optimal meeting times
  • Photo organization and search
  • Predictive text and autocorrect
  • Fraud detection in financial transactions
  • Content moderation (when done responsibly)

Why it works: Solves annoying tasks, stays invisible, doesn't require new behaviors.

Specialized Professional Tools

What's emerging:

  • Medical image analysis assisting radiologists
  • Code completion for developers
  • Design system component generation
  • Legal document analysis
  • Research paper summarization
  • Data analysis and visualization

Why it works: High-value professional use cases justify learning curves and limitations.

Enhanced Creative Tools

What's useful:

  • Brainstorming and ideation assistance
  • Draft generation for iteration
  • Style exploration and variations
  • Translation and localization
  • Accessibility improvements (alt text, captions)

Why it works: Augments human creativity rather than replacing it, speeds up tedious parts of creative work.

Voice Interfaces in Specific Contexts

What works:

  • Car controls while driving
  • Kitchen timers while cooking
  • Smart home controls while hands are full
  • Accessibility for vision or mobility impaired users

Why it works: Solves specific contextual problems where hands-free is genuinely valuable.

What doesn't work: Trying to make voice the only interface or pretending it's always better than visual/touch interfaces.

What Companies Should Do Instead: The Strategic UX Approach

If you're building AI products or considering wearable technology, here's the framework fractional design officers and UX consultants use:

1. Start With Real User Problems

Don't ask: "How can we use this cool technology?"

Ask instead:

  • What problems do our users actually face?
  • How do they currently solve these problems?
  • What makes current solutions inadequate?
  • Would our technological solution be meaningfully better?

Validate before building. Conduct ethnographic research. Observe users in natural contexts. Identify real pain points, not imagined ones.

2. Map the Complete Ecosystem

Consider every touchpoint:

  • How do users learn about the product?
  • What's the first interaction experience?
  • How do they set it up?
  • What's daily usage like?
  • How do they get support?
  • How does it integrate with existing tools?
  • What happens when things go wrong?

UX design agencies know that a great core product can fail if any part of the ecosystem is broken.

3. Prototype and Test in Real Contexts

Don't just build demos. Test with real users in real environments:

  • Would they actually use this at work? Test at their workplace
  • Would they wear this in public? Test in public spaces
  • Would they use this at home? Test in home environments

Watch for:

  • Social discomfort or stigma
  • Cognitive load and fatigue
  • Physical discomfort
  • Privacy concerns
  • Comparison to existing solutions

4. Validate Willingness to Pay

Price sensitivity testing:

  • What would users pay for this improvement?
  • Does the value justify the cost?
  • What's the total cost of ownership? (device + accessories + subscriptions + replacement cycle)

The Vision Pro at $3,500 is a perfect example of failing to validate price sensitivity.

5. Consider Trust and Privacy

Critical questions:

  • Do users trust your company with this data?
  • What data are you collecting?
  • How are you protecting privacy?
  • What are you training AI models on?
  • Can users opt out meaningfully?
  • Are you transparent about data usage?

Meta's smart glasses fail here because users fundamentally don't trust Facebook with always-on cameras and microphones.

6. Build for Specific Use Cases First

Don't try to be everything to everyone.

Start with one specific, high-value use case:

  • Construction site safety with overlay warnings
  • Medical training with AR anatomy
  • Remote expert guidance for technicians
  • Language translation for specific professional contexts

Master the specific before attempting general consumer products.

7. Measure Real Outcomes, Not Engagement

Wrong metrics:

  • Time spent in device
  • Number of features used
  • Engagement with AI prompts

Right metrics:

  • Did we solve the user's problem?
  • Is this faster/easier/better than previous solution?
  • Would users recommend this?
  • Do they continue using it after novelty wears off?
  • Net Promoter Score (NPS)

Product design consultants help companies identify metrics that reflect actual user value, not vanity metrics that look good in board presentations.

Our Recommendations: What to Do If You're Building AI Products

As fractional UX experts and UX consultants, here's our direct advice:

For Consumer AI Products:

  1. Solve existing problems better, don't create new interaction paradigms
  2. Improve current form factors (phones, watches, earbuds) rather than inventing new ones
  3. Make AI invisible when possible—users care about outcomes, not technology
  4. Test for trust as rigorously as you test for usability
  5. Validate price sensitivity before committing to manufacturing
  6. Consider social acceptability as a core requirement, not an afterthought

For Professional AI Products:

  1. Focus on specific high-value workflows with clear ROI
  2. Partner with actual professionals in your target industry from day one
  3. Build for precision and control, not just automation
  4. Integrate with existing professional tools rather than replacing them
  5. Provide transparency about what AI is doing and why
  6. Enable human oversight for all critical decisions

For AI Wearables Specifically:

Our blunt assessment: Unless you're building for specific professional use cases, don't build AI wearables right now.

The technology isn't ready. Users don't want them. The social stigma is real. The privacy concerns are valid. Existing form factors work better.

If you're determined to build them anyway:

  • Focus on professional applications (medical, industrial, military)
  • Partner with institutions (hospitals, manufacturers, universities)
  • Accept niche market rather than mass adoption
  • Invest heavily in privacy and security
  • Plan for years of iteration before consumer viability

The Broader Lesson: Question the Hype

UX design agencies exist partly to be organizational skeptics—the people who ask uncomfortable questions before companies commit to expensive mistakes.

Questions we ask:

  • Should we actually build this?
  • Who is this for?
  • What problem does this solve?
  • Is this better than existing solutions?
  • Have we validated demand?
  • Can we scale this responsibly?
  • What are the unintended consequences?

When everyone in the room is excited about "the future of AI" or "revolutionary wearables" or "the next big thing," someone needs to be the adult saying: "But have we talked to users?"

That's what fractional design officers, product design consultants, and strategic UX consulting firms bring: perspective, skepticism backed by data, and the courage to say "this isn't ready" or "users don't want this."

Final Thoughts: The Interface Wars Nobody's Winning

We've watched multiple waves of "interface revolution" predictions fail:

  • Voice will replace all visual interfaces (it didn't)
  • Gesture controls will replace physical buttons (they didn't)
  • VR will replace screens (it hasn't)
  • AR glasses will replace phones (they won't)

The pattern: Interface changes happen slowly, driven by solving real problems, not by technological capability.

Smartphones succeeded because they solved real problems (internet access everywhere, better cameras, GPS navigation, app ecosystem) in a socially acceptable form factor.

Smart watches succeeded in limited ways because they solved specific problems (fitness tracking, notifications without pulling out phone, health monitoring) in an existing, accepted form factor.

AI wearables are failing because they don't solve meaningful problems better than existing tools, they introduce social stigma, they raise privacy concerns, and they cost too much.

The companies winning aren't trying to revolutionize how humans interact with technology. They're making incremental improvements to interfaces people already accept and use.

We're Here to Help You Avoid These Mistakes

Whether you're building AI products, wearables, or any other technology, we can help you:

Validate before you build:

  • User research to identify real problems
  • Ethnographic studies in natural contexts
  • Competitive analysis and market positioning
  • Price sensitivity and willingness to pay testing

Design strategically:

  • Service design mapping entire ecosystems
  • Prototyping and iterative testing
  • Accessibility and inclusivity reviews
  • Privacy and trust considerations

Implement responsibly:

  • Agile integration with development teams
  • Ongoing user testing and feedback
  • Metric definition and success measurement
  • Continuous improvement based on real usage

Navigate organizational dynamics:

  • Stakeholder alignment workshops
  • Business case development for UX investment
  • Change management for new approaches
  • Executive education on user-centered design

With 35-40 years of combined experience, we've seen companies waste billions on products nobody wanted. We've also helped companies build products that users love and that succeed in the market.

The difference? Starting with strategy, not technology. Beginning with user needs, not engineering capabilities. Validating assumptions before commitment, not after failure.

Building AI products or wearables? As strategic product design consultants, we help companies avoid expensive mistakes by validating assumptions, identifying real user needs, and building products that people actually want to use.

Whether you need early-stage research, strategic planning, or hands-on design and testing, we bring the expertise to help you build right the first time.

Looking for a UX design agency that will tell you the truth about your product—even when it's uncomfortable? Let's talk about how strategic UX can save you from becoming the next cautionary tale in tech hype cycles.

This article is based on content from the UX MURDER MYSTERY podcast.

HOSTED BY: Brian J. Crowley & Eve Eden

EDITED BY: Kelsey Smith

INTRO ANIMATION & LOGO DESIGN: Brian J. Crowley

MUSIC BY: Nicolas Lee

A JOINT PRODUCTION OF EVE | User Experience Design Agency and CrowleyUX | Where Systems Meet Stories ©2025 Brian J. Crowley and Eve Eden

Email us at: questions@UXmurdermystery.com

About the Author:

About the Author:

More Articles by EVE

Search Engine Optimization is the practice of increasing the quantity and quality of traffic to your website through organic search engine results.

EVE’s client, Pacoto based in Chicago, Illinois (IL) is a coach networking website that offers a broad range of coaches from business to wellness.

It can at times be hard to understand how something like social media has had such a large impact on the business world. “…Roughly 94% of small businesses use social media as a marketing tool.”