Dec 17, 2025

/

Life

I worked on this 9 years ago and today it is used in the new Meta RayBan.

The "Aha!" Moment

When I started reading about the new Meta Ray-Ban smart glasses and the accompanying "Neural Band" wristband, I felt a sudden, distinct wave of nostalgia.

As I dug deeper into how their technology works, using a wrist device to detect hand signals to control an interface, I realized something incredible: I had no idea this was directly related to something I worked on nearly a decade ago.

Back then, I was working on a research project called AirScript. And while we didn't have the budget of a tech giant, we were chasing the exact same future.

The Challenge: Writing in Thin Air

Let me give you some context. We were trying to solve a problem that seemed crazy at the time: How do you write in the air without a keyboard, a touchscreen, or even a camera watching you?

Computer vision was the standard approach, but it had limitations. It required good lighting and a camera pointed directly at your hands. We wanted something invisible and ubiquitous.

We realized that every time you move a finger, your muscles generate tiny electrical spikes. We hypothesized that if we could "listen" to those spikes, we could figure out what the hand is writing without ever "seeing" the hand.

Wiretapping the Body

Think of it like wiretapping the nerves in your arm. Even if your hand is moving in empty air, the sensors on your wrist can "hear" that you are writing the letter 'A' or the number '5'.

This technology is known as Surface Electromyography (sEMG).

For AirScript, we built an algorithm called 2-DifViz and trained Neural Networks to interpret these noisy signals in real-time. It turned invisible air movements into actual text on a screen. And honestly? For a project built years before the current AI boom, it worked brilliantly well.

Enter the Meta Neural Band

Fast forward to today. What is the Meta Neural Band?

Meta’s wristband uses the same fundamental technology we explored—sEMG—but with a massive upgrade in sensitivity and processing power. They are detecting the electrical signals sent from your brain to your motor neurons; the tiny "firing" commands your nervous system sends to your fingers.

This is why the tech feels magical. As Meta describes in their research on neural interfaces, the sensors are so sensitive they can detect "motor intention."

They can hear the command to move your finger milliseconds before your finger actually moves. This allows for "micro-gestures." You can essentially twitch your thumb against your index finger—or eventually, just think about pressing hard—and the band registers it as a click, even if your hand looks perfectly still to an observer.

From Gesture Recognition to Neural Interfaces

It is fascinating to see that in both instances—my research from years ago and Meta's current product—we arrived at the same conclusion: The wrist is the best place to decode the hand.

  • AirScript proved that EMG signals could be decoded into complex outputs (text/digits) using Neural Networks, freeing the user from cameras.

  • Meta's Neural Band validated that approach but solved the "noise" problem by going deeper, decoding the neural drive rather than just the noisy muscle activity.

They effectively turned a "gesture recognizer" into a true "neural interface."

Seeing this technology graduate from academic papers and prototypes into a consumer product that people will wear every day is thrilling. It’s a validation of the work we did on AirScript and a glimpse into a future where our devices understand not just what we do, but what we intend to do.

More from AI with Ayushman:

Read More Articles

We're constantly pushing the boundaries of what's possible and seeking new ways to improve our services.

Ayushman Dash

Dec 17, 2025

Life

A look back at AirScript, my research on converting hand movements into text using EMG, and how it parallels Meta’s latest Neural Band technology for AR glasses.

Ayushman Dash

Dec 17, 2025

Life

A look back at AirScript, my research on converting hand movements into text using EMG, and how it parallels Meta’s latest Neural Band technology for AR glasses.

Ayushman Dash

Dec 17, 2025

Life

A look back at AirScript, my research on converting hand movements into text using EMG, and how it parallels Meta’s latest Neural Band technology for AR glasses.

Ayushman Dash

Dec 17, 2025

Life

A look back at AirScript, my research on converting hand movements into text using EMG, and how it parallels Meta’s latest Neural Band technology for AR glasses.

Ayushman Dash

Dec 14, 2025

AI News

Discover why Google’s new "Nano Banana" update is changing the game for AI character consistency. We break down the 17-second workflow that keeps your subject perfect across every prompt.

Ayushman Dash

Dec 14, 2025

AI News

Discover why Google’s new "Nano Banana" update is changing the game for AI character consistency. We break down the 17-second workflow that keeps your subject perfect across every prompt.

Ayushman Dash

Dec 14, 2025

AI News

Discover why Google’s new "Nano Banana" update is changing the game for AI character consistency. We break down the 17-second workflow that keeps your subject perfect across every prompt.

Ayushman Dash

Dec 14, 2025

AI News

Discover why Google’s new "Nano Banana" update is changing the game for AI character consistency. We break down the 17-second workflow that keeps your subject perfect across every prompt.

Ayushman Dash

Dec 11, 2025

Life

Tired of YouTube tutorials? Learn how one user leveraged ChatGPT's live camera and personalized profile to get real-time, context-specific advice while plant shopping in Paris.

Ayushman Dash

Dec 11, 2025

Life

Tired of YouTube tutorials? Learn how one user leveraged ChatGPT's live camera and personalized profile to get real-time, context-specific advice while plant shopping in Paris.

Ayushman Dash

Dec 11, 2025

Life

Tired of YouTube tutorials? Learn how one user leveraged ChatGPT's live camera and personalized profile to get real-time, context-specific advice while plant shopping in Paris.

Ayushman Dash

Dec 11, 2025

Life

Tired of YouTube tutorials? Learn how one user leveraged ChatGPT's live camera and personalized profile to get real-time, context-specific advice while plant shopping in Paris.

Ready to lead with confidence in an AI-driven world?

Copyright © 2024 AI with Ayushman. All Rights Reserved

Social

Ready to lead with confidence in an AI-driven world?

Copyright © 2023 Techty. All Rights Reserved

Social

Ready to lead with confidence in an AI-driven world?

Copyright © 2023 Techty. All Rights Reserved

Social