A wristband reads hand movements from inside the body and converts them into real time control for robots and screens, without cameras or gloves.

Engineers at the Massachusetts Institute of Technology (MIT) have developed a wearable ultrasound wristband that can track hand movements in real time and convert them into digital or robotic actions. The system captures internal wrist activity and uses AI to predict finger and palm positions with high accuracy.
The wristband works by imaging muscles, tendons, and ligaments inside the wrist while the hand moves. These structures control finger motion, so tracking them helps estimate hand position. An AI model processes the ultrasound data and translates it into movement, which can then be sent to a robot or a computer system.
In testing, users were able to control a robotic hand wirelessly. The robot mirrored actions such as finger movements, playing simple piano notes, and throwing a small ball into a hoop. The same setup also worked for digital interaction, where gestures like pinching enabled zooming and object control on a screen.
The system is trained for each user. During setup, hand movements are recorded using cameras and matched with ultrasound images. This helps label specific regions in the images that correspond to different finger movements. Once trained, the AI can predict gestures directly from ultrasound data without external cameras.
The device was tested across users with different hand sizes and movement patterns. Participants performed a wide range of actions, including all 26 letters of American Sign Language and object handling tasks like holding a bottle, scissors, or a tennis ball. The system consistently tracked hand positions during these activities.
Current hand-tracking methods rely on cameras, sensor gloves, or electrical muscle signals. Camera systems can fail with occlusion, gloves can restrict movement, and electrical signals are often noisy. The ultrasound approach avoids these issues by directly imaging internal structures.
The wristband includes a compact ultrasound unit similar in size to a smartwatch, along with onboard electronics for continuous imaging. The design builds on earlier work using small ultrasound stickers that attach to the skin.
The team is now collecting data from a wider group of users to improve the AI model. The aim is to build a large dataset of hand motions that can support applications such as training robots for fine tasks, including surgical procedures, and enabling more natural interaction in virtual environments.
