Mon. Jul 21st, 2025

“Our AI Recognises Gestures Even When People Sign Slightly Differently, Achieving About 98% Accuracy”

Aishwarya Karnataki Founder CEO


– Advertisement –
Starter

Glovatrix’s AI-powered gloves translate sign language into speech and text, bridging communication gaps for the deaf, starting in workplaces, empowering lives, and shaping the future of accessibility. In an exclusive conversation with EFY’s Nidhi Agarwal, Aishwarya Karnataki, Founder and CEO of Glovatrix, discusses the journey behind the innovation, challenges faced, and the vision to make communication universally inclusive.


Aishwarya Karnataki, Founder & CEO, Glovatrix
Aishwarya Karnataki, Founder & CEO, Glovatrix

Q. Can you shed some light on your firm and its range of products or services?

A. Our organisation’s goal is to assist deaf and speech-impaired individuals who use sign language. We have developed AI-powered smart gloves that translate sign language into speech and convert spoken responses into text, helping both parties communicate more easily.

Q. Any reason for beginning a startup?

A. In 8th grade, I was part of the student council and regularly interacted with children with disabilities. Most had friends who understood them, but one boy, Atharv, did not—no one knew sign language. I learned it and asked him, “Will you be my friend?” He was overjoyed to finally communicate. That experience inspired me. Later, while studying engineering, I developed a gesture-controlled robot and merged the idea with sign language translation. That is how Glovatrix began.

Q. Do you think your product, 5th Sense, will be used more at work or at home?

A. Definitely in workplaces. We aim to start by collaborating with companies that employ deaf individuals. Our product currently recognises around 200 gestures, but sign language includes approximately 20,000. A deaf worker at a place like KFC only needs a small set of gestures related to their job, like ‘chicken’ or ‘spicy.’ So, we will create specific gesture libraries for each workplace, making it easier to use.

Starting with workplaces makes sense, not only due to CSR funding and HR support, but also because job-specific gesture libraries facilitate easier adoption. As production scales up, the cost will decrease, making it more accessible to individuals.

Q. How will this product help in learning and studying sign language? Will it create new signs?

A. We have spoken with the Indian Sign Language Research and Training Centre (ISLRTC), and they are keen to collaborate with us, not only to translate sign language but also to create new signs. When there is a need to introduce a sign, such as for ‘earphones’, experts from various fields convene to finalise it. However, they currently lack a system to verify whether a similar sign already exists elsewhere.

We suggested using our gloves to help build a digital dictionary that verifies existing signs, recommends similar alternatives, and aids in sign language education. ISLRTC has responded positively, and we are continuing discussions.

Q. What sensors make your gesture system work?

A. Each fingertip contains a motion sensor to track finger movement, along with an additional sensor on the wrist. All the data is transmitted to a microcontroller on the wrist, which then sends it via Bluetooth to a phone. From there, it is uploaded to the cloud, where our algorithm processes it. The output is sent back to the gloves, which feature a speaker for sound and a screen for displaying text.

Q. How is electronics involved in your device?

A. Our device incorporates a substantial amount of electronics and includes two types of circuit boards (PCBs): a main board on the wrist and ten custom fingertip boards. The main board features a motion sensor, an ESP32 chip, a microphone, a speaker, a vibration motor, and a screen, along with essential components like resistors and capacitors. Each fingertip board has a motion sensor, and one fingertip includes a button. Pressing the button begins gesture capture; pressing it again stops it. The system then initialises gesture analysis, helping distinguish between sign language and other movements.

Q. What special features does your device have?

A. One key feature we have added is background sound detection, which is particularly beneficial for deaf users. Sounds such as doorbells, a baby crying, or a pressure cooker whistle often go unnoticed, which can cause serious issues. Our device addresses this by sending alerts for specific sounds. It does not just indicate ‘noise’, it shows what the sound is. A crying baby triggers a baby icon, a cooker whistle shows a cooker icon, and a doorbell displays its own icon. We have preloaded 8–10 common sounds, including a baby crying, a cooker whistle, a fire alarm, a doorbell, and a car horn. Users can also add up to four custom sounds by recording them through the app.

Q. What languages does it support?

A. Right now, the gloves support Indian Sign Language. Plans are underway to add American and British variants. Users can also choose their preferred spoken output such as Hindi, Marathi, or English, via the app.

Q. How AI is helping?

A. AI plays a key role in recognising sign language in our system. We use a machine learning (ML) model that processes data from 12 sensors, each sending 6 values, 72 in total, every nanosecond. These values form a unique pattern for each gesture. For example, the signs for ‘hello’ and ‘how are you’ create distinct patterns. The AI identifies these patterns and recognises the gesture, even when people sign slightly differently. We have currently trained the model on nearly 200 gestures, achieving an accuracy of around 98%.

AI also assists with language translation. Sign language grammar differs from spoken English. For instance, ‘A cat ran after the mouse’ becomes ‘cat mouse ran’ in sign language, often omitting words such as ‘a’, ‘is’, and ‘the’. Using natural language processing (NLP), the AI restructures the signs into grammatically correct English. It also adapts to individual signing styles, much like interpreting different handwriting.

Q.  The processing happening over the edge or in the cloud? 

A. The primary processing occurs in the cloud, so an internet connection is required. However, in emergencies where deaf individuals may not have access to the internet, we did not want to leave them without support. To address this, we created a local gesture library containing the 26 letters of the alphabet and 10 emergency phrases, stored on the device itself. This library works offline. Access to the full library of 20,000 gestures requires internet connectivity.

Q. How does the system learn new signs?

A. Yes, when we need to add a new gesture, we provide the gloves to deaf individuals and ask them to perform the gesture. At present, around 15 to 20 people help us, each performing the gesture 10 times, resulting in approximately 200 samples. We use this data to train the AI to recognise the new gesture.

Q. What design problems did you face, and how did you fix them?

A. We encountered multiple design challenges. Our first version was a full glove, which proved uncomfortable, especially in summer, and reduced users’ sense of touch, which is vital for them. We then explored a smartwatch model, which was more comfortable but introduced delays. Ultimately, we developed a hybrid model: part glove, part watch, with open fingers and palm to preserve touch.

Another issue was speech-to-text conversion. Long texts are often tricky for deaf users to read due to their unfamiliarity with spoken English. To help, we added a summariser that displays just 2–3 keywords, enabling users to follow conversations through context and lip reading.

Q. What problems did you have making a cross-platform app, and how did you fix them?

A. The biggest challenge was receiving data from two gloves simultaneously via Bluetooth. While connecting one device is straightforward, connecting two devices to a single phone proved to be difficult. Initially, we planned to develop separate Android and iOS apps. However, our partner, Coreco Technologies, used Flutter to create a single app for both platforms. They began with a basic app to test Bluetooth connectivity and later built the full version. Now, the right glove sends data to the left glove, which then sends all the data to the phone. Each glove’s controller communicates with the other and the app, which relays the data to the cloud.

Q. What were the biggest challenges going from the first prototype to the final design?

A. We are currently on the eighth version of our prototype and the second version of our app. Our journey started in February 2020 with a crude prototype made using duct tape and jumper wires. Initially, we used flex sensors to detect finger movements, but they were expensive (₹800 each) and unable to capture full hand motion. We attempted to combine them with a wrist-mounted motion sensor, but the results were unreliable. Eventually, we switched to using motion sensors on each finger, which significantly improved accuracy. We also transitioned from a bulky Arduino board to a compact, custom PCB, and later eliminated the use of Arduino.

In July 2022, we experimented with a smartwatch-based model using finger rings and detachable strings instead of gloves. The idea was to let users wear only the watch at home and quickly connect the strings via magnetic connectors when needed. However, limited funding and a small team made it difficult to make the connectors work effectively. The strings also had too much slack and interfered with usability. We eventually returned to the glove design, which provided better stability.

Q. Have you received any government funding for your prototypes?

A. Yes, we have received significant government funding for prototyping, about Rs 15 million so far, all equity and debt-free. We got Rs 1.5 million from Startup India seed fund, Rs 1 million from Nidhi Prayas, and Rs 5 million from the Buyer Back Ignition grant. We also won Rs 1 million from Boeing’s Build 3.0 competition and smaller amounts from other contests, including 200,000-300,000 from a UN competition. This funding has helped us support our work independently.

Q. How do you get deaf people and organisations to use your product?

A. We have received extensive media coverage in publications such as The Indian Express, Times of India, CNN, NDTV, and others. Several organisations, including Royal Orchid Hotels, TVS Scooters, Indigo Airlines, and KFC, are on a waiting list for our gloves. Word of mouth and media visibility have accelerated our growth. We also participate in exhibitions, job fairs, and startup events to expand our reach.

Among our users, the deaf community, over 10,000 people follow us on social media. One video about our product received 1.5 million views, largely due to shares from the deaf community. We also plan to start deaf-focused podcasts to strengthen this engagement.

Q. Any tie-up with schools or colleges?

A. Yes, we have partnered with a design firm called Dominix Global. They assisted with early research, including empathy mapping, and helped us better understand the thought processes of deaf users, enabling a more user-friendly product design.

Q. How many units have you sold, and how much money have you made?

A. We have not made any sales yet. Royal Orchid Hotels has placed a pre-order for 10 devices through their CSR initiative. We plan to deliver these next month. As this is a paid pilot, it may not be classified as regular revenue. So, officially, we have not recorded any sales yet.

Q. What are your growth plans for the company?

A. We plan to begin by selling to businesses (B2B). Deaf waiters will initially use our gloves to take orders. Following this, we intend to seek investment to scale up operations. We aim to expand our team, particularly in marketing, sales, and research, and collaborate with partners to build a comprehensive gesture library. By the end of 2026, we plan to sell directly to consumers (B2C) and work on translating American Sign Language to voice, expanding our reach to global users.

Q. What is stopping your startup from growing really fast right now?

A. We are currently in a classic chicken-and-egg situation. Investors want to see the product in real-world use before committing funds, but we need capital to scale and deploy the product. Currently, we are operating with a lean team, supported by government grants and some personal funding. Our immediate goal is to deliver devices to Royal Orchid successfully by mid-July.

Q. How is the ecosystem helping you? Are you adding any new partners or resellers?

A. We are working with several NGOs dedicated to supporting individuals with disabilities. Over the past four years, we have built relationships with remarkable organisations and founders, such as Pratik Madhav and the Asistec Foundation. Asistec supports startups developing assistive technologies. These organisations have helped us reach customers, connect with investors, and gain a deeper understanding of user needs. We continue to add new partners on a regular basis.


By uttu

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *