top of page

Latest

Demos

View our latest and most capable prototypes, including our flagship product, EyeSight, and its sister service, CogniSense

EyeSight v4 -
Reading Mode

EyeSight's Cognisense can scan text and instantly transform it into spoken words. The sound is delivered through discreet speakers in the frame, allowing the user to listen comfortably.

EyeSight v4 - 
Navigation Mode

EyeSight provides step-by-step navigation to chosen spots in a building. Simple one-word directions are delivered through the frame speakers to help with easy navigation.

EyeSight v4 -
Scene Inference Mode

EyeSight can scan the environment and analyze it with its Cognisense AI algorithm to interpret ongoing activities. The inferences are delivered to the user in real time through the speakers in the frame.

EyeSight v4 -
Explanation Mode

EyeSight can process written content and convert it into a simple explanation. The output is spoken to the user in the language they choose.

EyeSight V3

A state-of-the-art, yet $15 retrofittable attachment to a pair of glasses, capable of detailed scene inference and explanatory conversations with its users. A product designed for learning, it can explain anything from diagrams and graphs to math proofs.

Stationary photo

EyeSight v3 -
Multi Language Support

Our software allows the glasses to not only read, but understand, tens of local languages from all across India. This video demonstrates its capabilities to read and explain Hindi content to the user.

EyeSight v3 - 
CogniSense

A free-to-access online AI tutor, just a phone call away. Capable of speaking in any language, even local Indian languages (like Tamil in this Demo), this custom OpenAI assistant makes learning accessible to all - just a phone call away.

Stationary photo

EyeSight v2 -
Mobile Processing

Our software allows the glasses to not only read printed text, but also hand written text.

bottom of page