Currently there is a gigantic variety of differing technologies which are aimed at aiding people by providing the user with additional information about the environment around them. I see this revolution in portable information to be an incredibly good thing, for one I can't even begin to say how many times google maps has helped me out when I've been lost. I'd just whip out my smartphone and load googlemaps, tell it to find me and then input my destination and away I went.
Solutions like google maps, G.P.S and what not are all brilliant, but all have one or two snags. For one google maps and G.P.S technologies in general rely on static information being true, for example google maps may display that the nearest bank is two blocks away but when you arrive you discover it's been knocked down and a starbucks built over the top. My point here is that a lot of technologies can't give real-time information about the environment at large around the user, and those that can tend to rely on providing the user the information via audio or visual cues.
My project, Augmented Sensory Perception is meant to partially address this issue by providing people with VibroTactile feedback directly proportionate to the physical distance between a person, and physical objects around them.
Ultrasonic VibroTactile Belt |
It is using vibrational feedback that this technology is different to many existing technologies. Specifically this kind of setup can aid visually impaired individuals to navigate the world at large by providing vibrotactile feedback to give them a better understanding of their proximity to objects within range of the ultrasonic sensors. The main issue that visually impaired users have with most existing technologies is that the existing tech specifically relies on the second most prevelant human sense - hearing. As visually impaired users rely on their sense of hearing to perform tasks that most would use their sense of sight to perform, it seems counter intuitive to use the sense of hearing for assistive technologies as they will tie the visually impaired users most prevalent sense.
Having established that VibroTactile feedback can be beneficial to users who are visually impaired, this same technology can also be adapted to be used in vehicles. Whilst it is not uncommon to find vehicles that use ultrasonic sensors to aid the driver by providing auditory and visual alerts on a screen when the vehicle is getting dangerously close to solid objects, it IS unusual to provide the driver with VibroTactile information with regards to the proximity of objects around. It would not be difficult to modify a car seat and seatbelt to provide vibrotactile feedback according to objects around the vehicle.
My project therefore does not target one specific group of potential users, but rather the project is to design, develop, build and implement an abstract technology that can be adapted to various applications.
I have designed a rough schematic for part of the system, the ultrasonic array, micro controller for the array and a way of interfacing the sensor and microcontroller to a P.C (or another embedded system).
I'll be updating this blog with my progress as I go, It is my intention to keep the entire project open source and all schematics and hardware I design will eventually be released under an unrestrictive license for others to expand upon and use in their own research.
No comments:
Post a Comment