Interfaces For The Handicapped? 96
"Pete, a long time family friend and founder of Pedal with Pete, was born with cerebral palsy (check the Web site for more information). This greatly inhibits his motor control. He enjoys bike riding but last June, was in serious bike accident (once again, more info on the Web site) and this only reduced his motor control. The man has incredible spirit and wants to live on his own, though he has to have home care nurses stop by every day to help him perform most of his daily activities.
He used to do lots of work on his computer pecking away with sticks strapped to both hands. Since his accident he has virtually lost control of one side of his body and has seriously reduced control on the other side, so this is no longer possible.
My current line of thinking is that if we can get him to be able to control a computer, we might be able to automate many of the other parts of his life so he doesn't have to rely on so much help from other people. While I definitely have several ideas on how to go about this, I'ld really like some other input. Does anyone have any ideas or new products they can point me to?"
We've already had a discussion about Voice Recognition in one of the first Ask Slashdots but that article is from '98 and I expect there has been a lot more progress in this areas since then. I may have to run a follow-up post to this if folks are interested.
Computers and Appliances (Score:1)
visual impairment (Score:3)
We also have zooming software for those who can still see a little. Many of these tools can be very usefull even if we aren't handicapped.
How am I supposed to hallucinate with all these swirling colors distracting me?
x10? (Score:1)
Detroit
possible? (Score:1)
I think this coupled with autonomous units (independant roving vacume cleaners?) a person with limited motor control could lead a very comfortable independant life style.
Atticka
Re:visual impairment (Score:2)
Now comes the part that I'm gonna get flamed for: They are running Windows. One thing M$ has done VERY well is build hooks into the OS for alternative input devices and keyboard control of the GUI.
--
Crips n' Puters (Score:5)
New Mobility Mag. had a write up on Dragon Dictate in their latest issue. Only parts are available online and I've requested a full copy be made available to /. The article really goes a long way in giving a "disabled" spin on this. I've pasted what of the article I could grab online at the bottom.
Ruthie - Compu-babe w/Multiple Sclerosis
Is the Dragon Draggin'? The Rise and Fall of DragonSystems
By Ben Mattlin
Last spring, I replaced my 3-year-old computer, which included a 5-year-old version of DragonDictate, the voice recognition program that allows you to talk to your computer instead of typing on the keyboard or using a mouse. What a miracle it had been when I first got it! I'm a functional quad--have little use of my arms and no use of my legs--and I'd been unable to find any truly effective way of interacting with a computer until DragonDictate came along.
I first saw a prototype demonstrated at a university laboratory in the 1980s, when it worked slower than catsup pouring out of a Heinz bottle and the complete setup ran about $20,000. Yet by 1991, a faster and more accurate DragonDictate was available for less than $5,000; thanks to the Department of Vocational Rehab, I was soon a grateful customer.
I had the greatest respect for the company itself, DragonSystems, a small Newton, Mass.-based software developer that had practically invented voice recognition technology. What a team they had working there!
Nowadays you can pick up a starter version of the software at any Best Buy store for about $20. But as DragonSystems has gone mainstream, it seems to have sold its soul. Service is no better or smarter than what you'd expect at McDonald's, and the program itself has become about as accessible and welcoming as a flight of stairs ...
JAWS link ... (Score:3)
ian.
Speech & Recognition... (Score:1)
But I don't think this is too terribly well-known, since my search on Google just turned up the last "Ask Slashdot" we had about this.
Any good resources out there about this, guys?
---
pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
Re:visual impairment (Score:1)
Using a mouse when you can't see the screen seems difficult. Do you just move it around until you happen to get it over the thing you want?
Where I work our products have to be accessible and we use the Self Voicing Kit Technology for Java [ibm.com] to test. I have not played with it much, but my understanding is when, for example, a dialog pops up, it will read the whole dialog to you.
2D vs. 1D input devices. (Score:5)
A computer, for most of the desktops, has two primary input devices, the keyboard and the mouse.
The keyboard is a conglomeration of many one dimensional buttons. A button is either on or off.
The mouse is on a 2 dimensional plane with 1 to 3 one dimensional buttons.
This combination has been very effective for creating input for a computer. In fact it can be argued that a keyboard is more productive than a mouse due to the time/movement factor. I can, with a couple of key strokes, do X but to use the mouse I have to move my hand and move the mouse to get the same thing done in 10x the amount of time. So following that line of logic a 1 dimensional input device is more efficient than a 2 dimensional input device. For the sake of efficiency it would be best to use a device with many 1 dimensional inputs, buttons, for a disabled person.
The most successful input device for a physically impaired person, IMHO, is the 2 paddles that Stephen Hawking uses for communication. It is very slow and laborious but with a little patience he is able to get quite a bit done. From what I understand is that there are paddles, basically buttons, which provide 2 one dimensional inputs.
Taken that this is a 2 bit system and a keyboard is based on ASCII, an 8 bit system (7 of them useable) you have 128 combinations. A 2 bit system has 4 so 128/4 = 32. So to get the same amount of work done out of the paddles you need to spend 32 times the amount of time.
Now that all of that is understood, and probably poorly. A new form of input needs to be invented based on the specific needs of the specific person. You need to determine how many one dimensional planes can he interface with then design a system around that. If he has use of his feet they can be used in the interface also. In fact if enough 1 dimensional planes are identified and specifically tailored to his need he may be more productive on a computer than the better than average computer user.
Re:x10? (Score:2)
FYI, I thought the same thing you did, until I checked out the site a little more. Pretty cool stuff that they do.
Guidelines (Score:3)
BCI and Eye-gaze systems (Score:1)
Re:Computers and Appliances (Score:2)
Perhaps not the example you were looking for..
Adaptive Hardware (Score:5)
With the help of his brother he has built a pretty cool system (beats either of my two) and has been using a device called an Adap2U Keyboard Emulator System which allows him to use Sip'n'Puff through a straw to "type" using Morse Code.
He also has a page on his site for disABILITY information and resources [eskimo.com] which seems to be an Ultimate collection of links.
Depends on the individual need (Score:2)
trace r&d center University of Wisconsin (Score:1)
Re:visual impairment (Score:1)
Good link (Score:2)
You want a professional opinion? (Score:1)
Ask Timmy [comedycentral.com].
Of course, he'd only have one thing to say [comedycentral.com] about it...
------
"I do not think much of a man who is not wiser today than he was yesterday."
Universtiy (Score:1)
Andrew Dumaresq
Not just for handicapped users (Score:1)
Every time I see a new small computer such as the Esspreso PC discused earlier today. I think about how nice it would be to be able to carry this around and still get data into it without lugging around a keyboard. Also my arms are feeling signs of wear after typing since the Commodore PET days. An alternate Input method would cut down on much lost productivity due to RSI. Unfortunately I do not believe that there is any such device that is more productive than the keyboard. Voice recognition sounds like it might show the most promise but can you imagine what the once quiet cube farm will be like with everyone jabbering away at their computers. This also raises privacy issues.
Often in SF literature you will encounter references to sub-vocalization (similar to voice recognition but the speaker doesn't make a sound. has anyone heard of any serious research in this direction?
For years I have seen reports of quadraplegics blowing on straws or communicating with devices that read eye positions these may be options but I don't know how efficient they might be.
A Possible Solution (Score:3)
My point is this - I'm sure it wouldn't be too hard to build a small (and cheap - thats why we're using it :-) system that allows device control using the computer and the TINI as a bridge to the 1-Wire network.
The 1-Wire network consists of a dasiy chain of devices using standard RJ11 networking gear. So this would be relatively cheap and easy to instal. And devices can talk to it using cheap 1-Wire interface modules (2 way for approx $10-$15 US) from Point Six Inc [pointsix.com].
The TINI can also be used to send email - so maybe even some sort of alarm. If a button is pressed an email goes out asking for help.
The link is incorrect... (Score:3)
a weenie macintosh idea (Score:1)
With cheap USB joysticks, there are typically a half dozen buttons and several axes of movement that could be mapped to a variety of tasks on the computer.
With the wireless networking available today, a laptop could be used anywhere which talks to a base computer which can control apliances via X10 etc. If the IR ports were useful enough (and I don't think they are), they could be trained to control a TV and VCR (many remotes suck to use even for those not disabled).
Sheldon
Contolling appliances with your thoughts (Score:1)
http://www.eurekalert.org/releases/uor-gia05030
Pete
Re:We are designing it right now (Score:1)
Sadly... the state of the art is, well, lagging (Score:1)
A good friend of mine is nearly completely disabled by polio, but has enough hand control to steer her chair. Using a computer is too laborious, and even talking on the telephone is not as easy as it could be. So what's my idea of an integrated system for this person?
Re:x10? (Score:1)
Keyboard Hot Keys (Score:1)
Other X10-Like Products (Score:4)
help? (Score:1)
Re:visual impairment (Score:2)
I assume you can touch-type. You know, without looking, where you have to put your finger to press a given key, and you can see the result on-screen. Why would it be so terribly difficult to get an image of Netscape's layout in your head instead of the keyboard, and hear the button before you press it??
Speech (Score:1)
My dad has progressed far enough in the illness that he is barely able to speak and when he does only a select few are able to understand him. Because of this I have started a project to mount a laptop on his wheel chair, and using the festival speech synth library give him the ability to speak. The problem that I have is the interface. My father uses a computer and the internet a great deal. He is unable to use a keyboard though. Instead he uses a little program that displays a picture of a keyboard and when clicked on will send that keystroke to the keyboard buffer. For the wheel chair his means of input are even more limited because he uses his right hand to control the wheel chair and will now use his left to control the speech laptop. If anyone has any suggestions for the interface, or would like to help in the project please contact me. My email address is angel@myrealbox.com and I would be very grateful for any help that I might get. This is being done as a present for my father, and I would like to have it working asap. thanks.
Stephen Hawking (Score:2)
Re:x10? (Score:1)
I imagine that a handicap person would find this type of automation very helpful, especially if they are on a regular schedule.
Adaptive access (Score:2)
They sold most of it in 1997. For more information you can go to http://www.cyberpatrol.com/handiw/def ault.htm [cyberpatrol.com].
It used to include screen magnifiers, morsecode to ascii, key predictors, word completion, remote control of X10 devices.
Another option might be touch screen. It might be easier to touch than to move a mouse.
Morse 2000 Outreach (Score:1)
Combination approach (Score:2)
The main problem with this idea is that you're probably looking at combining a number of different technologies in addition to speech commands and X10, and there are more of those in total for Windows (ironic because Windows is inherently harder to control and use than just about anything else). And despite X10's recent nauseating push to quasi-porn spycam marketing, they have been doing home control for decades (I remember seeing an entire house controlled from an Apple ][ at an exhibition once).
One of the neatest command speech apps I've seen was an integrated solution for Mac Netscape (Speech Navigator?) that allowed one to say "Navigator Back", "Navigator Scroll Down", "Open link 'Access Technologies'", etc. It worked amazingly well, but some pages were more accessible than others (the Slashdot main page would suck, for instance, since all the important links have the same name - "Read More..."). Nowadays you could probably write custom scripts to do the same thing with a more scriptable browser, like iCab or IE5.
Re:x10? (Score:1)
X10 is a communications protocol for communicating over 110V wiring. Here's a FAQ [ualberta.ca]. One company that sells X10 compliant equipment (along with wireless cameras) is X10.com [x10.com].
Regardless of this particular company's advertising methods, the technology is still cool and useful.
---
Have you ever noticed that at trade shows Microsoft is always the one giving away stress balls...
Re:visual impairment (Score:1)
Much as this will go down like a lead balloon, the short answer is install Win2K.
You might not like it, but the support for alternative inputs outweighs Linux. Now X10, I don't know about, but I'd guess the support for Windows/X10 is as complete as the support for Linux/X10.
Pie Menus-- faster, simpler... (Score:1)
If you haven't seen them, you might want to check out Don Hopkin's Pie Menus:
http://catalog.com/hopkins/piemenus/index.html
Assuming a user with limited motor control who can use a stick interface, pie menus allow navigating multiple selections with a minimum of effort. Some time ago, someone did a shareware pie menu for entering text on Newtons -- I could enter text using a pie menu faster than I can type! It'd be a nice tool for PCs.
Unfortunately, different disabilities require different tools -- and we're still in an age where most of the people who design devices for the handicapped have little clue. (Most electric wheelchairs don't have battery gauges, or a way to recharge without the help of a third party... one friend ran out of battery power in an elevator
Cheers.
Re:Computers and Appliances (Score:1)
Huge demand for a good, cheap solution (Score:1)
A drinking buddy of my dad's had a stroke a few years ago that left him with the use of almost nothing but his mind (and a loyal family, thank God). He can move is neck and torso a little bit, and manage a soft growl... that's about it.
His only means of communication is his computer. It uses a laser pointer which he wears on his forehead (it was pretty uncomfortable at first, but some local kids built a mount for his glasses frames that helped out a lot).
"Typing" is done using a predictive selection program. Hold the cursor on "A", and you get a list of the 20 words that you used most often starting with A. Choose the word, or select the second letter to narrow it down a little more until you either spell it out or get the word you want from the list. Once you choose a word, you get a list of word clusters, and so on. (People who watched the "Brief History of Time" documentary saw how Stephen Hawking uses a program a lot like this.)
Using this program on a 486 runing Windows 3.x, he was able interact with people and write professionally, but the technology is far from perfect... He recently upgraded to a spiffy new PC, only to find that the cursor moved too fast for him to control. The system had no mouse, only the laser pointer, so people had a tough time working out how to adjust it for him. (My dad called me for help, and I walked him through the Win95 GUI using only keyboard shortcuts and arrow keys... not fun.)
A free beer solution would totally rock, because people who sell this kind of stuff charge massive ammounts of money for their products, and not being able to walk & speak can make it tough to land a high-paying job.
Has been done (Score:1)
Re:Computers and Appliances (Score:1)
Roman Plumers...
Actually it was worse, they made pipes out of lead, but then there are many places in the US (Like Minneapolis) and other places around the world that still have lots of lead pipes being used for water distribution. Minneapolis is to the point where it is only the pipes from the mains to the houses, but...
On to Blue Tooth, etc.. One thing that worries me is protocals, extensibility, and controls over extensions. It's my feeling that for the first decade or so you will see lots of little incompatibilities between appliance lines. As time wears on the extensions that cause the incompatibilities will slowly become universal or be dropped as dead ends. Blue Tooth isn't going to make all the appliances talk seamlessly at first. Their will be hickups, and your dealing with an environment where the item will not be replaced for years, possibly decades. It will take time to determin how best to comunicate and what information needs to be transfered.
Center for really neat research does this kind of (Score:1)
http://www.pulsar.org
Re:Other X10-Like Products (Score:1)
GPL'd Morse Code user interface: Morseall (Score:1)
I have developed a GPL'd morse code interface
for a linux shell. Please download it
and try it out!
[pehr.net]
http://pehr.net/morseall
I would love to get more feedback on this
so that I can make it more usefull.
This is intended to be a complete user
interface controlled by a single mouse button.
Send me email if you need help setting
it up. I have someone working on documentation
and want to make this an open-source application
of the highest quality!
-pehr anderson
pehr@morseall.org
windows software for challenged users (Score:1)
It always pissed me off that they charged so much for this stuff as it was very simplistic. Essentially it was an application set up in windows that had a grid of letters or options and first a vertical bar would pass over the options and stay at each one for a certain amount of time (the time was controllable), and then after the button was hit on that line, a vertical bar would scroll across that line doing the same. And to select the user would press the button again. It had some intelligence to it that would place the most used options or letters at the top left making them easier to access and push other options and stuff onto other menus that were accessable through icons. He could control the mouse and keyboard with this program, and he will still talk to me on ICQ from tiem to time and always asks if my dad could use ICQ (which he couldn't). He uses the same program to speak for him, which works in similar ways but can allow you to specify whole words through hotlists and stuff.
TO specify how hte mouse is controlled with it, it has like 8 directional buttons, and icons for click, and double click, and left click and what not, and a radar button. the direction buttons nudge the mouse in a specific direction, while the
radar button shows an xored line on the screen and it starts pointing at 12'oclock and sweeps around clockwise and the person must hit the button to stop its rotating, then the mouse moves outward from the center and they press it again to stop the mouse from moving.
The technology is cool in that it requires only one button or switch of some kind and can be operated with the blink of an eye, but its not worth the huge amounts of money the company charged for it, when most people who could use it,
are all government funded anyways. Which I'm guessing is paid by our taxdollars and private grants.
Re:visual impairment (Score:1)
opop.nols.com [nols.com] ??
Earl Higgins
UIs for Dolphins (Score:1)
Video based ("non-applicance")UIs are a wonderful field to pioneer: Once we get beyond crude motion detection, to actually interperating gestures (surpassing the work of Myron Kreuger's Videoplace), we can interface with a computer back home just by dialing it up on the videophone! Pity the Max Headroom series didn't foresee that!
Jaron Lanier invented the dataglove/powerglove, and demonstrated how it can teach eye-hand experience, even teaching himself to juggle. Yet, he regards musical instruments to be the greatest refinement of human interface. Imagine combining the genuis of Hawking, Kreuger, and Lanier into the science of UI. Add a sprinkling of Xerox PARC, Todd Rudgren, and the sense of Brenda Laurel, then simmer until done.
Check out Pulsar.org (Score:2)
Like, did you know the Velostat(tm) that provides static shielding for chips and boards makes a great pressure sensor? Put some leads on it and push - you get a resistance change proportional to pressure. Makes a good cheap switch. The pulsar folks have demoed an RC car controlled by a pad placed in a shoe.
They (http://mindtel.com side of Pulsar)have a nice interface box called the TNG (Totally Neat Gadget). It provides A/D and Digital IO via a serial port. Primary progamming uses NeatTools - a true visual programming environment (connect the boxes). Very powerful for very little effort.
As for X10 - BAH! They spam way too much and their service is terrible. However, the products can go a long way towards automating a home with remote and PC control. If you watch their site, you can get some decent deals, but DO NOT USE THEIR E-COMMERCE side for any specials advertised in email - the site doesnt know about the email specials. Use the phone instead.
As for speech recog - its not really all that good, but it can be better than nothing!
As for brain interfaces - they are getting better but are still very expensive. Muscle sensors can be used nicely for therapy feedback devices. One pulsar.org app used them to drive MIDI synths and nintendos. Kids that would not do their physical therapy exercises before could not be dragged away from the drums/games.
Eye trackers are also pretty expensive and poor. I tried several at the Cal State Northridge Technology and Persons with Disabilities conference and was disappointed. IF you hold your head perfectly still, the video based eye trackers can do fairly well, but if you move around, they lose the pupil. Eyeglasses and contacts also screw up the tracking.
Re:Adaptive Hardware, morse code (Score:1)
check out Morseall.
This is an actively developed morse code
user interface for a linux shell.
[pehr.net]
http://pehr.net/morseall
-pehr
Re:GPL'd Morse Code user interface: Morseall (Score:1)
This looks really cool! Now if you had an interface that allowed the use of a paddle keyer -- maybe going into the sound card -- that would really fly.
Mark
Re:visual impairment (Score:2)
"I assume you can touch-type. You know, without looking, where you have to put your finger to press a given key, and you can see the result on-screen. Why would it be so terribly difficult to get an image of Netscape's layout in your head instead of the keyboard, and hear the button before you press it??"
I must have missed the part where they said, "Since blind people have no concept of 2 dimensional surfaces....."
I can see (no pun intended) where mousing presents additional challenges to those who are blind. Heck, I lose my cursor on the screen and I've got 20/20 vision or better. This is especially acute when most of the cursor is off the screen along the edge. Knowing the starting point of the cursor makes a difference. If I think the window I want is on the right side of the screen and my cursor is buried along that same side, moving right won't help.
I've worked with some blind friends (they'd turn on lights for me since I was handicapped and tripped over the furniture without them) and they do much better at remembering where things were left than I do. This isn't so much a function of 'enhanced' senses compensating as it is a matter of necessity. Maybe that ability would translate to the screen as well.
I would think a hot-key to center the cursor would be helpful in establishing a reference point to start mousing. Maybe the existing products do something similar already.
carlos
Integrating technology and humans (Score:1)
Eye Trackers (Score:1)
If they are small enough, you could have a head mounted unit.
If eyeglasses interfere, why not mount them on the eyeglasses themselves?
Then the camera could observe the pupil from a short distance without getting reflection interference, since it would be calibrated with the glasses on.
That would eliminate the interference problem and the problem with moving around all in one step, since the camera would move with your head.
EagleEyes at Boston College (Score:1)
For the past four years I've been working on a research project at Boston College called EagleEyes that was developed for exactly this purpose. EagleEyes is a mouse replacement technology that allows the user to control the mouse cursor just by moving his eyes or head
The technology is about 6 years in the making and we currently have over 35 students, most with cerebral palsy, using the sytsem on a regular basis (a couple of hours each week) for their learning. We also have five systems installed in homes around Massachusetts, three in special needs schools, and two systems in Birmingham and Manchester England.
The Hardware isn't free, but the system specifications and all of the necessary software (much of which I have written) are available for free as well as support for the system.
For details, check out http://www.bc.edu/eagleeyes [bc.edu]
Re:Integrating technology and humans (Score:1)
Re:x10? (Score:1)
Bluetooth, whenever it appears in volume, would be a good communication tool for this kind of system. No extra wires. It's supposed to be single-chip capable, cheap (after a bit), and later phases have a range of up to 100m.
Home automation/control site:
hometoys [hometoys.com]
Re:Computers and Appliances (Score:1)
Is this a given, or do Bluetooth and other hightech-enabled consumer goods open the door to the possibility of upgrading your [dishwasher | dryer | stove | doorbell] every three years? I'm sure the Maytag repairman is salivating at the thought of recurring revenue, as are Home Depot, Sears, Bill Gates [ob-borg reference],etc.
guesture interfaces (Score:1)
Alternative interfaces are something that interests me. When total bandwidth is limited by disease or trauma, I think it's a good strategy to look at the axes of motion or expression that still have full bandwidth. Certainly, speech is one route. Another is fine motor control and timing, which can be used a'la morse code key. I wrote one interface that can be used to do a limited amount of command interpretation for someone with a 2-degree plane of mobility. (Mouse, or finger on a touchpad, or pointer on screen.)
LibStroke [etla.net]
Eye Tracking (Score:1)
-Foxxz
eye control (Score:1)
Re:watch your language :) (Score:1)
Re:Computers and Appliances (Score:1)
Connectivity matters a lot (Score:1)
Software available (Score:2)
It comes with layouts for a bunch of different languages/keyboard styles.
I hope this will be useful to you or other people. I've gotten email from a bunch of people in the past that said they found it quite useful - among those were a stroke victim using linux, several people with ALS (Lou Gherig's disease) and cerebral palsy.
The good part is that if you can use a mouse and you have X, you can do anything a keyboard can do. Of course it's a bit slower than using the keyboard, (we've got some new features coming soon to address that) but one mouse and three buttons is never going to be able to compete with 10 fingers for speed.
Hope this helps.
Interfaces for the mentally challenged (Score:1)
FutureHome (Score:1)
Old Byte Magazine (Score:1)
Anyway, as far as ideas, there is not a whole lot that has changed. However, we now have the technology to actually implement some of them. Hope you can find a copy.
Re:2D vs. 1D input devices. (Score:1)
Development of a 1-D device is exactly what is in progress by a group in Cambridge [cam.ac.uk]. I was involved in the early stages of the project and we popped over to Stephen Hawking's office to see what his setup is capable of. Hawking is extremely good with his device, he has very fine temporal control with his button clicks. Basically common words/letters are highlighted by a moving bar and a click selects. It's basic, it's slow, but it works.
The new idea, known as `dasher' [cam.ac.uk], reverses text compression (arithmetic coding with a language modeller). The user input, in the form of clicks, mouse movement or even eye-tracking, effectively enters compressed information which is expanded into text. Very efficient, and speeds can be up to 40wpm or so.
It could probably be used effectively as input for PDAs, too.
Check out http://wol.ra.phy.cam.ac.uk/djw30/dasher/ [cam.ac.uk] which also includes a demo.
Matt
Matt Davey
Oh, I can't stand scientists. Have they nothing better to do with their time
Drawing Tools for the Blind (Score:1)
http://guir.berkeley.edu/projects/ic2d/ [berkeley.edu]
We're at UC Berkeley grad school CS researching user interfaces. You can check out the other stuff we're doing too.
http://guir.berkeley.edu/ [berkeley.edu]
Alternative Input methods - Fitaly Keyboard (Score:1)
This keyboard is presumably on-screen (think Palm) but seems like an obvious candidate for a conversion to hardware (anyone? My days are full with work and Uni, and I don't have much (read: any) experience in this area).
No, I don't work for fitaly, etc, etc
(FWIW: I'm not handicapped, but I am interested in alternative input methods mainly in relation to effective UI...)
David Jackson
Re: (Score:2)
Re:Integrating technology and humans (Score:1)
My own research (Score:1)
I figured that electric wheelchairs had the minimalist form of control, a 2D input device. In a computer, the challenge is to get it do what you want without having to click mouse buttons, or keyboard keys. A joystick or mouse (without button clicks) are interchangeable input devices, with maybe the joystick being easier.
Software would basically be a nested set of screens with large hotspots that were activated by mouse overs.
If anyone is interested in seeing a demo of this sort of thing, let me know (markoatpcbluesdotcom). It's obviously quite easy to write an app like this.
The other thing I am doing currently is cheaply building a parallel port interface with digital and analog inputs and outputs that are software controllable. This will allow interfacing with household appliances. In Australia, at least, such a kit is available from Dick Smith Electronics for about AUD $50, which at the moment is about US$30.
I am going to be combining software for this with the clickless interface and see what happens.
Another interesting research area is in controlling a cursor on a screen with brain waves. This was achieved about 5 years ago. (Sorry, no reference - check online science mags) A cursor's X and Y movement was not controlled by thinking "UP" or "DOWN", but by thinking about emotionally different things, for example, a peaceful scene for one direction, tense for another. I am not sure about the cost of hacking together one of these devices and connecting it to the parallel port.
The bringing together of all these ideas would create a controllable home environment without ANY physical manipulation, (unless windows crashed).
The "Total Access Port" project at Stanford (Score:1)
Visual Mouse (Score:1)
Photosensors:$3
Lamp:$5
Dictation Software:$20
Basic Stamp OEM:$34
Being able to use the computer without carpel tunnel: Priceless
Some options (Score:1)
is good for a disabled person or not. Your friend
Pete will need to have something that goes with
what he can do. Is his speech clear? then I guess
a voice recognition software might help. Otherwise
you will need to consider options such as making
use of mice-like devices such as the logitech
cyberman or trackball mouse which don't need to
be dragged on a table but can nonetheless be used
by fixing it firmly on a table/desk. It is
advisable not to go in for any of the IR/wireless
mice as if the mice were to fall down or remain
at a position reasonably far from where the
person is sitting, he/she will find it difficult
to get to it. The cable on the mouse can come of
use for pulling the mouse up. It might be good
to go in for a combination of both a fixed mouse
as well as a voice recognition software and then
to train the software to perform actions based
on simple monosyllabic sounds.
Disabled Stuff Here (Score:1)
Also, here's http://www.disabilitymall.com/assistiv etech/ [disabilitymall.com] a blurb on a pair of goggles that allows you to "control your computer with your eyes," although it looks like vapor at this point.
Although this stuff is designed for the disabled, I think that it has some significant use as portable/wearables--the one handed keyboard could be slung on a hip, say, to allow typing while standing.
Volunteers for Medical Engineering (Score:1)
My guess is that an inquiry to the Mech.E. Dept. or VME will get you a load of useful info.
And, yes, I'm an undergrad there, and I'm doing my senior design project for VME, also. I'm working under the supervision of the keyboard's inventor. E-mail me with any serious inquiries.
--Jurph
What about a faster keyboard? (Score:1)
The Saitek PC Dash is a large touchpad designed for games. You program a layout into it, print an overlay on your printer, and plug it into your keyboard port. It's about $70. Check out www.saitek.com
Program the Fitaly layout into a Saitek PC Dash and you suddenly have a large, inexpensive keyboard designed for fast one digit text entry.
Web directories & databases for blindness et a (Score:1)
http://www.geocities.com/neohephaestus/vlinks.h
Following hyperlinks will often lead to similar help for other types of faculty impairment, too.
GUSS URL please [Re:Universtiy] (Score:1)
Paul
Re:Volunteers for Medical Engineering (Score:1)
X10 and Interfacing (Score:1)
Re:GPL'd Morse Code user interface: Morseall (Score:1)
Words+ EZKeys should be GNU FreEZKeys! (Score:1)
Mobility Aids for the disAbled (Score:1)
Re:visual impairment (Score:1)
Lupe2: http://www.controtex.com/lupa2.htm
Sample screenshot: http://netdesign.net/~moses/vzones/lupa_screensho
Does something similar exist for MacOS, BeOS or X-Windows/Motif? I dunno, but it should.
Hypothetical Quadraplegic Thoughts (Score:2)
I'm going to assume for the moment that speech is not available. Why? Because
Thanks to the Perception midterm, I can tell you that the human eye is a busy little organ, saccading at rates we completely fail to perceive due to our ability to unnoticingly shift, tremor, and saccade our eye at high speeds.
The vaunted fovea that contains the vast majority of your visual acuity covers about a degree of sight. In other words, you're seeing little more than six to ten letter of the text you're looking at. Your eye is moving THAT FAST.
So in my mind, it's pretty fruitless to use eye tracking--the eye is just not a stable enough entity. Way too much signal in the noise. It's hard enough to track the poor thing *without* its design completely contrary to what you're trying to use it for--namely, to figure out what the eye is looking at.
However, as unstable as the eye generally is, the head position is actually pretty solid. A simple sensor that took samples of the image coming out of a monitor at a high enough speed to catch an exact pixel being drawn could correlate head position with gaze position. Assuming one could get this into a feedback loop(in other words, the patient could see where the computer thought he was "pointing" and could move his head to compensate for where he wanted the cursor to reside) would make for a reasonable interface to a number of applications.
Most interestingly, the Quikwriting scheme offered for the Palm Pilot, which offers an entirely continuous motion based alphabet for writing, would work quite well in such an environment. This deserves more investigation than anything else I've written on the topic thus far.
Of course, the disadvantage of a sensor on the user is that now the user is wearing something. This introduces moving parts. Eye tracking systems could be replaced easily by a system that doesn't try to track an eye but rather some specific element of a face over time, with quick recalibrations--the idea is to register movements when given some cue to pay attention, for a *short period of time* have a high quality system track a given point on the face in order to determine where to move a cursor(perhaps using a iterative sensitivity function whereby a user can specify "ok, I want to be pointing at something in this 256x256 block; recenter in three seconds and move slower"), and then disappear as the user takes in the new data.
But what to use for specifiers? While, again, the eye isn't something perfect, unnatural blink patterns may be. Two dark-bright regions disappearing and then reappearing twice in unison(or perhaps staying gone for a two second period, with a tone providing feedback to the user as to how long they've sent a blink signal), or possibly wink detection systems, would be far easier to design than a pupil tracker.
I can't stress how much I'd imagine having the ability to specify that the system needs to recalibrate would be. In my mind, the system should recalibrate as often as possible, detect differences from the initial state for short bursts of time, and then disappear into the background. Extended Blink Initialization is also nice for this aspect--wherever two large white objects with dark interiors open up, there's your eyes
But there's another area where there's an incredible amount of control--the mouth! There's an amazing amount of non-natural(and ridiculously silly looking) things we can do with our mouths that we don't generally. Folks, I call that a signal
What if speech is available, but only partially? (Does this happen? I dunno.) But I could imagine that tonal patterns--a low pitch to a high pitch, a high-low-high, etc., would be ridiculously easy to extract via FFT.
Again, whatever is done, the UI needs to feedback to the user what it thinks it saw, and needs to be able to be built such that both the user and the environment can learn when its doing something wrong.
Anyway, I've got two midterms, and I just spent way too long on this...somebody please contact me if this turned out useful. And, please, SOMEBODY look into Quikwriting as a useful scheme for quadraplegics?
Yours Truly,
Dan Kaminsky
DoxPara Research
http://www.doxpara.com
Linux Infrared Remote Control (Score:1)
Using Brain Signals (Score:1)
It outlines research that is being done to allow real world devices to respond to brain signals and the issues that need to be overcome.
The article doesn't specifically mention anything about uses for the disabled, but I imagine the implications for the physically disabled would be enormous.
I'm waiting for when they can fine-tune it enough to eliminate the need for keyboards and mice.
Cool Stuff!
shadowmn
For the temporarily handicapped too (Score:1)
Novel techniques of interaction with computers would benefit us all since we are all temporarily handicapped on one time or another. Two examples that spring to mind: In ops centers your eyes may be busy looking at a log file when you need to get some information from another system without moving your eyes. Another example is when driving: you should not take your eyes of the roard for any lenght of time to fiddle with the PC-based MP3 player.