Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Technology

Interfaces For The Handicapped? 96

heller asks: "I'm wondering about ideas for different methods intefacing with the computer for the physically handicapped. I've got a family friend that has very limited motor control, and while he can peck away at a computer, it really is hard and time consuming. Plus, he can't control almost anything else around his house, TV, front door, phone, microwave. My current thoughts are that if we can get him to interface with the computer, we can make it so he can control everything else he needs to." We have basic voice recognition, and technologies like X10: could these technologies and others help make living for the handicapped any easier without costing lots of money? (Read on...)

"Pete, a long time family friend and founder of Pedal with Pete, was born with cerebral palsy (check the Web site for more information). This greatly inhibits his motor control. He enjoys bike riding but last June, was in serious bike accident (once again, more info on the Web site) and this only reduced his motor control. The man has incredible spirit and wants to live on his own, though he has to have home care nurses stop by every day to help him perform most of his daily activities.

He used to do lots of work on his computer pecking away with sticks strapped to both hands. Since his accident he has virtually lost control of one side of his body and has seriously reduced control on the other side, so this is no longer possible.

My current line of thinking is that if we can get him to be able to control a computer, we might be able to automate many of the other parts of his life so he doesn't have to rely on so much help from other people. While I definitely have several ideas on how to go about this, I'ld really like some other input. Does anyone have any ideas or new products they can point me to?"

We've already had a discussion about Voice Recognition in one of the first Ask Slashdots but that article is from '98 and I expect there has been a lot more progress in this areas since then. I may have to run a follow-up post to this if folks are interested.

This discussion has been archived. No new comments can be posted.

Interfaces For The Handicapped?

Comments Filter:
  • With the coming of age of Bluetooth and tiny computers, credit-card sized Linux computers, it is only a matter of months before appliances will be plug and play. The house will truely be a living machine. Sig Internet engineers are the Roman Plumbers of today.
  • by silicon_synapse ( 145470 ) on Wednesday May 03, 2000 @10:50AM (#1094051)
    I work at a community college where we have a few blind/visually impaired students. With the aid of a piece of software called Jaws, they can use a computer about as efficiently as any of us. It reads aloud the title of windows as your mouse moves over them, reads aloud text, adds some more shortcut keys, and really makes the monitor uneccessary...almost. I'm going to download the demo and see for myself. I've seen a blind man demo it as he trained some of the faculty and the results suprised me.

    We also have zooming software for those who can still see a little. Many of these tools can be very usefull even if we aren't handicapped.


    How am I supposed to hallucinate with all these swirling colors distracting me?
  • What does this have to do with handicap access? This is a site that pushes wireless cameras. They've spammed the entire web, and now they're spamming /.? How much are you guys getting paid for that? I think there must be plenty of relevant sites to link to that actually involve tools for the handicapped.

    Detroit


    ... . . .
  • Once you have appliances that can be controlled by a computer (ie: microwave, smart fridges that communicate with the grovery store, ovens with preset temp and times for favorit foods, etc....)you could setup macro's that could in turn make everyday tasks a push of a button. In this persons case you could setup a keyboard of sorts with larger buttons (since he has limited motor control, lager buttons would be easier for him to push), certain buttons would be macro'ed to do various household functions (ie: set the time on his microwave).

    I think this coupled with autonomous units (independant roving vacume cleaners?) a person with limited motor control could lead a very comfortable independant life style.

    Atticka

  • My aunt and uncle are blind. They surf the web, do email, news, etc. without much trouble at all. They have a screen reader and special keyboard mappings to control it.

    Now comes the part that I'm gonna get flamed for: They are running Windows. One thing M$ has done VERY well is build hooks into the OS for alternative input devices and keyboard control of the GUI.
    --

  • As a disabled person, I can attest that while computer make my life simpler in many ways, most of the VR software out there isn't up to snuff...It is geared towards the Able-bodied crowd and therefore usually requires significant hands-on work at some point in the process, be it during set-up or "correcting" things. In addition, many programs are only designed to type for a person. No thought is given as to how someone without limb function might actually GET TO Word or wherever to type. Software that lets you fully control your box and is actually affordable unfortunately seems to be a long ways off.

    New Mobility Mag. had a write up on Dragon Dictate in their latest issue. Only parts are available online and I've requested a full copy be made available to /. The article really goes a long way in giving a "disabled" spin on this. I've pasted what of the article I could grab online at the bottom.

    Ruthie - Compu-babe w/Multiple Sclerosis

    Is the Dragon Draggin'? The Rise and Fall of DragonSystems
    By Ben Mattlin

    Last spring, I replaced my 3-year-old computer, which included a 5-year-old version of DragonDictate, the voice recognition program that allows you to talk to your computer instead of typing on the keyboard or using a mouse. What a miracle it had been when I first got it! I'm a functional quad--have little use of my arms and no use of my legs--and I'd been unable to find any truly effective way of interacting with a computer until DragonDictate came along.

    I first saw a prototype demonstrated at a university laboratory in the 1980s, when it worked slower than catsup pouring out of a Heinz bottle and the complete setup ran about $20,000. Yet by 1991, a faster and more accurate DragonDictate was available for less than $5,000; thanks to the Department of Vocational Rehab, I was soon a grateful customer.

    I had the greatest respect for the company itself, DragonSystems, a small Newton, Mass.-based software developer that had practically invented voice recognition technology. What a team they had working there!

    Nowadays you can pick up a starter version of the software at any Best Buy store for about $20. But as DragonSystems has gone mainstream, it seems to have sold its soul. Service is no better or smarter than what you'd expect at McDonald's, and the program itself has become about as accessible and welcoming as a flight of stairs ...

  • by ian stevens ( 5465 ) on Wednesday May 03, 2000 @11:05AM (#1094056) Homepage
    ... is http://www.hj.com/JFW/JFW.html [hj.com]. An evaluation version is available for download.

    ian.
  • Festival is an awesome program for speech, Emacs has support for it too, and I believe BILINUX is the distribution for the blind.

    But I don't think this is too terribly well-known, since my search on Google just turned up the last "Ask Slashdot" we had about this. :)

    Any good resources out there about this, guys?
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • It reads aloud the title of windows as your mouse moves over them

    Using a mouse when you can't see the screen seems difficult. Do you just move it around until you happen to get it over the thing you want?

    Where I work our products have to be accessible and we use the Self Voicing Kit Technology for Java [ibm.com] to test. I have not played with it much, but my understanding is when, for example, a dialog pops up, it will read the whole dialog to you.

  • by infodragon ( 38608 ) on Wednesday May 03, 2000 @11:13AM (#1094059)
    Designing an i/o device for a physically impaired person is a difficult task. I've been thinking about this for quit a while and to get any where you have to understand the ideas behind the current way of doing things.

    A computer, for most of the desktops, has two primary input devices, the keyboard and the mouse.

    The keyboard is a conglomeration of many one dimensional buttons. A button is either on or off.

    The mouse is on a 2 dimensional plane with 1 to 3 one dimensional buttons.

    This combination has been very effective for creating input for a computer. In fact it can be argued that a keyboard is more productive than a mouse due to the time/movement factor. I can, with a couple of key strokes, do X but to use the mouse I have to move my hand and move the mouse to get the same thing done in 10x the amount of time. So following that line of logic a 1 dimensional input device is more efficient than a 2 dimensional input device. For the sake of efficiency it would be best to use a device with many 1 dimensional inputs, buttons, for a disabled person.

    The most successful input device for a physically impaired person, IMHO, is the 2 paddles that Stephen Hawking uses for communication. It is very slow and laborious but with a little patience he is able to get quite a bit done. From what I understand is that there are paddles, basically buttons, which provide 2 one dimensional inputs.

    Taken that this is a 2 bit system and a keyboard is based on ASCII, an 8 bit system (7 of them useable) you have 128 combinations. A 2 bit system has 4 so 128/4 = 32. So to get the same amount of work done out of the paddles you need to spend 32 times the amount of time.

    Now that all of that is understood, and probably poorly. A new form of input needs to be invented based on the specific needs of the specific person. You need to determine how many one dimensional planes can he interface with then design a system around that. If he has use of his feet they can be used in the interface also. In fact if enough 1 dimensional planes are identified and specifically tailored to his need he may be more productive on a computer than the better than average computer user.

  • The cameras and DVD and MP3 transmitters are their "hotter" technologies. They also have a couple of other items, such as motion sensors and light switches and electrical outlet stuff that can be controlled remotely. It'd be great for mobility limited handicapped people. Here's a link [x10.com] you can take a look at.

    FYI, I thought the same thing you did, until I checked out the site a little more. Pretty cool stuff that they do.

  • by cara ( 118378 ) on Wednesday May 03, 2000 @11:18AM (#1094061)
    Check out IBM's Special Needs Systems Guidelines [ibm.com]. It has info on software accessibility, hardware accessibility, web accessibility, etc, with lots of good info and links to other resources as well.
  • by Anonymous Coward
    I have searched for interesting webpages on the subject of Brain-computer interfaces (BCI), and stumbled about some really interesting stuff. While those systems that measure electrical peaks on the scalp aren't really any good for real-life use yet (hit rate about 80% at best), theres a very interesting product that lets you control your computer entirely with your eyes. Check out LCT Inc.'s webpage [lctinc.com] I thought it interesting even for me as a not-disabled, but it was too expensive just for playing around with it ;) - Ratty
  • Roman plumbers were the reason they died of lead poisoning.. that's what they used to seal the joints of the aquaducts that fed the city. The Romans slowly went mad - an entire society killed by lead poisoning, effectively.

    Perhaps not the example you were looking for..

  • by calishar ( 111609 ) on Wednesday May 03, 2000 @11:25AM (#1094064)
    One of the mailing lists I am subscribed to is for people who have survived a specific spinal trauma (Transverse Myelitis, fairly rare). The gentleman who runs the list is a vent-dependant quadripalegic, who has absolutely no use of his limbs.

    With the help of his brother he has built a pretty cool system (beats either of my two) and has been using a device called an Adap2U Keyboard Emulator System which allows him to use Sip'n'Puff through a straw to "type" using Morse Code.

    He also has a page on his site for disABILITY information and resources [eskimo.com] which seems to be an Ultimate collection of links.

  • Voice recognition is a relatively wide bandwidth source of control if you can speak. In the case of an individual that cannot speak but has good control of their eyes cornea tracking is a solution. You stare at a letter or command for a few milliseconds to toggle it then move your gaze to the next letter. This is faster than sticks but still gets tiring after awhile. I was wondering about an accurate GPS device for the blind. It would make finding bus stops a lot easier, as well as checking the progress you are making towards your destination etc. Now that the accuracy has increased for GPS maybe somebody will build one that is more usefull.
  • by Anonymous Coward
    http://www.trace.wisc.edu/world/ this is the place.
  • You won't get flamed alone. IIRC, Windows 3.0 already had some sort of "magnifying lens" cursor.
  • by cg ( 18840 )
    There is a company here in the DC area called Anthrotronix [anthrotronix.com] that does a lot of great work in this area. There was an article in The Post [washingtonpost.com] about them today. Check it out. The potentials for the future of handicapped access to technology are great.
  • Ask Timmy [comedycentral.com].

    Of course, he'd only have one thing to say [comedycentral.com] about it...

    ------
    "I do not think much of a man who is not wiser today than he was yesterday."

  • I work at a university we have a department that specializes in helping handicapped students. They often suggest they use computers, we have computers that use JAWS for reading what is on the screen, they have programs that (using a scanner) can scan in pages and read them to the students, they also have brail (sp?) writers so that blind students can read what is on the screen. This department does not do much for students that can't move the hands, that being said we do have a department head who has what I believe is cerebral palsy (in any case he is slowly losing the use of his arms and legs) they purchased a program call GUSS and a GUSS mouse. It uses some form of laser to move the cursor. I do not have much information on the program because all I had to do was install it. The product was quite expensive (around $3000) so you might not be interested, I do not know the website off the top of my head but if your interested reply here and I'll get it for you.

    Andrew Dumaresq
  • Every time I see a new small computer such as the Esspreso PC discused earlier today. I think about how nice it would be to be able to carry this around and still get data into it without lugging around a keyboard. Also my arms are feeling signs of wear after typing since the Commodore PET days. An alternate Input method would cut down on much lost productivity due to RSI. Unfortunately I do not believe that there is any such device that is more productive than the keyboard. Voice recognition sounds like it might show the most promise but can you imagine what the once quiet cube farm will be like with everyone jabbering away at their computers. This also raises privacy issues.

    Often in SF literature you will encounter references to sub-vocalization (similar to voice recognition but the speaker doesn't make a sound. has anyone heard of any serious research in this direction?

    For years I have seen reports of quadraplegics blowing on straws or communicating with devices that read eye positions these may be options but I don't know how efficient they might be.

  • by BigTed ( 78942 ) on Wednesday May 03, 2000 @11:45AM (#1094072)
    In my final year Electronic Engineering project we are building a microprocessor based webserver to control/monitor devices over a simple 1-Wire network [ibutton.com] (from Dallas Semiconductor [dalsemi.com]) network. The micro is the TINI also from Dallas Semi and one of it's interfaces is a standard RS-232 serial port.

    My point is this - I'm sure it wouldn't be too hard to build a small (and cheap - thats why we're using it :-) system that allows device control using the computer and the TINI as a bridge to the 1-Wire network.

    The 1-Wire network consists of a dasiy chain of devices using standard RJ11 networking gear. So this would be relatively cheap and easy to instal. And devices can talk to it using cheap 1-Wire interface modules (2 way for approx $10-$15 US) from Point Six Inc [pointsix.com].

    The TINI can also be used to send email - so maybe even some sort of alarm. If a button is pressed an email goes out asking for help.

  • by sdelk ( 73884 ) on Wednesday May 03, 2000 @11:45AM (#1094073)
    It should be: http://www.x10.org
  • There's a nice utility for the macintosh OS called USB overdrive which will allow USB mouse or joystick inputs to be mapped to a variety of keyclicks, mouse actions or scripts (I don't know that such a simple and elegant utility exists for other platforms). This could be very handy when combined with a bit of engineering. A mouse or joystick could be gutted and used with whatever switches, paddles etc work well for a given disability, and the keys and such can easily be mapped to the OS without a lot of hacking.
    With cheap USB joysticks, there are typically a half dozen buttons and several axes of movement that could be mapped to a variety of tasks on the computer.

    With the wireless networking available today, a laptop could be used anywhere which talks to a base computer which can control apliances via X10 etc. If the IR ports were useful enough (and I don't think they are), they could be trained to control a TV and VCR (many remotes suck to use even for those not disabled).

    Sheldon
  • These guys seem to be on the right track. All you have to do is wear a funky hat, and think.

    http://www.eurekalert.org/releases/uor-gia050300 .html

    Pete
  • Don't see how this makes his post redundant, though.
  • I don't know of low cost, highly fully integrated systems, and I have been looking at this issue (microcomputer controlled accessibility applied to the human dimmension -- via X10, etc.) since 1982.

    A good friend of mine is nearly completely disabled by polio, but has enough hand control to steer her chair. Using a computer is too laborious, and even talking on the telephone is not as easy as it could be. So what's my idea of an integrated system for this person?

    • A quality, voice controlled browser (Mozilla + IN 3 would be fine, if that combo existed / worked well)
    • a script savvy plug in to the browser that could program and control a set of X10 modules (by voice),
    • a DSL Line that can be used as the basis for an in home/internet access system sort of like the Sprint ION.
    Obviously the voice recognition is the key component, but are there/why aren't there browser based X10 controller programming methods out there?
  • x10 allows electronically operated (i.e., you can program a computer to do it or use a computer to do it) remote control of all sorts of stuff with your house wiring serving as the transmission medium. I can certainly see how this could be useful to a handicapped person.
  • The guy is the cube across from me is blind and uses JAWS. He knows the keyboard shortcuts to do most the the windows manipultion, switch windows, expand menus etc. When he switches to a new window it says the window title, then starts reading the text in the window.
  • by Andrew Dvorak ( 95538 ) on Wednesday May 03, 2000 @11:56AM (#1094080)
    I must make mention of other products much like those X10 produces to keep things fair ;) That's all, not much, but a start :-)
  • I'm not sure if this will help or not but here at West Virginia University we have a few handicap accessable comp. labs. They use among other things a foot mouse for these students. I can do some further reasearch on this if you think it will help. Lemme know at snale@wvu.edu scott
  • Just because you can't see doesn't mean you have no concept of a two dimensional surface!

    I assume you can touch-type. You know, without looking, where you have to put your finger to press a given key, and you can see the result on-screen. Why would it be so terribly difficult to get an image of Netscape's layout in your head instead of the keyboard, and hear the button before you press it??
  • by angel ( 84938 )
    My father suffers from Amyotrophic Lateral Sclerosis. It is often refered to as Lou Gerig's diesease. It happens to be the same thing that afflicts Mr. Hawking.
    My dad has progressed far enough in the illness that he is barely able to speak and when he does only a select few are able to understand him. Because of this I have started a project to mount a laptop on his wheel chair, and using the festival speech synth library give him the ability to speak. The problem that I have is the interface. My father uses a computer and the internet a great deal. He is unable to use a keyboard though. Instead he uses a little program that displays a picture of a keyboard and when clicked on will send that keystroke to the keyboard buffer. For the wheel chair his means of input are even more limited because he uses his right hand to control the wheel chair and will now use his left to control the speech laptop. If anyone has any suggestions for the interface, or would like to help in the project please contact me. My email address is angel@myrealbox.com and I would be very grateful for any help that I might get. This is being done as a present for my father, and I would like to have it working asap. thanks.
  • There is a product called Words+ that is worth checking out: http://www.words-plus.com/ It is what Stephen Hawking uses Neil
  • We use a lot of the X10 electric products to control the lights in our home. Scripts can be written and run on a linux box to turn the lights on and off on a regular schedule. For example, our outside light is turned on every night based upon the sunset time.

    I imagine that a handicap person would find this type of automation very helpful, especially if they are on a regular schedule.

  • Microsystems Software (yes, the cyberpatrol people) used to put out a line of adaptive access products.

    They sold most of it in 1997. For more information you can go to http://www.cyberpatrol.com/handiw/def ault.htm [cyberpatrol.com].

    It used to include screen magnifiers, morsecode to ascii, key predictors, word completion, remote control of X10 devices.

    Another option might be touch screen. It might be easier to touch than to move a mouse.

  • Morse 2000 Outreach [uwec.edu] is a great organization, dedicated to the use of Morse code as an enabling technology with those of limited mobility. I highly recommend them as a resource.
  • It's possible to do quite a bit with the native speech recognition stuff that's built into the classic Mac OS, especially combined with Applescript - even though Apple has never had much clue what to do with this stuff, you can. The speech recognition itself is explicitly designed for command control, not dictation. That means it works very well for controlling apps and the OS, but not at all for entering the content of an email, say. Possibly you could combine it with something like that IBM continuous-speech software and address both needs. Discrete command recognition is just what you need for controlling X10-style remote devices, though, and there is a company [bzzzzzz.com] that makes that ADB control & sensing equipment for Macs as well as an interface to home X10 systems. They'd probably be able to give you an idea of how much physical home control you could do from Applescript (and thus speech) control, too. Any cheap old pre-G3 Powermac will handle this stuff fine (especially since they have ADB), so you may be able to experiment with at least the speech part for free if you have access to one.

    The main problem with this idea is that you're probably looking at combining a number of different technologies in addition to speech commands and X10, and there are more of those in total for Windows (ironic because Windows is inherently harder to control and use than just about anything else). And despite X10's recent nauseating push to quasi-porn spycam marketing, they have been doing home control for decades (I remember seeing an entire house controlled from an Apple ][ at an exhibition once).

    One of the neatest command speech apps I've seen was an integrated solution for Mac Netscape (Speech Navigator?) that allowed one to say "Navigator Back", "Navigator Scroll Down", "Open link 'Access Technologies'", etc. It worked amazingly well, but some pages were more accessible than others (the Slashdot main page would suck, for instance, since all the important links have the same name - "Read More..."). Nowadays you could probably write custom scripts to do the same thing with a more scriptable browser, like iCab or IE5.

  • What does this have to do with handicap access? This is a site that pushes wireless cameras.

    X10 is a communications protocol for communicating over 110V wiring. Here's a FAQ [ualberta.ca]. One company that sells X10 compliant equipment (along with wireless cameras) is X10.com [x10.com].

    Regardless of this particular company's advertising methods, the technology is still cool and useful.

    ---
    Have you ever noticed that at trade shows Microsoft is always the one giving away stress balls...

  • IIRC, Windows 3.0 already had some sort of "magnifying lens" cursor.
    Yup, it's further improved in Win2K, along with a narrator and on screen keyboard, and some wizards to help you set everything up as needed.
    Much as this will go down like a lead balloon, the short answer is install Win2K.
    You might not like it, but the support for alternative inputs outweighs Linux. Now X10, I don't know about, but I'd guess the support for Windows/X10 is as complete as the support for Linux/X10.
  • Hi,

    If you haven't seen them, you might want to check out Don Hopkin's Pie Menus:

    http://catalog.com/hopkins/piemenus/index.html

    Assuming a user with limited motor control who can use a stick interface, pie menus allow navigating multiple selections with a minimum of effort. Some time ago, someone did a shareware pie menu for entering text on Newtons -- I could enter text using a pie menu faster than I can type! It'd be a nice tool for PCs.

    Unfortunately, different disabilities require different tools -- and we're still in an age where most of the people who design devices for the handicapped have little clue. (Most electric wheelchairs don't have battery gauges, or a way to recharge without the help of a third party... one friend ran out of battery power in an elevator :( ).

    Cheers.
  • hmmm.. reminds me of my dorm.

  • Unfortunately, for the motor-skills-impaired, voice recognition is usually not an option.

    A drinking buddy of my dad's had a stroke a few years ago that left him with the use of almost nothing but his mind (and a loyal family, thank God). He can move is neck and torso a little bit, and manage a soft growl... that's about it.

    His only means of communication is his computer. It uses a laser pointer which he wears on his forehead (it was pretty uncomfortable at first, but some local kids built a mount for his glasses frames that helped out a lot).

    "Typing" is done using a predictive selection program. Hold the cursor on "A", and you get a list of the 20 words that you used most often starting with A. Choose the word, or select the second letter to narrow it down a little more until you either spell it out or get the word you want from the list. Once you choose a word, you get a list of word clusters, and so on. (People who watched the "Brief History of Time" documentary saw how Stephen Hawking uses a program a lot like this.)

    Using this program on a 486 runing Windows 3.x, he was able interact with people and write professionally, but the technology is far from perfect... He recently upgraded to a spiffy new PC, only to find that the cursor moved too fast for him to control. The system had no mouse, only the laser pointer, so people had a tough time working out how to adjust it for him. (My dad called me for help, and I walked him through the Win95 GUI using only keyboard shortcuts and arrow keys... not fun.)

    A free beer solution would totally rock, because people who sell this kind of stuff charge massive ammounts of money for their products, and not being able to walk & speak can make it tough to land a high-paying job.

  • Tis already been done; X10 and Macintoshes (I know they are worse than windoze) but the cool thing is, anything that can be controlled by X10 is AppleScriptable, and then in turn, activated by voice recognition through Apples Speech recognition software. Excepting the Apple and the X10 interface, it is all free. One could also use a cordless mike and be free of the CPU as well. "Computer (or whatever you name it), TV on" and ... ping ... the TV turns itself on.
  • Roman Plumers...

    Actually it was worse, they made pipes out of lead, but then there are many places in the US (Like Minneapolis) and other places around the world that still have lots of lead pipes being used for water distribution. Minneapolis is to the point where it is only the pipes from the mains to the houses, but...

    On to Blue Tooth, etc.. One thing that worries me is protocals, extensibility, and controls over extensions. It's my feeling that for the first decade or so you will see lots of little incompatibilities between appliance lines. As time wears on the extensions that cause the incompatibilities will slowly become universal or be dropped as dead ends. Blue Tooth isn't going to make all the appliances talk seamlessly at first. Their will be hickups, and your dealing with an environment where the item will not be replaced for years, possibly decades. It will take time to determin how best to comunicate and what information needs to be transfered.

  • There is a place here at su that does a lot of this stuff. Really great people...
    http://www.pulsar.org
  • I don't know about the others, but Home Director uses X-10 modules and X-10 machine code.
  • Dear Linux Users!

    I have developed a GPL'd morse code interface
    for a linux shell. Please download it
    and try it out!

    [pehr.net]
    http://pehr.net/morseall


    I would love to get more feedback on this
    so that I can make it more usefull.
    This is intended to be a complete user
    interface controlled by a single mouse button.

    Send me email if you need help setting
    it up. I have someone working on documentation
    and want to make this an open-source application
    of the highest quality!

    -pehr anderson
    pehr@morseall.org

  • by Anonymous Coward
    I always helped out my dad's friend who has Lou Gehrig's Disease, and he still uses a computer quite well. They would buy software and hardware from some company, and essentially the hardware was like a switch that connected to the serial port (possibly parallel, but i think it was serial) and it had one button. They had other versions that were pedals, and ones that used lasers to detect blinks in your eye for those really impaired.

    It always pissed me off that they charged so much for this stuff as it was very simplistic. Essentially it was an application set up in windows that had a grid of letters or options and first a vertical bar would pass over the options and stay at each one for a certain amount of time (the time was controllable), and then after the button was hit on that line, a vertical bar would scroll across that line doing the same. And to select the user would press the button again. It had some intelligence to it that would place the most used options or letters at the top left making them easier to access and push other options and stuff onto other menus that were accessable through icons. He could control the mouse and keyboard with this program, and he will still talk to me on ICQ from tiem to time and always asks if my dad could use ICQ (which he couldn't). He uses the same program to speak for him, which works in similar ways but can allow you to specify whole words through hotlists and stuff.

    TO specify how hte mouse is controlled with it, it has like 8 directional buttons, and icons for click, and double click, and left click and what not, and a radar button. the direction buttons nudge the mouse in a specific direction, while the
    radar button shows an xored line on the screen and it starts pointing at 12'oclock and sweeps around clockwise and the person must hit the button to stop its rotating, then the mouse moves outward from the center and they press it again to stop the mouse from moving.

    The technology is cool in that it requires only one button or switch of some kind and can be operated with the blink of an eye, but its not worth the huge amounts of money the company charged for it, when most people who could use it,
    are all government funded anyways. Which I'm guessing is paid by our taxdollars and private grants.

  • As one with motor disabilities who cannot use a regular keyboard, and yet doesn't have unlimited funds to purchase Windows 2000, can you tell me if the Windows 2000 on screen keyboard is any better or worse than the free GTKeyboard, available at

    opop.nols.com [nols.com] ??

    Earl Higgins

  • User Interfaces must be customized to the strengths of the user. A simple GameCam [realityfusion.com] may be useful to even a dolphin, yet a Stephen Hawking interface plays to a different level of symbolic genius.

    Video based ("non-applicance")UIs are a wonderful field to pioneer: Once we get beyond crude motion detection, to actually interperating gestures (surpassing the work of Myron Kreuger's Videoplace), we can interface with a computer back home just by dialing it up on the videophone! Pity the Max Headroom series didn't foresee that!

    Jaron Lanier invented the dataglove/powerglove, and demonstrated how it can teach eye-hand experience, even teaching himself to juggle. Yet, he regards musical instruments to be the greatest refinement of human interface. Imagine combining the genuis of Hawking, Kreuger, and Lanier into the science of UI. Add a sprinkling of Xerox PARC, Todd Rudgren, and the sense of Brenda Laurel, then simmer until done.

  • One of the great places doing computer interface work for the disabled is Pulsar.org ( http://www.pulsar.org ) which is actually a collection of companies, researchers and other good folks. One of the lead personalities behind it is Dr Dr Dave Warner (PhD, MD) - a VERY dynamic individual with a lot of great ideas. Dave often gets accused of "stunt research" with his application of high (and low) tech in quick demos that dont become products, but most of the work is stuff you can rebuild yourself.

    Like, did you know the Velostat(tm) that provides static shielding for chips and boards makes a great pressure sensor? Put some leads on it and push - you get a resistance change proportional to pressure. Makes a good cheap switch. The pulsar folks have demoed an RC car controlled by a pad placed in a shoe.

    They (http://mindtel.com side of Pulsar)have a nice interface box called the TNG (Totally Neat Gadget). It provides A/D and Digital IO via a serial port. Primary progamming uses NeatTools - a true visual programming environment (connect the boxes). Very powerful for very little effort.

    As for X10 - BAH! They spam way too much and their service is terrible. However, the products can go a long way towards automating a home with remote and PC control. If you watch their site, you can get some decent deals, but DO NOT USE THEIR E-COMMERCE side for any specials advertised in email - the site doesnt know about the email specials. Use the phone instead.

    As for speech recog - its not really all that good, but it can be better than nothing!

    As for brain interfaces - they are getting better but are still very expensive. Muscle sensors can be used nicely for therapy feedback devices. One pulsar.org app used them to drive MIDI synths and nintendos. Kids that would not do their physical therapy exercises before could not be dragged away from the drums/games.

    Eye trackers are also pretty expensive and poor. I tried several at the Cal State Northridge Technology and Persons with Disabilities conference and was disappointed. IF you hold your head perfectly still, the video based eye trackers can do fairly well, but if you move around, they lose the pupil. Eyeglasses and contacts also screw up the tracking.
  • For real morse code action on a linux box
    check out Morseall.
    This is an actively developed morse code
    user interface for a linux shell.

    [pehr.net]
    http://pehr.net/morseall


    -pehr

  • This looks really cool! Now if you had an interface that allowed the use of a paddle keyer -- maybe going into the sound card -- that would really fly.

    Mark

  • "Just because you can't see doesn't mean you have no concept of a two dimensional surface!"

    "I assume you can touch-type. You know, without looking, where you have to put your finger to press a given key, and you can see the result on-screen. Why would it be so terribly difficult to get an image of Netscape's layout in your head instead of the keyboard, and hear the button before you press it??"

    I must have missed the part where they said, "Since blind people have no concept of 2 dimensional surfaces....."

    I can see (no pun intended) where mousing presents additional challenges to those who are blind. Heck, I lose my cursor on the screen and I've got 20/20 vision or better. This is especially acute when most of the cursor is off the screen along the edge. Knowing the starting point of the cursor makes a difference. If I think the window I want is on the right side of the screen and my cursor is buried along that same side, moving right won't help.

    I've worked with some blind friends (they'd turn on lights for me since I was handicapped and tripped over the furniture without them) and they do much better at remembering where things were left than I do. This isn't so much a function of 'enhanced' senses compensating as it is a matter of necessity. Maybe that ability would translate to the screen as well.

    I would think a hot-key to center the cursor would be helpful in establishing a reference point to start mousing. Maybe the existing products do something similar already.

    carlos

  • I believe that the future of this sort of technology will be very positive. When we are able to fully integrate technology into the human mind and body the possibilities will be infinite. Eventually we will probably have access to the entire World Wide Web through our conscious thought. If you don't know what a word means, you could consciously, or unconsciously look it up on the World Wide Web. We must remember however that this is potentially a very dangerous technology. We must make sure that we are extremely careful in how we deal with such a powerful technological advancement.
  • How big are the video eye trackers?

    If they are small enough, you could have a head mounted unit.
    If eyeglasses interfere, why not mount them on the eyeglasses themselves?
    Then the camera could observe the pupil from a short distance without getting reflection interference, since it would be calibrated with the glasses on.

    That would eliminate the interference problem and the problem with moving around all in one step, since the camera would move with your head.
  • For the past four years I've been working on a research project at Boston College called EagleEyes that was developed for exactly this purpose. EagleEyes is a mouse replacement technology that allows the user to control the mouse cursor just by moving his eyes or head

    The technology is about 6 years in the making and we currently have over 35 students, most with cerebral palsy, using the sytsem on a regular basis (a couple of hours each week) for their learning. We also have five systems installed in homes around Massachusetts, three in special needs schools, and two systems in Birmingham and Manchester England.

    The Hardware isn't free, but the system specifications and all of the necessary software (much of which I have written) are available for free as well as support for the system.

    For details, check out http://www.bc.edu/eagleeyes [bc.edu]

  • It was an attempt at a brief analysis of the obvious implications of the type of technology that is currently being discussed on this site. I was only trying to take a broader outlook at where this sort of technology may lead us in the near future.
  • I guess X10 does have other products. It's a disturbing site if you're looking for something tame like devices for handicap remote control and automation.

    Bluetooth, whenever it appears in volume, would be a good communication tool for this kind of system. No extra wires. It's supposed to be single-chip capable, cheap (after a bit), and later phases have a range of up to 100m.

    Home automation/control site:
    hometoys [hometoys.com]



    ... . . .
  • [...] Their will be hickups, and your dealing with an environment where the item will not be replaced for years, possibly decades. [...]

    Is this a given, or do Bluetooth and other hightech-enabled consumer goods open the door to the possibility of upgrading your [dishwasher | dryer | stove | doorbell] every three years? I'm sure the Maytag repairman is salivating at the thought of recurring revenue, as are Home Depot, Sears, Bill Gates [ob-borg reference],etc.
  • Alternative interfaces are something that interests me. When total bandwidth is limited by disease or trauma, I think it's a good strategy to look at the axes of motion or expression that still have full bandwidth. Certainly, speech is one route. Another is fine motor control and timing, which can be used a'la morse code key. I wrote one interface that can be used to do a limited amount of command interpretation for someone with a 2-degree plane of mobility. (Mouse, or finger on a touchpad, or pointer on screen.)

    LibStroke [etla.net]


  • My younger brother is handicapped and we regulary goto shows where they show my technology for the handicapped. There was a camera that would track the motion of the eye and calculate where you were looking at the screen and it would move the mouse. This was just last year and i'm sure it has been refined. I tried it and it takes a tad of practice, but i was playing pong with it. It just uses a camera zoomed close up to the eye.

    -Foxxz

  • For a pointing device which doesn't require use of the hands, check out eye control [eyecontrol.com]. It's a camera which watches an eyeball and figures out where you're looking. Expect to pay a couple thousand dollars, and don't expect them to ship a linux driver.
  • You had me laughing out loud. Thanks for the correction!

  • Actually, the lead pipe thing wasn't so bad. The hard spring water they piped down calcified (just like it does in your house without a softener). This calcification added a layer of protection which kept the lead from bleeding through. Of course, there's no accounting for the fact they used to sprinkle lead powder on their food just for the buzz.
  • The user-level APIs are one thing. But if you can only use them with equipment you've specially constructed then there's a lot of the world that you can't interact with. Someone with voice recognition, for example, could use an ATM, but right now that requires an ATM that has a voice recognizer. If the ATM instead was looking for something that answered questions, you might have a VR system and I might have a keyboard wired to my wheelchair, with a CRT, but we can both answer questions. The ATM should be able to talk to either of us. Traditionally this is done via agreement on wire protocols, such as a standard XML DTD or a CORBA interface that translates into a particular set of IIOP bits. But that's very inflexible. The point of the Jini project is to allow agreement at the programming API level, and have each service (such as a question answering service) give the client (the ATM) code that executes the request. At least that's my view of interfaces for the handicapped. Otherwise your friend can get things working inside his house, but tomorrow when he goes to the bank or the move theater or wherever, that stuff is mostly useless because those places aren't set up for your X10/VR system. Ken
  • I'm the author of a package called GTKeyboard, located at http://opop.nols.com/gtkeyboard.html [nols.com]. It's an on-screen keyboard that allows users to type using only the mouse. The keystrokes can then be sent either to the regular text editing area in the application, or even to other applications, (such as typing shell commands into an rxvt or URLs into netscape and so on)

    It comes with layouts for a bunch of different languages/keyboard styles.

    I hope this will be useful to you or other people. I've gotten email from a bunch of people in the past that said they found it quite useful - among those were a stroke victim using linux, several people with ALS (Lou Gherig's disease) and cerebral palsy.

    The good part is that if you can use a mouse and you have X, you can do anything a keyboard can do. Of course it's a bit slower than using the keyboard, (we've got some new features coming soon to address that) but one mouse and three buttons is never going to be able to compete with 10 fingers for speed.

    Hope this helps.

  • Slashdot seems to have quite some success with this clientele.
  • I remember visiting FutureHome (House?) near Baltimore. It was one of those assitive technology showcases, with everything from remote-controlled elevation of dining room table for guests with wheelchairs, to a housewide remote and voice control of lighting and doors. What I really got a kick out of was that with all the money spent, they were using a Commadore for the TV remote-control interface because it was the only thing they could find that could overlay its video over the TV picture.
  • There was a whole issue of Byte devoted to use of computers by the handicapped -- probably 17 or 18 years ago. Back when it was 350 pages, full of ads and great technical articles. In a lot of ways, Slashdot has replaced a lot of its (Byte's) functions.

    Anyway, as far as ideas, there is not a whole lot that has changed. However, we now have the technology to actually implement some of them. Hope you can find a copy.
  • Development of a 1-D device is exactly what is in progress by a group in Cambridge [cam.ac.uk]. I was involved in the early stages of the project and we popped over to Stephen Hawking's office to see what his setup is capable of. Hawking is extremely good with his device, he has very fine temporal control with his button clicks. Basically common words/letters are highlighted by a moving bar and a click selects. It's basic, it's slow, but it works.

    The new idea, known as `dasher' [cam.ac.uk], reverses text compression (arithmetic coding with a language modeller). The user input, in the form of clicks, mouse movement or even eye-tracking, effectively enters compressed information which is expanded into text. Very efficient, and speeds can be up to 40wpm or so.

    It could probably be used effectively as input for PDAs, too.

    Check out http://wol.ra.phy.cam.ac.uk/djw30/dasher/ [cam.ac.uk] which also includes a demo.

    Matt


    Matt Davey
    Oh, I can't stand scientists. Have they nothing better to do with their time

  • Just FYI, my office mate, who is blind, is developing a drawing tool for the blind.

    http://guir.berkeley.edu/projects/ic2d/ [berkeley.edu]

    We're at UC Berkeley grad school CS researching user interfaces. You can check out the other stuff we're doing too.

    http://guir.berkeley.edu/ [berkeley.edu]

  • While reading this discussion, I've noticed many comments and mentions of controlling devices from the computer, and the inadequacies of standard keyboards. In my travels (mainly for alternative entry methods for handheld devices), I came across the Fitaly [fitaly.com] keyboard [fitaly.com] . [fitaly.com]

    This keyboard is presumably on-screen (think Palm) but seems like an obvious candidate for a conversion to hardware (anyone? My days are full with work and Uni, and I don't have much (read: any) experience in this area).

    No, I don't work for fitaly, etc, etc

    (FWIW: I'm not handicapped, but I am interested in alternative input methods mainly in relation to effective UI...)

    David Jackson

  • Comment removed based on user account deletion
  • We already have "access to the entire World Wide Web through our conscious thought". In fact, a fully functional brain is considered mandatory. You should try it some time.
  • A while back, a relative had a brain tumour, and I wrote a Delphi program that could enable construction of sentences with just limited and simple movement of the mouse - no clicking.

    I figured that electric wheelchairs had the minimalist form of control, a 2D input device. In a computer, the challenge is to get it do what you want without having to click mouse buttons, or keyboard keys. A joystick or mouse (without button clicks) are interchangeable input devices, with maybe the joystick being easier.

    Software would basically be a nested set of screens with large hotspots that were activated by mouse overs.

    If anyone is interested in seeing a demo of this sort of thing, let me know (markoatpcbluesdotcom). It's obviously quite easy to write an app like this.

    The other thing I am doing currently is cheaply building a parallel port interface with digital and analog inputs and outputs that are software controllable. This will allow interfacing with household appliances. In Australia, at least, such a kit is available from Dick Smith Electronics for about AUD $50, which at the moment is about US$30.

    I am going to be combining software for this with the clickless interface and see what happens.

    Another interesting research area is in controlling a cursor on a screen with brain waves. This was achieved about 5 years ago. (Sorry, no reference - check online science mags) A cursor's X and Y movement was not controlled by thinking "UP" or "DOWN", but by thinking about emotionally different things, for example, a peaceful scene for one direction, tense for another. I am not sure about the cost of hacking together one of these devices and connecting it to the parallel port.

    The bringing together of all these ideas would create a controllable home environment without ANY physical manipulation, (unless windows crashed).
  • There is a cool project going on at Stanford to develop a Total Access Port - a device which intermediates between a user and any computerized device (computer, ATM, microwave). The TAP consists of three parts: A set of user interfaces suitable for the individual user (from braille interfaces to blow-tubes), a bit of smart technology in the middle, and a "port" specification for the destination device. Read more at the Archimedes Project [stanford.edu]
  • I know someone (who know works at parallax [parallaxinc.com]) that created a visual mouse for his Sr Project at CSU Sacramento. It used a pair of IR phototransistors to detect where you were looking and moved the mouse accordingly. The sensors mounted on a standard set of eyeglasses and the source for the IR was an incandessent lamp next to the screen. The remarcable thing is that the whole thing was run off of a Basic Stamp. Most such input devices are fairly pricy and specialized. He has not had time to fully debug it but what I have seen looked very cool.

    Photosensors:$3
    Lamp:$5
    Dictation Software:$20
    Basic Stamp OEM:$34
    Being able to use the computer without carpel tunnel: Priceless

  • It is difficult to say whether a particular device
    is good for a disabled person or not. Your friend
    Pete will need to have something that goes with
    what he can do. Is his speech clear? then I guess
    a voice recognition software might help. Otherwise
    you will need to consider options such as making
    use of mice-like devices such as the logitech
    cyberman or trackball mouse which don't need to
    be dragged on a table but can nonetheless be used
    by fixing it firmly on a table/desk. It is
    advisable not to go in for any of the IR/wireless
    mice as if the mice were to fall down or remain
    at a position reasonably far from where the
    person is sitting, he/she will find it difficult
    to get to it. The cable on the mouse can come of
    use for pulling the mouse up. It might be good
    to go in for a combination of both a fixed mouse
    as well as a voice recognition software and then
    to train the software to perform actions based
    on simple monosyllabic sounds.
  • I just made a /. story submission on this kind of stuff; presumably the story is still in the pipe. Here it is again: A meta-index for hardware and software for the disabled is here: http://www.disabilitymall.com/electric.h tm [disabilitymall.com]. Specifically, I'm interested in the one-handed keyboard: http://www.coast-resources.com/appl iedlearning/ [coast-resources.com]. They boast 60 wpm(!).

    Also, here's http://www.disabilitymall.com/assistiv etech/ [disabilitymall.com] a blurb on a pair of goggles that allows you to "control your computer with your eyes," although it looks like vapor at this point.

    Although this stuff is designed for the disabled, I think that it has some significant use as portable/wearables--the one handed keyboard could be slung on a hip, say, to allow typing while standing.

  • VME [toad.net], or Volunteers for Medical Engineering, has been helping disabled persons for years now. One of their earliest projects, built by senior undergraduates of the JHU Mechanical Engineering Dept. [jhu.edu] was a curved keyboard that was activated with a "puffer" stick. The stick fit in the mouth like a traditional mouth stylus, but kicked a pulse of IR light onto a IR-sensitive keyboard. This allowed the user to greatly reduce neck strain and increase typing speed.

    My guess is that an inquiry to the Mech.E. Dept. or VME will get you a load of useful info.

    And, yes, I'm an undergrad there, and I'm doing my senior design project for VME, also. I'm working under the supervision of the keyboard's inventor. E-mail me with any serious inquiries.

    --Jurph
  • Fitaly is a keyboard layout that you can get for Palms. It was designed for fast entry when using only a single digit (stylus, finger, mouthpiece, etc) Check out www.fitaly.com. Palm users can get 50wpm out of it. A large version would certainly be faster for typing with a mouthpiece or other implement than a conventional keyboard.

    The Saitek PC Dash is a large touchpad designed for games. You program a layout into it, print an overlay on your printer, and plug it into your keyboard port. It's about $70. Check out www.saitek.com

    Program the Fitaly layout into a Saitek PC Dash and you suddenly have a large, inexpensive keyboard designed for fast one digit text entry.
  • The Hephaestus Project is one of many efforts around the world trying to help the blind and vision impaired. This page links to Web-hosted directories of Web sites, and databases of tools related to this effort:

    http://www.geocities.com/neohephaestus/vlinks.ht ml

    Following hyperlinks will often lead to similar help for other types of faculty impairment, too.
  • I'm interested. If you could post the URL I would be most grateful.

    Paul
  • For UK users, REMAP [remap.org.uk] offers a similar custom-built aid service to VME. There are branches in most areas.
  • I've been developing a router for X10 that routes X10 commands over an ethernet network. With touchscreen systems throughout the house, it would be a great means of control for the disabled. I've given thought on integrating voice mail into the system, along with e-mail, internet, etc. The idea in the end is to integrate voice/video/data/internet/e-mail/home automation into one coherent and easy to use package. I'd gladly make the code available if you think it would help.
  • Ever taken apart a mouse? The buttons are just simple pushbutton switches. You want a paddle keyer, just solder wires to the switch inside your mouse and there you go! I designed this for the lowest common denominater, a simple pushbutton switch. If you can connect a swich in parallel with your mouse button, you can still use your mouse as normal with two extra wires running out the back. -pehr
  • I'm glad someon mentioned EZKeys for Windows. What a piece of crap! If I were Stephen Hawking I'd never admit to using it! For example -- here is a software made for paralyzed people -- you open the online help and get the message, "The online help has not been written yet. Please see the printed manual." This is version 2.4 for cryin' out loud! I could go on, but my point is that someone should reverse engineer this thing and open source it. I would but I don't have the programming skills. All I want is a piece of Linux software that uses 1,2 or 3 switches to manipulate an onscreen keyboard and mouse (word prediction and abbreviation would be nice too). How do I get such a project started?
  • My "nerdy" son referred me to this discussion, let me add several already existing paths to pursue: 1) GPSs have been linked to voice output so that people with vision impairments can achieve mobility with accuracy. 2) The IBOT 3000 a sort of "wheelchair" that incorporates three sampling CPUs and gyroscopes and enables the users to cross rough terrain; climb stairs, and "stand up" on to wheels to increase vertical access. Undergoing FDA testing. Designed by Dean Kamen. Search: DEKA; IBOT; and Johnson & Johnson. 3) French IBM lab has developed a chip and implanted in a person with spinal cord injury enabling him to do rudimentary lifting of leg thus far. My interest: Working With Scouts With disAbilities http://www.boyscouts-marin.org/wwswd/wwswd.org
  • I've found some free zooming software for WinXX machines, called Lupe2 (appaently it's Polish for "lens"). Some zooming software will cost you $295 (no joke. see www.hj.com) Lupe2 is a free demo of an ActiveX control, but very useful in it's own right.

    Lupe2: http://www.controtex.com/lupa2.htm
    Sample screenshot: http://netdesign.net/~moses/vzones/lupa_screenshot .gif

    Does something similar exist for MacOS, BeOS or X-Windows/Motif? I dunno, but it should.
  • A little thought experiment--insert the German word that I'm thinking of; begins with a G. Don't remember it, anyway.

    I'm going to assume for the moment that speech is not available. Why? Because :-) I'll write about some speech involving tech later. Presume now the patient can't talk well.

    Thanks to the Perception midterm, I can tell you that the human eye is a busy little organ, saccading at rates we completely fail to perceive due to our ability to unnoticingly shift, tremor, and saccade our eye at high speeds.

    The vaunted fovea that contains the vast majority of your visual acuity covers about a degree of sight. In other words, you're seeing little more than six to ten letter of the text you're looking at. Your eye is moving THAT FAST.

    So in my mind, it's pretty fruitless to use eye tracking--the eye is just not a stable enough entity. Way too much signal in the noise. It's hard enough to track the poor thing *without* its design completely contrary to what you're trying to use it for--namely, to figure out what the eye is looking at.

    However, as unstable as the eye generally is, the head position is actually pretty solid. A simple sensor that took samples of the image coming out of a monitor at a high enough speed to catch an exact pixel being drawn could correlate head position with gaze position. Assuming one could get this into a feedback loop(in other words, the patient could see where the computer thought he was "pointing" and could move his head to compensate for where he wanted the cursor to reside) would make for a reasonable interface to a number of applications.

    Most interestingly, the Quikwriting scheme offered for the Palm Pilot, which offers an entirely continuous motion based alphabet for writing, would work quite well in such an environment. This deserves more investigation than anything else I've written on the topic thus far.

    Of course, the disadvantage of a sensor on the user is that now the user is wearing something. This introduces moving parts. Eye tracking systems could be replaced easily by a system that doesn't try to track an eye but rather some specific element of a face over time, with quick recalibrations--the idea is to register movements when given some cue to pay attention, for a *short period of time* have a high quality system track a given point on the face in order to determine where to move a cursor(perhaps using a iterative sensitivity function whereby a user can specify "ok, I want to be pointing at something in this 256x256 block; recenter in three seconds and move slower"), and then disappear as the user takes in the new data.

    But what to use for specifiers? While, again, the eye isn't something perfect, unnatural blink patterns may be. Two dark-bright regions disappearing and then reappearing twice in unison(or perhaps staying gone for a two second period, with a tone providing feedback to the user as to how long they've sent a blink signal), or possibly wink detection systems, would be far easier to design than a pupil tracker.

    I can't stress how much I'd imagine having the ability to specify that the system needs to recalibrate would be. In my mind, the system should recalibrate as often as possible, detect differences from the initial state for short bursts of time, and then disappear into the background. Extended Blink Initialization is also nice for this aspect--wherever two large white objects with dark interiors open up, there's your eyes :-)

    But there's another area where there's an incredible amount of control--the mouth! There's an amazing amount of non-natural(and ridiculously silly looking) things we can do with our mouths that we don't generally. Folks, I call that a signal :-) Move left side of mouth, cursor goes left. Right side, cursor goes right. Smile, up, kiss, down. Stick your tounge out to click ;-)

    What if speech is available, but only partially? (Does this happen? I dunno.) But I could imagine that tonal patterns--a low pitch to a high pitch, a high-low-high, etc., would be ridiculously easy to extract via FFT.

    Again, whatever is done, the UI needs to feedback to the user what it thinks it saw, and needs to be able to be built such that both the user and the environment can learn when its doing something wrong.

    Anyway, I've got two midterms, and I just spent way too long on this...somebody please contact me if this turned out useful. And, please, SOMEBODY look into Quikwriting as a useful scheme for quadraplegics?

    Yours Truly,

    Dan Kaminsky
    DoxPara Research
    http://www.doxpara.com
  • There are many devices [snafu.de] which can be controlled with infrared remote controls. Linux can use IR control signals with LIRC [uni-sb.de]. Not all IR transmitters can send all signals, so some testing will be needed, and transmitters need to be near devices which they control (ie, in the same room).
  • There is an article at: www.sciencedaily.com [sciencedaily.com]

    It outlines research that is being done to allow real world devices to respond to brain signals and the issues that need to be overcome.

    The article doesn't specifically mention anything about uses for the disabled, but I imagine the implications for the physically disabled would be enormous.

    I'm waiting for when they can fine-tune it enough to eliminate the need for keyboards and mice.

    Cool Stuff!

    shadowmn
  • Novel techniques of interaction with computers would benefit us all since we are all temporarily handicapped on one time or another. Two examples that spring to mind: In ops centers your eyes may be busy looking at a log file when you need to get some information from another system without moving your eyes. Another example is when driving: you should not take your eyes of the roard for any lenght of time to fiddle with the PC-based MP3 player.

Always draw your curves, then plot your reading.

Working...