Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Sign Language gloves are an interesting idea, but they don't work. Sign language relies heavily on facial expressions and body language beyond the hands.

For clarity, could you give an example of a sign that changes to mean a different word or phrase depending on facial expression?

> But deaf people aren't actually that keen on these solutions

People often aren't keen on solutions that are either still nascent or that they haven't given a chance yet. My mom wasn't keen on navigating with a GPS until she had one in her car and it changed her life. Not being keen is not on its own a great reason to dismiss a technology. The reason matters.

IMO a more meaningful statement might be to say that the obvious alternative is simply a small keyboard, which already exists.



Questions are almost entirely made with the face (by raising the eyebrows) as one example, which is ASL specific.

I think technology might someday be able to understand ASL and other signed languages (every region/area has their own signed language, generally). But it will be long after computers can easily understand spoken/written languages, which they can only sort of do now and even then very rudimentary currently.

Languages are very complex, and signed languages are no different, but add extra complexity, because you have to also handle facial expressions and hand/arm AND body movements (the whole body moves in ASL to delineate different contexts as another example.)

I for instance just carry around an iPad with a text app (like Sorenson's Buzzcards) which make communicating with hearing people way easier, and save paper.


> Questions are almost entirely made with the face (by raising the eyebrows) as one example, which is ASL specific.

Thank you for the clear example. I think this context really helps people understand the conversation better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: