Human-computer interaction can help people with disabilities by (i) making current interfaces and technologies accessible for all, and (ii) designing tools that assist the disabled as well as help their carers. In the first line of research, we are working on 3D sign language animation on mobile devices and on the web to provide deaf people with information in their native language and offer translations of written word or sentences. Moreover, we evaluated techniques to help blind users in selecting items from very long lists on mobile phones. In the second line of research, we developed a web-based system, called PRESYDIUM, to help emergency dispatchers rapidly find all the information about a disabled patient involved in an emergency, and to automatically generate tailored operating instructions for emergency medical responders. Information in PRESYDIUM can be provided by physicians who follow disabled persons, but also by disabled persons themselves, so the system was designed and tested to be accessible for all. We also proposed an application for mobile devices, called SLEC, that supports communication between deaf patients who communicate in sign language and emergency medical responders.