PROJECTS Accessibility in Smart (Mobile) Phones
15% of world population, approximately 1 billion people of the world live in some form of disability. Smartphone interfaces supporting to their needs providing relevant information access can empower them.
This collaboration with Samsung Research Institute Bangalore (SRIB) aims to propose smartphone technology enabled novel design interventions to support needs for differently abled, especially with people having hearing, speech, motor disability and people with limited accessibility. It also exapnds its scope towards situational accessibility. The project intends to use technology platforms of smart (mobile) phones and wearable devices.
This collaboration investigates two major scopes - (a) design solutions to support users to use large screen smart phones effectively and efficiently and (b) design solutions to solve everyday problems of differently abled users through effective interventions of smart (mobile) phones and wearable devices.
This project is under non-disclosure agreement, hence we may not be able to publish complete work. Based upon approvals, we may publish some work of this project.
Team members: Shimmila Bhowmick, Vrushin Goswami, Sumit Arora, Rohan Vijay, Samadrita Das, Meenal Mandil and Keyur Sorathia
People with one hand, deformed hand and limited movement of hands often have problems to access full screen of large screen smart phones, especially when the navigation panel is presented at top or extreme corners of the device. This limits users to access many important features of applications. Such problems are also experienced while traveling in crowded busses, trains and crowded places. Users have situational accessibility - where users in specific situation have limited access to their full abilities (despite having complete physical and mental abilities). e.g. able to use only one hand effectively in a crowded bus, where other hands are stuck in crowd or busy carrying bags.
We have explored few methods to support users who experience these problems. Following are some explorations primarily to help differently abled users to use unaccessed locations with ease and effeciency.
This is an illustration of accessing features (placed on top of the screen) for a left-handed user. The single image view screen has icons in the action bar that are unreachable by the thumb in single-handed usage. On the bottom left of the screen is the proposed Easy Access floating button. As the user taps on the floating button, Easy Access gets activated. The action bar slides down to the lower half of the screen and is comfortably reachable by the thumb. The icons are now present on the left end of the bar for easy access for a left-handed user. While the Easy Access state is active, the background becomes non clickable. As user completes his desired action, he taps on the floating button again and the icons slide back to their original position and Easy Access becomes inactive
This is an example of a typical single image view screen in picture gallery. The screen has icons in the action bar that are unreachable by the thumb in single-handed usage. On the bottom right of the screen is the proposed Easy Access floating button. As the user taps on the floating button, Easy Access gets activated. The action bar slides down to the lower half of the screen and is comfortably reachable by the thumb. While the Easy Access state is active, the background becomes non clickable. As user completes his desired action, he taps on the floating button again and the icons slide back to their original position and Easy Access becomes inactive.
The user presses on the action bar and drags it vertically on the screen to manually place it in a position comfortable for her access. The user can place it at any position. This is a one-time action. As the user taps the easy access button the next time, the action bar automatically locates itself at the user customized position. The user can change this position by pressing on the action bar and dragging it vertically.
This is an illustration of new user interface of unlock screen for touch screen mobile phones. The unlock screen of a mobile phone is divided into top layer and a bottom layer. Top layer of the phone consists of a floating screen, showcasing mobile applications or features whereas the bottom layer showcases a mobile phone supported security feature. The users can tap on desired application or feature and perform the security pattern to unlock the phone. Once users unlock the phone, it will direct the user to chosen application or feature. The floating top layer can be swiped to access more contents and choose the desired one. The floating top layer can represent a category of application or feature based on user preference.
This proposal helps to reduce cognitive load on users as it reduces total number of interactions required to access user preferred applications. Moreover, users as they do not have to remember multiple patterns for multiple applications/features (often found in application such as speed dial or perform a gesture to open an application). This is proposed to be very beneficial for geriatric users and users with movement & joint pain related problems.
Hearing impairement is observed among older adults and sometimes in normal people due to situational disability. In this section, we present few concepts on hearing impairment.
Enhancing movie watching experience for hearing impaired individuals
Hearing impaired individuals lack the sense of hearing while watching movies or videos, however a strong desire to experience the movie or video is always observed among them. We developed a conceptual experiment that enhances the movie watching experience among hearing impaired individuals. The total number of characters in the screen space, character who is speaking, the gender of the character, toggling between subtitles and presented to ensure the information is clearly communicated and further contextualized according to the users.
Older adults and individuals with partial hearing impairement are observed communicating in louder tone, which may cause social unacceptability especially when communicated in public spaces. This concept is an attempt to bring notice to the socially unacceptable communication practices observed during communication between partial hearing impaired individuals or between normal individuals-partial hearing impaired individuals-older adults. Through simple vibration notifications on Samsung wearable devices as well visual-haptic feedback on mobile phones, they are communicated about the volume of the communication.
Normal individuals often end up in a situation where some abilities of huamns are restricted. For example, human body-movements are occupied in a crowded bus or a train. In such situation, use of hands to pick up mobile phone (from the pocket or purse) and having a telephone conservation may be dangerous for users. During this project, we investigated few situations where users senses are occupied to perform specific task efficiently. We present few concept solutions which provide assistance in various situational disability conditions.
Easy Access - allowing easy-to-use one-handed large screen smartphone interactions
Large screen smartphones demand high physical attention and practice for easy, effective and efficient interaction, especially during one handed usage. We developed a concept of "Easy Access" that allows users to interact with large screen smartphones through one hand (both right & left hand) with increased ease of use and efficiency. Moreover, the application is applicable to all application, including native and thrid party applications. Following video showcases the details of the concept.
Intelligent status checker
We are often looking at the screen (performing on-screen activities) while walking across the street. This is dangerous and possibly cause human life, if not attended properly during cross the roads. The concept solution identifies users' position through GPS and immedieatly freezes the screen while crossing the road. Few screenshots of the concept application is presented below.
Automated text assistance of contents on telephonic conversation
Users are not always in situations where they can manually note down specific details during the call. For example, it is nearly impossible to note down the address, phone number etc. on a moving train or a bus. This is due to unavailability of physical notepad or resources to write down the details. Moreover, to communicate to these details (e.g. locating the address on the maps, given during phone call) demands users' attention and additional interactions with mobile devices. In this concept solution, we propose an automated text assistance which records a specific duration of verbal communication and converts into text during telephonic conversation on Samsung smartphones. Once the call is over, user gets the recorded information is text format, which can be saved or directly communicated to calling/maps etc. features. Below is a schematic representation of the concept.
Often older adults, individuals with Parkinson disease and partial vision impairement cannot differentiate between 2D and 3D surfaces. This results due to inability to understand the depth of moving objects. In this concept solution, we use three major component solutions - (i) increasing the shadow intensity of the object (ii) increasing the border outline of the choen object and (iii) increasing the contrast level of the objects with the scene background to increase the depth perception among users. We took a usecase of moving balls - sports such as table tennis, tennis, basketball, football etc. where depth perception of the ball is to be identified to differentiate between 2D and 3D. It proposes to use match videos on Samsung mobile phones and also use mobile phone camera to increase depth perception of live playground matches. The following image presents the schematic diagram of the concept.