Navigate Your iPhone With Your Noggin
Apple’s accessibility settings have often been praised for helping people with disabilities use their devices with relative ease. Settings such as larger fonts, voice over navigation and hearing aid compatibility have given a larger number of people the ability to use the touchscreen-only device to help them get around or keep up with friends and family.
With iOS 7 in beta, developers are sharing some new features they’ve found in the software months before it’s available to the general public. Yesterday, 9to5Mac posted a report about a new accessibility feature in the new software that allows users to navigate their phone with head motions. When turned on, the iPad or iPhone uses the FaceTime camera to start watching your head for moves to the left or right.
In the video, writer Scott Buscemi demonstrates the new option, using a turn to the right as an alternative to a home button press and a turn to the left as an alternative to a tap.
The software and this new feature are still in very early beta, meaning there’s a lot of room for improvement in many aspects. Yet the demonstration video posted on 9to5Mac looks a bit like the Saturday Night Live skit with Fred Armisen posing as a tech blogger showing off Google’s Glass. (If you haven’t seen it yet, it’s worth your time.)
When this feature is set up and turned on, the device begins cycling through every available option on the screen.
For instance, if you’re in the settings pane, a blue block will begin moving from option to option waiting for you to turn your head whichever direction you’ve set up to select an item.
If you turn your head to the right and go home, the boxes will begin highlighting rows of apps. When one row is selected, the box will begin moving to the right until you select the app you want.
This would also be the closest thing to Samsung’s Smart Stay feature in their Galaxy S 4 smartphone, though used for very different reasons.
Developers have also found code in the recent iOS 7 beta that allows the camera to distinguish blinks and smiles. 9to5Mac also has this story, saying these APIs might mean some improvements to the native camera app may be forthcoming in future versions of iOS, if not iOS 7. For instance, these APIs could make the camera wait before taking a picture until everyone in the shot is looking at the phone and smiling.
Image Credit: Monkey Business Images / Shutterstock