Design
Where form meets function
Getty Images / Justin Sullivan

Controlling your phone with your thumbs could soon become a thing of the past, as a WWDC conference attendee recently discovered. Matt Moss attended this year’s edition of Apple’s Worldwide Developers Conference as a scholarship student, and had a chance to test out Apple’s iOS 12 developer beta. While playing around with it, he found that the augmented reality platform ARKit 2.0 offered up some exciting possibilities.

As he shared in a video via Twitter, the AR platform allowed him to build a demo where he could control the iPhone using only his eyes – by looking at a button to select it and blinking to press.

“I saw that ARKit 2 introduced eye tracking and quickly wondered if it’s precise enough to determine where on the screen a user is looking,” he explained to Mashable over Twitter direct message. “Initially, I started to build the demo to see if this level of eye tracking was even possible.” Check out the video below.

“Once the demo started to work, I began to think of all the possible use cases, the most important of which being accessibility,” he elaborated. “I think this kind of technology could really improve the lives of people with disabilities.”

Now, check out everything you need to know about Apple’s iOS 12.

Associate Music Editor

A “multi-hyphenate” Cancer Sun, Aquarius Rising, Virgo Moon, whom loves matcha.

What To Read Next