This SXSW, Chaotic Moon Director of Strategy Greg Carley will be holding an official talk titled Think Beyond the Screen: UX with Human UI on March 15th, . To prep you for the program, here’s a glimpse at the topic: design that responds to and/or mimics natural human behavior.
The excitement started with messaging, and apps like Operator and Magic that promise to help users get essentially anything they want through text. Need to get across the country ASAP? Try “Book me a flight to San Francisco this Friday.” In the doghouse? We recommend “Deliver flowers to my girlfriend NOW!” It’s almost like you’re texting a friend for a favor or your personal assistant to get something done. The app then decodes the message with Deep Learning (think the technology used by platforms like IBM Watson), a process that is created to understand and solve problems by itself without requiring engineers to code every single possible outcome or detail.
Because written text doesn’t involve accents, intonations, facial expressions and other body language components, it’s probably the most simple, direct method for processing and exchanging of information. But while it may be straight-forward–and while devices like laptops and mobile phones have created a revolution in remote communication where body language may seem less important–can we really discredit this important element of communicating, especially when it plays such a huge role in transmitting meaning and emphasis?
Designers and developers have historically been focused on creating and developing technologies for us to click, swipe, tap, press, hold, etc. It was not until relatively recently that we could really stray from the screen and start thinking about UI in other terms and exploring other ways to take a user’s digital experience to the next level.
As human interactions shift, companies should start making experiences less screen-dependant and more appealing and relevant to all human senses. For experiences to be more human-like, we could start exploring cues like sound, temperature, smell, etc. After all, yes, we’re largely visual creatures, but vision and touch are only part of what constitutes the way we interact with the world. It’s time that we go from inputting information onto screens to creating technology that appeals to and interacts with the other human senses.
“Responding to natural human behavior will become a more important element of design as we move into a mixed environment of screens and smart objects without screens,” explains Carley.
While we’ve gotten a taste of digital experiences sans the screen through largely voice-controlled technology like the Amazon Echo, it’s obviously difficult to imagine a world completely devoid of standard interfaces, one in which we could be independent of the screen and instead utilize other technology to control and interact with our environment and our homes and to make our lives easier. However, when it comes to technology, there are human senses that really haven’t been applied, and ones which could be implemented extremely effectively.
Take, for instance, the ways we could use sensors. Sensors are certainly nothing new, but cloud-based technology has given them the ability to essentially talk to each other, which opens plenty of doors and presents a lot of potential for gathering useful data. For example, this sensor-to-sensor communication could allow your refrigerator to “smell” when food is going bad, identify what smell relates to what product, and catalogue that information accordingly. Then this big data, provided by an entire network of fridges with different owners/users, could be stored in a “library” of sorts that devices could draw upon later–right before you decide that “yeah, I think this chicken looks okay.” Crisis (aka food poisoning) averted. Alternately, imagine taking a call at a bar, and your phone being able to sense and identify your level of intoxication. If you were over the legal limit, your phone could send a message to your car and disable it. (Or disable your text messaging, allowing you to avoid a disaster of a totally different kind.)
Essentially, we’ve just scratched the surface of interface opportunities beyond the screen and the ways in which human-like senses could be applied to technology to make for a more natural and amazing user experience. And if you want more…well, you’ll have to attend Carley’s SX talk for that.
For a full list of Chaotic Moon SXSW panels and programs, click here!