Intuitive Design Delights End Users

https://www.autocar.co.uk/car-news/motor-shows-geneva-motor-show/honda-bucks-industry-trend-removing-touchscreen-controls

Great example of a scenario where the User Experience (UX) strongly matters and goes well beyond any simple User Interface (UI) — intuitive controls in the car allow drivers to use them without having to look with their eyes, to perceive config changes, which is especially important to allow the driver to keep their eyes on the road.

User testing matters. User context matters.

Emergence of AR Applications

In the recent couple of years, I’ve mentioned in conversation with fellow engineers and technologists that I believe augmented reality (AR) has great practical potential to improve how we live and work.

Last week, I got to experience that myself for the first time, in a practical way, when I wanted to quickly get walking directions to a local taco shop, via my Android phone.

Google Maps presented me with the option to get walking directions via AR.

I gotta say, the experience was phenomenal, despite multiple heads up mentioning it was still in preview mode.

The recognition of my position and orientation on a street was a breeze, quick and very smooth. ( I’m assuming it was using location data in combination with visual cues matching to street view data?)

The app also suggested I put the phone down, to focus on what’s in front of me, instead of trying to walk with the phone in my hand, pointing straight. When I followed the app’s instructions, the interface changed back from a viewfinder like state (with AR overlay arrows and endpoint bubbles for my destination), to a regular maps experience.

Try it out yourself!

(side story: while I was using the app to turn the last corner, while I had the phone held up, a passerby paused and let me observe the surroundings with my phone. When I noticed him pausing, I apologized and suggested he continue on. He suggested I finish taking my selfie, to which I replied that I was using Google’s Maps AR experience to navigate. His reply was “Wow, sounds intense”. My guess is we’ll be seeing more folk on the street mistaking the AR navigators with folk taking selfies, which is the more common notion nowadays).

Helpful wearables

I had to get a new thermometer while battling the flu the other day. When looking on the shelf at the local CVS, I came across a reasonably priced smart thermometer, called Kinsa, and decided to try it out.

It was a very positive experience, all the way from setup via Bluetooth. 

Especially pleasant (I'd go as far as calling it a delightful UX), was the way the application opened up and instead of jumping into the typical homescreen, dashboard, or menu initial screen -- it had a conversation with me, through a super simple chat interface. After answering a few questions, my user profile was created and the app was ready to help me, with my single need of taking a medical device reading (in this case, temperature), and suggesting some next steps based on that (should I call the doctor? are my symptoms serious?) - which is all that I wanted from this app and device. 

Hats off to the Kinsa team for designing this experience, and obviously testing it, to make sure it's as easy and useful for all of us to use. 

It made my life easier and I'm grateful for having this product available in the local pharmacy. 

The experience and guidance that it provided beyond a traditional thermometer, was well worth the premium extra couple of bucks it cost.