Thursday, July 10, 2014

Historical Convenience: UX Intertia

The more time I spend in the U.S. the more I notice the little differences in human behavior.  For the most part, these differences don't actually translate into differences in digital UX or UI design. This lead me to the question - is good design a reflection of existing human preferences, or does it create new human preferences?

This isn't a black and white answer. It waivers in the grey. If we look at one element that touches everyone daily - money -  we see that most implementations of design are adopted when they reflect a mass-based human behavior.

Micro-chip credit cards
The American adoption chip based credit cards has followed the path of the adoption of electric cars - even when a realistic solution comes into play, the intertia of old perceptions on UX and unpredictable commercial accessibility get in the way. One part of this is the lag for retailers to implement chips reading credit card scanners at point of sale. The greater part is the lack of consumer demand, as historical convenience almost always outways change in UX. For the U.S consumer, they have lagged in most changes that increase the convenience of using money - adoption of automated tellers, digital currency, and the idea of cash free solutions generally lag compared to other similar countries. Companies like McDonald's have spent millions to update locations to accept micro-chips, and have struggled to realize a return on that investment.

North of the border and across the ocean, chip technology started making it's way into the hands of University students just over ten years ago - a low barrier pilot project with a target audience who do not suffer from the inertia of historical convenience - mass buying food or drinks was not really a behaviour until they left home. Since that time, chips have become the norm and within the last 18 months have evolved to include quick scan payments. Perhaps the cultural nuances differ in trust or security enough to allow this, or perhaps the new convenience of quickly scanning a purchase is closer to feeling like a cash transaction than credit cards have ever managed to do before.

In this case adoption of the chip in the U.S. the user experience leans on the barriers of behavior changing; there is no perception that a new transaction system fits or improves on the convenience of the current system. A case where UX adoption is being lead by existing human behavior.

Digital Currency: Let's forget about Bitcoin for a second and just look at the use of digital to facilitate transactions. The use of e-commerce is definitely main stream. After overcoming the trust barrier in user adoption, it has not had trouble becoming a new norm. Look overseas to South Korea, and we get another hint at existing human behavior driving digital UX. A society rich with tradition, and tech adoption, we see an interesting convergence of behavior. Traditional gift giving for major life/death events involves money in an envelope. The dominant messaging app, Kakao (kind of like WhatsApp), has recently added a digital wallet feature, that allowed its millions of users to send monetary gifts to other users. Was there a lag in adoption? In two days, 22 million transactions were made (maybe that's why Facebook bought WhatsApp...but that's a different blog post). There was no thought to adoption, there was an existing human behavior that had reached the point where 'digital' and 'traditional' converged in the mind of the user. The use case proposed a new process, that didn't require new behavior or thinking.  UX lead again be existing human behavior.

I'll be expanding on the hints of design and adoption in future posts. They apply to all aspects of user interaction - from mobile apps to vehicles to the next gadgets that connect us to the world around us. A trend of characteristics connects all of them, which include:
  • Balancing Convenience: weighing historical versus innovative
  • Accessibility: reducing friction to use and continue using
  • Reflecting behavior: a mirror on existing UX, even when it appears to be drastically different.
  • Perpetual Reinforcement: We all learn by doing. we create habits by repetition. Combining these in the UX is often the tipping point in surpassing UX inertia.

Friday, May 23, 2014

The Human Challenge in Technology

Much like theatre has the fourth wall, design and communications has the z-axis. Traditionally, as in theatre, the story being told is left on two-axis – two-dimensions of engagement that allow the audience to participate.  

Side note: I tend to interchange storytelling with communications/design/UX – it’s the output and the approach, and I like the word.

On occasion, a third-axis (or dimension...3D) is added to the mix, drastically shifting the way in which a story can be told and consumed by the audience. However, change rarely comes easy.

With VR tech like the Occulus Rift, ulrasound tech like the Qualcomm Pen, and the insane evolution of 3D printing, the world of storytelling has more than enough tools to evolve.  Re-thinking story telling and experiencing, we can add the z-axis and allow the audience to lie in the middle of the narrative and explore their new surroundings.  We see lite versions of this with presentation tools like Prezi, which aim to break the flat mold of typical presentation styles. Occassionally, we get a glimpse of the z-axis fully integrated into the story, as in this case with a recent DARPA project, where the experience is “…like swimming in the internet.

The Internet of Things has freed us from the rectangle, allowing the user interface (i.e. the story experience) to move beyond the digital screens of our phones, tablets, laptops, etc,  Likewise, three dimensional  technology and experiences hold the promise to free us from being a passive viewer – allowing the user interface to move beyond the flat surface or single direction of dialogue most stories are told with today.

But there are hurdles...
There are two reasons why shifts in how we engage or consume are slowed down or fail completely.
  1. The majority of us do what we know has worked. 
  2. We often build, or use the things we build, in the way we historically used the previous version of them. 
I call this the human challenge in technology; we often use the new in the same way we use the old. Internet advertisers destroyed the early potential of online media by failing to see the context of the audience. Mobile phones were simply smaller versions of the land-line phone for over a decade – now the mobile calling takes up less than 30% of mobile phone time.
Taking full advantage of the ability to tell a story across all three dimensions, requires us to avoid these two points, and approach the experience through a new lens – involving context, constraints, and cooperation.  BUT that is an entirely different blog post; one I have yet to write.

Tuesday, May 13, 2014

Learning to suck at creativity

Somehow, somewhere along the way, the line between creativity and technology became blurred. Same goes for innovation; a word often used but rarely in the absence of technology.

We've learned to suck at creativity by masking it in technology or placing it in the shadow of innovation. There is nothing worse that technology that lacks creativity.

It seems to happen systematically. The reason may be that we often place creativity on a pedestal. Rather than reach for the pedestal, we conform to the present (which is currently technology/innovation).  It's easier than lowering the pedestal. Up there,  creativity is an isolated activity owned by someone with creative credibility (I'm talking broadly and not judging... where on one side of the scale, they have a knack for coming up with ideas. While on the other side of the scale, they have a wall of awards made of crystal or earth metals). In this scenario, creativity can't be easily influenced or redefined. Collaboration ends up being a word that is used by people on the pedestal looking to gain support by those who are not. And often, the collaboration comes in requests for technology or innovation. It is here we see the meld of creative suckitude fuse into our unfortunately acceptable definition of creativity.

Creativity should be rooted in simplicity and improvement. This is something everyone is capable of but rarely provided permission to explore. Finding the insight that connects consumers to a product, cutting down lines of code to increase an apps efficiency, modifying company HR policies to encourage an activity (and ultimately discourage another), or simply allowing for conversation to flow among people uninhibited -  are all examples of creativity at work.

Learning to unsuck just requires the discipline of defining the creative request to the person you're engaging with. Lists aren't creative, but they are focused. Here's one on how to turn down the suck in the creativity of your workplace or community:
  1. Don't ask the same questions at the beginning of every creative process. This is habit, and habit kills creativity. A starting point is to avoid the words "engage" "innovate" "connect" "brainstorm" "blue sky" "viral" "social" "insight" - make a creative request...creative.
  2. Never ask a bull for milk. Focus the tasks on the persons expertise - if they know money, focus it on savings/profits, if they know software, focus it on UX or efficiency, if they know design, focus it get the picture. Expand the notion of creativity by framing and priming participants with a set of tasks that fits their abilities to add to the discussion.
  3. Favor diversity over support. Diversity of thought inspires great ideas. If you're in a creative discussion, or a meeting, or a brainstorm, and you feel comfortable with everything you hear - then you've missed the creative bus. Same goes for simply giving time to ideas you like - you've missed the point. Which brings us to point #4.
  4. Check yourself first. Creativity often sucks because we often come up with the creative framework/solution before we ask for help. We look to support our cause and build a brainstorm around it. Hence why leaders of creativity often end up with unstable egos, or unchecked biases, they are constantly supported in every collaboration. Check yourself should be step one, but it took me four steps to realize I didn't before writing this.
All that said, the point isn't to tear down creative bureaucracy. People need to make decisions, and decisions don't involve everyone all the time.  Rather, the point is to foster creative diversity at the beginning - which creates creative health....and actually leads to true innovation.

Friday, May 9, 2014

3 Tips to Creative Storytelling with technology

Conversations around innovation are often rooted in limitations. A creative team would like to know what is not possible, a brand wants to know possible risks, etc. Often the focus on what is not possible distracts everyone from the possibilities of what is possible.

Storytelling in action in galaxy far, far, away.
The point of technology is to assist a user experience. And the beautiful thing about a user experience is it is always wrapped in a story. Rather than look at a technology set a set of barriers that a creative idea needs to fit within, shift the perspective to start with a story with technology as a supporting role.

I would argue that the current state of digital provides limitless creative applications. The barrier lies in the stories - not in the technology.

How do we craft better stories with technology?
  1. Focus on people. Start with the user experience, or if we want to sound technical, the Human Interface. All stories have an arc and motivate reaction. Ignore technology entirely at this point.
  2. Create Momentum. With the human story in place, look for points where the imagination will begin to run. If the story arc guides the imagination, find the points that have opportunity to be reinforced. Perhaps the imagination stalls during details of the story, or perhaps the user needs to be primed so the store has the greatest effect. What tangible action can ignite the imagination? What literal elements can be removed to leave room for the mind to play? This will lead you to explore the medium of the storytelling experience and technology will naturally seep into consideration. Avoid crafting to what you know or a technology that seems to be an obvious fit.
  3. Use technology as a mirror or support (not both). Begin to weave the use of technology on the same path as what was identified to create momentum; using it as a tool to mirror the momentum of the story and augment the imagination, or to support the story and create momentum where the human experience lacks. If you find that technology is mirroring and supporting the story, you're likely forcing things together. What's the point of a story if it needs technology? Simplify and always fall back to the human experience.
Helping people to tell their crafted story with the help of technology is often how I spend my day. I have yet to become an expert, but have learned a few lessons the hard way. Creative technology without a story can be compared to a creative tactic without a story - without the story (or strategy if you prefer) it is clutter.

Wednesday, May 7, 2014

Concierge Minimum Viable Product

A phrase often heard in our boardroom is: "Where technology falls short, the creative experience does the rest" - meaning, technology can do a ton of heavy lifting, but if it becomes too complicated we can simply let people fill in the gap by guiding them creatively (storytelling with technology...another post entirely).

I recently was reminded of an amazing articulation of this idea by Eric Ries.  The Concierge Minimum Viable Product.  If an MVP is the minimum set of features a customer will pay for, a Concierge MVP takes this a step further by offering those features - even if the technology that would enable those features is not yet ready (it's not solely technology based, but I'll stick with the digital interpretation).

An example?
Shipping. If you don't currently offer integrated and automated shipping to local online customers, you can still offer a Concierge MVP by manually doing the shipping/ordering/etc. When it reaches a revenue level that allows this to become automated, you invest. It's all about staying lean.

A shift from financial limits to technology limits
I love the idea of a Concierge MVP when it comes to the Internet of Things. If the idea of a MVP is an approach to market entry when finances are limited, think of a Concierge MVP as an approach to market entry when technology is limited.

From an IoT perspective, allowing the human experience to connect technologies creates infinite possibilities for bringing the (currently) impossible to life.
  • We are years away from automated taxis picking us up, but Uber does a great job of providing a Concierge MVP - an automated transportation network delivered by humans. 
  • Sticking with cars - products like Automate - provide a ton of value by providing sensor data to drivers - but the driver still needs to monitor this data and alter their driving behavior. 
  • Look to fitness, shipping, and healthcare and you'll see dozens of examples  - apps or shoes that track footsteps (informs users, but requires a user to act), sensors to track the temperature of a shipping container (informs users, but requires a user to adjust temperature), or patient monitoring (sense user activity data/health indicators, but a doctor still delivers the care).
  • Augmented Reality is no different. Ideally, technology would be smart enough to understand the context around the user - but right now those tech features are limited - so the mobile experience leans on the customer to fill in where technology is absent.
The underlying theme is that some of the greatest innovations for the IoT  - connecting the digital and physical space - have already started. The Concierge MVP version is likely in-market, and perhaps that idea that you're waiting for technology to catch-up to is already possible with a bit of human interaction.

Thursday, May 1, 2014

The end of an era: The Prehistoric Internet of Things

The buzz around the Internet of Things (IoT) has people's imaginations soaring with driver-less vehicles and fridges that keep track of your food. However, for many people, the interaction with the IoT is likely not that far away. It's likely in their pocket.

Source: TheTelegraph.
A less obvious form of the IoT is the smartphone. Once simply an object with one purpose (phone network), little data collected (no sensors or intelligence), and very little UI, the phone in your pocket has become a central data collector (for you and others), is loaded with automated sensors (gyroscopes, GPS, light, audio recognition, etc) and has a UI that accommodates for all of it. In terms of being a 'smart' object that connects the physical and digital world - automating activities and processes - our mobile phone could be thought of as the first mass consumer product in the IoT space.

Even Augmented Reality, an innovation that first took the mass market by storm in 2008, is just a simple form of the IoT. By nature it allows you to unlock digital content or connect to experiences by scanning or tracking to the physical world. It takes the mobile experience a step beyond the rectangle screen - allowing the UX to flow between a mobile/digital interaction and a physical object or location interaction. As more physical objects become networked and equipped with sensors, the words Augmented Reality will disappear, as these physical/digital handoffs become a regular aspect of every day life.

With first generation wearable glasses now available to the masses, networked home heating and monitoring systems falling more into CPG than luxury goods, and robot assisted professions improving care, we are living in the end of a prehistoric period of the Internet of Things. A period that began in 90s when the folks at MIT imagined a world connected by RFID, is now ending as connected devices exceed the human population. The speed of this shift is where the real fun begins - we never know how behaviours will change with new technology until people actually use it.

Wednesday, April 23, 2014

Has the wearable bubble burst?

A few days ago Nike put the final nail in the coffin on its Fuelband wearable-technology. Laying off the majority of the digital hardware team and announcing it is exiting the wearable-hardware business.

This follows some recent news reporting that one-third of American consumers who have owned a wearable product stopped using it within six months. In a potential U.S market of approximately 45 million adults who are regular members of a gym, this must be a clear signal of an over-inflated 'need' in the market or perhaps that the world is not yet prepared to enter the age of wearable tech.

In this case both those ideas are likely not the reason and do not paint the entire picture. There are some re-occurring principles to be observed that extend beyond the Fuelband. Here are a few:
  1. Tech moves faster than any single company can adapt. At my organization, in a typical project our technical capabilities and options will refresh every 45-60 days. Meaning, how we plan to build something is usually replaced by a better way to build it before the project has been completed. This is the reason why we have remained hardware and software agnostic - we build on the latest and adapt/evolve during the process. This is echoed with Nike, which had  several hardware and software rapidly released for the FuelBand, as well as Samsung's Galaxy Gear which seems to release a new hardware option every few months; bringing me to point number two...
  2. Beware the hardware demand vacuum. With the release of new versions of wearable tech, the speed of the release rarely is an attempt to meet increasing demand. It is an attempt to improve the product. The result creates a false demand within the existing consumer base. This symptom has a sense of half-life, as new iterations decrease novelty and cause demand to collapse in the category - creating a vacuum within the existing market. Even though wearable technology is still within its infancy, this generation of consumer largely expects the product to behave and evolve as a mature product.
  3. Perpetual infancy of technology.   In the case of mobile, wearables, and the human interface (the Internet of Things) the industry will likely remain in a state of infancy longer than consumers are used to experiencing. Evolution in hardware and software will feel more like a revolution as the platforms and delivery of these experiences changes at an increasing rate. It is no wonder Google Glass continues down the Explorer path, as the yet-to-be market ready product has gone through nine software updates since its release just over twelve months ago. For Nike, a business built on apparel and shoes, we'll see their focus on software continue to support digital sports - a market they've also created. Their return to hardware will likely be delayed until the sensors and processors reach a size that allows for true wearable integration - within fabric or within the environment around us. Samsung will follow its path that helped it succeed in mobile - it will try, fail, try, fail, etc until it launches the wearable equivalent of the Galaxy, the mobile device that finally succeeded for them. Dozens of other players will continue to enter the market and keep everyone honest. The wild west will expand.
It is a seriously exciting era in technology, as we all figure out how to move beyond the rectangular limits of our smartphones, computer screens, and televisions. The promise of interaction with the world around us, pushing us back into our communities and physical interactions will continue to fuel the demand for seamless hardware and software.