Home / Software & Service News / How to design for the user…and the bot

How to design for the user…and the bot



The rise of artificial intelligence (AI) continues to drive change in how enterprises innovate and communicate their business. But for the humans and consumers interfacing with products affected by this technology, what will an AI-empowered future look like? What will it feel like? Empowered by rapidly advancing machine intelligence, product designers are rethinking the fundamental principles that guide the way they work.

Faced with an evolving and exciting set of problems, product designers are asking a new set of questions to reinvent the model of human-computer interaction. How do we empower people rather than overwhelming or terrifying them? How do we help people grapple with intelligences that will inevitably exceed their own? How do we think about user experience (UX) design when it is no longer aimed at helping people understand machines, but rather at machines that will understand human beings and communicate accordingly?

Here are five considerations I’ve discovered thus far.

1. Design for two

We are no longer designing for just the user. We’ve traditionally designed interfaces solely through the lens of the user’s needs — centered deeply on understanding the user’s goals, journeys, or stories. While those needs will remain core to the process, we have to consider AI as the second agent.

We’re in the nascent era of designing for interactive conversations between two intelligent agents. While some of these designs are voice-based, we are still focused on creating windows, not screens, for relationships across interactions. Successfully designing for these interactive relationships requires an additional consideration of the machine’s goals, the machine’s journey, and the machine’s needs in any given context.

2. Understand the stakes

Fruitful conversations rely on trust and respect. A human user’s comfort or discomfort relying on AI will be directly impacted by how well they understand what the machine is up to (complexity) and how much it will impact their lives (importance). To put it plainly, you might be less concerned about taking product recommendations directly from an AI system than you would be about taking medical advice. The more complex and important a set of interactions is, the more the user has to both respect the machine’s competence at coming to a solution and trust that it has the user’s best interests in its (digital) heart.

Designers can approach these problems in a host of ways. For example, a designer can focus on explanations in human terms, provide constant transparency in information gaps, or they can completely reimagine an approach to error messaging and guidance (my team has taken to calling this “error-driven design”). Most importantly, a designer should first assess an interaction by how well each agent understands one other and how much of an impact the interaction will have.

3. Design like you talk

A key part of winning trust and respect, and facilitating a productive conversation, is rooted in a slight riff on the old adage about good writing: “Write like you talk.” Since the advent of the PC, designers have been tasked with helping humans think like machines — first at the command line, and later through spreadsheets, database inputs, and form fields. AI-empowered interactions change our goal. We are now designing to help machines meet people where they are. In other words, we get to let people be people again.

Consider Facebook. Recall the site’s simplicity in 2004 versus its ecosystem in 2017. In the early years, the UX was entirely centered around influencing people to organize their lives in SQL-friendly chunks and graphable tags. Fast forward a decade and Facebook is hard at work on Messenger integrations that learn about you just by absorbing your chat threads. Facebook’s designers are saying, “You be you, we’ll figure it out.”

4. Leverage active “listening” (in moderation)

Facebook’s latest moves highlight another important technological advancement for designers: AI can listen. Computers used to wait for input, but today machines collect information through our online exchanges, interactions, and communications (and, thanks to the rise of mobile and IoT devices, the “online” component of our daily lives is nearing ubiquity). Through each new app, new device, and new interface, the collective machine is capable of learning more about us every day. As designers, we have both a huge opportunity and an immense responsibility.

On the opportunity side, AI can now be an active “listener” in any given interaction. For example, it can gather traffic data by tracking your motion during use of a map app, optimize a coaching app by getting a sense of your activity habits through tracking your heart rate, or prioritize possible matches in a dating queue based not on what you say you like but on whose profiles you view and interact with most — this is helpful active listening. On the responsibility side, we have to find the line between helpful listening and nefarious eavesdropping that violates privacy rights (anyone remember the Samsung Smart TV episode?).

5. Convince is the new convert

Despite the machine’s ability to listen and generate information passively in some contexts, the user is still called on to take action during many interactions. Depending on the stakes, that required action might be stressful or go against gut instinct (say, decisions about medical treatment or financial planning). As AI becomes part of high-stakes interactions, designers can look to a cousin of the age-old “conversion” flow for a framework; in other words, convince is the new convert. In a wide variety of realms ripe for AI involvement, machines (and their designers) will find themselves in the position of persuading the user to accept a logical conclusion. It’s a process that will require systems to be imbued with an “understanding” of human emotion, bias, and logical fallacies. After all, human conversation isn’t just about knowledge transfer. It’s about context, mutual understanding, nuance, and trust. Machines will need to understand us if we’re to understand and believe in them.

Designers have their work cut out for them at the edge of the Fourth Industrial Revolution, but the work is amazing and powerful, when you consider the implications. Take a moment and think about this: As humans, our senses define our understanding of the world — our reality is what we see, smell, feel, hear, and touch. If designed successfully, interactions with AI can augment human perception and knowledge, widening the window into a universe of information that humans have yet to fathom. But, of course, windows are two-way. As we find ourselves the creators of an increasingly rich digital primordial soup — through which Unix time counts back to a second big bang — we have a related consideration: We are beginning to create the senses by which intelligent machines will know us and are defining how a nascent intelligence will come to understand our universe in the future.

It’s a pretty amazing time to be a designer.

Andrew Paley is the Director of Product Design at Narrative Science, a company that makes advanced natural language generation (Advanced NLG) for the enterprise.

Above: The Machine Intelligence Landscape This article is part of our Artificial Intelligence series. You can download a high-resolution version of the landscape featuring 288 companies by clicking the image.

Click Here For Original Source Of The Article

About Ms. A. C. Kennedy

Ms. A. C. Kennedy
My name is Ms A C Kennedy and I am a Health practitioner and Consultant by day and a serial blogger by night. I luv family, life and learning new things. I especially luv learning how to improve my business. I also luv helping and sharing my information with others. Don't forget to ask me anything!

Check Also

UK drone rules will require you to take safety tests

UK drone rules will require you to take safety tests

US officials might be easing up on drone regulations, but their UK counterparts are pushing forward. The British government has instituted rules that require you to not only register any robotic aircraft weighing over 250g (0.55lbs), but to take a "safety awareness" test to prove you understand the drone code. Regulators hope that this will lead to fewer drones flying over airports and otherwise causing havoc in British skies. Not that they're taking any chances -- the UK is also planning wider use of geofencing to prevent drones from flying into dangerous airspace.

The new rules come following a study highlighting the dangers of wayward drones. A smaller drone isn't necessarily safer than its larger alternatives, for example -- many of those more compact models have exposed rotors that can do a lot of damage. A drone weighing around 400 g (0.88lbs) can crack the windscreen of a helicopter, while all but the heaviest drones will have trouble cracking the windscreen of an airliner (and then only at speeds you'd expect beyond the airport). While you might not cause as much chaos as some have feared, you could still create a disaster using a compact drone.

It's nothing new to register drones, of course, and it doesn't appear to have dampened enthusiasm in the US. The test adds a wrinkle, though: how willing are you to buy a drone if you know you'll have to take a quiz? The test likely won't slow sales too much, if at all, but it could give people one more reason to pause before buying a drone on impulse. Manufacturers appear to be in favor of the new rulebook, at any rate -- DJI tells the BBC that the UK is striving for a "reasonable" solution that balances safety with a recognition of the advantages that drones can bring to public life.

Source: Gov.uk (1), (2)

css.php