Recently, I spoke with a large technology company and asked them if their human-centered design prevents experience bias. The short answer is: The short answer?
We are not referring to our cognitive biases when we refer to experience bias. Instead, we are referring at the digital interface layer (design and content). Most apps and websites you use are designed based on the perceptions of the creators or for a few high-value users. We believe that users are not equipped to understand design conventions or have limited technical access.
One solution is to change to a mentality where multiple versions of the same design or experience are created by organizations to meet different user needs.
Referring to the tech company I was speaking with, it is essential that any company invests in empathy design. However, as someone who has run design functions and launched them, we must be aware of some dirty details.
First, UX and design teams often get instructed by business functions on very specific target users. Experience bias begins there. A design team will not have the budget or permission to create experiences for a user if the business does not prioritize them. Even if the company practices human-centered design, or employs design thinking to create experiences for users, they are often only pursuing a user profile that is aligned with their commercial interests.
Another dirty secret is that human-centered designs often assume humans are the ones who design the UX, interfaces and services. This handcrafted UI model is not the best solution for experience bias. It doesn't take into account the diversity of the teams creating it. In order to achieve experience equity, it is necessary to prioritize a variety experiences based on users' needs. This can be done either by fundamentally changing design processes or by leveraging machine learning and automated creation of digital experiences.
How to identify and correct experience bias
Understanding how to identify experience bias is key to addressing it. These questions can help you identify the source of experience bias in your digital experiences.
Language and content: Does it make sense to you?
Many applications require technical understanding. Use jargon that is specific to your company or industry or assume technical knowledge.
Any financial website or site offering insurance services assumes that you are familiar with their terminology, industry, and nomenclature. Digital experiences must translate for you if you want to replace the old days of a banker or agent translating for you.
Complexity of the interface: Based on my capabilities, does it make sense?
Can I use assistive technology to navigate a situation if I have a disability? Do I have to learn the UI? One user may need to navigate an interface in a different way depending on their ability and context.
Design for the elderly would, for example, prioritize more text over subtle visual cues. Younger people prefer color-coding and preexisting design conventions. Consider the horrible COVID-19 vaccination websites. It was difficult to find information and navigate. It was once that startups had simple UIs. But now, features upon features make them complicated even for experienced users. Just look at Instagram's evolution over the past five years.
Ecosystem complexity: Do you place responsibility on users to seamlessly navigate through multiple experiences?
Our digital lives don't revolve around one app or site. We use a variety of tools to do everything online. Virtually every digital company or product team wants users to be locked in to their own virtual world. They rarely think about the other tools that might be available to them based on what they are trying to achieve.
I might need to contact insurance companies, doctors, hospitals and banks if I get sick. As a college student, it is possible that I will need to interact with several systems at school. This includes vendors, housing, banks, and other related institutions. If users have trouble putting together different experiences within an ecosystem, they are always to blame.
Inherited bias: Do you use systems that create content, design patterns for a different purpose, or machine learning to personalize your experiences?
How can you make sure these methods create the best experiences for your users? You inherit any biases in content, UI, and code that is derived from other systems when you leverage them. You can take a look at the many AI content and copy-generation tools that are now available. If those systems generate copy for you site, then you will also inherit their biases.
New design and organizational processes will be required to build more inclusive and equitable experiences ecosystems. AI tools that can generate more personalized digital experiences will be a major part of new front-end design and content approaches in the next years. However, there are five steps every organization can take immediately:
Digital equity should be part of the DEI agenda. While many companies have diversity, equity, and inclusion goals, they rarely translate into digital products for their customers. I have worked as a designer in large corporations and also in digital startups. The problem is the exact same: there is no clear accountability for diverse users within the company.
At large and small businesses alike, departments are competing for the impact and proximity to customers. For digital products or experiences, the first step is to prioritize diverse users at business level. Each department can determine how it meets those goals if there is a directive at the highest levels.
Without funding support and management, no product or design team can have an impact. The C-suite must be accountable for making sure this is done.
Diversity is a priority in design and development teams. There has been much written about it, but it is vital to stress that people who lack diversity will create experiences that are based on their privilege and abilities.
It is important to hire people with experience in designing for different users. What are you doing to improve the design and development processes in your company? To source diverse talent, who are you partnering? Do your DEI goals only include a list of boxes that can be omitted when you hire the designer you had in mind? Are your agencies able to implement clear and proactive diversity programs. Are they knowledgeable in inclusive design?
Google has taken a few outstanding steps to increase representation in the talent pool. It has moved funding for machine learning courses away from predominantly white schools to more inclusive schools. It also allows free access to TensorFlow courses, provided free access to TensorFlow courses, and sent tickets to BIPOC members to attend Google I/O.
Rethink what and who you test with. Too often user testing is restricted to the most important or profitable user segments. How does your site function with an older population? Or with younger users who don't use desktop computers at all?
Equity and equality in experience are both about testing multiple experiences. Too often, designers test one design and then tweak it based on user feedback. Although it may be more work, designing variations that consider the needs of older users and people with different cultural backgrounds is possible. This allows you to connect designs to digital equity goals.
Change your design goal to launch multiple versions of the experience instead of one. Common practice in digital design and product development is not to make one version of every experience. This is based on the most important users. Futures where every app and site has multiple versions that are compatible with different users is a radical departure from the way most design agencies are resourced.
This shift is crucial in order to achieve experience equity. Ask yourself simple questions: Is your website/app/product designed for older users? Does it have a simpler, more accessible version? When designing for low-income households, is it possible to have mobile-only users complete the same tasks as those who use desktops?
This is more than just having a responsive website. You should also be testing different designs to get the best design. The goal of design teams is to launch multiple targeted experiences that link back to prioritized, diverse and underserved customers.
Automate the creation of different content and copy for each user group. Even though we test design variations with many users, I've seen content and UI copy treated as an afterthought. As organizations grow, it becomes either more jargon-filled, or so polished that it is meaningless.
How can we limit people's understanding of the tool or its use if we take existing copy (e.g. marketing copy) and place it in an app? Understanding where automation can be used is a smart way to reduce experience bias.
There is an explosion of AI tools that will revolutionize the way content and UI are created at this moment. While they are primarily aimed at content creators, there have been a lot of copy-driven AI tools available in the past year. It is not difficult to imagine a customized deployment of such a tool within a large brand. This tool would take users' data and generate UI copy and content for them. Older users might get more textual descriptions for services and products with less jargon. Gen Z users might get more referential copy but with more imagery.
No-code platforms offer a similar opportunity. Everything from WebFlow to Thunkable demonstrates the possibility of dynamically generated UI. Although Canvas design can feel generic at times it is used by thousands of businesses to create visual content, rather than hiring designers.
Many companies use Adobe Experience Cloud, but seem to ignore the automation features that it contains. The role of design will shift from creating bespoke experiences to curators of dynamically generated UI. Just look at the evolution of animation in film over the past two decades.
Machine learning and AI are the future of design variation
These steps are designed to change the way organizations deal with experience bias using current technology. AI tools will play a crucial role in the future state of experience bias. There are already a lot of AI-driven content tools such as Jarvis.ai, Copy.ai, and others. Then there is Figma, Adobe XD, and other platforms that have automation tools.
Machine learning and AI technology that generates front-end content and design is still in its infancy. However, there are some interesting examples that show what's possible.
Google's Material You design system for Android devices, which was released earlier in the year, is the first. It's highly customizable and has a high level of accessibility. The user can change color, type, and layout. However, there are machine learning capabilities that could alter the design based on variables like location or time of the day.
Although the personalization features are initially marketed as giving users more control over their own customizations, a closer look at Material You will reveal many possible intersections with automation at design level.
It is also important to highlight the work organizations have done around interaction design principles and interactions for how people experience AI. For example, Microsofts Human AI eXperience program covers a core set interaction principles and design patterns that can help in crafting AI-driven experiences. Additionally, it has an upcoming playbook that will be used for designing and anticipating human-AI interaction failures.
These are examples of an AI-generated future. However, there are very few examples of AI-generated designs and interactions in real life. To reduce bias, we must evolve to a point where front-end design can be personalized and vary dramatically. This speaks to the emerging trends in AI and design.
This convergence of new technologies and design practices will allow organizations to dramatically change the way they design for their customers. We won't be able to address the issue of experience bias if we don't start to examine it now as front-end automation becomes more mainstream.