Adaptive UX Practices
At DNA, our focus is to provide our clients a product that has been guided by rich user insights generated through design research. Because of this, we are able to achieve balanced outcomes, which are more usable and valuable solutions – that also meet the specific business goals of our clients. Our go to approach is to conduct research with users in their native environments – however, sometimes this is not possible due to budget, timeline or access constraints. In these cases we have needed to adjust our methods and apply different approaches to alleviate this problem.
We’ve created some adaptive UX practices that are handy to have in our 'proverbial toolbox’. These approaches enable us to deliver rapidly and to deliver within an array of cost bands, while still delivering with value and efficacy.
Before I launch into these practices in more detail, I want to dispel that this is not 'Lean UX', a very hot topic in the design industry.
The term ‘Lean' in relation to production methods has been around since the 1990’s. Its core concept is that you remove waste, or unnecessary steps, from the process at hand. The term has had a resurgence in the software industry, where it compliments Agile methodologies. All of our design practice fits into the definition of ‘Lean’, as we continually look for ways to deliver best value to our clients.
Adaptive UX is less about waste or cost reduction, but rather represents the creative restructuring of a process in order to fit within any given constraints. Budget, timeline, and scope are three key points we need to understand to get a project initiated. Budget and timeline are often the first words muttered by clients. Within these constraints, scope is the area that we must address to shape a viable outcome. That said, merely decreasing scope is a pretty blunt reaction, therefore we’ve developed a range of approaches to allow us to meet a range of challenges more flexibly. Some of these approaches are adaption, but we have a few pretty obvious things in the suite as well.
Quantity before Quality.
Sometimes the answers to UX challenges and user pain points lie in the numbers. Some prowess with Google Analytics will glean a lot of insights, especially if you set yourself up for success with it. Setting goals in Google Analytics is fairly easy. You follow the same path to a goal that a user would take. If a user might complete a goal from multiple origination points, set them all up as different goals, i.e. “sign-ups through social media” vs “sign-ups through campaign X”. If you don’t have access to build your own goals, use the reverse goal paths view to work backwards. Spend a couple hours poking around in Google Analytics before you commit to building a prototype or plotting out what tasks you need to test. You’ll be surprised what you can learn about user behaviour when you apply a little [digital] elbow grease.
Burn down the lab.
Make users feel comfortable by not showing up in a figurative lab coat. Our position is that usability testing should be enjoyable for both participant and conductor. We’ve changed the ways we conduct formal user research to have a more relaxed tone by replacing the script with a discussion framework. There’s no reason to hold back this thinking in terms of usability testing. We’ve seen usability testing succeed in a speed dating format. Hell, why don’t we test at the bar, alcohol is a truth serum right? The point here is that usability testing shouldn’t be sacred; it should be accessible and habit forming.
One of the staples of our full-on user research is the discussion framework which helps us ask interview questions through a more relaxed conversation usually at the participant's home or office. In a recent project, discussed in DNA Labs, we built a discussion framework to bring into a usability test. It wonderfully complemented the interaction tasks we had users go through and allowed us some extra insight into why users were doing the things they were with the prototype. It also helped us control the conversation pace by allowing us to bounce between questions and tasks. The necessity for this came from gap in user insight and a limited timeline/budget keeping us from using our standard process.
We’ve run projects through Trello and others through Jira but even the best-laid project boards miss out on nuances and sometimes it’s just easier/faster to get something done in person. One of our most successful, easy to implement and cost-friendly practices is pairing. This is where our visual or UX designer sits with the developer to hone some of the finer points of the design over the course of a couple hours. We schedule pairing sessions with clients in the initial project planning meetings so that we get it on the project roadmap early and give the client time to schedule any security clearance and whatnots that might get in the way. If that’s not possible for one reason or another, Skype it out or have their developer come to you.
Too often our clients are ready to move on to the next problem after a project launch, so it’s hard to have a meaningful conversation around metrics. A metrics review is part of our early project discovery work, as you read above, but we also build metrics reviews and reports into our SLAs to make sure someone’s paying attention to them after launch. This practice helps developers be mindful of the SEO that needs to be wired into the product and keeps designers thinking about accessibility. We’ve also found an interesting sustainability feature in them in that we can pinpoint areas that might need to be addressed without the client having to find them first. We also use metrics to prove our effectiveness and value and the benefits of user research in general. We also constantly refer to the core empathy work we've done, and assess these analytics alongside core user needs and goals.
A word of caution – A major source of the inputs able to be ‘eliminated' from a standard process is seen as the user feedback that helps us understand who the user is, what they feel and most significantly ‘why' they behave the way they do. This isn’t unimportant data, on the contrary, it helps us build a bigger and better picture about how a product is used.
When we have to draw a line, that line has to land somewhere. However, not including this data in our approach means that we’re not able to balance our view of what users are doing, and not deliver tools that are extremely valuable to our clients like user profiles or personas. This ultimately runs a risk of understanding 'when, what and how' people are interacting with you, but the ‘why' may be masked.
Pete Fecteau -UX designer.