As the average age of our clientbase skews younger, and the interest in technology and data skyrockets, creating an analytics experience for mobile is a no brainer. Clients were asking for it, and the time was right for the engineering and design teams to show the organization what we could do.
Many design processes begin by defining a specific or poignant question. As this was one of the first design driven products for The Orchard, there was quite a bit of groundwork we needed to do to discover the question we were solving. We began by speaking with stakeholders to identify customers who were believed to be our target demographic.
Also, as this was one of the first times the design team was connecting with clients directly, we needed to do some digging to understand our users' current usage patterns. These insights allowed us to cater our questions for the upcoming interviews. Generally we found we had two types of usage patterns:
1. Those who used analytics weekly to look at specific data
2. Those who looked quarterly or yearly and only looked at overall trends
Interview scripts were carefully crafted to match the particular kind of usage. We knew going in how often they used the system, what kinds of information they were looking for, and the areas we needed to dig a little deeper.
The participants of the study also represented our core client base, ranging from very small businesses, to multi-million dollar ones. The interviews were mostly by phone and with people from around the world, including England, Australia, Germany, California and Michigan. We recorded the interviews and made transcripts. It was a lot of work, but eventually we started to see some trends emerging. After compiling this information, we continued on to the interface phase of the project.
The onboarding designs became a very important part of the experience, because it would be the first mobile product, and also the first product that users would be introduced to Oauth. After users logged in the first time, their information would be stored and would keep them logged in for a month. Take a look at some of the iterations:
I worked with the team to produce designs to generate some quick ideas about the architecture and UX of the app.
The final touches were adding some interactivity to the graph:
Some testing was done to confirm some assumptions we made about how data was being displayed. We were able to get some pretty clear evidence that certain designs made more sense to our group of users.
After confirming our suspicions, a fully functioning prototype was finished and we scheduled time with some of users to come into the office and try the application out on their own phones. We watched as they logged in, clicked around and also gave them a few simple tasks to complete. Everything worked as planned and we prepared for launch.
The first launch was first to a small alpha group of users so that we could gain some initial analytics about usage. We prepared a set of instructions and let the group loose to try out the app. We also created an email alias for them to send specific concerns.
Midway through the alpha we sent out a survey to these users to get a sense of how the app was performing and if they were finding it useful. The results were very positive, so much so that we were prompted to end the alpha stage early so the entire client base could benefit from using the app.
The app has been live for a few months and we have been monitoring usage. The next step is to take the usage data and user feedback to continue the cycle of idea, build, launch and learn.