Jodi Bollaert: I'll use a quote that I hear Keith Instone over here say all the time. Thank you, Keith. Deciding which tool is the best, it depends. It depends on a lot of things. It certainly depends on, what do you want to learn. That's probably the most important question. If you want to figure out, if people are going to click on your awesome CTA because you put it in bright color orange, there's probably some analytics tools, quantitative tools, that would work for you. If you want to do qualitative research, you want to find out, would users really understand your interface, maybe you're looking at a usability testing tool. Who's your target audience, is another question. Are they hard to recruit? Are they easy to recruit? Where are they? Those can all be factors. What's your timing and budget? That's a big factor. Do you have a lot of money? Do you have a little money? Do you have no money? That can be a determinant. What will you be testing? Is it a website? Is it a remote control? Is it an app? Those all also factor in to what tool you would use. I want to talk briefly about Team Detroit UX research. We actually have a pretty large research department. It's called Marketing Sciences Unit. Acronym is MSU, which is why this is green, the real reason why this is green. I really actually like the green color, but we do all kinds of research. My focus is usability testing, or usability research. Sometimes I just broaden it to say "UX research." We also do attitudinal research, a lot of attitudinal running surveys all the time on our sites at Team Detroit. In case you don't know, our main client is Ford, Lincoln and everything related to that company. Analytics, that's another big department. We are tracking everything at Team Detroit on the Ford and Lincoln sites. We do a little bit of ethnographic research, and we do a little bit of concept testing, and focus groups. We've actually kind of gotten away from focus groups in the last several years. We do a little bit of card sorting. The UXA team, a few of whom are here today, does some card sorting. We also do in-depth interviewing. The interview's often combined with the usability testing. Little bit of history at Team Detroit. I'll try to go through this quick. I started in 2007. At that time, we were doing primarily lab-based research. We were doing moderated testing, one-to-two markets, and I was traveling. I was traveling to Los Angeles and Chicago and various places. That was pretty much our methodology. In 2009, we started to introduce remote usability testing, remote-moderated, so we had a facilitator. In all of these cases, we were working with third-party research companies, like TecEd for example. Remote-moderated using a tool as simple as WebEx or GoToMeeting. Then, around 2010, right up until 2012 and even today...all of these methods we use today...we were doing remote-moderated testing. But now instead of just a few markets, we were doing a nationwide market. We were able to reach out to anyone, and not just through a third party market research firm. Now we were doing it right off ford.com, right off lincoln.com with tools like Ethnio and WorldAPP and really any survey tool. Any survey tool can be used to create an intercept to greet people, or at least some people, as they're arriving at your site and entice them into a usability research study. We used Ethnio and WorldAPP to create screeners and then determine the qualifications of that particular visitor to the site. The challenges that we face today with all of those research methodologies that I just talked to you about is that there is now three times as much to test. Back in 2007, when I first came to Team Detroit, we were just testing websites primarily. Now we're looking at the mobile and tablet experiences. For just about everything, we're creating mobile and tablet in addition to desktop. Also, our project life cycles are often short. I'm sure some of you can relate to that. Traditional usability testing typically takes, at least us, four to six weeks to plan, execute, and report, and sometimes we just don't have that kind of time. Traditional usability research, as I've described it, also requires a substantial investment, mainly on the part of our clients. As Team Detroit, we partner with third party research companies, companies that are focused on research. Then a few of us resources internally work with those companies, such as myself. We also have a perception that usability testing may slow a project down. That's probably been a longstanding issue. In other words, people are afraid that the results may be poor and that changes may be required. Egads. Our opportunity with web-based tools, such as the ones that I showed you earlier. It really enables us to conduct research at a radically lower cost. Some of them are very cheap. You won't believe how cheap. In less time and with fewer resources. We'll get into this more. Of course, in comes do-it-yourself usability testing. This is a theme that's emerging for us this year. If I were to continue the evolution of research, I would say this year is when we're going to start doing a lot more of the do-it-yourself, not as a replacement for the traditional research, but actually as a supplement. We simply don't have more time available, and we don't necessarily have more budget available. We've got to find some other way to get all of that research in. Do-it-yourself tools we use. I'm going to breeze through some of these. I'm quite sure some of you are going to be familiar with these. One tool we've used is the Five Second Test off Usability Hub Tool hub. It's a nice little site. They've got actually a couple of tools. Basically, this lets you find out what people recall about your design. It can be a comp, a where frame, a logo, whatever you can upload, essentially. This tool is free, essentially, with Karma Points. If any of you are familiar with the site you'll know what I'm talking about. Basically, if you participate in a test, you, yourself participate in a Five Second Test as a user, or the other tests that they have, you get a Karma Point. It's like a community thing. With each Karma Point you, yourself, get a free test. That's pretty cool. We actually used this on a Team Detroit project, just a really little project. It was actually something I just felt compelled to do because I thought our site wasn't really communicating what we do. I uploaded the Team Detroit home page. This was a few years back. I had a concern. The concern was does the home page effectively communicate what we do? Simple question to people taking the test, "What is the purpose of this page?" These were the answers that we got, "Well, it's about cars in Detroit." "It's about Detroit." "I don't know." This is a little word cloud that Usability Hub put together for us. [audience laughter] I didn't send out a massive mail and I share this with a ton of people. I did send it to the creative director, head EVP-type person. He thought it was pretty interesting. I didn't know what he was going to do with it, but not too much longer later this is what the page looked like. We're an ad agency so if I were to run this test there would be no mistaking what the site is about. That's the way the site looks today. I'd like to think that that little free Karma Point Usability test actually had an effect, a positive effect. Another on Usability Hub is the Click Test. This tells you where users would click to find information. Again with Karma Points. Easy set up. Example, we wanted to know where's the best place to put this thing called "credit test run" on FordCredit.com, one of other clients. Should we put it in the "finance tools" section or should we put it in "apply for credit"? We weren't really sure, so this tool helped us answer the question, "Where would you click to take a credit test run?" All six participants, and we decided to just run it with six, that seemed sufficient, chose "apply for credit." When we designed the page guess where we put "Credit test run"? We are pretty confident that that's fairly intuitive. We could run more testing but I think we'd find the same results. WebSort, I'll just touch on this very briefly, an information architecture's dream. You can do card sorting online with up to ten people for free. Again, a free tool. Of course all of these tools have subscription pricing if you have to go beyond the free option. Again, easy set up. You can run a good card sort with ten users. That's really plenty to get a good understanding of how your audience might organize their site, assuming, of course, you have recruited your audience. Ethnio is another tool you might be familiar with. We have gotten very familiar since 2010 with Ethnio. I mentioned earlier, we're using that to recruit people right off the site. I personally have a lot of enthusiasm about Ethnio. This is what it looks like. This is an example on Ford.com. A little pop-up, "Hey, for 75 bucks would you like to participate in our study?" We show this pop-up to some people, not to every visitor. What I love about it is we do what's called "live intercepts," which is another way of saying we pull people right into a usability study as they're arriving at the site. We determine whether they are in fact meeting our criteria, that they're not, say, someone from GM, and we make sure they're qualified. Then we pull them into a study immediately. What I love about this is that they're actually in the moment of shopping for a vehicle. They're in the mindset, "What is it I need?" We use this tool to do what we call "natural shopping research." In a lot of cases we spend at least 15, 20 minutes of just sitting back and observing how they were going to use the site in the first place. There are a lot of great insights that come from that. Some of my teammates in the UXA department are spending time with this, and I'm not paid to talk about this. I just want to say that upfront before I get into more slides. But I love this tool. I really love this tool. It allows you to observe and hear users as they're experiencing the site. What you get is a video, 15 minutes of video. I should mention you can test up on desktop wired experiences as well as a tablet and as well as a mobile. In addition to the 15 minutes of video of watching someone use the site, you also get a written summary of likes, dislikes. Basically, you can ask whatever follow-up questions you'd like to ask after they experience the site. I want to just show you a sample desktop test so you can get a feel for what this looks and sounds like. We were pitching some new business at Team Detroit, and we won the business, it's OK for me to talk about this. As part of the pitch, I was asked to do a little usability research study on their existing casino websites. And we were going to incorporate that little bit of research into the pitch to hopefully win their business. Well, that was really one of my first...I should say early experiences really using usertesting.com in any significant amount. We actually ran 10 people through the study, and this is just someone who's just experienced the casino website. The casino has a hotel, and she's just experienced a form where you book a hotel room. This was her reaction. Woman 1: There wasn't anything I disliked about the experience except it's a little web 2.0. It just looks kind of old and a little outdated, this process. But it makes sense to me. I mean, I wouldn't not book because of it, but just as an FYI, that's kind of going on in the back of my mind is ew, I wonder if they've got shag carpet. [laughs] [laughter] Woman 1: That's another reason I'd like to see pictures. Jodi: That was really interesting, and we got a lot of really great feedbacks from the participants. I have to say, I'd like to think that doing this little bit of research as part of a pitch to really win their business from an advertising standpoint, it really caught their attention because I, along with several other people who were doing other forms of research, we kind of went the extra mile to get some information about their business and their customers to really understand who they were. I think that was really powerful. This is another example. Not too long ago, we have a mobile team who they're interested in trying out new navigation styles on mobile Ford, mobile Lincoln, and they wanted me to do a little competitive analysis of a couple sites, this one being Audi. They wanted to just study how people were reacting to different navigation styles. This is just a mobile video, a user who is looking at the Audi models page. Woman 2: At least the first generation one. All road. Now, this is confusing. [laughs] This is why Audi confuses me, because I click on all road and then it takes me to a page that doesn't say anything about all road. It takes me to Q5. So is it Q5? Does that mean Quatro, or? Wait, is this the all road or is this the all road? I was assuming that the one above was the all road, but maybe this one's actually the Q5 and this one's the all... Jodi: You get the gist pretty quick that there was a little bit of a design issue there that probably has a really easy solution. So that's beauty of usability testing, is sometimes the solutions are not so hard. How are users recruited? I throw this slide in here because this is the first question clients asked me. They're always concerned about are we recruiting the right audience? Is this tool...this tool's pretty inexpensive. We'll get to that. But is this really going to be legitimate audiences? Is this going to be credible research? Just a little bit about the audience. User Testing has a panel of over a million users. They basically recruit participants, primarily from the US, but they also get into Canada and UK. And at any one time, a few thousands mobile users have the cameras. I had a concern about that. I said a few thousand mobile users, that doesn't seem like very much. Their response to me was well, they only have it for so long and then they give them back and we send them out to new people. We're constantly getting new participants, people signing up to participate in user research, we're constantly getting fresh people to participate in this research. I thought that sounds pretty good. The pay as you go, some of you might be familiar with the pricing on that. I'm sure you are. Very cheap $49, I think, is what it is now. If you get an enterprise plan, which is where we're at, the pricing was quite reasonable. For $12,000 we get to do 20 tests per month or 240 tests per year. That's quite a lot to fill in some of those gaps where we don't have time or money to do additional research, and then they go beyond that with the enterprise elite. Really quick, I'm going to tell you how you set it up. It's super easy. I can do it. Any of you can do it. Step one setup, you enter your tasks. You create your tasks that your want participants to go through. Of course, there's some skills involved in that, so I would recommend doing a little research on how to create good tasks. Second step, choose your participants. This is basically your screener. With the pay as you go plan, you get a couple demographics. As you get into enterprise plan, you actually can create custom screeners, which is really fun, but again, requires some skills. Best to do some research on that. And then third step is you share and analyze your results. One of the things about this tool is you can literally...they claim that 79 percent of the results or videos are completed within an hour of launching the test. I can tell you that that is largely true, but the more screen criteria you have, the more difficult it becomes to get the right candidates. The worst case scenario I had is I did have to wait two days for one last participant on a mobile test that we were running with some very strict recruiting criteria. But that wasn't too bad. When is usertesting.com appropriate? I'm going to just touch on this briefly. I have to get quick here. You need to find findings quickly, so you need it within the hour or within a few hours, great tool. If you can wait approximately 10 days for your findings, the traditional research is fine. If you can complete your test in 15 minutes, that's a pretty big constraint, and not all testing can be done in 15 minutes. Sometimes you need the 45 to 60 that traditional research would provide you. If you want to ask the same questions again and again and you don't want any fluctuation, that's really important, the tool's great. If you need moderation, however, and that's going to be helpful to you, such as the case when you have a prototype with limited functionality -- we often have those -- you're going to need a moderator. Recruiting this criteria for choosing the tool versus traditional. For the tool, you need to be able to access the site via URL. That's important. Of course, you need resources because some of this is do it yourself. Some of it you have to do yourself. That includes the planning...well, with your team, the execution, and the analysis. But I'll give you some tips on that, too. And of course, if you have a low budget versus a substantial budget, low cost. What does that mean exactly for this particular tool? Here's traditional. Now, this is based on my almost six years of being heavily involved in usability research at Team Detroit. It may be different elsewhere. On average to test five people, lab based or remote for 45-60 minutes, results in about 10 days, it's about $25,000. It's pretty significant. With a tool to fill in the gaps where you can't do this, on moderated usertesting.com, five people remote testing 15 minutes, results in about an hour for $250. These numbers did catch the people's attention. Let me talk just a little bit about writing good tests and questions. As I mentioned, you do need some skills in order to set up good tests. Or you need to partner with a consultant or a company that has those skills. I know there are some companies that are doing that for companies as well. Avoiding lead in questions is probably the number one thing. Lead in question, was that awkward? Can anybody tell me why that might be a lead in question? [laughter] Man 1: [inaudible 19:00] . Jodi: Yeah. It's actually a question I heard in the last few months and I just about shot out of my chair. It's a lead in question because it puts a negative thought into the mind of your user immediately. You don't want to sway someone. You don't want to sway them to the right or the left, to negative or positive. You want to be as neutral as possible in your questions and your facilitation. Non-lead in question? Instead of saying that, what did you think of that experience? Totally open-ended, totally leaving it up to the participant to decide. Another tip, ensure the participant goes to the correct study early. This is the lessons learned. Fortunately, there's a backup plan. One of your first impressions of this website is the question that I will use a lot in these user testing tests. However, we had a scenario where a person never actually clicked on the right link to get to the right site, she actually started testing the wrong site. Now, my questions are, what are your first impressions of the thezeitgeistmovement.com site? To be specific and to inspire the participant really to double check that they're actually looking at the right thing. Fortunately, user testing has a policy that if you're not satisfied with a test you can replace it. Two times now [laughs] I've had to replace users because they basically tested the wrong site. Consider exploratory tests first, then directive. Exploratory begin open-ended and begin with a natural shopping experience, because it gets... [inaudible 20:29] is comfortable with the site and it actually is a little bit more natural. Ford site, we might say, "Find information about a vehicle that interests you." Any vehicle, whatever vehicle you might be interested in. We're not going to force them to go look at Mustang if they don't care about a Mustang. Then we're directive. Now we would like you to configure a vehicle, if you haven't done that already, with your desired options and features, that's getting more specific. Recruiting, testing with five people is fine. I put this in here, because if I ever have to show this to clients and I probably will, that's a question that always comes up. How many...Really, you're only going to test with five people? How does that rate? That doesn't sound right to us. It is right and most of us here know that. If you do want to do more rigorous research, you can break up those groups of five into different user groups. Split it out anyway you'd like, age-ranged under attitudes and so on. Is five people enough? You might be familiar with this slide from Jakob Neilson. Yes, it's plenty. Anything over five is overkill. You capture about 80-85 percent of the issues with five people. Analyzing reporting results. What I would recommend, this is from my experience so far this year and I'm sticking to it for your results with your team, as soon as possible. It gets them excited when they get results. They just helped you plan a study, you launch it, and they get results in an hour. They get really excited. They're like, "Wow!" They can use those findings immediately. Keep your report short. I do recommend doing at least a short report. Document what was discovered. Create video highlights, which is super easy to underscore the key themes. Provide recommendations in your reports, but only if necessary. Where I work, I work with a lot of smart people. They have lived through all of the history of the project, the constraints of the project. They've had multitudes of discussions about what they can and can't do. Sometimes it's best for teams to come up with their own recommendations. Develop a report template. Deliver the reports, if you can, to as many people as possible. Another approach to analysis, in case you're a department of one -- Hello -- is to engage in team-based analysis. This is where you actually involve the team in watching the videos. I've got a photo here of Megan Schwartz, who's in the audience. It wasn't too long ago that we were engaged in this task-based analysis. We watched the videos together. We practiced active observation, which is keeping laptops closed and phones off. I can't tell you how hard that is in an ad agency. I don't know what it's like where you are, but that's very difficult. Provide some caffeine and chocolate. What happens is as each viewer documents key insights, one at a time on a sticky note, it can be positive or negative, or maybe just an idea. People are already thinking about solutions, of course. At the end, post the stickies on the wall and basically a diagram or a card sort together. Group your stickies and your issues into key themes. Then, with the team, you start to brainstorm solutions. It's really a wonderful thing and your job as department of one is basically done. Getting to a buy-in, a couple of more slides, that's it. I would say persistence is key. Usertesing.com was not new to me this year. I actually started sharing information about it probably three years ago. I don't know what happened this year. Probably because of the need for so much more research today, but be persistent. Address real problems. Instead of just doing demos, which you can get right off the site, use the trials. Talk to your project teams. What are their burning questions? You can help them. I'm sure with something right away. Involve them in the test setup and the recruiting. Again, share the results. Document some findings and then be careful what you wish for. I can tell you from my experience that the requests start flowing in. Then pretty soon you'll be asking, "I need a resource. Help me!" Free trials, man, just do it! Just do this. Just to prove to you that I really was paid to do this, I have three free tests for you. Just go to usertesting.com and type in "UX Thursday" at your discount code area. You can try it out for yourself if you haven't already. Some contact information. Homework and Quick Plug Detroit User Experience is on Meetup. We just started two weeks ago and we got 40 people. Please join us. We've got a book chat coming up. We're going to have some Happy Hours, no doubt. I'd love to see more of your faces there. Questions? Sorry, I think I ran over. Shit. [applause] Man 2: Thank you. We have time for some questions while Dan is getting setup for his talk. What questions do people have? I'll bring the mike to you and we'll go from there. What do you have for Jodi? Somebody needs to have a question. Here we go. Josie? Josie: Jodi, what thing to you test on mobile? What do you have? Is it a prototype? Is it live? What's going on there? Jodi: So far, we've been primarily focused on competitive testing. We've been doing mobile for a few years and we have done some traditional usability testing on mobile. Right now, the mobile team's burning questions are really, "What do people think about what somebody else is doing, because we might do it, too." I've been helping them with looking at other sites, in particular navigation. We've got one coming up where we're going to look closely at vehicle comparison tools, so that's the type of mobile research we're doing so far. I also expect that we're going to be doing more app research, with this tool you can test apps as well. I know we've got a big initiative that's getting started for that. Any other questions? Man 2: What other questions do you have for Jodi? Woman 3: How do you handle client confidentiality when you're doing especially the remote testing and you don't know who's on the other side? Jodi: Right. Yes. That is an important question. Most of us, in fact, have to make sure that participants aren't going to run out and tell them your secrets, especially if you're showing a prototype. Not so worrisome if you're testing live sites, because it's already out there so it's not necessarily a secret. If you're doing prototypes, user testing has all of their participants sign NDAs. If you are on an Enterprise plan and you have a special NDA, which of course Ford Motor Company does, you can have participants sign that as well. There's flexibility there. Man 2: Any other questions here? How many people have tried using a tool like user testing or one of the other tools that Jodi mentioned? How many people are continually using it and finding it useful in their organization? A small number of you. While we're waiting Dan to set up, someone want to share their experience what they found useful? Derrick Poppin: Sure. I'm Derrick Poppin, Cengage Learning, up in Farmington Hills. Jodi, you continue to inspire me, both Ethnio and I feel like we're following very similar footsteps. We've been using user testing also for about the last two and-a-half years and the Enterprise plan in the last year. Even though we're recruiting for very specific demographics, students are easy but librarians or teachers. To your point, yeah, it is possible to set the screener right and actually get people, because there are just a lot of people willing to do tests. It's given us the supplemental user testing that we can't do when we don't have 10 days to spare, so I totally agree with you on that. Jodi: Yeah. That's awesome. Derrick: I really have seen the need for that to grow. In the last week, within a single day, I was able to run two five-person tests to answer an urgent design question for software that's going out at the end of the week. Jodi. Yeah. That's great, really great. Steven Tangler: Steven Tangler. I'm the user experience director at Ultia. One other tool, not free, that we've used, it's called Oopgo, O-O-P-G-O. It actually lets you see live when users are interacting with the website, not necessarily what they click on or fully click through. It allows you to see them, actually. What they type in. Where they move their mouse, things to that affect on your own website. They don't realize that they're necessarily being tested, so you avoid the Hawthorne affect and all that. You literally see, as they're moving around their mouse, their keyboard and things to that affect. Jim: Jodi? Jodi: Hi Jim. Jim: Hi Jodi. We use our testing.com sometimes, even though we're going to do some formal testing. We use it along with the formal testing. Sometimes, before we want to go spend a ton of money, I want to get the easy stuff out of the say, the "aha" things. Do you incorporate a lot of these little tools with formal testing down the road, too, then? Jodi: I will be honest with you. We're really in the early stages of using the little tools. For us, it's not that we just started this year, but one of the challenges has been the creditability factor of using tools that are...When tools are that cheap or even free, people start to question both internally and clients, "Are you really doing legitimate research?" If I'm not spending some money [laughter] , I'm not confident that this is real research. There's a learning curve. I think the more we do it and the more that we prove that it's helpful, the more buy-in there will be. I really don't think, at least for us, the traditional research is not going away. Jim: I had the opposite problem. They like the cheap stuff a little bit too much [inaudible 30:17] . [laughter] You need to go down to Detroit to test now or some other place. Jodi: [laughs] Yeah. Man 3: One of the things that we found really interesting, using usertesting.com or any one of these tools that is available there, as we were trying to figure out the best way to phrase some of our answers, we learned a lot by using the tool. It's cheap enough that you can learn by doing. You can just get on there and make tests. The other thing that we did that I don't know if we should admit or not, but we actually signed up for accounts to be testers. We learned a lot by observing probably some of you in the room who have used UserTesting.com and seen how the tests are actually constructed by other professionals in the industry, which is something you don't get to do very often in our industries-actually look at other people's work. And that was really helpful as we were sort of developing our methodology for using it as well. Jodi: That can make you feel really good about those NDA's. [laughter] Man 3: One other thought, Jodi, and I liked that you mentioned it already, is that you can mix and match the tools. You can use Ethnio to feed into Usavilla. Or you could UserTesting.com to feed into a click stream test. Anything that sort of lets you grab users and then lets you evaluate something, you can mix and match them really effectively. Jodi: Yes, yes. They are surprisingly easy, too. I was really amazed trying to do them by my lonesome that I could pretty much figure it out. Man 2: OK. Thank you very much, Jodi. Jodi: Thank you. [applause]