Amy Montgomery: All right. I heard these words, ''Let's start usability testing at the state of Michigan.'' Audience member: Woohoo! Amy: Woohoo! Nine-and-a-half years ago. [laughter] Amy: I was actually hired to support the Department of Transportation as a QA specialist, build quality assurance processes, put some quality controls, and build a software testing team. Two months later, I got a new manager, Chuck Baird, and those were the words I heard. He had been very involved in human factors throughout his career, and before I gotten there, had taken people over to Ford Motor Company to see their usability lab. When QA testing fell under his team, he thought, "Aha, this is the time starting and making those connections." We started usability testing, as well as still software testing, so there was both. But, they were definitely here, which we'll come to find out here. A little bit more of the history. Since we had that gate at the software testing, we started a little backwards. We started usability testing pretty much right before it went in production. But that's what we had control over because anybody who say there to test knows the huge value inclined sitting through that and seen what users are struggling with. That was our crazy like a fox, like, "Let's get them in this testing and see the value that some of these UCD methods can give us, and that's going to help us sell to be able to push it up in the life cycle." All the while...this was new to me. I didn't know what I was doing at first. We went to every conference. We tried to go to every training within Michigan because we are state employees. We reached out to all local communities and started traveling, going to different meetings. All the while, just kind of sponges, soaking it up and then relating that back to how can we make this work at the state of Michigan because it is a challenge. At large corporation, the state is no different on making change. One thing we did have going for us is they were also looking at their development processes and doing it standardized. It was waterfall back then. We're just getting the agile. We're a little slow. But hey, we're getting there. At the same time, we're able to...if I would go to a conference and learn something cool, I would just go, "Hey, can I try something on this project?" Just get in there and do some user research and then say, "Look at all those great stuff I got." I didn't necessarily ask for permission. We started just doing these things and showing the value. We got a formalized lab and continued again to do more UCD methods. Even though we were tied to the Department of Transportation, we started to service other agencies, saying, "Well, yeah, we can do that for you. Come on over to the lab," or, "Hey, yeah, we'll come talk to you," and create an awareness. We had a lot of education. We still have a lot of education, but continuing down that path of evangelizing UX and getting the good word out there. We needed to continue the education, hence, the whitepaper. We had to write a whitepaper to our executives to say, "Here's how and why we can institutionalize some of these practices." We need to do that. It was well-received, the conversations kept going, kept happening. We kept learning, kept trying things, having great successes. We were hooked. Our clients were hooked and wanting more. Then, we found ourselves as a blind item, one out of six strategic directions in the state of Michigan IT plan. Now, that might not sound like a big deal, but this is the state of Michigan IT, and it says, "User experience." We were like, "Yes, they're listening. They're liking it." That was a big deal, and the initiative was to basically create a centralized...because we're still attached to the Department of Transportation because they really get IT, not that other agencies don't. They said, "We want to centralize this team and make it widely available to all state agencies. Last year, that's what they did. They plucked a few of us and we are now a centralized team able to support all state agencies. Now, what you don't see here...now, remember what my original job was at the state of Michigan, what you don't see here is that at the same time, in my spare time, I was doing the same thing is software testing. There's a lack of software testers at the state of Michigan. Again, they were formalizing their processes and they were realizing, "We need to get at software testing, too." I was doing the exact same thing with software testing, educating, creating processes, as I was with UX. The good thing is, when it came to moving the centralized team, we had built our reputation both around software testing and UX and both were very needed and there was demand, so they moved both. All those services came with us to the centralized team. All along that journey, I thought, "Ugh, one day, I'm going to have to choose." It's kind of like a weird pairing at first, Software Tester, UX Specialist. I'm going to have to choose. Then when they moved all the services last year to our team I'm like, "Wow, this is really cool." Talking with clients and now people are starting to get it. They get that they need software testing. They get that they need UX services. We can leverage, right? They're a perfect match. My aha! Moment was we were talking with a client as a business analyst and a PM, who needed testing services. They said, "We need testing services." They needed a little bit more time than we had bandwidth for, so we said, "OK, we might have to go get a contract resource to help you out, but we can definitely help you out." They said, "OK, but we know we're coming to you for testing services, but to clients, and time, and budget, we really wanted to do some of that. The usability and user research upfront but didn't happen. Since you're testing, can you also give us the usability review feedback? If you go get a contract resource are they going to know about this usability stuff?" Software testing but you want usability feedback as well. It's, "Yes!" How awesome is that? I always tell people, "We're getting there." They're realizing they can't go without UX. They need usability. Just like you all know, projects don't go perfectly. Budgets, time, we don't always do things when we are to do things, but they want to do it. Again having that testing service. They also say, "Can you give us that usability feedback as well?" When we deliver testing results, if it's late in the project and we hadn't anything else, we say, "Well, we think this might be a problem. If we would have done some of these things upfront we would know more about this." Clients get that. Clients want to make their products good for their end users. Believe it or not, the next time, the next project, they are calling us earlier. We have to help make those connections. We have to help. Even when things aren't perfect and they're not happening the most perfect, make those connections if they see the value. Next time, "Hey, let's do these things upfront. We'll have this better information to be able to then help you down the line." Let's explore what I'm saying this perfect match, testers. I want to just get out there, to say it. You can disagree with me. Testers are IT professionals. Not everybody can test. I have been fighting that for nine years. There's a difference between putting the body and say, "Follow this test script," and then putting a trained tester. There is a difference. Both testers, right, and UX folks, they have this common end goal. Their end goal is to create a quality end product for the user. A usable product is effective, efficient and satisfying. Those things can't be realized if the software is buggy. There's a nice goal match there. Software testing, no matter really agile, intuitive, waterfall, a lot of testing happens after coding, correct? Let me ask you this, designers, designers out there raise your hands. Designers? All right, good. Maybe I'm off base, has this ever happened to you? Have you ever created a design and passed it off? Passed it off for somebody else to implement and you weren't involved in that part. You just created your design, your spec. You pass it off. When you happen to see the end product, does that end product look like what you always designed? Audience: No. Amy: No? OK, good. Thank you. That was not just what I received. Imagine this. Imagine if you had a tester there who was involved early on, who understood user needs. Who understood your design and what your design was fulfilling those end user needs, and they were testing after the product was built. They could verify, "Ah, yes. They did a good job implementing this design," or, "You're no developer, sorry. You didn't get it quite right. You messed this up." The design was the way it was because of this user need. They're creating a defect. Documentation of saying why it's broken and why it needs to match that design. Powerful thing. Good, good ally. How do we make this work? Nine years ago, as a tester, I really enjoyed and loved being invited to requirements meetings. These are business requirements. They're just soliciting business requirements, but I loved hearing the conversations from the actual end users, end clients. As the business analysts are writing, "OK, you need it to do this. Got it." What they didn't get was all of that wonderful user information of, "I can't use it now because this is a problem." I was writing down all that stuff, because that made me be a better tester. I was hearing those conversations, and then when I tested it, I could recall those conversations and go, "Oh, no, no, no. You guys didn't get this right. They needed it to do this because of..." It was very powerful. When I was prepping for this, I was like, "Maybe people are talking about testers being involved in this stuff." I looked through some books, and I found the "Lean UX" book. You guys read that? Jeff talks about collaborative discovery. I thought, "Yeah, here it is, collaborative discovery. It's the whole team approach. Everybody should be involved in user research." I'm reading UX researchers, designers, developers, product owner. What, no testers? Come on, you're killing me. Why not testers involved? When they're there hearing that user research, they can now be that informed, and they're informed because they saw and heard the user needs. Reviewer throughout the lifecycle, reviewing that design, testing it at the end, making those connections. Help them be that quality person for the project. Number two is learning about all of this good stuff, the human factors, the cognitive design, the interaction principles. When I early on luckily saw Susan Weinschenk speak, and good speaker, I was like, "Brainy lady, love you." I was hooked. Love to hear her speak. I have her books. Having Chuck Barrett as my manager, Donald Norman was required reading, so I got all that. Early on, I was learning the methods, getting these whys, understanding why people are not seeing things over on the right hand side of the screen, or various things. Just making those connections, which is really, really powerful. One of the stories that really stands out is we did a usability test, and we met with the group beforehand, and the business sponsor was like this. You could tell. He was going along with it, but he didn't care. He didn't say a word. He didn't come to the test. Come on. When I went to go deliver the results, I was a little nervous. I hate to sit across from people that look like this. Me and the designer, we were co-delivering the results. Always deliver the results in person. We always say, "Here's the usability issue we saw. Here's why we think they had a problem with it, the whys, and here's some possible solutions." He started like this. We started the report, and he started to loosen up. We're going through, he starts bobbing his head, looking around everybody. Halfway through, he goes, "Stop, I got to say something." I was like, "Oh, dear." [laughs] "We've had a lot of these conversations along the way, and I've had a lot of the same concerns that I'm hearing these testers had, but I've never heard it put like you guys are putting it." Why? It's because of that information, the whys. That changes the conversation. It's no longer what Amy thinks and Bob thinks. It's, again, the end user and giving the whys. What would be best for them to accomplish this task? This has been a big part of our success at the State of Michigan. It's not just running through these methods, but really trying to educate people on the whys, making those connections, having those pairings changes the conversation. Before I leave this, tying it back to testers. Having testers have this information makes them better testers. Let me pose two different defects to you. Defect one, the proposal status field is way off on the right hand side of the screen all by itself. It looks weird. Fix it. Defect one. Defect two, when the analyst comes to create a proposal, the status field should be below the proposal ID left of line with all the rest of the fields, because that's the flow that their eye is going to go down and that they're going to work in. If that status field stays to the right, they're probably going to miss filling it out, and it's a required field, and the error message will slap them on the hand. That changes the conversation. It tells them why that user is going to have problems, and that one's going to get fixed faster. As a tester, having this information and providing feedback in that way makes them a better tester, and more defects will be fixed. Number three. Build a Pixar-like brain trust. Before I get into what a brain trust is, short little story. We did a usability review, and it was on a vendor design and developed application for the State of Michigan. The client said, "Hey, can you come in and review what they're doing?" I said, "Sure, absolutely." We found a lot of things to it. Nothing major, but the report was pretty big. All the clients were there. The vendor was there. We were going through the report, and they were bobbing their heads and making their notes, and having good discussion. "Oh, yeah, vendor. Can we change these things? Oh, yeah, yeah, yeah." After we were done, they said, "Wow, you guys were spot on. We never looked at it from this perspective before." Tying that in with a brain trust, "Fast Company" April 2014 had an article "Inside the Pixar Brain Trust." The brain trust was, they said, one of their management tools that allow them to produce 14 in a row box office hits. What the brain trust was, it started off as a really tight-knit group, and then evolved to a larger group, more fluid, comprised of lots of different roles. But they were all good storytellers and embraced candor. It was their job to assess Pixar's movies coming down the pipeline to identify and solve problems. What does that sound like? I was like, "What? Identify and solve problems." I made the immediate connection to testers and UX people. That's the job. Creating the brain trust put that quality control, that quality measure, and allowed them to create better movies. One of the quotes that just jumped off the page was this, "The people who take on complicated creative projects become lost at some point in the process. Where once there was a forest, now only trees," but that's your job. When you're a designer and developer, you've got to take that deep dive so you're looking at it from a totally different perspective. "All trees, I'm lost." If you have that tester as part of that brain trust, part of the group can help you keep that perspective, keep the forest, see that end to end product and help you identify and fix problems. Again, from beginning to end, they can be that quality control the whole way through. What's next? My call to actions are these. If you have access to testers, go talk with them. Every tester might not be a perfect fit. I'm not saying that. Go reach out to the testers, involve them in user research, see how it's resonating with them. See what kind of information and how they can use that information throughout the life cycle. Don't be afraid to try things. Try things. Make it better next time, but definitely involving testers throughout the process and giving them the tools, letting them know, "Hey. I'm just not giving you more things to do. I'm not just giving you one more thing. Learning about this stuff is going to make you be a more effective tester and we're going to make a better product for the end users which is near and dear to testers' heart and UX folks' heart." UX communities, anybody in UX communities runs them, reach out to testing professional. They don't know what they're missing. Don't blame them. [laughs] Just help them realize that, "Hey, this information, these methods are going to help you be a key player so the team can make a better project." That's it. Thank you. [applause] Facilitator: Excellent, excellent. What questions do we have for Amy here? One that occurred to me while you were talking to me was...The state of Michigan is a very big state. I've learned where my hands I'm supposed to point, where I am. In Massachusetts, we just use our finger. Never mind. [laughter] Facilitator: This change happened over time. Is it evenly distributed throughout the state government or are their pockets that are doing really well and pockets that are still not thinking about it at all? What are your plans and folks you're working with to bring everybody up? Amy: Great question. The state of Michigan is very big. It never ceases to amaze me to how difficult it is to communicate to that many people. Starting the nine and a half years ago, we went around to agencies. We always did awareness sessions, clients, as well as IT folks. Today, still, there are people that don't know we exist. That's crazy, but as much as we try in communications, definitely, there are definitely agencies that are starting to get it more and embrace these things. A little scary sometimes because there are so many eye line tools and it's so easy to get your hands on. People are just out there, doing stuff, and it's like, "I wish they contacted us and we could collaborate more and help you along that journey." I said, "You know what, the important thing is they're doing it." Just like we started nine and a half years ago, trying to figure it out, they will figure it out, too. There are definitely pockets that are getting it. It is, though, becoming more of a "We need to have." People are realizing, just like software testing, this is just as important as that. Definitely, thus, we can continue to communicate, a long time ago, idea was to have a user experience awareness day once a year, which how many busy things fall. Now, we're a shard service. It's pros and cons because we have to work to pay for ourselves, [laughs] which sometimes takes away time to do that stuff. We're just going to keep plugging away and trying to get the word out and then figure out what's next. Male Audience Member: It's not a question. It's just an observation. I think technical writers are possibly a third constituency that might be at that table, sort of. Amy: I agree. Male Audience Member: Just from experience. Amy: I do know lots of writers that have made the transition well. Writing, it transforms how they write their documentation. Female Audience Member: I had a question. In terms of the process after you sat with the UX person and understand what the design they're trying to do, what is your process for testing it? Do you have any preferred testing tools that you would test the designs with, whether it's automated testing or manual testing? Amy: Really more on the manual side. Again, knowing those user needs, and I always try to sit in the design meetings. Being a tester, too, and quality person, you're always trying to pull the defects up in the process, right? The earlier you find them and catch them, the better. Really, just sitting at an Agile. You're all in one room, you're reviewing designs. I literally try to imagine myself as that user completing it. It's hard when you're not touching things. When you're in the Agile, it's up and everyone's like, "What do you think?" People are moving things around to just really try to visualize, "I'm going to click this. I'm going to do this," and try to catch those interactions even without interacting with it. Then when someone's done with it, saying, "I think we're done with it. Can I get my hands on it? Can I just come over here and start and again, just more of that manual testing, going through it and finding something that might have seemed like a great idea on the wall but when it comes to the true interaction, it's like, "Oh, no, that's too compress on that. That's not going to work." Male Audience Member: Are there ways that citizens of Michigan can help make this UX thing more prevalent through the state government? Amy: Why? I didn't even prompt you on that. That's actually a really good question and something that's going to be out sale. Male Audience Member: This is why I earn the big one. Amy: [laughs] That's right. I didn't even think of mentioning those here. That's awesome. Thank you. We are actually going to be putting out a survey on michigan.gov soon, and there's going to be a press release. It's a survey to get people to sign up that, "Hey, do you ever want to help the state? Do a focus group or maybe do something online or a usability test. Fill out this survey and get into our pool of people." That's going to be out there soon. As that's out there, I can maybe send an email out to folks and, "Tell your family and friends and say, 'Hey, if you got some time to spare, sign up for this. Help them make services better.'" Because again, being a shared service, we're not mandatory. People just do come to us. We don't touch everything. I don't go to a website and go, "Oh, my god. This thing stinks." [laughs] We are actually more on the application in online services side than we are on the website side. The other great thing is that agencies are actually starting to contract out to consultants. We encourage that. We're not a very large team. We can't do everything there is to do at the state of Michigan. That is a really neat thing, too, that to see clients going out to MSU and other consultants for this work and we try to always help guide them and go along through the ride. Male Audience Member: Do you have an internal resource list of folks that you can give to other state agencies when they say, "We need some help with design. We need to some help with user experience?" Do you have a list of vendors that you've worked with or people that you know re doing good stuff? Amy: Yes, there is. I believe it is a little out of my...I'm not quite sure. I think there is a doing business ad to fill something out and to get you on the list for the state of Michigan. If there's any consultants in the room, there is. Female Audience Member: I just want to say we're on the list. Amy: [laughs] Female Audience Member: We've gone through the process and I've worked with Chuck Baird. We're certified to do business with the state of Michigan. Male Audience Member: Was it [inaudible 26:17] ? Female Audience Member: The usual paperwork application, et cetera, accounting questions, things like that. Male Audience Member: Could you make that more usable? [laughter] Amy: Yeah, always. Thank you. Female Audience Member. Hi, Amy. I'm curious. Does your team get much chance to work with other researchers, say on the attitudinal research side or the analytic side? Do you get an opportunity to collaborate with those folks and try to put your stories together, so to speak? Amy: I wish more. [laughs] That's something. Again, always looking for those opportunities? Yeah. I wish more of that happened. Facilitator: Amy, thank you so much. Amy: Thank you. [applause] Facilitator: Awesome.