Friday Nov 09, 2007

What is it like to work in a design group, when you're not a designer?

Kim Arrowood has worked in xDesign for over a year managing Sun's usability test labs in the U.S. Before coming to xDesign, she worked at Sun for 6 years in market development engineering as a program manager. Kim is working to improve the visibility of the usability labs in the U.S.

I recently spoke with Kim Arrowood about what it's like to join a design group, when you're not a designer.

Jen: So Kim, tell me a little about what it is that you do.

Kim: I manage our usability test labs. World-wide, we have 9 or 10 labs spread across Prague, Massachusetts, Colorado, and California, but I primarily manage the 3 labs we have in Menlo Park, California. I handle logistics, recruit usability test participants, and help out with technical equipment. I also manage some aspect of operations for our organization, like goals, budgets, and dashboards.

Jen: From your perspective, what's the most challenging or interesting part of coming into a design group?

Kim: The most challenging aspect is the terminology. In my former group, we used the terminology of the customer, but the design group uses both the terminology of the engineering teams as well as terms that are specific to design or usability. For example, I had to learn what it was an interaction designer does and how that is different from the work of a visual designer. And I didn't know what a usability test was until I got to see one, so there was a big learning curve.

One really interesting thing that I learned was how "hands on" design is. I never knew all the work that goes into creating designs before they go to engineering. And I was surprised at how collaborative the design process is. When I worked in engineering, a single person wold work to resolve a single customer problem. But here, there's a very supportive environment -- a lot of teamwork.

Jen: How do you see that manifested?

Kim: Well, when Kristin was working on some designs for the Identity Manager team she took them to the weekly Design Cafe, to get feedback and input on her ideas from other designers in the group. And we have those design cafes weekly, so anyone with an idea or a new mock-up can get feedback from their peers, in a supportive way. But I was surprised, too, at how small the group is, when design is so important to Sun.

Jen: So what is the most interesting part of your job?

Kim: I get to learn a lot more about the products we make; what they are and what they do. I'm reading as much as I can about design and usability testing, but I like to learn about our products by being the participant in our dry runs -- the practice round of a study, when the lab setup and script get tested.

I enjoy participant recruiting, but it's challenging. It's really hard to find good participants; ones that match the test goals for each study.

But the best part of my job is getting involved in the projects, and working on the teams. Everyone works together and communicates -- there are no funny looks and no stupid questions. I really enjoy the collaboration and the teamwork.

Tuesday Nov 06, 2007

Toh-may-toh? Toh-mah-toh?: Let's call the whole thing.. a terminology study

Ann Sunhachawee is an interaction designer in xDesign, and has been working for over 8 years in the area of tools, Java client, and currently OpenSolaris projects.

When deciding on terminology to use in your user interface, you try to fulfill a couple of different objectives: 1) accurately and concisely describe the concept 2) make it easy for the user to grasp.

The Network Auto-Magic project (which is part of OpenSolaris) needed to capture the concept of associating a group of settings (such as network proxies and services) with the network the computer is connected to. For instance, if you're using your computer in your office, you would need to use proxies that allowed you to work through your employer's firewall. But you would not need those particular proxies when using your computer at home.

What do you call this concept of needing different network settings depending on where you are? Mac OS calls it "Location" (as does Windows Vista). However, "Location," as Mac OS uses it, is not exactly the same as what Solaris will be implementing. We tried some alternatives like "Network Environment". No term was fully accepted. So what could we do?

A quick & dirty terminology study! I printed 2 sets of screenshots — one set featuring the term "Environment", and the other "Location". Then I walked them around the hallways to get people's opinion: "Hey you — Do you like A or B?"

But, before rushing into that, I consulted user researcher Nalini Kotamraju to figure out if there were any gotchas to think about. There are a few factors to consider, for what seemed like a short & simple survey. Here are some of them:

  • Alternate which term is shown first to each person, to get rid of any order-effect bias.
  • Don't ask about the term directly; instead just ask what they think the function of the dialog is. Observe the person's understanding by listening to their response.
  • Only directly prompt the user about the term in question if the person doesn't comment on the term during the course of their discussion.
  • Present the alternative after finishing discussion of the first term.
  • Ask for any alternatives that they might think are better.
  • And of course, avoid leading questions.

The Results

In the end, of the 10 people I polled, only 1 person preferred Environment over Location; 8 people chose Location; and 1 was undecided. I hadn't thought the results would be this skewed. Reasons for choosing Location did include familiarity with the term (a number of people were Mac users) and the notion that Environment is way too broad, evoking associations with the Desktop Environment and Unix environment variables, which were both something people felt were not changed very often.

In the end, the term might not be the most accurate, but sometimes it's better to use a good approximation that is recognizable. In the case of this project, it is an acceptable trade-off. I highly recommend the quick & dirty study method — great payoff for the price of a couple of print outs and getting to know your neighbors =)

Thursday Oct 25, 2007

Thoughts from a recent remote usability study

Kristin Travis has been working in high tech as an interaction designer and usability engineer for more than 15 years. She is part of the xDesign team based in Menlo Park, California, and she currently supports the Identity Manager team, which is based in Austin, Texas.

Identity Manager Login Screen

The last release of Sun's Identity Manager software (in May of 2007) had substantial user interface changes, so when I joined the team in June we discussed conducting a usability study in the Menlo Park usability labs, to get feedback from representative users on the current release.

In my experience, most development team members appreciate seeing how users interact with a piece of hardware or software that they've helped to create. Seeing first-hand reactions to existing functionality helps to shape team members' thinking about changes and new features for a product.

Picture of LabBut while I'm located in Menlo Park, the Identity Manager development team is located in Austin, Texas. So the questions I had going into this exercise were: would it be relatively easy to involve a remote development team in a usability study? And would the remote team be satisfied with viewing a study in real-time, but not actually being in the same room as the user?

So what did we do?

In terms of the setup, we created a VNC connection between the usability lab in California (where I was, with the study participants) to a conference room in Texas, where the members of the development team could observe the test sessions.

The remote access allowed the people in Texas (and other locations, if needed) to see what study participants were doing. The Texas team could see the participant's computer monitor and watch, in real time, while the participant interacted with the product. In addition, the team could listen to the participant over an audio conference call that we established between the locations. At the end of each session, if the remote team wanted to follow up with the participant about a particular issue or question, they could do so by using the conference call.

And how did it work out?

Here are some highlights of the feedback that I got from the remote and local teams:

  • As with any other type of study, it's really important to conduct a dry run of the session. You don't want to get side-tracked during the study by unanticipated logistical issues. During our dry run, we diagnosed an error in the VNC login instructions for the remote set up. That took a while to figure out, but then things went according to plan.
  • The remote team's commitment to the study is essential. Jeff, my main contact in Texas, coordinated the remote conference room, kept everyone there informed about any schedule changes, and attended each study session. Considering the two-hour time zone difference, this meant a few late nights in Austin. But it was extremely helpful to have Jeff complete intra-task dependencies so I could concentrate on working with the participant.
  • Jeff said that seeing how the participant interacted with the product was a huge benefit. This was true even though they couldn't "see" the participant directly (as they would have if they had been here locally).

Would we change anything for the next time?

I was in the room with the participant, and Kim, our Usability Lab Manager, was in the control room interacting with Austin by phone, so there was a bit of a communication delay at times. If Kim and Jeff had relevant content to share, Kim had to wait for an appropriate time to break into the conversation that I was having with the participant. It would have been useful to have a '3-way IM chat' up and running, so if the participant discovered any software surprises, or they had any related questions, we could communicate more quickly, without disrupting the flow of the test.

So the questions I had going into this exercise were: would it be relatively easy to involve a remote development team in a usability study? And would the remote team be satisfied with viewing a study in real-time, but not actually being in the same room as the user?

Well, in this case, yes.

Friday Sep 14, 2007

A Sociologist in a Technologist's World: What's a CLI, again?

Nalini Kotamraju is a user researcher in xDesign, and a PhD in Sociology. She has a penchant for research methods and telling it exactly like it is.

Years ago, shortly after I joined the Software User Experience Group (xDesign) at Sun, my manager asked me whether I would be willing and able to conduct a usability study of a new CLI for one of our software products, Sun Cluster. I, the ever eager new employee, promptly responded yes, that I'd be thrilled to do such a study. I then withdrew to my desk, and typed "CLI" in Google to figure out what it meant.

CLI stands, of course, for command line interface, which is a way to interact with software or an operating system. Once I met with the product team and had my first look at the CLI, I understood why my manger had wanted to feel out my reaction to this kind of study. By the time I joined Sun I was a veteran at usability studies, having led many a user through a graphic interface in paper prototypes or interactive mock-ups (usually web sites of now-failed dot.coms). Testing the intuitiveness of the content and structure of a CLI, initially seemed to be simultaneously a tedious bore (only a bunch of cryptic words, no images?) and a memory challenge (learning how to string those same words together to make software do something?).

However, the usability study of this CLI turned out to be one of the favorite usability studies that I've conducted in the past decade. The fact that those words come out of my mouth still makes people who know me, even a little bit, laugh. What was so great about this study?

What made the study great wasn't just the team's ability to follow through on the findings from the usability study; thankfully, that happens regularly, though to varying degrees. Nor was it the rich feedback that we did indeed receive from the usability participants themselves. What made this usability study great, for me as the researcher, was the commitment of the product team. It's the most dedicated team with which I've ever worked on a usability study.

The software engineers on the product team were committed to hearing what actual breathing users had to say about the proposed changes to the CLI, which is rare, particularly in the context of what was a politically charged project. They hadn't made the changes to the CLI lightly, and they were passionate about making sure that what they had come up with would work for their users. In addition, they were willing to participate fully in the preparation, execution and post-analysis of the usability study, which is a rare occurrence in a field in which usability studies are often used as after-the-fact rubber stamps to mollify potential internal critics rather than to improve products.

Most of the team had never seen a usability study, so we toured the usability labs in Menlo Park, California. After a discussion of various research methods, they accepted that questions about a statistically significant population of users had no place in what we were about to do. Their commitment also involved spending painstaking hours with me, preparing me for the potential questions of live participants, by explaining how the most popular commands were executed both in the original and the proposed CLI, and, most interesting, how it connected to the underlying software structure. They not only attended the usability sessions, but mandated that other engineers, doc writers, and marketing staff on the project attend as well. My manager, who dropped by one of the usability study sessions, said he couldn't enter the observation room (of our largest lab, nonetheless) because it was chock full of observers.

And all this for a usability study for a bunch of words. Just kidding.


xDesign is a software user experience design group at Sun.
Follow us on Twitter : Flickr : Blog (see feeds below)


« February 2017