Stories from my work…

Identifying life-saving solutions for healthcare workers (interviews)

In 2013, my supervisor and I were conducting interviews with nurses, social workers, and patient navigators who work with “underserved” communities: pockets of society where knowledge about and access to healthcare is limited by socioeconomic and geographic factors, and where serious morbidities are relatively common.

One navigator we spoke to was frustrated by the fact that the tools she relied on often got in the way more than they helped, or no tool existed where one was needed — for instance, to efficiently manage her sky-high caseload, or to guide her patients toward resources. Several of her patients had died — one very recently at the time of our interview. A certain question triggered this raw memory, and we had to pause the conversation while she wept. The right tools, built the right way, might have enabled her to intervene before tragedy struck.

Many of those interviews were uncomfortable and sobering, but they highlighted some of the goals and objectives of our project. We prioritized features like a resource database and a case dashboard, and the experience of that navigator helped remind us of the value of what we were doing.

Discovering opportunities in an EMS system (contextual inquiry)

At Pomiet we were encouraged to spend 10% of our time on a personal project; it could be anything we were interested in relevant to healthcare IT. I chose to study the way patient data is gathered and transferred during an emergency healthcare interaction with the Beavercreek fire department, from answering the 9-1-1 call to checking in at the hospital to reporting back at the station.

It didn’t take long for the first call to come in. I rode along in the Battalion Chief’s truck, closely following the medic (their term for an ambulance) on its way to an assisted living facility where an elderly resident had taken a serious fall. I paid attention and asked questions about the information he was getting over the radio and the computer in his truck.

Inside the facility, I observed how the firefighters controlled the area around the patient and interacted with him as he lay on the floor, apparently dazed. They used sophisticated sensor equipment that could wirelessly relay the patient’s vital health indicators, but more commonly the firefighters simply wrote the information down in pen on whatever surface was available — in this case, the palms of their hands! When I asked them why they did this they told me they had been unable to get the sensor equipment to talk to their computer systems, and since their job was time-critical they prioritized whatever was fastest.

At the emergency room, the firefighters relayed information to the ER nurses through conversation as the nurses entered it into computers running Epic. I learned that every emergency room has a little side room for EMS workers. This room was stocked with snacks and some consumable medical equipment with which the firefighters could top off the supplies on their medic, to replace any used on the call they’d just responded to. In that room I watched the firefighters sit for nearly an hour filling out paperwork by hand, sometimes copying down data from their own hands!

Back at the station it was a similar story, as they now had to type the same information into their computer system there. All this repetitive manual data entry, coupled with the fact that their station reporting software was clunky and unreliable, soaked up more than two hours per firefighter per call.

I asked them how many calls they responded to per week (an average of about 20) and realized that if patient data could somehow be gathered automatically based on the initial call and what was captured by the sensor equipment, and only had to be entered once, the fire department could be saved thousands of man-hours per year, and the residents in their response area would be that much safer with firefighters more often available to serve them.

For the last part of my project I sketched out a list of the basic requirements such a system would need, and was pleased to learn a few months later that one just like it already existed and was in use by a neighboring city’s fire department. I hoped the city of Beavercreek would consider adopting it as well.

Delivering value at every step (Agile)

My first several years in UX were at Pomiet, a company that practiced Agile seriously and even structured their statements of work around cycles, milestones, etc. The first lesson I took from my immersion in the Agile world was to formulate my work so that everything I did created value. My learned mindset: even if my project gets cancelled tomorrow, the customer still comes out ahead from what I’ve done up to that point.

I took this philosophy with me to AEP, where I was placed on a handful of projects within the first six months. In each case I set out immediately to clarify goals, incorporate UX best practices, and introduce or amplify the voices of end users. To do this I had to ask questions, arrange (and prepare) presentations, and insist on being connected to users with whom I conducted both qualitative and quantitative studies, and then report findings back to my teams.

Fostering teamwork around UX (educating)

When I began consulting in 2017, UX was a vaguely understood and sometimes unheard-of practice among most of the people I worked with. Within my first few days on each project I was typically asked to explain UX to each whole team. I created a 20-minute “UX Level Set” presentation for this purpose, and included an overview of how UX research fits into the Agile methodology. Because of this, the first piece of value I delivered to AEP was in the form of education: everyone on those teams was now better prepared to work with UX from that point onward.

sample slide
A slide from my UX Level-Set presentation

My efforts paid off immediately, as I was able to collaborate with product owners, project managers, information architects, and developers using research-based designs and guidelines. Thanks to my level-set on UX they understood my role and responsibilities and we avoided stepping on each other’s feet. I also invited them to sit in on some of my interviews and concept tests, and this gave them a deeper appreciation for where my recommendations were coming from: communicating UX, rather than just practicing it, helped convey the integrity of the research data.

Bringing in users as stakeholders (jumpstarting UX research)

In most cases some development was already underway when I joined a project at AEP, but little or no input had been solicited directly from end users. After making the case to project managers and leads that this input was necessary and worthwhile, I devised studies to capture it.

My investigative methods were chosen based on a combination of the kinds of questions I sought answers for, the status of the project, the amount of access to users, and other constraints such as time and budget. Most of my studies took the form of interviews and surveys. On my first project at AEP I also conducted concept tests and paper prototype tests (remotely in most cases, so they were technically “PNG” prototype tests), and on a later project I had users participate in a Do-Go Map test and a pair of card-sorting activities to produce an affinity map.

Publication1.png
A photo collage of a “hard copy” affinity map I created
Affinity map thumbnail.PNG
A digitized version of the same affinity map as above

With a turnaround time of one to three weeks (about one Agile cycle on average) I was able to deliver reports back to my team, providing them for the first time with actionable insights and feedback directly from the people who would be using the systems they were developing.

Beyond software usability (Learning and Development experience research)

For the past year and a half, the project I’ve worked on is an overhaul of AEP Transmission’s Learning and Development program. This program affects the entire organization, and the project leader was wise enough to engage UX very early on. This has allowed me to deliver value even more broadly — critically, outside of just a software deliverable — while retaining the aim of delivering value at each step. Within the first few weeks, I leveraged my experience up that point to provide input on the project’s strategy: for example, devising a system, driven by verifiable user need, to identify which groups within the organization should be engaged first.

My core contribution to the project has been a holistic analysis of employees’ learning experiences, role by role and department by department. To do this I observed and facilitated job analysis workshops, which gave me an understanding of the demographics as well as the work context of people in various roles. Each time, I followed this up with in-depth interviews that focused on participants’ training experiences, needs, and preferences, and tied together training with performance review and job satisfaction. I gathered my data in an efficient tabular format I’ve developed over the years, and presented findings and suggestions back to the project team. I was frequently asked to present these to the individual departments as well, to keep them apprised of the state of their internal training and what direction it should go.

Systems evaluation

Like any modern L&D program, one of my recent projects had a software component, and one of my duties was to assess virtual training platforms offered by third-parties. This required me to think about the needs of learners more deeply than mere usability, and served as the impetus to begin investigations into employees’ preferred learning strategies, and how their experience with training relates to things like job satisfaction, performance feedback, and career path comprehension. I worked to understand these things first, so I could identify which platforms would best support the learners.

To establish a baseline against which to evaluate forthcoming learning platforms, I conducted a SUS survey of AEP’s existing Learning Management System. Gathering responses from over 100 participants, I was able to produce a quantitative measure of the system’s perceived usability (it was about average with a score of 69/100). I asked additional questions so the data could be broken out according to how recently the participants had accessed the system and the conditions under which they tended to access it, which produced deeper insights. At the end of my survey I included an open-ended comment section, to which a surprising number of participants gave similar answers, thus providing our team with further direction in our evaluation of various Learning Management Systems.

Thinking outside the Change Management box (UX for change management)

In late 2018 I realized that UX can be thought of as a Change Management discipline: when do we consider the user experience of systems? When new systems are being developed, or when already-in-use systems are being modified or when we have plans to modify them. In other words, when technology changes. Improvements to usability can make changes go more smoothly, but I observed that even with great UX research and design practices in place, new technology roll-outs still often result in failure: either a good technology is rejected, or a bad technology is adopted with negative outcomes.

Broader Change Management practices play a vital role in this success or failure, so I began teaching myself about that field. I read books, attended meetings, and studied some of the academic literature on Change Management. My main takeaway was that there is a tendency to model change resistance wrong, producing what to me looks like false assumptions and counterproductive messaging.

Rather than go on my intuitions, I decided to test my hypothesis. I created two studies: one at my job and one in my free time. At my job I conducted a survey in which employees are asked about their attitudes and preferences around new technologies, about what factors affect technology changes going well or poorly, and about their perceptions of why certain technology changes happen in the first place. As of this writing data are still being collected, but the trends so far are striking: some of the assumptions I recognized in Change Management texts indeed look false, while others seem more valid. Hopefully I will have more details to report in the middle of 2019.

The study I conducted in my free time consisted of interviews with people who have resisted or discontinued their use of social media. Participants were asked about their reasons for resisting social media and what empowered them to do so, as well as what effects or ramifications they experienced afterward. Other types of technology resistance was also asked about. As of this writing the data are still being analyzed but two emerging trends are that people who reject social media do not seem particularly likely to have rejected other technologies, and that they express a variety of rationales including not wanting to be “addicted”, as well as concerns about privacy. These trends suggest that change resistance is not the curmudgeonly Ludditism it is portrayed as in Change Management texts.

This data has already proven valuable to me as a UX researcher, as I can devise studies and craft hypotheses around a more informed view of users and their behavior. I can say with some confidence, for example, that if users do not like a piece of software it might be only partially related to the software’s functionality or user interface.

Advertisement