Why Don't People Care About Long-Term Infrastructure?
How research reveals why communication breaks down when stakes are high
Research Type: Qualitative Usability Testing
Participants: 12 diverse residents
Duration: 5 days
Focus: Cross-device usability & engagement
The Challenge
Link21, a major infrastructure initiative spanning 21 counties, needed to understand how a diverse regional population understood a multi-decade transportation program. The program was ambitious and complex, with significant implications for equity, the environment, and regional economics.
Their website had recently undergone a content redesign, but stakeholders didn't know whether the new approach actually helped residents understand the program's importance. More importantly, they didn't know if the redesign was motivating people to engage. Before investing further in public communication, they needed answers.
Research Approach
We conducted qualitative usability testing with 12 residents representing the region's geographic, economic, and educational diversity. Testing occurred over five days, with six participants on desktop and six on mobile devices.
Each 60-minute moderated session combined behavioral observation with direct participant feedback. We watched participants navigate the website naturally, asked them to find specific information, and captured both what they did and what they thought about it. This revealed not just whether people could find information, but whether the information actually resonated with their needs and concerns.
The key objectives: Do residents understand the program? Will they support it? Can they engage with it? And most importantly, what needs to change to transform passive understanding into active support?
What We Discovered
The Good News
Participants described the website as professional, trustworthy, and modern. Maps and infographics performed exceptionally well; residents understood regional needs at a glance and could visualize how the program would affect their neighborhoods. Cross-device usability was solid; the mobile experience was equivalent to desktop.
Most importantly, the core message came through: residents understood the program's goals and appreciated the emphasis on equity and environmental benefits. When people cared about these issues, they got excited.
The Critical Problem
A 2040 service date devastated engagement. People who had just gotten excited about the program's benefits felt immediately discouraged. The phrase "2040" derailed conversations about current relevance. Suddenly, residents weren't thinking about how they'd benefit—they were wondering if they'd still live in the region by then.
“That's a lot of work for something that I won't see the benefit of directly.”
This wasn't cynicism about infrastructure timelines; it was a rational response to feeling disconnected from near-term benefits. And the website didn't offer them any sense of agency. There was no suggestion that they could influence the timeline or meaningfully accelerate progress.
Information Overload
The content was thorough and detailed—perhaps too much so. Residents had to hunt through multiple pages and lengthy text to find benefits. Long videos and paragraph after paragraph of explanation made people feel like they were missing something important.
Jargon and acronyms added another layer of friction. When everything on the page seemed equally important, nothing felt urgent. When key benefits were buried under layers of context, people moved on without fully understanding the program's relevance to them.
Strategic Implications
This research demonstrates a core UX principle: people don't make decisions based on comprehensive information; they make decisions based on the information they care about. When designing for public engagement, the goal isn't to present everything you know. It's to guide people toward the insights that will actually change their minds.
The best information architecture acknowledges the emotional journey. You can't separate comprehension from engagement. A perfect explanation of your program won't help if it leaves people feeling helpless.
Similarly, content that works internally may not work for your audience. Technical accuracy matters, but so does emotional resonance. This research revealed where those two things were in tension, and showed which to prioritize.
Skills & Expertise Applied
Diverse Population Research: Successfully recruited and conducted research with participants across income, education, location, and technology use, ensuring findings reflected the full spectrum of the audience.
Complex Information Architecture Testing: Evaluated how people navigate and comprehend multilayered digital information, identifying where communication breaks down.
Cross-Device Testing: Validated that digital experiences work consistently across desktop and mobile, accounting for how different users access information.
Emotional Insight Discovery: Moved beyond "Can people find information?" to "Does the information actually motivate action?" understanding the psychological dimensions of user experience.
Remote Research Excellence: Conducted fully distributed research while maintaining methodological rigor and participant engagement.
Strategic Translation: Converted user research findings into actionable business recommendations, showing how UX insights directly impact stakeholder engagement and program success.
How We Help
If you're launching a new digital product, communicating complex information to diverse audiences, or trying to understand why people aren't engaging as you expected, user research can transform your approach.
We specialize in bringing user voices into product and content decisions, not as an afterthought, but as a core strategy. We help organizations understand what their audience actually needs, how to communicate authentically with them, and how to design experiences that drive real engagement.