Total Economic Impact

The Total Economic Impact™ Of UserTesting

Cost Savings And Business Benefits Enabled By UserTesting

A FORRESTER TOTAL ECONOMIC IMPACT STUDY COMMISSIONED BY UserTesting, August 2025

[CONTENT]

Total Economic Impact

The Total Economic Impact™ Of UserTesting

Cost Savings And Business Benefits Enabled By UserTesting

A FORRESTER TOTAL ECONOMIC IMPACT STUDY COMMISSIONED BY UserTesting, August 2025

Forrester Print Hero Background
M
K
[CONTENT]

Executive Summary

Exceptional user and customer experiences are not just a differentiator but a necessity: Consumers and business users alike have come to expect intuitive, seamless, and enjoyable interactions with products and services. The proliferation of digital touchpoints has heightened the stakes for companies aiming to meet these rising expectations. Businesses must now understand not only what customers do but also why they do it, uncovering motivations, pain points, and areas of delight through rigorous empirical investigation. A centralized experience, research, and insights platform can empower organizations to navigate the complexities of the modern experience economy, maximize their investment in customer understanding, and unlock new sources of business value.

UserTesting offers a centralized solution for gathering, analyzing, and activating qualitative and quantitative customer insights. It is a platform for qualitative and quantitative testing tools, enabling teams across product, design, customer experience (CX), and marketing to make decisions that drive growth and innovation.

UserTesting commissioned Forrester Consulting to conduct a Total Economic Impact™ (TEI) study and examine the potential return on investment (ROI) enterprises may realize by deploying UserTesting.1 The purpose of this study is to provide readers with a framework to evaluate the potential financial impact of UserTesting on their organizations.

415%

Return on investment (ROI)

 

$7.6M

Net present value (NPV)

 

To better understand the benefits, costs, and risks associated with this investment, Forrester interviewed six decision-makers with experience using UserTesting. For the purposes of this study, Forrester aggregated the experiences of the interviewees and combined the results into a single composite organization.

Interviewees said that prior to using UserTesting, their organizations struggled to identify and launch experience and product improvements based on customer behaviors and feedback. Prior attempts to gather insights through customer and user experience research yielded limited success, leaving interviewees with slow research cycles, a lack of scalable research processes, inadequate qualitative tools, and insufficient validation to launch solutions to reduce friction in the customer journey and ultimately increase revenue. These limitations led to a focus on marketing over usability insights to inform product and experience direction. Compliance and privacy hurdles often delayed or limited the implementation of effective user research.

After the investment in UserTesting, interviewees reported improvements across research operations and product development. Teams streamlined research processes, recruited participants at scale, and aligned more efficiently on key product decisions using real-time, customer-driven insights. They also enhanced the customer journey through higher-quality data and more reliable participant panels. Key results from the investment include optimized usability, increased customer retention, avoided developer rework due to the ability to conduct research more frequently to guide iterative decision-making, and increased researcher and designer productivity.

Key Findings

Quantified benefits. Three-year, risk-adjusted present value (PV) quantified benefits for the composite organization include:

  • Increased conversion rate of 7.2% due to channel optimization. Insights gained through UserTesting informed strategic design and feature optimization in the digital experience, therefore enhancing revenue growth for the composite organization. The composite reduces customer friction and boosts conversion rates. Over three years, profit from optimized usability is worth $2.1 million to the composite organization.

  • Increased customer retention rate of 10% due to product and service improvements. By leveraging study results, the composite organization identifies opportunities for product and experience improvement, reducing customer churn and fostering customer loyalty. Over three years, profit from increased customer satisfaction is worth $2.5 million to the composite organization.

  • Iteration cycle reduction of 25%. Validation of concepts and designs prelaunch prevents costly iterations, reduces wasted effort, and ensures efficient use of development resources for the composite organization. Over three years, avoided developer rework due to prelaunch testing is worth $2.5 million to the composite organization.

  • Researcher times savings of 50% and designer time savings of 10%. UserTesting enables more efficient collaboration for the composite organization, centralizing research while still empowering teams to work autonomously. The composite organization streamlines workflows and increases testing throughput without proportionally increasing headcount. Over three years, improved researcher and designer productivity is worth $2.4 million to the composite organization.

Unquantified benefits. Benefits that provide value for the composite organization but are not quantified for this study include the following:

  • Improved research and panel quality fostered trust in insights and internal buy-in across teams. The composite removes ambiguity and launches data-informed product and digital experience improvements, reducing internal bias and promoting objectivity in conversations and collaborations with creative and product teams and stakeholders.

  • Studies at scale broadened geographic reach and inclusivity. The composite extends the geographic reach of its research without straining operational costs, covering a diverse set of participants that better represents their customers.

  • Availability of user data unlocked a cultural shift toward customer-centricity. With UserTesting, the composite’s access to data enables it to infuse daily workflows with customer-centricity, elevating the voice of the customer in strategic decisions and planning.

  • Strategic enablement unlocked innovation. UserTesting provides the composite with timely insights that guide product development and customer engagement, helping to identify new initiatives that might have otherwise been overlooked.

  • Mitigated compliance and reputation risks. To better align with customer expectations, the composite validates user flows and product messaging, reducing compliance risks. With UserTesting, it properly secures consent for studies.

Costs. Three-year, risk-adjusted PV costs for the composite organization include:

  • UserTesting subscription. Subscription costs for the composition organization are based on the number of licenses needed. For UserTesting platform access across users, the composite organization incurs $1.6 million in subscription costs over the course of three years.

  • Ongoing administration costs. The composite’s researchers dedicate a portion of time to ongoing administration, including handling interdepartmental requests, contract renewals, and discussions with UserTesting. Over three years, the composite incurs costs of $134,000 for ongoing administration.

  • Training costs. The composite’s UserTesting users dedicate a portion of time to familiarizing themselves with navigating the platform and understanding new capabilities. Over three years, the composite incurs costs of $113,000 for training.

The financial analysis that is based on the interviews found that a composite organization experiences benefits of $9.4 million over three years versus costs of $1.8 million, adding up to a net present value (NPV) of $7.6 million and an ROI of 415%.

“One corrected product decision alone could recoup our annual investment in UserTesting.”

Director of design operations, software

Key Statistics

415%

Return on investment (ROI) 

$9.4M

Benefits PV 

$7.6M

Net present value (NPV) 

<6 months

Payback 

Benefits (Three-Year)

[CHART DIV CONTAINER]
Profit from optimized usability Profit from increased customer retention Developer rework avoidance from speed to insights Research and designer productivity

The UserTesting Customer Journey

Drivers leading to the UserTesting investment
Interviews
Role Industry Region Revenue Employees
Director of design operations Software Global $38 billion 73,000
Senior vice president of technology governance Finance Global $20 billion 215,00
UX researcher E-commerce Global $8 billion 10,000
Vice president of digital experience and product Consumer packaged goods North America $6 billion 120,000
Senior manager of UX research and content design Retail North America $4 billion 15,000
Senior user researcher Finance Global $2 billion 10,000
Key Challenges

Before adopting UserTesting, interviewees faced friction gathering customer insights and aligning stakeholders around product decisions. Inefficient research workflows, inconsistent participant recruitment, and reliance on ad hoc methods, such as speaking to customers or internal staff through processes that lacked rigor and structure or were time- and resource-consuming, slowed momentum and introduced risk. These challenges collectively delayed product development, limited research scalability, and made it difficult to form and execute confident, customer-informed decisions that could potentially increase revenue and decrease expenses.  

Interviewees noted how their organizations struggled with common challenges, including:

  • Challenges maximizing revenue opportunities from existing users. Interviewees noted that prior to UserTesting, their teams missed opportunities to deepen product utilization. Slow research cycles and a lack of confidence in product decisions due to insufficient, scattered, or contested data also thwarted product improvements and risked customer attrition, which impacted top-line revenue. A senior user researcher in the insurance sector shared that without timely research, their organization had risked losing up to 50% of customers eligible for a new product offering, impacting revenue. They said, “It’s millions of customers we would have lost. If we can retain the percentage we’re hoping to retain, it’s huge.”

  • A lack of structure and scalability that inhibited application of customer insights. Interviewees described that in their prior environments, user and customer research was outsourced to expensive agencies or conducted internally without proper study guides or processes. High investment in field research and delayed speed to insights were common themes. The senior manager of UX research and content design in the retail industry noted their organization relied heavily on in-store intercepts, which were geographically biased and labor-intensive, requiring three-person teams and full-day commitments. Recruitment was another major hurdle to research at scale. The senior vice president of technology governance in finance explained that sourcing participants through internal processes could take up to six weeks, stating that the study design approval process at their organization was “bureaucratic” and involved multiple teams, numerous approvals, and manual coordination. The vice president of digital experience and product in the consumer packaged goods industry described how their team had to physically visit cafés to conduct interviews, which was resource-inefficient and yielded inconsistent results.

  • A lack of centralized tools for qualitative research. Relatedly, several interviewees highlighted the marked lack of internal tools for actionable qualitative research, sharing they had no in-house qualitative platform or any way to directly engage with customers. This caused a reliance on limited survey tools that didn’t provide the depth of insight needed for decision-making. The senior vice president of technology governance in finance noted that decentralization led to a loss of visibility, as individual business units conducted research independently without sharing results broadly. The UX researcher in the e-commerce industry said: “Before UserTesting, we didn’t have any solutions that would support research for e-commerce. We had a shift in business goals with the company to be more customer-first. That really drove the need to find more tools that helped build our qualitative database, our research, speaking with our customers and building those connections, and also driving results.”

  • Products built without proper validation which led to errors, wasted developer efforts, and poor NPS. Several interviewees described how lack of validation led to costly mistakes. The director of design operations in the enterprise software industry recounted instances where features were released without testing, resulting in user confusion, support tickets, and emergency fixes. The interviewee emphasized that usability issues, while subtle, could escalate quickly and negatively impact the customer experience, resulting in customers dropping out of workflows and filing tickets: “[The customers] are in their roles performing critical business responsibilities. If they’re clicking on the wrong button, they think they can’t perform a test. They log a bug. They start getting involved in support tickets, and by introducing confusion to a user in the grand experience, the situation just begins to escalate. So, while a product may look okay going out, usability is important because it can be quite stealth. When usability becomes a problem out in the real world, it introduces confusion.” Without UserTesting, Net Promoter ScoreSM (NPS) dropped for this organization. Similarly, the vice president of digital experience and product in the consumer packaged goods industry noted that without early testing, their organization wasted development cycles on features that failed to meet user needs, increasing the likelihood of developer rework and missed business opportunities when a less promising idea was prioritized over a lucrative one.

  • A focus on marketing insights that hindered value from usability. The director of design operations in the enterprise software sector explained that their research and insights team focused primarily on marketing and generative research, leaving a gap in usability testing that needed to be filled by the UX team. This misalignment meant that product teams lacked the evidence needed to make informed design decisions. The vice president of digital experience and product in the consumer packaged goods industry described how early research efforts before UserTesting were more about general customer sentiment than usability, which limited their ability to optimize digital experiences effectively.

  • Compliance and data privacy hindered implementations of user research. The senior user researcher in finance remarked that internal compliance teams were initially resistant to video recordings and customer interviews, requiring extensive approvals and process adjustments. These concerns before UserTesting often delayed research efforts and limited the ability to gather timely insights.

Investment Objectives

The interviewees searched for a solution that could:

         Support a strategic customer-centric shift.

         Deliver ease of use, efficiency, agility, scalability, and speed.

         Promote cross-functional adoption and cross-team enablement.

         Integrate smoothly with existing workflows.

         Meet enterprise requirements.

         Prove cost-effective.

“We were doing a lot of field research. It took three people out of circulation for a day. UserTesting helped us scale.”

Senior manager of UX research and content design, retail

“[Before UserTesting], we didn’t have the ability to answer these questions. We were building based on intuition.”

Vice president of digital experience and product, consumer packaged goods

Composite Organization

Based on the interviews, Forrester constructed a TEI framework, a composite company, and an ROI analysis that illustrates the areas financially affected. The composite organization is representative of the interviewees’ organizations, and it is used to present the aggregate financial analysis in the next section. The composite organization has the following characteristics:

  • Description of composite. The multibillion-dollar organization has global operations, delivering its products and services from both physical locations and through a strong digital presence. CX and UX research are growing priorities as the organization shifts to an increasingly customer-centric approach.

  • Deployment characteristics. The composite organization begins with licensing UserTesting to 20 researchers and 10 designers in Year 1, which increases to 40 researchers and 20 designers in Year 2 and 60 researchers and 30 designers in Year 3. As it increases its adoption of UserTesting, it prioritizes optimization of its digital experiences and improvements of its products, focusing first on experiences and products that are low-risk to alter but that serve as a starting point from which insights can apply to and studies can be designed for higher revenue streams.

 KEY ASSUMPTIONS

  • Multibillion in revenue

  • Global operations

  • 10% operating profit margin

Analysis Of Benefits

Quantified benefit data as applied to the composite
Total Benefits
Ref. Benefit Year 1 Year 2 Year 3 Total Present Value
Atr Optimized usability $432,000 $864,000 $1,296,000 $2,592,000 $2,080,481
Btr Increased customer retention $518,400 $1,036,800 $1,555,200 $3,110,400 $2,496,577
Ctr Avoided developer rework due to prelaunch testing $510,000 $1,020,000 $1,530,000 $3,060,000 $2,456,123
Dtr Increased researcher and designer productivity $501,500 $1,003,000 $1,504,500 $3,009,000 $2,415,188
  Total benefits (risk-adjusted) $1,961,900 $3,923,800 $5,885,700 $11,771,400 $9,448,369
Optimized Usability

Evidence and data. Interviewees noted that optimized usability from insights gathered through UserTesting enabled reduced friction in the customer journey, thereby increasing task success rates for calls to action and conversion rates. Experience optimization through thoughtful designs of features, functionalities, and journeys proved strategic levers for revenue growth. Interviewees leveraged UserTesting’s AI to synthesize data from various places for key insights and trends to enhance usability.

  • The director of design operations in the software industry noted the value of UserTesting’s AI feature: “With the AI tools to do synthesis, we’re really in a much-improved state than we were five years ago, when you’d have to look through every one of those items. We can get results, get a brief idea of where we’re going, and then spin up another test quickly based upon that. ... [And] quick speed to insights is also another piece.”

  • The UX researcher in the e-commerce industry also remarked that “the analytics or the metrics that UserTesting give us with the combination of AI have been an absolute game changer.” This interviewee described how their team applied UserTesting to three major digital areas: product ratings and reviews, personalization, and navigation. The UX researcher explained that their organization had previously deprioritized product ratings for years due to inconclusive internal metrics, but UserTesting helped validate their importance and inspired speed and confidence in the decision. As a result, the interviewee’s company implemented a third-party ratings tool within six months and began phasing it into the site, therefore empowering customers to make more informed decisions increased sales and reduced returns. The UX researcher noted: “We were able to use UserTesting to help confirm everything we knew [about the importance of customer product ratings in e-commerce], and there’s no question that this is something that was absolutely necessary even regardless of financial impact to the business. UserTesting helped us eliminate the indecision specific to our company culture and do what we needed to get done.”

  • The senior vice president of technology governance in finance described how usability improvements with UserTesting contributed to a 30% increase in online banking acquisition. By leveraging UserTesting to refine digital onboarding and product flows, their company was able to better communicate value and reduce drop-off during key customer journeys. The interviewee emphasized that broader sample sizes and faster feedback loops enabled by insights from UserTesting gave the team higher confidence in their design decisions, which translated into measurable gains in customer acquisition and retention. The interviewee said, “With UserTesting, we have seen an increase in task success rate and task efficiency across the board, probably around 5%.”

  • The vice president of digital experience in the consumer packaged goods industry shared that improvements to fulfillment time communication and menu presentation, both informed by UserTesting, led to a 6% increase in retail conversion and a 2% increase in catering conversion year-over-year. These gains contributed to millions in incremental revenue. Underscoring the financial impact of even small usability improvements, the interviewee estimated, “We’re a $2.5-million company on our first-party platform. So, if we improve conversion by even 1%, that’s $25 million.”

Modeling and assumptions. Based on the interviews, Forrester assumes the following about the composite organization:

  • The composite organization experiences 100 million annual visitors to its digital channels.

  • Its average transaction value is $150.

  • The conversion rate before UserTesting is 1.5%.

  • The composite conducts research projects to optimize various features, functionalities, and descriptions on its digital experiences. With insights gained from UserTesting, the composite applies improvements to its digital experiences, resulting in a conversion rate increase of 2.4% in Year 1, 4.8% in Year 2, and 7.2% in Year 3.

  • The operating profit margin for the composite organization is 10%.

Risks. The impact of this benefit will vary among organizations based on the following factors:

  • An organization’s annual volume of visitors, prospects, or opportunities.

  • An organization’s existing conversion rate, operating profit margin, and the level of testing it conducts before UserTesting.

  • An organization’s adoption of UserTesting throughout its teams and its strategy for testing the components of its digital experiences and applying insights.

Results. To account for these risks, Forrester adjusted this benefit downward by 20%, yielding a three-year, risk-adjusted total PV (discounted at 10%) of $2.1 million.

7.2%

Increased conversion rate from channel optimization attributable to UserTesting

“We use UserTesting to understand user pathways, streamline navigation, and improve product presentations to allow a user to get to the ultimate end goal, whether that’s finding information, transacting, or viewing a new product suite like credit cards. We have seen an increase in task success rate and task efficiency of around 5%.”

Senior vice president of technology governance, finance

Optimized Usability
Ref. Metric Source Year 1 Year 2 Year 3
A1 Visitors to digital channels Composite 100,000,000 100,000,000 100,000,000
A2 Average transaction value Composite $150 $150 $150
A3 Conversion rate prior to UserTesting Composite 1.5% 1.5% 1.5%
A4 Annual revenue from digital channels A1*A2*A3 $225,000,000 $225,000,000 $225,000,000
A5 Increased conversion rate attributable to UserTesting due to channel optimization Interviews 2.4% 4.8% 7.2%
A6 Revenue from digital channel optimization with UserTesting A4*A5 $5,400,000 $10,800,000 $16,200,000
A7 Operating profit margin Composite 10% 10% 10%
At Optimized usability A6*A7 $540,000 $1,080,000 $1,620,000
  Risk adjustment 20%      
Atr Optimized usability (risk-adjusted)   $432,000 $864,000 $1,296,000
Three-year total: $2,592,000 Three-year present value: $2,080,481
Increased Customer Retention

Evidence and data. By designing studies in UserTesting to identify mobile testing opportunities for product and experience improvement, interviewees were able to reduce churn and foster long-term customer loyalty.

  • The senior user researcher in the finance industry shared how usability improvements led to higher-than-expected retention during a pilot for a new product journey for existing customers. The journey, which allowed customers to transfer between brands within the same group, was refined through multiple rounds of usability testing. As a result, their organization retained customers who might otherwise have left for competitors, directly impacting revenue. The senior user researcher noted, “We’ve had very little pushback. People have been finding the journey easy. We wouldn’t have created this product without UserTesting.”

  • The same interviewee described early mobile testing of a new feature: “We are at the very early stages of testing a product idea. It’ll be iterative, so this is the first part of it. We will then come back based on feedback and we’ll revise the idea. and we’ll go back and test it. We’ll maybe get some wire frames, etc. So, UserTesting has quite a lot of impact because it’s changed how that product looks and where we’re going with it. So, I think the market that we’re looking at is probably a little bit different than [the market we looked at] initially.”

  • The vice president of digital experience and product in the consumer packaged goods industry described how UserTesting informed digital recovery and reordering features, leading to a 14% increase in retention among delivery customers who could easily place additional orders for items they had previously purchased. The digital recovery initiative focused on real-time issue resolution, such as providing accurate delivery time estimates and enabling immediate remedies for service failures. The interviewee noted that these improvements not only reduced customer service call volume but also increased satisfaction: “We’re solving problems with a more accurate representation of when your order will be delivered, and we’ve introduced functionality to help you get a remedy from us in real time.”

  • Interviewees also demonstrated the value of customer-centric design in driving product relevance and satisfaction. The director of design operations explained that the team at their software organization reported conducting 1,000 to 1,200 studies annually, with at least 10% to 20% of those studies influencing product direction, which helped ensure that new product features aligned with user needs and expectations.

  • The senior vice president of technology governance in the finance sector said their organization experienced a 10% reduction in customer churn after implementing usability improvements informed by UserTesting. This reduction in attrition preserved revenue from retained customers and reflected a broader shift toward more responsive, user-informed product development.

Modeling and assumptions. Based on the interviews, Forrester assumes the following about the composite organization:

  • Before UserTesting, the composite experiences an 80% retention on revenue from products and experiences.

  • With UserTesting, the composite increases its customer satisfaction and therefore increases its retention rate by 3.6% in Year 1; 7.2% in Year 2; and 10.8% in Year 3.

Risks. The impact of this benefit will vary among organizations based on the following factors:

  • An organization’s existing retention rate, operating profit margin, and the level of testing it conducts before UserTesting.

  • An organization’s adoption of UserTesting throughout its teams and its strategy for testing the components of its digital experiences and applying insights.

Results. To account for these risks, Forrester adjusted this benefit downward by 20%, yielding a three-year, risk-adjusted total PV (discounted at 10%) of $2.5 million.

10.8%

Increased retention for product and service improvements attributable to UserTesting

“We have seen a delivery satisfaction increase — an increase in the perception of orders being on time. What we do [to rectify] when customer satisfaction is at risk is heavily informed by UserTesting and contributes to our 14% increase in retention rate.”

Vice president of digital experience and product, consumer packaged goods

Increased Customer Retention
Ref. Metric Source Year 1 Year 2 Year 3
B1 Average revenue from products or services addressed with UserTesting A4 $225,000,000 $225,000,000 $225,000,000
B2 Retention rate before UserTesting Composite 80% 80% 80%
B3 Increased retention attributable to UserTesting for product and service improvements Interviews 3.6% 7.2% 10.8%
B4 Revenue from increased customer satisfaction with UserTesting B1*B2*B3 $6,480,000 $12,960,000 $19,440,000
B5 Operating profit margin Composite 10% 10% 10%
Bt Increased customer retention B4*B5 $648,000 $1,296,000 $1,944,000
  Risk adjustment 20%      
Btr Increased customer retention (risk-adjusted)   $518,400 $1,036,800 $1,555,200
Three-year total: $3,110,400 Three-year present value: $2,496,577
Avoided Developer Rework Due To Prelaunch Testing

Evidence and data. Interviewees reported that by validating concepts and designs with UserTesting before products were developed, teams prevented costly iterations, reduced wasted effort, and ensured that development resources were used efficiently. UserTesting increased speed to insights and provided the opportunity to test ideas prelaunch, saving development costs. Integrating UserTesting into the early stages of product development not only improved the user experience but also protected valuable engineering time and budget for the interviewees’ organizations. By identifying issues before they reached production, teams avoided the high cost of rework and delivered improved products more quickly.

  • The vice president of digital experience and product in the consumer packaged goods industry described that by using UserTesting to validate usability before launching features, their team reduced the number of A/B test iterations from three to 1.5 on average. This not only saved development time but also ensured that scarce engineering resources were focused on high-impact work. The interviewee noted: “We eliminate a whole bunch of wasted cycles by doing the testing upfront. We’re still going to A/B testing … but it cuts down on us wasting effort and capacity.”

  • The senior vice president of technology governance in finance noted that acceleration in research cycles not only improved time to insight but also reduced the likelihood of costly rework. The interviewee estimated that their organization achieved a 2% to 3% cost savings specifically from avoided developer rework, a meaningful figure given the scale of their operations.

  • The director of design operations in the software industry described how UserTesting helped prevent costly errors. One example involved a feature that, if released without testing, would have caused users to drop off midtask, triggering support tickets and requiring emergency fixes. The team estimated that such rework could cost hundreds of thousands of dollars in labor alone and potentially could have resulted in frustrated customers and lost business. By validating usability early, they avoided these expenses and improved the overall user experience. The interviewee said, “We are saving ourselves from sending things out [that would have failed in market],” and also estimated that each development cycle saved could represent approximately $10,000 in labor costs, not including the opportunity cost of delaying more impactful work. With a lean team of under 15 developers supporting digital initiatives, the ability to focus on building the right features the first time was critical. This interviewee noted, “We have very scarce development resources, so we need to be really, really confident that it is the right thing.”

Modeling and assumptions. Based on the interviews, Forrester assumes the following about the composite organization:

  • The composite organization conducts 240 research projects in Year 1, 480 in Year 2, and 720 in Year 3 (or one project a month per researcher).

  • Twenty-five percent of research projects directly influence product development with four sprints per project.

  • With UserTesting, the composite organization experiences a 25% reduction in iteration cycles.

  • The average cost to iterate per cycle is $10,000.

Risks. The impact of this benefit will vary among organizations based on the following factors:

  • The number of developers supporting researchers at the organization.

  • The organization’s approach to research and testing within the product development cycle.

  • The average cost to iterate per cycle for the organization.

  • The average research project to iteration ratio at the organization.

Results. To account for these risks, Forrester adjusted this benefit downward by 15%, yielding a three-year, risk-adjusted total PV (discounted at 10%) of $2.5 million.

25%

Iteration cycle reduction with UserTesting

“We would receive study results that changed our initial product direction 10% to 20% of time.”

Director of design operations, software

Avoided Developer Rework Due To Prelaunch Testing
Ref. Metric Source Year 1 Year 2 Year 3
C1 Research projects D1*12 240 480 720
C2 Research projects directly influencing product development Composite 25% 25% 25%
C3 Sprints per project Composite 4 4 4
C4 Sprints expected without UserTesting C1*C2*C3 240 480 720
C5 Iteration cycle reduction with UserTesting Interviews 25% 25% 25%
C6 Avoided sprints with UserTesting C4*C5 60 120 180
C7 Average cost to iterate per cycle Composite $10,000 $10,000 $10,000
Ct Avoided developer rework due to prelaunch testing C6*C7 $600,000 $1,200,000 $1,800,000
  Risk adjustment 15%      
Ctr Avoided developer rework due to prelaunch testing (risk-adjusted)   $510,000 $1,020,000 $1,530,000
Three-year total: $3,060,000 Three-year present value: $2,456,123
Improved Researcher And Designer Productivity

Evidence and data. Interviewees reported that UserTesting enhanced researcher and designer productivity by providing templates, streamlining workflows, increasing testing throughput, and enabling more efficient collaboration. By reducing the time and effort required to plan, execute, and analyze user research, interviewees’ teams were able to focus more on high-value activities and scale their impact without proportionally increasing headcount.

  • The vice president of digital experience and product in the consumer packaged goods industry described how their team transitioned from conducting one or two tests per quarter — often involving in-person interviews at cafés — to running multiple tests per week after adopting UserTesting. Despite a reduction in team size from nine to six UX professionals, the testing throughput increased dramatically. The interviewee said, “Our testing throughput is in orders of magnitude higher, even with a smaller team than we had before.” This shift allowed the team to validate ideas more frequently and with greater confidence without overburdening researchers or designers.

  • This interviewee also noted that the platform also helped reduce the number of design and research cycles needed to reach a viable solution. Before UserTesting, the interviewee’s team typically required three iterations to refine a concept; with UserTesting, that number dropped to one and a half. This not only saved time but also reduced the cognitive and logistical load on researchers and designers. The vice president of digital experience and product said: “We’re launching one test and getting the result that we need. … We lose a whole bunch of wasted cycles by doing the testing upfront.”

  • The senior manager of UX research and content design in the retail industry described how UserTesting enabled their small team (three researchers and four content designers) to support from nine to 12 product teams. This was previously unmanageable.

  • The senior vice president of technology governance in finance shared that prior to adopting UserTesting, research cycles took two to three months and required coordination across five to 10 team members, often in partnership with external research institutions. With UserTesting, the research cycle was reduced to about one month, allowing teams to move faster and with more autonomy.

  • Interviewees described that the ability to quickly gather user feedback helped focus internal conversations and reduce the time spent debating design decisions. With even a small amount of user data, teams were able to align more quickly and avoid unproductive discussions. These improvements in productivity not only enabled teams to do more with less but also fostered a culture of evidence-based decision-making, where researchers and designers could confidently advocate for user needs while accelerating the pace of innovation.

Modeling and assumptions. Based on the interviews, Forrester assumes the following about the composite organization:

  • The composite organization has 20 researchers leveraging UserTesting in Year 1, 40 in Year 2, and              60 in Year 3.

  • Researchers save 50% of time with UserTesting.

  • The average fully burdened annual salary for researchers is $105,000.

  • The composite organization has 10 designers supporting researchers in Year 1, 20 in Year 2, and 30 in Year 3.

  • Designers save 10% of time due to improved design decisions with access to data, which prevents errors.

  • The average fully burdened annual salary for designers is $130,000

Risks. The impact of this benefit will vary among organizations based on the following factors:

  • The growth of the organization and its research projects.

  • The approach by which the organization prioritizes research projects and their support.

  • The extent of adoption of UserTesting throughout the organization.

  • The average annual burdened salaries of researchers and designers.

Results. To account for these risks, Forrester adjusted this benefit downward by 15%, yielding a three-year, risk-adjusted total PV (discounted at 10%) of $2.4 million.

50%

Researcher time saved with UserTesting

10%

Designer time saved with UserTesting

“Before UserTesting, there was no user research. It was not part of the process. We just didn’t have the partners available to perform usability studies. That started creeping into product quality and that’s when it became much more necessary to bring that capability to our designers, to get insights and interweave them. We went from not having studies to scaling to 1,200 studies a year.”

Director of design operations, software

Improved Researcher And Designer Productivity
Ref. Metric Source Year 1 Year 2 Year 3
D1 Full-time researchers Composite 20 40 60
D2 Fully burdened annual salary for a researcher Composite $105,000 $105,000 $105,000
D3 Percentage of time saved with UserTesting Interviews 50% 50% 50%
D4 Productivity recapture rate TEI methodology 50% 50% 50%
D5 Subtotal: Researcher time savings D1*D2*D3*D4 $525,000 $1,050,000 $1,575,000
D6 Full-time designers Composite 10 20 30
D7 Fully burdened annual salary for a designer Composite $130,000 $130,000 $130,000
D8 Percentage of time saved with UserTesting due to improved design decisions Interviews 10% 10% 10%
D9 Subtotal: Designer time savings D6*D7*D8*D4 $65,000 $130,000 $195,000
Dt Improved researcher and designer productivity D5+D9 $590,000 $1,180,000 $1,770,000
  Risk adjustment 15%      
Dtr Improved researcher and designer productivity (risk-adjusted)   $501,500 $1,003,000 $1,504,500
Three-year total: $3,009,000 Three-year present value: $2,415,188
Unquantified Benefits

Interviewees mentioned the following additional benefits that their organizations experienced but were not able to quantify:

  • Improved research and panel quality fostered trust in insights and internal buy-in across teams. Interviewees emphasized that UserTesting allowed teams to launch with greater confidence, reducing reliance on intuition and enabling more data-informed decisions. They noted that even without statistical significance, having directional feedback helped cut through internal debates, enabling more objective conversations and reducing internal bias in product decisions. The vice president of digital experience and product estimated that UserTesting reduced subjective debates by 50%, especially when presenting to nondigital stakeholders.
    The same interviewee in the consumer packaged goods industry noted that prior to UserTesting, research was inconsistent and often based on informal café interviews that lacked statistical rigor. With UserTesting, their team could reliably gather actionable insights at scale, which helped them build confidence in their decisions. The interviewee stated, “Whenever we launch a test, we know that we’re going to get actionable insight in a much more repeatable and efficient way.” Similarly, the senior vice president of technology governance in the finance sector explained that broader sample sizes and faster turnaround enabled their team to remove ambiguity and gain higher confidence in design and product decisions.
    The vice president of digital experience and product in the consumer packaged goods industry said the digital team leveraged UserTesting insights to reduce subjective debates and accelerate decision-making. The interviewee noted that UserTesting helped minimize unproductive conversations and focus discussions on user needs, stating, “If we have even a little bit of data or signal, it is much less likely that you are going to get random feedback from stakeholders.” The senior vice president of technology governance in finance emphasized that wider sample sizes and faster insights helped reduce ambiguity and improve the clarity of product decisions.

  • Studies at scale broadened geographic reach and inclusivity. UserTesting expanded the geographic reach of research at the interviewees’ organizations, allowing their teams to gather insights from a more diverse and representative set of users. The senior vice president of technology governance in the finance sector explained that prior to UserTesting, research was limited to two or three markets due to logistical constraints. With the UserTesting platform, they were able to double that reach to four to six markets, enabling broader and more inclusive feedback. This expansion helped reduce regional bias and ensured that product decisions reflected the needs of a wider customer base. The senior manager of UX research and content design in the retail industry explained that prior to adopting UserTesting, their team conducted most of their research in local stores near their headquarters. This created a strong bias in their insights, as the region is deeply embedded in the company culture brand culture. The interviewee noted, “We were loving our stores a little too much,” adding that one store even grew tired of frequent research visits. With UserTesting, the company expanded its reach beyond local markets and engaged customers nationwide without the logistical burden of travel. This shift not only reduced operational strain but also helped counter internal biases, enabling teams to make decisions based on a broader and more inclusive understanding of their customer base.

  • Availability of user data unlocked a cultural shift toward customer-centricity. The interviewees noted their organizations’ adoption of UserTesting contributed to a cultural shift toward customer-centricity by embedding user feedback into daily workflows. The vice president of digital experience and product in the consumer packaged goods industry stated that their team now rarely makes decisions without grounding them in user data. They said, “We generally don’t open our mouths without at least some sort of footing in user data,” highlighting how the platform helped reduce reliance on intuition and elevate the voice of the customer in strategic planning.

  • Strategic enablement unlocked innovation. UserTesting enabled strategic initiatives at the interviewees’ organizations by providing timely insights that informed product development and customer engagement strategies; interviewees noted that UserTesting helped identify and shape entirely new initiatives, such as digital recovery and fulfillment messaging, which may not have been prioritized otherwise. The vice president of digital experience and product in the consumer packaged goods industry said their digital team used UserTesting to shape key features such as reorder flows. The senior vice president of technology governance in the finance industry said research insights supported efforts to increase product utilization and online banking acquisition, aligning with broader business goals and serving as strategic enabler, not just a tactical tool.

  • Mitigated compliance and reputation risks. The senior vice president of technology governance explained that their finance organization had previously faced reputational challenges due to customer experiences that were not always aligned with customer consent, and the company had chosen UserTesting to help them address the reputational risks surrounding securing consent for users. By using UserTesting to validate user flows and product messaging, the company was able to better align with customer needs and reduce the risk of negative press. The interviewee emphasized, “We’ve always tried to make sure we present ourselves as being mindful of our customers,” noting that the platform helped reinforce that commitment.

“Our customers telling us about our products was immediately impactful for our merchandising team. It created new processes, communications with our vendors, and accountability across products we were selling and stocking. ... It positioned our merchandising team to return to vendors with data and help improve or return products. [It made] major differences for us, especially in making future decisions. Our merchandizing team was able to look at certain vendor relationships as a whole. ... There’s definitely been some major impacts on that end.”

UX researcher, e-commerce

Flexibility

The value of flexibility is unique to each customer. There are multiple scenarios in which a customer might implement UserTesting and later realize additional uses and business opportunities, including:

  • Variety in research methods. Interviewees mentioned that UserTesting unlocked previously unavailable research methods, enabling teams to conduct a wide range of studies with minimal overhead and allowing teams to adapt their research approach to different business needs and timelines. Interviewees highlighted the ability to use various methodologies (e.g., unmoderated tests, surveys, card sorting) and the flexibility to scale research across teams with different needs.

  • Democratization and autonomy in research — even in a centralized platform. Interviewees found UserTesting enabled greater autonomy and democratization of research across large organizations even while offering a centralized solution. The senior manager of UX research and content design in retail explained that their team had evolved from a centralized model to one where more individuals across product teams were empowered: UserTesting equipped designers and product managers to conduct their own research, reducing bottlenecks and increasing research coverage without expanding the centralized research team. The interviewee noted that product managers and designers were increasingly using the platform independently. The senior vice president of technology governance in the finance sector described how previously centralized research efforts with alternative solutions required extensive coordination and offered limited visibility. With UserTesting, individual business units gained the ability to conduct their own studies independently, increasing agility and responsiveness to specific customer needs. This empowered more teams to engage directly with customer feedback and make informed decisions.

Flexibility would also be quantified when evaluated as part of a specific project (described in more detail in Total Economic Impact Approach).

“With UserTesting, we have gone from fighting to make the case to gather customer insights to inform decisions to an understanding that this is part of our process; we want to put things in front of customers before we make a decision.”

Senior manager of UX research and content design, retail

Analysis Of Costs

Quantified cost data as applied to the composite
Total Costs
Ref. Cost Initial Year 1 Year 2 Year 3 Total Present Value
Etr UserTesting subscription $0 $330,000 $660,000 $990,000 $1,980,000 $1,589,256
Ftr Ongoing administration costs $1,150 $27,600 $55,200 $82,800 $166,750 $134,070
Gtr Training costs $30,912 $17,002 $34,003 $51,005 $132,922 $112,790
  Total costs (risk-adjusted) $32,062 $374,602 $749,203 $1,123,805 $2,279,672 $1,836,116
UserTesting Subscription

Evidence and data. The interviewees’ organizations paid licensing costs to UserTesting for platform access. Pricing may vary. Contact UserTesting for additional details.

Modeling and assumptions. Based on the interviews, Forrester assumes that as the composite organization increases its licenses, it pays $300,000 in Year 1, $600,000 in Year 2, and $900,000 in Year 3.

Risks. The impact of this cost will vary among organizations based on the following factors:

  • The number of licenses the organization uses.

  • The volume of consumption of the platform.

  • Changes and shifts in UserTesting’s pricing strategy.

Results. To account for these risks, Forrester adjusted this cost upward by 10%, yielding a three-year, risk-adjusted total PV (discounted at 10%) of $1.6 million.

UserTesting Subscription
Ref. Metric Source Initial Year 1 Year 2 Year 3
E1 Total license fees Composite   $300,000 $600,000 $900,000
Et UserTesting subscription E1 $0 $300,000 $600,000 $900,000
  Risk adjustment ↑10%        
Etr UserTesting subscription (risk-adjusted)   $0 $330,000 $660,000 $990,000
Three-year total: $1,980,000 Three-year present value: $1,589,256
Ongoing Administration Costs

Evidence and data. Interviewees reported that as a SaaS platform, UserTesting required little ongoing administration. Some costs for ongoing administration include interdepartmental requests for access or projects, contract renewals, and discussions with UserTesting.

Modeling and assumptions. Based on the interviews, Forrester assumes the following about the composite organization:

  • Researchers spend 20 hours initially, then 24 hours per researcher annually for ongoing administration.

  • The fully burdened hourly rate for researchers is $50.

Risks. The impact of this cost will vary among organizations based on the following factors:

  • The extend of adoption of the UserTesting platform throughout the organization.

  • The fully burdened hour rater for a researcher.

Results. To account for these risks, Forrester adjusted this cost upward by 15%, yielding a three-year, risk-adjusted total PV (discounted at 10%) of $134,000.

Ongoing Administration Costs
Ref. Metric Source Initial Year 1 Year 2 Year 3
F1 Researcher operation hours Initial: 20 hours
Y1, Y2, and Y3: D1*24
20 480 960 1,440
F2 Fully burdened hourly rate for a researcher Composite $50 $50 $50 $50
Ft Ongoing administration costs F1*F2 $1,000 $24,000 $48,000 $72,000
  Risk adjustment 15%        
Ftr Ongoing administration costs (risk-adjusted)   $1,150 $27,600 $55,200 $82,800
Three-year total: $166,750 Three-year present value: $134,070
Training Costs

Evidence and data. Interviewees spent a minimal number of hours training researchers and other users to navigate the platform and gain familiarity with new capabilities. Overall, interviewees described the platform as easy to use and intuitive, requiring little formal training.

Modeling and assumptions. Based on the interviews, Forrester assumes the following about the composite organization:

  • Thirty researchers and designers use the platform in the initial period and in Year 1. This expands to 60 users in Year 2 and 90 in Year 3.

  • The churn rate at the organization is 10%.

  • The fully burdened hourly rate for users is $56.

Risks. The impact of this cost will vary among organizations based on the following factors:

  • The extend of adoption of the UserTesting platform throughout the organization.

  • The fully burdened hourly rate for users of the platform.

Results. To account for these risks, Forrester adjusted this cost upward by 15%, yielding a three-year, risk-adjusted total PV (discounted at 10%) of $113,000.

Training Costs
Ref. Metric Source Initial Year 1 Year 2 Year 3
G1 Researchers and designers Composite 30 30 60 90
G2 Churn rate Composite 0% 10% 10% 10%
G3 Hours spent on training or discovery Interviews 16 8 8 8
G4 Fully burdened hourly rate for a platform user Composite $56 $56 $56 $56
Gt Training costs (G1*G2+G1)*G3*G4 $26,880 $14,784 $29,568 $44,352
  Risk adjustment 15%        
Gtr Training costs (risk-adjusted)   $30,912 $17,002 $34,003 $51,005
Three-year total: $132,922 Three-year present value: $112,790

Financial Summary

Consolidated Three-Year, Risk-Adjusted Metrics

Cash Flow Chart (Risk-Adjusted)

[CHART DIV CONTAINER]
Total costs Total benefits Cumulative net benefits Initial Year 1 Year 2 Year 3
Cash Flow Analysis (Risk-Adjusted)
  Initial Year 1 Year 2 Year 3 Total Present Value
Total costs ($32,062) ($374,602) ($749,203) ($1,123,805) ($2,279,672) ($1,836,116)
Total benefits $0 $1,961,900 $3,923,800 $5,885,700 $11,771,400 $9,448,369
Net benefits ($32,062) $1,587,298 $3,174,597 $4,761,895 $9,491,728 $7,612,253
ROI           415%
Payback           <6 months

 Please Note

The financial results calculated in the Benefits and Costs sections can be used to determine the ROI, NPV, and payback period for the composite organization’s investment. Forrester assumes a yearly discount rate of 10% for this analysis.

These risk-adjusted ROI, NPV, and payback period values are determined by applying risk-adjustment factors to the unadjusted results in each Benefit and Cost section.

The initial investment column contains costs incurred at “time 0” or at the beginning of Year 1 that are not discounted. All other cash flows are discounted using the discount rate at the end of the year. PV calculations are calculated for each total cost and benefit estimate. NPV calculations in the summary tables are the sum of the initial investment and the discounted cash flows in each year. Sums and present value calculations of the Total Benefits, Total Costs, and Cash Flow tables may not exactly add up, as some rounding may occur.

From the information provided in the interviews, Forrester constructed a Total Economic Impact™ framework for those organizations considering an investment in UserTesting.

The objective of the framework is to identify the cost, benefit, flexibility, and risk factors that affect the investment decision. Forrester took a multistep approach to evaluate the impact that UserTesting can have on an organization.

Due Diligence

Interviewed UserTesting stakeholders and Forrester analysts to gather data relative to UserTesting.

Interviews

Interviewed six decision-makers at organizations using UserTesting to obtain data about costs, benefits, and risks.

Composite Organization

Designed a composite organization based on characteristics of the interviewees’ organizations.

Financial Model Framework

Constructed a financial model representative of the interviews using the TEI methodology and risk-adjusted the financial model based on issues and concerns of the interviewees.

Case Study

Employed four fundamental elements of TEI in modeling the investment impact: benefits, costs, flexibility, and risks. Given the increasing sophistication of ROI analyses related to IT investments, Forrester’s TEI methodology provides a complete picture of the total economic impact of purchase decisions. Please see Appendix A for additional information on the TEI methodology.

Total Economic Impact Approach
Benefits

Benefits represent the value the solution delivers to the business. The TEI methodology places equal weight on the measure of benefits and costs, allowing for a full examination of the solution’s effect on the entire organization.

Costs

Costs comprise all expenses necessary to deliver the proposed value, or benefits, of the solution. The methodology captures implementation and ongoing costs associated with the solution.

Flexibility

Flexibility represents the strategic value that can be obtained for some future additional investment building on top of the initial investment already made. The ability to capture that benefit has a PV that can be estimated.

Risks

Risks measure the uncertainty of benefit and cost estimates given: 1) the likelihood that estimates will meet original projections and 2) the likelihood that estimates will be tracked over time. TEI risk factors are based on “triangular distribution.”

Financial Terminology
Present value (PV)

The present or current value of (discounted) cost and benefit estimates given at an interest rate (the discount rate). The PV of costs and benefits feed into the total NPV of cash flows.

Net present value (NPV)

The present or current value of (discounted) future net cash flows given an interest rate (the discount rate). A positive project NPV normally indicates that the investment should be made unless other projects have higher NPVs.

Return on investment (ROI)

A project’s expected return in percentage terms. ROI is calculated by dividing net benefits (benefits less costs) by costs.

Discount rate

The interest rate used in cash flow analysis to take into account the time value of money. Organizations typically use discount rates between 8% and 16%.

Payback

The breakeven point for an investment. This is the point in time at which net benefits (benefits minus costs) equal initial investment or cost.

Appendix A

Total Economic Impact

Total Economic Impact is a methodology developed by Forrester Research that enhances a company’s technology decision-making processes and assists solution providers in communicating their value proposition to clients. The TEI methodology helps companies demonstrate, justify, and realize the tangible value of business and technology initiatives to both senior management and other key stakeholders.

Appendix B

Endnotes

1 Total Economic Impact is a methodology developed by Forrester Research that enhances a company’s technology decision-making processes and assists solution providers in communicating their value proposition to clients. The TEI methodology helps companies demonstrate, justify, and realize the tangible value of business and technology initiatives to both senior management and other key stakeholders.

Disclosures

Readers should be aware of the following:

This study is commissioned by UserTesting and delivered by Forrester Consulting. It is not meant to be used as a competitive analysis.

Forrester makes no assumptions as to the potential ROI that other organizations will receive. Forrester strongly advises that readers use their own estimates within the framework provided in the study to determine the appropriateness of an investment in UserTesting. For any interactive functionality, the intent is for the questions to solicit inputs specific to a prospect's business. Forrester believes that this analysis is representative of what companies may achieve with UserTesting based on the inputs provided and any assumptions made. Forrester does not endorse UserTesting or its offerings. Although great care has been taken to ensure the accuracy and completeness of this model, UserTesting and Forrester Research are unable to accept any legal responsibility for any actions taken on the basis of the information contained herein. The interactive tool is provided ‘AS IS,’ and Forrester and UserTesting make no warranties of any kind.

UserTesting reviewed and provided feedback to Forrester, but Forrester maintains editorial control over the study and its findings and does not accept changes to the study that contradict Forrester’s findings or obscure the meaning of the study.

UserTesting provided the customer names for the interviews but did not participate in the interviews.

Consulting Team:

Nahida Nisa

Published

August 2025