Skip to content

Examining Potentialities

The significance of research questions in user research is undeniable. In a recent video, William Hudson delves into both commendable and questionable examples of research questions.

Testing Strategies and Procedures
Testing Strategies and Procedures

Examining Potentialities

In the realm of digital design, a publication titled "Designing with Data: Improving the User Experience with A/B Testing" has emerged as a valuable resource for those seeking to enhance user experiences through A/B testing. Authored by Rochelle King, Elizabeth Churchill, and Caitlin Tan in 2016, this publication is not a video, unlike the one mentioned by William Hudson.

A/B testing is a research method that focuses on changes in user behavior, particularly in the context of websites and applications. The publication delves into the importance of crafting good research questions for A/B testing, emphasising that they should be clear, specific, measurable, and tied directly to business objectives or key performance indicators (KPIs).

Good examples of such questions include:

  • "We think that changing the homepage tagline to be more explanatory will increase click-throughs to other pages by 8%" - a clear hypothesis focused on a measurable outcome (click-through rate).
  • "We think that an explainer video added to the comparison landing page will increase conversion rate by 8%" - specifies the treatment and a numeric goal tied to conversion rate.
  • "A simpler, single-page checkout process will reduce cart abandonment rates compared to a multi-step checkout" - compares two variants with a clear behavioral metric (cart abandonment).
  • "A personalized call-to-action (CTA) will lead to higher click-through and conversion rates" - tests a specific UI element tied to conversions.

On the other hand, the publication warns against vague or unfocused questions, testing multiple independent variables simultaneously without accounting for interactions, hypotheses without specifying direction or measurable outcomes, and testing pricing differences in a basic A/B setup without considering user switching or incentive effects that can bias results.

While William Hudson discusses good and bad examples of research questions in a video, the publication itself does not provide specific examples or additional resources for learning more about A/B testing. It serves as a guiding light for those looking to improve their user experiences through data-driven decision making, emphasising the importance of clear, quantifiable, and measurable research questions.

The publication "Designing with Data: Improving the User Experience with A/B Testing" emphasizes the necessity of creating clear, specific, and measurable research questions for A/B testing, as they are tied directly to enhancing user experiences and achieving business objectives (KPIs). It cautions against using vague or unfocused questions in the process, as such questions can lead to biased results and hinder the success of data-and-cloud-computing solutions.

Read also:

    Latest