10 votes

On surveys

9 comments

  1. skybrian
    Link
    From the blog post: ... ... Previously: Survey Chicken

    From the blog post:

    Surveys are the most dangerous research tool — misunderstood and misused. They frequently straddle the qualitative and quantitative, and at their worst represent the worst of both.

    In tort law the attractive nuisance doctrine refers to a hazardous object likely to attract those who are unable to appreciate the risk posed by the object. In the world of design research, surveys can be just such a nuisance.

    ...

    A bad survey won’t tell you it’s bad. It’s actually really hard to find out that a bad survey is bad — or to tell whether you have written a good or bad set of questions. Bad code will have bugs. A bad interface design will fail a usability test. It’s possible to tell whether you are having a bad user interview right away. Feedback from a bad survey can only come in the form of a second source of information contradicting your analysis of the survey results.

    Most seductively, surveys yield responses that are easy to count and counting things feels so certain and objective and truthful.

    Even if you are counting lies.

    And once a statistic gets out — such as “75% of users surveyed said that they love videos that autoplay on page load” —that simple “fact” will burrow into the brains of decision-makers and set up shop.

    ...

    When you are choosing research methods, and are considering surveys, there is one key question you need to answer for yourself:

    Will the people I’m surveying be willing and able to provide a truthful answer to my question?

    And as I say again and again, and will never tire of repeating, never ask people what they like or don’t like. Liking is a reported mental state and that doesn’t necessarily correspond to any behavior.

    Avoid asking people to remember anything further back than a few days. I mean, we’ve all been listening to Serial, right? People are lazy forgetful creatures of habit. If you ask about something that happened too far back in time, you are going to get a low quality answer.

    And especially, never ask people to make a prediction of their own future behavior. They will make that prediction based on wishful thinking or social desireability. And this, I think, is the most popular survey question of all:

    How likely are you to purchase the thing I am selling in the next 6 months?

    No one can answer that. At best you could get 1) Possibly or 2) Not at all.

    Previously: Survey Chicken

    12 votes
  2. [4]
    patience_limited
    (edited )
    Link
    As it happens, I just this morning deleted yet another "customer satisfaction" survey e-mail from a hotel that I stayed at last week. I've been receiving (and deleting) them daily for the past 5...

    As it happens, I just this morning deleted yet another "customer satisfaction" survey e-mail from a hotel that I stayed at last week. I've been receiving (and deleting) them daily for the past 5 days, and was about to hit "unsubscribe". Customer surveys like this just feel like a solicitation for unpaid labor. Then a part of me hesitated - what if I really want to complain?

    And then it struck me, yet again, that like all rating systems, you'll get a bimodal distribution of results - people exceptionally pleased with the service or item, and people extravagantly unhappy. I suppose there's utility to a business in understanding outlier results. But it mainly feels like a performative exercise, as the author of this article suggests.

    Since I'm not an insider to this process, I'm curious whether survey response rate is measured, how the data is used, and so on. How do businesses decide what's actionable (is it just costs, or is there a measure of "satisfaction" per dollar spent)? If I've complained about crappy pillows on every hotel stay and that doesn't change, why would I bother filling out surveys - and does that impact the validity of results from increasing the number who didn't respond?

    Do I praise the exceptionally helpful and friendly front desk staff and cleanliness of faculties, or curse the dreadful climate control system in the room? Do I say I'd stay again as a "7" on a scale of 1 to 10, when I don't have much choice in allowable travel accommodations? It's all a nonsense of synthesizing something that can be reduced to a KPI and safely ignored until a competitor eats the business.

    5 votes
    1. skybrian
      Link Parent
      On the other hand, I have occasionally had the customer service rep imply that the survey does matter and furthermore, anything less than a 10 is bad for them. I wish they just had a “smooth...

      On the other hand, I have occasionally had the customer service rep imply that the survey does matter and furthermore, anything less than a 10 is bad for them.

      I wish they just had a “smooth transaction” button that ends the survey. If you say everything went smoothly and as expected then that’s all they need to know.

      7 votes
    2. [2]
      public
      Link Parent
      When I get e-mail survey reminders, I like to mark them as spam in hops it hurts future deliverability across all Gmail users. Play stupid games, win stupid prizes. Be doubly funny if it impacted...

      When I get e-mail survey reminders, I like to mark them as spam in hops it hurts future deliverability across all Gmail users. Play stupid games, win stupid prizes. Be doubly funny if it impacted order confirmations as well as the marketing spam.

      3 votes
      1. patience_limited
        Link Parent
        I'd love to do that, but spam-labeling sometimes resulted in things like confirmations and cancellations getting binned and missed. That's part of the enshittification process - you can't engage...

        I'd love to do that, but spam-labeling sometimes resulted in things like confirmations and cancellations getting binned and missed. That's part of the enshittification process - you can't engage in a legitimate business transaction without having extra time or data stolen, lest your services be further impaired.

        4 votes
  3. [3]
    first-must-burn
    Link
    I took a research methods class in grad school, and we developed some surveys. It is constantly amazing to me how difficult communicating with writing is. People make different assumptions, read...

    I took a research methods class in grad school, and we developed some surveys. It is constantly amazing to me how difficult communicating with writing is. People make different assumptions, read tone differently, misunderstand.

    I think the only hope for a good survey is to do a pilot study first with a group where you can seek feedback on the survey itself, then refine. The pilot also lets you run preliminary data to see if you are even asking the right questions. Even then, it probably won't be perfect.

    3 votes
    1. [2]
      skybrian
      Link Parent
      I've long thought that the answers from an interviewer asking 'so why did you answer the question this way' would be more interesting than the answers themselves.

      I've long thought that the answers from an interviewer asking 'so why did you answer the question this way' would be more interesting than the answers themselves.

      1 vote
      1. first-must-burn
        Link Parent
        Exactly! But getting people to rate their feelings about potatoes on a scale of 1-5 is already a challenge. Getting them to introspect the meta question and provide a detailed response requires a...

        Exactly! But getting people to rate their feelings about potatoes on a scale of 1-5 is already a challenge. Getting them to introspect the meta question and provide a detailed response requires a high level of commitment or a certain type of person, both of which tend to bias the responses. I suppose in theory, there is an amount of money you could pay people to invest not only the time but the mental energy, but I think that number is quite high, and the fact that different people's threshold is different still introduces bias.

        Writing this reminded me that a fascinating look at the challenges of survey results is the XKCD color survey writeup.

        The whole read is amazing, as so much of Randall Monroe's work is, but (as a cis man) here is my favorite part:

        Here are the color names most disproportionately popular among men:

        • Penis
        • Gay
        • WTF
        • Dunno
        • Baige

        I … that’s not my typo in #5—the only actual color in the list really is a misspelling of “beige”.  And keep in mind, this is based on the number of unique people who answered the color, not the number of times they typed it.  This isn’t just the effect of a couple spammers. In fact, this is after the spamfilter.

        Also, he comments in his summary that, "nobody can spell fuchsia". Back in grad school, there was a set of shared machines that were named for colors. The basic color machines like black and red were always heavily loaded, but I learned to spell fuchsia, mauve, and beige because those machines were almost never used.

        4 votes