I have been taking part in Google Rewards for over a year now. For the most part, I complete the various surveys to feed an ongoing habit without feeling like I’m being too indulgent or wasting money. It’s a fast and easy way to make a bit of completely disposable income and, honestly, the service works well.
Broadly, the surveys I get fall into three categories: store feedback, google reviews and marketing surveys. Store feedback is usually a case of confirming that I visited a given location and then rating them out of five. It’s quick, interesting enough to see which businesses feel the service is worthwhile and lets me provide some limited feedback. I don’t really imagine that the data is all that worthwhile, but enough stores do it, some of which having done so for an entire year at this point, that they must get something from the results.
Google reviews are a little more tedious but also have a higher reward, so I quite enjoy receiving them. I’m one of those people that routinely reviews online purchases, fills out in-store questionnaires and generally says “yes” when asked if I have a minute. I totally understand why most people ignore these types of things, but I try to do them whenever I have spare time for two main reasons. The first is that I’ve worked retail, I’ve been the person with the clipboard and I am fully aware how much that role sucks. I literally spent two months, for 4-5 hours a day, wandering around Durham trying to get people interested in taking a flyer for a store I worked for, and that was difficult enough. Getting people to actually engage with you for longer than ten seconds… that sounds like hell on Earth. The second reason is that I like having a record of my opinions, which should be fairly obvious from this website (and elsewhere), and that extends out to the services I’ve used and the items I’ve purchased.
So, the first two groups are easy for me to understand and pretty common. But once every month or so I’ll get a survey from group three: marketing research. Not market research, but questioning me on the adverts that I remember having seen or my awareness of brands. I imagine most of these are Google trying to gauge how well its own advertising algorithms are, something which is totally apparent when I get a survey like the one I received this morning.
That survey was incredibly quick and began by showing me a thumbnail of a Youtube video by Philip DeFranco. The video was several years old (I could see the uploaded date on the image) and the survey wanted to know if I had watched it. Now, I’ve been subscribed to Phil since I first created a Youtube account back in 2009 and had already been watching him for over a year before that. I quite literally created my account just to be able to track which of his back catalogue of videos I had watched. As a result, I could say with pretty high certainty that I had watched the video they were showing me. I also assume, considering that Youtube is tied to my Google account, that they already knew that I had watched the video. The first question on these surveys tend to request confirmation of known information, so that made sense.
But then they did something which I don’t understand, at all. I think what they were trying to do was refine their suggested videos algorithm but the way they went about it was just weird. There were two more questions to the survey and both showed another thumbnail of one of Phil’s videos from over a year ago. Both asked me to rate, out of five, how useful these would be as suggested videos on Youtube. Now, I don’t propose to understand the exact results or answers Google are looking for here, but I can imagine that they’re hoping to confirm that, yes, someone who wants to watch a video on current affairs would like to watch more videos on current affairs. The problem, though, is that their survey is completely ignoring my own video watching history. I am subscribed to Phil’s channel; I have watched every video he’s uploaded in the past decade. I don’t need to have his old videos suggested to me because I’ve already seen them. However, none of that information has been requested by the survey, so from the perspective of the questions I’ve been asked then, yes, based on the fact I enjoyed watching the first video I would want the other two videos to be suggested.
Yesterday I was reading an A List Apart article on why asking the right questions in user testing is key to not screwing up. Perhaps because that was on my mind, this survey through me round a loop. On a personal level, completely honestly, those videos are useless suggestions to me and I would have liked to rate them 0 out of 5 (which is, irritatingly, never an option). However, I’m a huge fan of Phil and want his channel to keep growing. Saying “Yes, I watched that one video of his and never want to watch another” seems wrong. I don’t want Google to take that message away from this survey. On the other hand, I hate how my current suggested videos feed is full of videos I’ve already seen and content from channels I’m already subscribed to. It’s a personal pet peeve of the current Youtube setup because it makes that page incredibly pointless, so I really don’t want to reinforce that behaviour and say that these are good suggestions.
At this point, I’m definitely over analysing what’s going on, but you would hope a company the size of Google would understand that the way they present a survey will have differing impacts. The questions are needlessly broad and non-specific, leaving the interpretation open to the user, but the subject matter leaves me stuck trying to guess what data Google actually want from me. Do they want me to know if I like those types of videos or do they want me to ‘confirm’ that suggesting other videos from channels I’ve watched before is a good thing? Unfortunately, I don’t know which it is, which means I don’t really know what the question is, and if I don’t know that, how can I answer it?
In the end, I just stuck them both at 4/5 stars. Typing this up now I feel that was probably the wrong thing to do, but oh well. At the end of the day, Google asked what seems like a fairly innocuous question, but one which has two wildly different answers. I doubt I’m the only person getting that question but I’ll probably be an outlier in my response. Still, it’s a prime example of where the phrasing, setting and simplicity of a question can leave it horribly ambiguous. The result will likely go on to inform some form of policy at Youtube, which is a shame, because no matter what question they thought they were asking I doubt it’s the one they’re actually having answered.