Hacker News new | past | comments | ask | show | jobs | submit login
Google Consumer Survey (google.com)
149 points by iag on Oct 8, 2012 | hide | past | favorite | 45 comments



Don't use this if you are going to spend money based on the outcome (or do anything else important with it).

People do not have a vested interest in completing the questions honestly or on the outcome. Their motivation is to get to the premium content, and although there will be some that are honest, most people are trying to get "free" content and have the most motivation towards not giving value (money or answers that betray their privacy/thoughts) back.

My day job is primarily based on analysing survey results, and I can tell you that the only way to get decent usable results from "paying" people is to make sure that they have a vested interest in the results of the survey and that the payment is a (nice) side-effect of completing the survey and not the goal.

Edit: Of course, this type of survey can be useful if you need to "validate" something that otherwise you can't (i.e. to falsify results). Simply choose multiple choice questions and stack the answers in the right order (people usually click in certain pattern when doing it at "random"). Keep repeating or extending your survey until you have the right ratio of participants in favour of your preposition, then call it a day and publish!


Can you give me an example of how people having a vested interest in the results of a survey would improve the results?

I always thought that if you have a question (like "should the government fund more particle physics") and you ask people with a vested interest (particle physicists) you'll get an answer aligned with their interest ("absolutely, extremely important") in which case why do a survey at all when you know what answers they're going to give?


You do indeed get an answer aligned with their interest, which is often useful (not perhaps in your example), as that is generally the point of the survey (to find out what your respondents think).

You need to pick your respondent cohort carefully if you are trying to ask a question of opinion (particularly more so than when asking a question of fact) to eliminate bias in your cohort. So in your example, rather than surveying particle physicists, you would look for another group of interested parties (e.g. a range of tax payers or non-specialist funding bodies) who have a vested interest (in this case in how their money is spent).

My work, for instance, revolves around surveying doctors on their experiences of medical training and career needs/aspirations. The vested interest of our participants is hopefully clear (influence on their training/career/work environment). We ask a wide range of factual and opinion based questions, eliminate biases (or segment responses on biases) due to various factors (e.g. location, age, gender etc.) and report our findings in the context of our cohort, i.e. "these are the experiences and opinions of doctors".

In short, you are right, you need to consider the makeup of your cohort and also how you report your findings in respect of that. Having a vested interest doesn't necessarily mean being biased, and sometimes that bias is what the survey aims to report on. In a survey like the google ones, that bias is money/free content, and thats all you will discover.


I think he's meaning they will actually pay attention to the survey, and then asking questions that aren't as black and white as the one you posed.

Asking particle physicists would they rather have X than Y when both are relevant to their interests would yield better results than asking a butcher particle physics-related questions and dangling a carrot in front of him for clicking through it.


GCS regularly conducts validation surveys which measure the incidence rates of common health issues like asthma, smoking and disability along with consumption of various media to make sure it's sample is representative of the population. It's pretty accurate. You can read more about it here: http://www.google.com/insights/consumersurveys/static/consum...


I've only skim read that as I'm at work, but it looks like it only compares their results with traditional internet surveys, is littered with caveats like it depending on the subject matter (type) of the survey, and doesn't fully show the methodology (full question wording and presentation, context, full details of respondents etc.).

It appears to only show the relative accuracy of certain types of questions in relation to other particular methods of internet surveys. Its not something that I would use to back significant spending decisions.


> Simply choose multiple choice questions and stack the answers in the right order (people usually click in certain pattern when doing it at "random").

Ideally, you could randomize the order of the answers for each person, and later divide them up by order presented to detect any bias. This is something that online surveys make much easier than paper surveys.


the thing is with tools nobody knows about is, that google regularly kills them. i.e.: http://trends.google.com/websites?q= google trends for websites

this was the unbelievable useful tool to get competitive data (traffic, queries) about other websites. it was not perfect, the data was sometimes months late. but it was good enough to compare websites with each other.

now it's gone. probably it didn't fit into google strategy. probably someone at google decided that "we are not in the business of providing competitive intelligence data of websites" (i guess that, as the google adplanner was stripped of lots of website data in the same timeframe )

the same thing with the product mentioned above. it will be killed. maybe not this year, maybe not next year, but sure as hell sometime.


Instead of "Trends for Websites", you can use "Google Ad Planner " https://www.google.com/adplanner which has more detailed traffic statistics.


nope, not anymore (as mentioned above) i.e.: https://www.google.com/adplanner/#siteSearch?uid=twitter.com... killed the product (for most websites on the web), too.


Nowadays you can only use it for websites that accept Google AdWords.


which is why I won't bother to post to google+

after killing google reader and buzz I just don't trust them anymore and won't put anything of value into their ecosystem.

they should find a way to just support everything forever even if development is killed.


Why do you say Google killed Reader? It may have moved down in the menu hierarchy, but it works fine still. They did, however, kill the project that allowed you to subscribe to html instead of RSS and google would compare versions of a page and generate RSS of differences, possibly that's what you were thinking of.


Sorry, I shouldn't have said "killed". The commenting on articles in the context of reader was great. We had bunches of gmail friends who would get some good conversations going . Buzz extended that and for a while it was high-quality at least for us.

Do you remember the protests outside of Google offices when they removed that ? One couple even met through reader and got married.

The shared items RSS out was another sad loss. I had set up one client so that he could select news items using that and they would show on his site as industry news. I used it to post stuff to my own site.

Oh, and recommended items basically sucks now and related feeds rarely returns anything. It was like they removed the google magic and left it as just a feed reader.


There's always cost to keeping old code around. With any given Google product, regardless of how unpopular it is, there will always be people unhappy that it's been pulled. But it's healthy to have a focused set of products and to cull non-useful ones.

Also: they didn't kill google reader. And Google+ is not comparable to Buzz. Google is clearly not going to kill that off.


Very true. I am one of those people who checked Trends for Websites once or twice every week. It used to give a nice overview of traffic of various websites. I am amazed there is not a lot of outrage over this.


well, i think we got used to the fact that google just kills all the good stuff that does not make them (enough) money or are not directly relevant for their main advertising business. i.e.: i don't think they will ever kill GA, as it's necessary for their customers (advertisers) to see how much they get out of it, they will never kill google webmaster tools, as a good crawl-able web makes their job easier, so it makes sense to give webmasters the necessary data,....

but for anything that is not directly relevant for search and advertising, all bets are off.


I've seen these kinds of things in the past. I can say first hand that I for one do not read the questions... type gibberish in text fields and select random choices in multiple choice answers. I am sure others will do the same.

I don't see how this can work. The only time I see it some-what-working is while gathering a massive (really massive) amount of submissions for a decision involving only two options. Maybe then... but even then it wouldn't be possible to determine the actual percentage of users suggesting one or the other option.


You can email Paul McDonald to get a free voucher to try this out (see http://www.quora.com/Google/What-is-it-like-to-use-Google-Co...)

I used it to ask how much people were willing to pay to collect video customer reviews and a lot of the responses were 11111111, 1000000 or 12345 dollars per month though.


Never, ever make a free entry question. You'll wind up with "balls", "suck it", etc. all the time. Predefine 3-4 set ranges, and let them perform a single click to answer the question. Much more accurate results.


more accurate results, or more plausible-looking random results?


Two things:

1. As the researcher you can set a range of validity for open-ended numeric if you know it ahead of time. We recommend doing this.

2. The system automatically detects outliers in all cases and puts them in a separate bucket to reduce noise.

An example of open-ended numeric for pricing is here:

https://www.google.com/insights/consumersurveys/view?survey=...


From what I can tell #2 doesn't seem to be working in my case.

You can see my survey at https://www.google.com/insights/consumersurveys/view?survey=...


I used this once and was really happy with the results. I think there's a little bit of an opportunity bias which good statisticians would have issues with, but for some useful imperfect numbers I think it's a good solution.

Also, this is a pretty new tool.


This type of market research was previously only accessible to companies that could drop 10k or more on these surveys. This tool will help entrepreneurs validate ideas they were unable to test because it was cost prohibitive.


I spent a couple of years working for a company in a market research space. It's an interesting business of which I knew very little before I worked there. But the answer is that you could do a survey for a fraction of that cost. SurveyMonkey has the same product and so do a few others.

As an aside, google launched this about 6 months ago and pretty much has everyone in that business scared. Ultimately it all boils down to having profiles on a lot of users and selling access to them. Google's user base (and what they know about them) dwarfs what everyone else has combined. If they wanted to, with a bit of effort they could own that market in 6-12 months.


I hear you on that - but that survey monkey panel is going to cost $3-$8 a response depending on the audience you want to reach (i.e. 50 year old working men tend to be expensive), and the results just aren't that pretty looking like googles. I wouldn't be surprised if Facebook comes out with a similar product - they seem to be in a better position to give users rewards to perform MR activities, and have more data on users for additional cross tabs;' Google only has approximate age, gender, location.


So for $110 I could run my own presidential election poll?

Hmmmmm.

No, wait. Targeting is 50 cents a person, not 10. So it would be about $550. Still, though.



Very interesting, thank you for sharing that!

The patterns in subpopulations seem to match general expectations. Very cool.

Almost no responses in the $100k+ incomes. I wonder if that's a general problem with the method?


It could be a general problem. For collecting responses from high-income individuals the method faces the double-whammy of: (1) smaller number of potential respondents; (2) respondents are least willing to spend their time on such surveys (or the sites the survey-wall blocks).


Here is a link to our latest election straw poll, run entirely after the 10/3 debate.

http://goo.gl/7H2KQ

You can compare those to some of the other polls aggregated at Real Clear Politics here: http://www.realclearpolitics.com/epolls/latest_polls/

For those people interested in the accuracy & validity of the GCS platform, this is a great place to start.


This is brand new, I saw it previewed on JLG Monday Note blog. The consumers are going to be incentivized to take them as a form of micropayment for access to articles behind a paywall...win-win for everybody I guess.


consumers are incentives to take the survey to see premium content - I didn't see anything about micro payments.


Google will charge the survey initiators, take its cut, pay premium content providers with the balance.


Glad to see the interest in this new form of market research. YC S12 company Survata.com offers a similar service. We can run multiple questions at a time, so researchers enjoy the ability to cross-tab responses (an important ability you lose when you run only 1 question surveys on other services). We encourage you to check us out at Survata.com (we give all first time users a $10 coupon) and we welcome any feedback.


Although this idea isn't new, I'm glad Google is getting into this space in order to make it more competitive and cheaper to perform these types of surveys.


This is clearly not my domain, so I am sure I am missing something, but how is this different from creating, say, a Wufoo form to prompt users to enter information that you need? (WuFoo seems to be $0.05 per entry)


If I understand correctly, Google supplies the traffic.


The results are a representative sample of the US Internet population.


The term for this is "content-locking". The opposite side of this relationship (for publishers): http://www.blamads.com/

disclaimer: friend owns that site


Won't this lead to pay-walls adopting this and eventually creation of pay-per-view non-news sites? If so, people will blindly click thgough and quality of data won't match expectations.


Survey pay-walls have been around for a long time and I would imagine a lot people already click through them blindly. I expect the price will eventually drop further so that you'll be able to afford enough responses to get past that margin of error.


Well the paywall thing is a huge deal-killer for me although I love the a response dashboard system they've built. For those of you conducting surveys, what are some good alternatives?


Now you, too, can test 41 shades of blue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: