Hacker News new | past | comments | ask | show | jobs | submit login

There's a fallacy of composition and awareness here.

Individual users:

1. May not be aware of how data are being used. (In fact this is a virtual certainty.)

2. Don't appreciate the immense power of data in aggregate. (Something that is close to Facebook's key commercial advantage.)

3. May be exposing data on other users, who are not participating and/or don't consent to particupate in such data hoovering.

I'd argue that Facebook can also make exceptions, and that good-faith, well-reviewed research projects, particularly those aimed at independently assessing manipulation and propaganda efforts on the platform, are a case I'd strongly recommend. But to say that Facebook has no right or obligation to decide is false on its face.




Keep in mind that when evaluating a research proposal, Facebook will have zero interest in evaluating the "good faith"-ness, or "well-reviewed"-ness of the proposal. And to be fair they are probably not qualified to do that, and would have no incentive to become qualified.

As a business, making a business decision, they'll want know "can this come back to bite us" (and they will miss many of the ways that might happen), and, how much will this benefit us either in money or in facilitating new ways of making money with the new information.


See lilactown's excellent comment here: https://news.ycombinator.com/item?id=28064953

My reply to that addresses some of your concerns as well.

TL;DR: the call is not entirely Facebook's to make. Perhaps not at all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: