Hacker News new | past | comments | ask | show | jobs | submit login

> however, what is a reasonable alternative for a content creator who wants to publish their videos and be able to build an audience?

I think it's more instructive to look at it from the other side. What's the reasonable alternative for a video hosting service to doing this kind of policing? Remember it's not really an option to just throw video over the fence, DMCA requirements mean you have to be responsive. And thus there's a built-in incentive to cut a deal with the content owners to preemptively prevent the DMCA claims (which are expensive!) by doing this sort of automated policing.

It's true that not every host does this, but every host that doesn't do this either does it in violation of the law or eats significant overhead that needs to be recouped in some other way (i.e. by paying their content creators less! Check the author's channel, this is someone who's clearly on youtube for revenue. Would even she jump ship given that it would probably cost her money?)

Really, this isn't something Google can fix. It's a problem with the legal regime that imagines that all infringement is a bright line definition and that preemptive takedowns are the best solution.




Completely agreed. I'd love to see Google (and let's get Twitch in there as well, while we're at it) working with content creators to push for legal change. To me, that's what's most disappointing: Google will spend millions on lobbying to legally collect more and more information about me; but, they don't have the inclination to use their size and scale to push a significant expansion of Fair Use Doctrine (or, if they do, they certainly aren't vocal about it).


It was out of self interest of course, but, Google did just finish a decade long, costing probably hundreds of millions of dollars in legal fees, battle with Oracle to fight for reasonable copyright interpretation for software developers.

This current situation on YT is the result of another multi-billion dollar legal fight with the music industry. I don't like how YT handles this either but I put most of the blame on the music industry for it.


Also remember Google Books, which was a huge (seriously, huge) project that spent years with full funding only to be killed by the publishing industry because they couldn't justify it under the notion of "fair use" we're stuck with.


you are getting to things confused here: fair use of copyrighted material, and being responsible and responsive for a quasi judical algorithmic process and its mistakes.

(remember its beethovens moonlight sonata, one of the most famous classical pieces period. i dont think you even could copyright a part of it, at least not just the sequence of notes)


The alternative is to responsibly scale your service. If your platform is so big that it can only be moderated by algorithms, and the algorithms don't work, then it's too big. Scale down until, at the very least, you can hire enough humans to review complaints when the algorithm does something wrong. Even better would be to have a human review every flagged violation to confirm it.

Obviously, hiring humans is expensive, and nobody is forcing them to do it, but that doesn't make it an unreasonable alternative. I consider it unreasonable to design an unethical system with the sole excuse that it makes more money that way.


So... which providers have responsibly scaled, in your opinion? All we have are tiny hosts (mostly porn) running by the seat of their pants and occasionally disappearing in a conflagration of lawsuits, and big folks like TikTok and Google and Facebook with draconian preemptive enforcement of various forms.

I think that argues strongly that the service you want to see is "unreasonable" given the regulation regime we have. You can't put this on the hosts, you'll just be disappointed. Call your representative.


To leave a ton of money on the table, and let the competition eat that space, because 0.1% of DMCA requests would be served with an overreaction?

Sorry, this is not going to make business sense. If you want a free video publishing platform that does user outreach for you, you got to pay a price; the false positives is a part of the price, alas.


That’s why we need a regulation change to make it so that you can’t outcompete by being unethical.


I don't think it's viable, but I welcome any attempts to actually imagine how such regulation might work, what would it consist of.


Easy. Put the burden of legal fees and risk of censure on a DMCA mill for false positives. This means if a host takes down content to maintain their safe harbor status, and the person can prove the copyright claim was false or frivolous, you get a cause of action to counter-sue the original claimant. Moonlight Sonata is in the public domain. The teacher's performance != anyone elses. Therefore, false, therefore it should have reprecussions for the signing attorney. If it was pre-emptive by Google without a DMCA request from an external entity, that's a different issue.

People don't get it. Perjury actually means something. When you have the blade of perjury over head, it is absolutely the case that as a human being, if you have doubt, you should be saying it, or you're misrepresenting the truth of the matter.

The level of perjury inherent to generating these claims via automated process is absurd. It is absolutely reasonable that when you have a group going around and using the legal system as a cudgel, the proper response is to return the favor.

This process should not scale at all if people would punch back. Perjury should be trivial to prove when no one even looked at the content in qustion aside from an analysis suite.

Regardless; there is always the self-hosted option.


I definitely don’t have a general case solution. A solution for this case would be something like “a provider of video hosting services is not allowed to take down videos automatically without having a human review each request.”


The law never had that intent though, and getting it into there will most probably be impossible .


I’m not saying that we will get this outcome, just that it’s the outcome that we need. Unfortunately I agree with you that we won’t.


I agree it doesn't make business sense, but it does make moral sense. And yes, this hypothetical service probably wouldn't be free.


well what kind of a monstrosity is this buisness sense then when it is strcturally incompatible with any moral whatsoever (as seems to be more or less the consensus of the first couple comments here)


>I think it's more instructive to look at it from the other side.

I'm cynical, so I feel it's much more likely that a competitor will push actual change to come (be it through google self improving, or through future lawsuits challenging the current laws) than for a current monopolistic entity to finding alternatives to problems that don't inconvenience them at large.

>Remember it's not really an option to just throw video over the fence, DMCA requirements mean you have to be responsive

"Responsive" is the key word to be challenged here. I'm unsure if automating a removal at the behest of any barely or unverified account is the bare minimum "reponsiveness" required legally. The big problem that won't be resolved without someone legally challenging it is that there's no negative consequence to filing a DMCA claim. Or at least, there wasn't as recently as 4 years ago.


> And thus there's a built-in incentive to cut a deal with the content owners to preemptively prevent the DMCA claims (which are expensive!) by doing this sort of automated policing.

How are DMCAs expensive?


You need to pay a human being at $15/hour or more to spend a few minutes to review the claim and response, on videos that on average are probably making you a dime or less in ad revenue. These companies receive a lot of DMCA claims, most of which are absolutely valid.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: