Well maybe they just have like a lot of linked lists that they need reversed? With that many people working in parallel you could really increase the throughput of the reversing process. Of course this is assuming that you previously hired people to identify and remove any cyclical nodes in those link lists.
Tell me, what's the big O of having the people working in parallel on the reversing? And as a follow-up, can you explain whether it is better if you have a team of n people doing the cycle detection to do it using a system which is eventually consistent or strongly consistent and why?
Also if it was actually about free speech, he wouldn’t immediately capitulate every time a foreign government asks him to remove content they deem inappropriate.
Watermarks can be helpful, but I believe that provenance via digital signatures is ultimately a better solution. Curious why Google doesn’t join the CAI (https://contentauthenticity.org/) and use their approach for provenance of Google’s generated audio files.
Bob produces something with AI but claims he produced it himself and signs it with his private key.
AI produces something and signs it or doesn't, but if it's signed you can just throw the signature away and either publish it as unsigned or sign it again with a different key.
Signatures allow Alice to verify that something is signed by someone who has Bob's private key. If only Bob has Bob's private key, that means it was signed by Bob. It doesn't tell you whether it was generated by AI or not if Bob doesn't want you to know, because Bob can sign whatever he wants with his private key.
In this case "Bob" is presumably supposed to be some camera with DRM, but that means it will be in the physical control of attackers and anybody who can crack any camera by any manufacturer can extract the private key and use it to sign whatever they want, which is inevitably going to happen. Keys will be available for sale to anyone who wants one and doesn't have the technical acumen to extract one themselves. Since that makes the whole system worthless, what's the point?
> Bob produces something with AI but claims he produced it himself and signs it with his private key. … because Bob can sign whatever he wants with his private key.
Whether or not to trust Bob is an entirely different problem space than being able to prove an image came from Bob. In most scenarios Bob would be “trustworthy news source” who cares about their reputability. The important piece here is that if someone shares something on e.g. twitter and says Bob produced it, that claim can be verified.
> crack any camera by any manufacturer can extract the private key and use it to sign whatever they want, which is inevitably going to happen … Since that makes the whole system worthless, what's the point?
Think about what happens today when a private key is leaked - that key is no longer trusted. Will it be such a large scale problem such that the day any camera is released the keys are leaked? Maybe. Even in that scenario though we end up in the same spot as today except with the additional benefit of being able to verify stuff coming from NPR/CNN/your preferred news source that is shared on third party platforms.
> In most scenarios Bob would be “trustworthy news source” who cares about their reputability. The important piece here is that if someone shares something on e.g. twitter and says Bob produced it, that claim can be verified.
We don't need some new system for that. You go to the website of your preferred news source and the connection is secured with TLS which certifies that the server is the one for the domain your browser shows you're visiting.
> Think about what happens today when a private key is leaked - that key is no longer trusted. Will it be such a large scale problem such that the day any camera is released the keys are leaked?
It's not that some camera's keys will be leaked and then you'll know not to trust them. It's that someone publishes how to extract the keys from some camera and then everything signed with any of those keys is called into question. Or figures out how to extract the keys from some camera or swipe them from one of the bureaucracies that generate them and doesn't tell anyone, they just use them to forge signatures.
And then because that is not only possible but likely to happen in practice, and you have no way to know when it has, you can't actually trust the signatures for anything.
> you can't actually trust the signatures for anything.
Do you bank online? Public-private key encryption work well enough to support millions (billions?) of dollars worth of transactions per day - I don't think it's as broken as you make it seem
It's a shame they don't use a SQLite database for version control. I know it's probably the least efficient way to store code changes but it would bring a whole new level to bootstrapping processes. Each code change would be inserted into the db by the code produced by the prior change.
> I know it's probably the least efficient way to store code changes
Not at all. Fossil's own repository (sqlite) database currently (as of this moment) has an overall compression ratio of 101:1, packing a total of 6.7GB of data into 66.5mb.
Does Amazon do literally anything to pro-actively prevent the selling of counterfeit goods?
> Prior to the deal, Apple sent “hundreds of thousands of take-down notices” to Amazon to reduce counterfeits, and the company conducted test purchases on Amazon that “consistently returned high counterfeit rates
It has become increasingly difficult to operate as a 3rd party seller on their platform. A lot of categories and items are "gated" requiring a lot of additional approval or limited access that's difficult to come by.
For example, there are books I purchased off of Amazon, directly from Amazon as new books, that I am not allowed to sell as used on their site. This wasn't always the case.
They have anti-counterfeit teams, brand protection programs, etc. They say their brand protection is about 99% effective at blocking counterfeits. But, they have something like 350+ million listings, so there's still tons of stuff that makes it through the cracks.
Fraudulent listings aren't specifically an Amazon problem, there's junk like the above example on basically every marketplace site. Search "2tb flash drive" on any of them. It's a fundamental problem with non-curated marketplaces.
I think ideally we'd get to a point where end users could easily see who signed an image so if someone was claiming an image was from cnn it could be validated. I imagine the end goal would be to put a warning on images that aren't from a signed source or maybe not displayed at all by default - similar to the uptake of https
Agree, but I also think that signing at all should be optional. That is: Give the camera user the option of strongly identifying themselves (with the added trust that brings), or remaining anonymous (as is the case now).
Because there's only one intended reader who is sufficiently educated and smart. All others of us are impostors who should not have pursued this career at all.
https://blog.adobe.com/en/publish/2024/06/06/clarification-a...