Hacker News new | past | comments | ask | show | jobs | submit login
How the Next Layer of the Internet Is Going to Be Standardised (mnot.net)
70 points by kennu on June 21, 2021 | hide | past | favorite | 14 comments



If the technical standardization process has been captured by major corporations and is being used to damage smaller competitors / further cement their own control over parts of the economy they consider integral to their corporate well-being then obviously that process needs to be controlled by governmental regulatory agencies that keep predatory companies in check.


> that process needs to be controlled by governmental regulatory agencies that keep predatory companies in check.

Two words: regulatory capture[1].

Every time it looks like we're having a referee police the marketplace, it turns out that the seller owns the marketplace, and the buyers become the product.

As technical types, we crave standardized purity.

As humans, we may be in better shape with a bit of inefficiency in the name of liberty.

[1] https://en.m.wikipedia.org/wiki/Regulatory_capture


> Every time it looks like we're having a referee police the marketplace, it turns out that the seller owns the marketplace, and the buyers become the product

Do you have data for this frequency claim? (Or built-in supply-side bias?)

Regulatory capture is a phenomenon. It is not guaranteed. Even if it were, it does not follow that suppliers—versus a pet consumer political group or powerful intermediary—will accrue the benefit.


Yes, I overplayed the quantifier there.


yes I'm familiar with the concept of regulatory capture but two points - as problematic as the whole thing is there are legal restrictions on regulatory capture that you could conceivably be caught in the act and two there are no legal restrictions on standardization capture - which has more of a chance of working?

>As humans, we may be in better shape with a bit of inefficiency in the name of liberty.

so, if I understand the argument - because regulatory capture is a thing we should allow standardization capture and the big companies should just do what they want because they will anyway?

at any rate if >Every time it looks like we're having a referee police the marketplace, it turns out that the seller owns the marketplace, and the buyers become the product.

why do the big companies complain so much about being regulated if they will benefit so much from it? Is it all just masterful reverse psychology on their part?


> so, if I understand the argument - because regulatory capture is a thing we should allow standardization capture and the big companies should just do what they want because they will anyway?

My contention is that lack of regulation does not equal chaos.

Does it really matter if I buy raw milk? It does to the FDA, but can we just be adults and trust tne nose?


> why do the big companies complain so much about being regulated if they will benefit so much from it? Is it all just masterful reverse psychology on their part?

I do think so. The big ones drive the legislation via lobbyists and freeze out the little ones who can't staff a compliance department.


The reality of standards is: there is no standard. My experience is that companies take standards as guidelines and then modify/develop their implementations to their needs.


Some people would that's what a standard is. The IETF's mantra of rough consensus and running code is one way to describe the processes of creating and using a standard, what you say there is another. A little snide or unkind perhaps, but not really wrong.

I've written eleven RFCs, I've seen how the sausage is made. The sausage is good.


Just to clarify: I wasn’t saying anything positive or negative about that. Just wanted to share my real-world experience. Before I naively thought that standards are very rigid :)


The nice thing about those standards is that you can choose which one you want to use.


Interesting to think of all of history as a succession of layers being added to the operating system of human civilization.

The interpersonal layer

The familial layer

The trading layer

The political layer

The legal layer

The transportation layer

The textual layer

The mass-media layer

The microprocessing layer

The internet layer

The virtual layer

Each one retroactively reconfiguring the layers below it as it comes into being.


If I recall correctly it's only the last decade things haven't been fragemented in the way the article describes - ie I remember there being a UK Google seperate from a .com Google. That actually seemed to perform better as a search engine because you still have access to international stuff but optimised for a UK searcher rather than what we have at the moment where it is fragmented down to a billion pieces each optimised for one individual but simultaneously completely missing our context and therefore don't seem to work very well as a search engine


Governments may have a legitimate role to play here only because of the impact the Internet has on markets and competition. Perhaps the inevitable ossification/fragmentation will precipitate a bigger shift.

I think we will see a return to a non-commercial community focused Internet. Software built by developer collectives or open-source communities could very easily win users away from mega-corp software and the instability in user confidence could be the falling domino that triggers this new wave of Internet actors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: