Hacker News new | past | comments | ask | show | jobs | submit login

Hmm ... "proto-danksharding" which activated the "blobscriptions protocol" so that blobs are "much cheaper than calldata", all of this helping it to become an "L2-centric ecosystem". In the end, this leaves them "not confident enough in the complex code of an optimistic or SNARK-based EVM verifier".

I'm sold ... just tell me where to transfer the money.




I wouldn't be surprised if all that was just a made up jargon and this was a joke article. But again, it's about blockchain, so the line is thin.


There's actually interesting technology being developed here in the areas of distributed computation and zero trust systems. The implementations of Zero Knowledge Proofs and ongoing work on ZK-SNARKS I personally find most fascinating.

There's a lot more to this ecosystem than just speculation. At it's core is a distributed world computer but all anyone knows about is money.exe because this stuff is immensely complex.

If you look into the researcher rather than paying attention to the soyjack youtube thumbnails you'll find the actual substance. Nobody is going do the work for you. Or you know, just write it all off with a snide joke because "crypto bad".


Since distributed blockchains / databases are distributed and therefore need partition tolerance by definition, that leaves consistency or availability for the shortfall in computation.

I'm guessing any "crypto-kinda-currency" is picking eventual consistency as a core mechanic. Think about the word EVENTUAL though.

If the core function of the crypto is a ledger, then it makes sense, it EVENTUALLY gets transacted, and in practical terms you take the faith in the distributed system from a flawless previous record in reconciliation, probably before the actual completion of the transaction.

Now, a distributed major blockchain has ... how many nodes? Thousands or more? That is a long time for reconciliation of the consistency, even with great dedicated internal networks. What? This is over a heterogenous global internet network? That implies EVENTUAL has some bad worst cases.

"Smart contracts" or "distributed trustless computation". Whatever, getting the value of calculation from a node and getting the value stored in the node is essentially the same thing in terms of determining an answer to a query.

It implies a horrendous performance, one you have little control over. I don't think Kubernetes is shaking in its boots.

It's interesting Aphyr never does any crypto analyses, although he makes his bones running a test suite. How do you test a scaled cryptocurrency?


coti is also using dags for some reason


> At it's core is a distributed world computer but all anyone knows about is money.exe because this stuff is immensely complex.

Alternatively, because the only way to use the aforementioned distributed world computer is to engage with money.exe and buy more CoinTokens. Imagine all the kids out there who will be delighted to learn a pay-per-use code interpreter. "Hey mom, I need your credit card to cover the gas while I debug my smart contract."

But assuming you have the money to spend, it's a whole universe of possibilities! Just make sure to cash in before actually trying to use any of them.


How did you not realize that one can run/evaluate code without actually broadcasting it? All you need is access to a node (plenty of public ones) and you can "simulate" any transaction(code) you would like.


Testing a dapp off the mainnet is like ensuring your website works on localhost. It will find some issues, but it's not representative of how it will look in deployment.

In any case, for actual usage it should surprise nobody why everyone conflates Ethereum with money. No, your L2 chain does not qualify as an official solution.


> Testing a dapp off the mainnet is like ensuring your website works on localhost

I would argue the exact opposite. A website will be deployed to different versions of different browsers on different operating systems. A smart contract will exist on a single distributed computer. It sounds like the actual problem is people treating smart contract development as cavalierly as web app development


No, you can test transactions as they would happen on mainnet (tests with mainnet state). Or if you want you can fork mainnet and do your stuff there.

It's absolutely representive of how it looks in deployment. You can test transactions EXACTLY how they would happen on mainnet.

I don't get your second point.


The problem with this statement is in assuming that any of this is actually ready for the average user, like a minor with their parents credit card. It's really unfortunate that the space received all of the attention it did during the pandemic, as that only managed to bring in misaligned expectations fueled by grifters making impossible claims.

There are a number of planned upgrades on the roadmap[1], such as layer 2 blobs, that will eventually drive the cost per transaction closer to zero, however we're still a decade away from that being the case. In the meantime you can debug your smart contracts on a testnet for $0

[1] https://notes.ethereum.org/@domothy/roadmap


Layer 2 blobs aren't even a solution either, arguably. You have to then engineer the layer 2 bridge to have it's own anonymization and escrow handling technology that is disconnected to the Ethereum network entirely. And realistically speaking, "closer to zero" does not mean free (or even at negligible cost). L2 chains can only exist when transactions on the mainnet are made impossible due to unbalanced gas prices. It's a catch-22.


The actual quote is "we are not *currently* at the point where we can be confident enough in the complex code of an optimistic or SNARK-based EVM verifier". The article seems to imply that "in the end", they will be.


Unlike the parent, the full quote gives me more confidence that they're being serious about the upgrades to the Ethereum protocol. This stuff is all cutting edge distribute systems and zero-knowledge proofs work, so of course it's going to take a while to reach confidence in how it'll work.


Something tells me that even if/when the optimistic or SNARK-based EVM verifier is production-ready, the person you're replying to will still feel somewhat unconvinced.


Very likely, but if they get it sorted out one day he will be using it without knowing.

If you have no interest whatsoever and they start explaining to you all the cryptography behind establishing a secure connection to your bank most people would dismiss it as mumbo-jumbo. But now you can tell your grandma to look out for the little green lock on the web that makes her account secure.


> he will be using it without knowing.

I will know. Not because of the "little green lock".

I will know in the same way I know this site is secure. In this case, because of PKCS #1 SHA-256 (aka CKM_SHA1_RSA_PKCS_PSS). Cert issued by DigiCert Global Root G2 and valid until one second before midnight UTC on 3/29/31.

That's where I guess I'm losing sight of the vision.

It's tested, it's proven, it's secure, it works, no "gas", no fees ... I don't know. Maybe I'm just missing something.


The only reason it's foreign to you is because you are not familiar with any of the technology. It's no different that having no idea about LLMs or ML or transformers etc., or if you are not a programmer than being confused by arrays and recursion and TCP/IP...


Obviously, the whole point is to tease all this serious discussion over something that amounts to a toy with almost no practical application.


Yes, reading TFA left me quite unclear as to how many slurp juices per ape all this new technology translates into.


[flagged]


When setting up my DWS on the chain I performed a fresh implementation of the flex spanning system so that sharded stakes were accordingly modulated when dilution factors are on the rise. Retro validators should never rely on the sole consideration of performative liquidity, and that's why in most use-cases, distributive-non-passing underfitted categorization of assets is preferable. Every DNP-uca implementation has proto-failure systems that allow for better mining experiences, in fact every time assets are minted you obtain by-products of the initial dilution thanks to the false commitment that is produced when pseudo-stakers correct the current derivation according to the relative spike index. That's why the tech interested me at first.


LLMs at their finest.


Just because you're unfamiliar with the technology doesn't mean it's nonsense. The failing is yours in actually looking into these things rather than trying to be funny.

If you explain to most people how a TLS handshake works it will sound to them as equally nonsensical. This technology is complicated, this article is targeted at an audience that understands the technology, not one that needs a "My first introduction to distributed computing"


I agree with you, but I was just trying to be funny.


TLS handshake actually makes sense for average programmers who understand basic cryptography.

Ethereums problem is that it is a badly designed clusterfuck. The newer blockchains will likely take over.


Ethereum should also make plenty of sense to an average programmer who understands basic cryptography. The concept of a distributed virtual machine shouldn't be difficult to grasp.

You could dive deeper into either of the topics you mentioned and start losing people - for example how QUIC carries a TLS handshake, or why enshrined proposer-builder separation is important to Ethereum. All that means is that both protocols hide complexity under the surface.


That's a pretty bombastic claim with no supporting evidence. What exactly makes the design a clusterfuck, and what do newer chains do that is a significant improvement?

So far all of the alternative chains have been plagued with downtime and centralization of nodes into supernodes (at least in the ones that weren't centralized from the start).


/r/vxjunkies

Wait, this isn't reddit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: