Hacker News new | past | comments | ask | show | jobs | submit | atrilumen's comments login


Choo-Choo!

https://choo.io


I think the name is probably hurting adoption.

The creature it refers to pretty universally considered repulsive. I'm no psychologist or anything, but I would expect that to have an undue influence on even the most rational person.


I don't know anything about it, but my first impression is that it's as robust as a cockroach, able to take a lot of damage.


The main metaphor I'm familiar with is that cockroaches always come back even as you try to kill more and more of them, rather like zombies, and never in a positive way.


i'm pretty sure that's what they're going for, except that the rest of the world doesn't think so.

It will definitely not help any CIO or anyone in an organization who tries to lobby for their org to adopt cockroach db.


OMG they went full·on WALL·E

Where do I put my 50 ounce soda?


I don't know; do you want to build services that depend on random hardware on residential internet connections, hoping they stay online, while exposing metadata to the entire WAN or slowing it down with some kind of mixnet protocol?

To me, edge networks make more sense. Distribute your services across datacenters close to your users.


Stop implementing them, and move to key & pin. Use backup keys instead of resets. Stop depending on email and SMS.


[Meta] god damn it why is scrolljacking still a thing?

( And if they fail at this, why would I trust them with anything else? )

Shame!


You'll likely still want to bundle for production, though, for the optimizations like minification, dead code elimination, module splitting, etc.

But yeah, JS is Crazy Town. It can be very frustrating.

( Be wary of dependencies. )


Chimba.


The web is Google's platform now, and they're locking it down.

It's time for a new web, and a new user agent with a minimal core so it isn't impossible to implement.


Not sure how much I agree with the first statement, but I definitely agree with the second. That's not an easy problem, though.

First, we are doomed from the start, because of network effects. The new web will likely never gain any traction whatsoever, because it looks like backwards compatibility is more important than simplicity, performance, and CO2 emissions.

Second, we must agree what that new web is for. We can display text, images, audio and video. We can tweak the layout of the content. We can take input from viewers (text, uploaded files…). We can make entire applications on top of the web.

Once we agree on the purpose of the web, we need to chose how to make it happen. Do we serve content declaratively, or procedurally? Should browsers be readers of a well defined, limited data format, or should they be virtual machines? I personally prefer virtual machines (unlimited functionality on top of a very simple core), but their natural opacity does have its problems: screen readers, dark mode…

---

There may be a way to break network effects: government web sites. Define a new standard that serve those right, make sure this standard is easy to implement pretty much everywhere (including on old computers with a crappy connection), and mandate that all .gov sites move to that. Also maybe rethink the whole security layer, most notably the PKI.

To move things further, we could possibly use regulations. For instance, we could mandate that banks provides an option to use that new web. We could regulate our way into a critical mass, to a point where common folks can realistically ditch the old web.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: