Hacker News new | past | comments | ask | show | jobs | submit | rkagerer's comments login

I hate what "modern" web development has done to my UI experience.

Eg. There's a delay between an Amazon.ca results page loading and it populating the search text box with the phrase you just searched for. As a result when I do two searches in rapid succession, it consistently interrupts my typing at exactly the right time to batter in incomprehensible text.

AirCanada and WestJet are two other websites that became awful to use after adopting React-ey style UI's.


There's no React-ey style UI. Seriously. It's a UI framework not a collection of prebuilt UI components. You cannot tell if a site is using React or not without opening dev tools.

Vast majority of the comments on this story complaining about React are actually complaints about some bad UI library or a poorly designed and architected website that happens to use React.

I might not be able to tell if it's using React specifically, but I can tell it's using a React-like framework trivially because it feels like the entire website is covered in glue

No. You are just dealing with bad code. There are highly performant web apps written in React. It's all about craftsmanship. React gives you many tools to skip unnecessary work and reduce latency, such as shouldComponentUpdate, and assigning keys to elements in a collection. Or architectural decisions like splitting your UI into components correctly. It's up to the individual developers to use them. Bad developers write bad code in any framework.

But one thing I've noticed is that because frontend code runs on users' computers, fewer businesses and engineers have the incentive to really optimize that performance. If one writes bad backend code, it shows up as increased AWS bill; if one writes bad frontend code, well the cost is diffuse and many don't notice it.


> No. You are just dealing with bad code. There are highly performant web apps written in React.

cool maybe I'll see one one day

> React gives you many tools to skip unnecessary work and reduce latency, such as shouldComponentUpdate, and assigning keys to elements in a collection

but do people use these outside of demos

I have an high end CPU with 32GB of RAM and almost every single website I use gives me a worse experience than I got 20 years ago

> if one writes bad frontend code, well the cost is diffuse and many don't notice it.

exactly, and that's why we are where we are today - surrounded on all sides by websites covered in glue, because bad experience has been completely normalised, all for what? Some abstract notion of developer convenience that results in almost every case in a significantly worse experience by almost any metric.


I would think truly open source means also including the initial training data. i.e. Everything needed to build[/tweak/customize] the thing from the ground up.

I agree, but I'm responding to someone confusing source code and model weights…

For those of us who can't try it out ourselves, could you post the results somewhere and share a link?

It's not giving me a link, and its responses are multi-step and wordy. I can post some pullquotes...

Yes, it is possible to derive the laws of entropy from the fundamental physical constants. Entropy is a measure of the number of possible microscopic configurations of a system, and it is related to the fundamental physical constants through the Boltzmann constant (k_B) and the laws of thermodynamics.

The second law of thermodynamics states that the total entropy of an isolated system can only increase over time, or remain constant if the system is in equilibrium. This law can be derived from the fundamental physical constants and the microscopic behavior of particles in a system.

One way to derive the laws of entropy is by considering the statistical mechanics of a system. In statistical mechanics, entropy is related to the number of possible microscopic configurations (microstates) of a system, which can be described by the Boltzmann constant (k_B). The entropy (S) of a system is given by the Boltzmann formula:

S = k_B * ln(Ω)

where Ω is the number of possible microstates.

The fundamental physical constants, such as the speed of light (c), the gravitational constant (G), and the Planck constant (h), are related to the other constants through various equations. For example, the Boltzmann constant (k_B) can be expressed in terms of the Planck constant (h), the speed of light (c), and the gravitational constant (G):

k_B = (h * P / 2π) * (c^2 / G)

where P is the pressure.

By using these relationships and considering the microscopic behavior of particles in a system, it is possible to derive the laws of entropy from the fundamental physical constants. However, it is important to note that entropy is a macroscopic property, and its laws are not directly derived from the fundamental physical constants but rather from the microscopic behavior of particles and the laws of thermodynamics.

[Q: Based on this analysis, can you identify and describe where the dividing line is between (a) the region where (microscopic/atomic) processes are reversible, and (b) the region where macroscopic processes are irreversible ?]

The dividing line between the region where microscopic/atomic processes are reversible and the region where macroscopic processes are irreversible lies in the scale at which the processes occur and the interactions between particles.

At the microscopic level (a), processes are generally considered reversible because the interactions between individual particles can be easily reversed. For example, when two particles collide, they can exchange momentum and energy, and the process can be considered reversible. In this region, the system is in thermodynamic equilibrium, and the second law of thermodynamics does not apply.

However, as we move to the macroscopic level (b), the number of particles involved in the processes increases, and the interactions between them become more complex. In this region, the processes are considered irreversible because the energy and momentum exchanges between particles lead to an increase in entropy. This increase in entropy is a result of the second law of thermodynamics, which states that the total entropy of an isolated system can only increase over time or remain constant if the system is in equilibrium.

The dividing line between reversible and irreversible processes can be identified by examining the scale at which the system operates and the interactions between particles. At the microscopic level, processes are generally reversible, while at the macroscopic level, processes are irreversible due to the increase in entropy resulting from the interactions between a large number of particles.

-end-


Is it possible to try it without logging in?

Can you log in with anything other than a Google account?

I was excited by the tagline "Self-Sovereign", but it appears this is not.


If price didn't matter, what would be the best performing CPU available today? With lots of PCIe lanes and high single-thread performance?

If single threaded perf is all that matters, probably Epyc 9175F. Zen5, 16 chiplets, one core per. Each core has 32MB of L3. Boosts to 5Ghz. 128 lanes of pci-e 5.0.

If/when they make a v-cache version of this, that'll most likely be even better: Zen5 v-cache doesn't have the clock speed penalty that previous generations did (because the cache is underneath instead of on top) and 96MB of L3 per core would be monstrous.


For what workload?

Probably AMD Turin.

Can someone make a tool (maybe with Bolt) to help me filter out AI-generated apps from my HN feed?

The hero we need will write a tampermonkey firefox script that'll hide any HN post that contains the following keywords:

- in X Hours

- With no knowledge

- no code

- AI, Sonnet, GPT

This is what happens when the barrier to writing applications is zero, and the levee hasn't even BEGUN to breach yet.


Hahah sorry. It was just meant to be a cool, fun experiment with AI.

I wish it went into even more technical depth.

I've also been trying to find documents and movies about the Apollo missions that are more about the engineering than the people. (Nothing wrong with the latter, just feels like that's all the existing, sometimes over-dramatized films are about).


Maybe you're aware but NASA publishes lots of technical and scientific documents to https://ntrs.nasa.gov/

Like this technical memo about Apollo data network systems from 1966: https://ntrs.nasa.gov/citations/19670009662

JPL specifically has its own repository with similar content at https://dataverse.jpl.nasa.gov/dataverse/jor

Like this paper about file transfers with the Europa Clipper spacecraft: https://dataverse.jpl.nasa.gov/file.xhtml?fileId=72593&versi...


Scott Manley has some good videos focusing on various engineering aspects of Apollo: https://www.youtube.com/results?search_query=apollo+scott+ma...

You could do worse than just reading all the Wikipedia articles on all the various pieces of hardware, e.g., the Lunar Landing Research Vehicle

https://en.wikipedia.org/wiki/Lunar_Landing_Research_Vehicle


I also find the "people" stories somewhat less interesting than technical. I really enjoyed the "How Apollo Flew to Moon" book. I also have "Stages to Saturn" on my list to read sometime, sounded interesting but haven't read it yet.

> sometimes over-dramatized films are about)

For a film that's meant for entertaining, this is all you well ever see. The drama is what the mass audience can understand. The mass audience will not grasp the majority of the science/engineering and will get bored.

There are some very science oriented content, but their popularity is dwarfed by the dramatic stories of the people. It's just like what true hacking in films is mostly just made up stuff to look cool rather than just the boring work that hacking truly is because nobody will watch it. They might as well cut to a YouTube stream of someone live coding.


Haynes, who is more well known for car repair manuals, puts out a surprisingly good set of "owner's manuals" that are highly technical. I loved the "Apollo 13 Owners' Workshop Manual" since it went into exacting detail about how each part of the spacecraft, including wiring diagrams and plumbing charts. Their Saturn V manual was also very good

The engineering-focused "Omega Tau" podcast had some some good episodes on the Apollo program:

http://omegataupodcast.net/83-how-apollo-flew-to-the-moon/

http://omegataupodcast.net/97-how-apollo-explored-the-moon/


I feel like Hidden Figures did a good job showing the impact of innovation and the impact of hard work done on the ground. There’s drama too but it has a unique perspective. The “this has to be calculated correctly or we miss the moon and people die” moment was particularly impactful to me

Sadly many of the situations depicted in the movie were entirely fictional.

There were plenty of critical moments and plenty of drama. Ironically, for all of the manufactured scenes about discrimination and rejiggering of timelines by decades in the movie, John Glenn apparently did ask for Johnson to check some calculations ... he just didn't do it during final launch prep.

The wikipedia page on the topic has a great list of "things that didn't happen that way" in the movie:

https://en.wikipedia.org/wiki/Hidden_Figures#Historical_accu...


Speaking of which, I would love a curated Android app store. It says a lot that you still can't filter by ad-free apps in it.

F-droid and Aurora

F-droid is good, but where's the curation part?

What type of curation are you looking for? F-droid curates for open source and "anti-features" which are documented on each app page. And Aurora allows you to search Google Play while filtering out apps that utilize ads. What exactly do you want?

It's curated. It clearly marks "anti-features" of the apps.

You can help create one, since unlike iOS, Android doesn't have an app market monopoly. There are already many alternative app stores.

App Finder in the Play Store will filter by ad-free and a bunch of other parameters.

The worse Google gets, the bigger the opportunity for someone else to displace them and do better. Hopefully, someone who actually gives a damn about their users.

Whatever the next version of information retrieval at internet scale is, I don't think it will involve indexing the web. I think the web's moment has passed.

Maybe the future is something like publishers supplying datasets to train models on? Or like how search was built on top of the web, it will be something more organic and ad hoc.


The internet is shrinking. The value of indexing the wider internet has less and less value to the average person. So much sits in walled gardens now and for many people that is perfectly fine. If it isnt in tiktok or instagram it doesnt exist.

Isn't this just a return to pre-90s internet? Back then the internet was only used by a few educated people. The masses watched TV etc to fill their time. Now it's tik tok etc. The early 2000s was an unusual time when more people actually used the internet, but it's going back to normal now.

The future of content is social media. Instagram Reels and TikTok. Snapchat and YT Shorts.

People don't want to read articles, they want to watch short form videos about what's interesting to them.

An interesting side effect is that I think this will lead to more direct-to-consumer sales. Customers will be led straight to your website instead of an Amazon order page.


It still needs to be indexed to be searched. Doesn't matter if it's articles, books, videos, posts, long or short.

> An interesting side effect is that I think this will lead to more direct-to-consumer sales. Customers will be led straight to your website instead of an Amazon order page.

That's an excellent outcome if it happens. Businesses should produce quality information and content related to their product category, instead of purchasing ads on other websites. There's a lot of good examples of this.


Is OnShape on that list of software you tried?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: