Hacker News new | past | comments | ask | show | jobs | submit login

My point is that this sort of thing doesn't happen so lightly in other ecosystems. And the "you don't have to use it" excuse isn't used when the project clearly wants to move in that direction:

- others will jump on the feature and use it. you might not have to use it but libraries you depend on may decide otherwise and rewrite

- you didn't have to use ReactHooks, but try using Apollo client or react-query without them now, or try finding any non-hooks documentation in general.

- you will slowly see bugs being neglected if they're not about the new and shiny.

- there is no equivalent of useContext for class components

So the word "supports" here carries very little weight in practice because the rest of the ecosystem constantly rewrites.




> My point is that this sort of thing doesn't happen so lightly in other ecosystems.

React has been around for 10 years (almost 11 since we're a few days away from 2024). Many of the ecosystems you're holding up as the epitomes of stability have had their fare share of large scale changes in that same time frame. Remember go before generics, or Rust before async landed [1]? Elixir's Phoenix didn't even have its first commit when React was released [2].

> there is no equivalent of useContext for class components

Class components can still use context [3]. They obviously can't use the useContext hook, because class components aren't compatible with hooks, but that's kind of a tautology.

> you didn't have to use ReactHooks, but try using Apollo client or react-query without them now, or try finding any non-hooks documentation in general.

Library authors moved to hooks because sharing stateful logic was difficult before hooks. I never bought into the GraphQL hype train, but I remember there being a ton of verbose render props necessary to make GraphQL libraries like Apollo work before hooks. I don't think anyone really complained about data fetching libraries adopting hooks. You could argue that the react devs should have created the perfect library in 2013 complete with hooks, but I can forgive them for not having perfect foresight. Like I said, if an API changes occasionally over a decade I'm ok with it. Technology should be stable, not frozen.

[1] https://areweasyncyet.rs/

[2] https://groups.google.com/g/phoenix-talk/c/l8kIhc_LC7o

[3] https://legacy.reactjs.org/docs/context.html#classcontexttyp...


You can't use more than one context in React classes

"React" has been around for 10 years but in practice its been 3.5 different frameworks

- the original

- the class version (still close enough to original)

- hooks version (largely a whole different concept and "compatible" only superficially while the rest of the ecosystem starts writing incompatible things)

- server components, and `use`, which monkey patch fetch and already broke half of the ecosystem.

And no, hooks are still far from perfect and are infact very unintuitive and awkward to pretty much anyone who hasn't used React.

Additionally, there were designs possible to evolve from class components, its just that the React team cared more about

- "innovation",

- "functional" aesthetic preferences

- enabling a DCE compiler

rather than compatibility and continuity

Example design that would prioritise compatibility and continuity while still enabling all the current features of hooks: https://news.ycombinator.com/item?id=35192312

Again, this is _not enough_ attention to backward compatibility and stability in other ecosystems, even older ones like Python.


The problem isn't just react though. Most libraries in JS land just don't prioritise interfaces, compatibility and colaboration, instead opting for a "I'm just gonna build my own thing and not care about anything else out there" mentality.

Lets look at that Rust comparison and see how the equivalent situation looks like for JS. Before async-await, Rust had a variety of libraries implementing something close to Futures. They all largely standardized on the Future trait from the futures crate, which made it possible to write compatible library code that could build on top of any of them. There was a bit of an issue with compatibility as the trait essentially moved into std and some compatibility shims were needed, but after that a lot of the ecosystem adopted it and its possible to write runtime-independent code

In contrast in JS land, we had callbacks which were not even values, Promises (which had a working group that actually cared about compatibility), monadic Futures, observables, thunks, event-based interfaces, a variety of libraries. This came to be dominated with node-style callbacks, and then what got standardized were promises which instantly made most of the library ecosystem incompatible with the standardized syntax.

Both ecosystem have had these challenges, but to me it seems clear that the Rust community approached it with more care and thought on average.

Going beyond async-await, we can see similar patterns in the rest of the ecosystem. We have:

- half a dozen different component frameworks and not one has thought to define an compatible component interface that anyone could implement.

- 10 different popular libraries for manipulating standard library collections, but not one common protocol or trait (other than iterable and concatspreadable) that would make them largely work with other custom collections (e.g. MobX reactive collections or similar)

- 5 different popular bundlers and 10 older ones, but not one common trait, interface or standard for writing bundler plugins

- at a certain point, we had 3 different ways to define monorepo workspaces between npm, pnpm and yarn

So for a variety of reasons we have a very "internally incompatible" ecosystem. You can barely get components to work together, and what works together today is almost guaranteed to break tomorrow.

I think that if we recognize this is a problem collectively, there may be a way out. There is very high interest in some maintainers for doing this, especially those that care about compatibility. This is because one-sided care is not enough - the side you want to be compatible with has to care and prioritize it too. Those maintainers could decide to form a sub-community that prioritizes compatibility and continuity amongst them, as well as a promise to their users.

The recognition has to be more widespread than just in maintainers though, as frameworks that have done this before (e.g. Ember) have been largely left behind due to hype propping up the latest shiniest thing.


> "React" has been around for 10 years but in practice its been 3.5 different frameworks

> the class version (still close enough to original)

React.createClass existed because React is so old that not every browser supported native classes, and it was ditched when browser support improved. It was a relatively seamless transition. The rest of your bullet points are basically restating what we've already discussed. There have been two major changes in the past decade: hooks and server components. Again, I'm fine with two changes over the course of more than a decade.

To give you an example of a Go project that is roughly the same age as React and has introduced major breaking changes take a look at InfluxDB. It went from using SQL and InfluxQL as a query language in v1, to Flux in v2 as the headlining feature, then back to SQL and InfluxQL in version 3 recently [0].

> You can't use more than one context in React classes

You can, it just requires render props which are verbose (as I've already stated). There's a whole section of the legacy docs on this. This was one of the many motivations for the creation of hooks.

> 10 different popular libraries for manipulating standard library collections

Not really. Lodash is obviously the standard choice, and at one point Underscore had a bit of a following too. We wouldn't need these libraries if TC39 didn't drag its feet implementing the standard library, but browser technology has many stakeholders so it's a slow process.

> In contrast in JS land, we had callbacks which were not even values, Promises (which had a working group that actually cared about compatibility), monadic Futures, observables, thunks, event-based interfaces, a variety of libraries. This came to be dominated with node-style callbacks, and then what got standardized were promises which instantly made most of the library ecosystem incompatible with the standardized syntax.

I disagree with your characterization of async in JS. First of all you're tossing in a bunch of fringe concurrency techniques to distort history. There have been three mainstream ways of doing concurrency in the three or so decades that we've had JavaScript (in the following order). Callbacks (which is the primitive we were originally given from the browser and node emulated, e.g. addEventListener), promises (which have been standardized incredibly well, e.g. Bluebird promises still work after all these years), and async/await syntax which is a nice layer of syntactic sugar on top of promises that every language with promises eventually adopts.

Functional style monadic futures were never a mainstream way of doing things, to the point where people proposing them were told that they were living in fantasy land (which is why they literally created a library called fantasy land). Observables and thunks are not specific to JavaScript. RxJS (the most popular library for handling observables) has analogues in every popular programming language, and Reactive Extensions were actually originally created by Microsoft for C#.

> Example design that would prioritize compatibility and continuity while still enabling all the current features of hooks:

This is an HN comment you wrote with a small snippet of code. I would hesitate to call it a comprehensive design without peer review. You certainly could've submitted this during the RFC on hooks. There was a lengthy discussion before the design was finalized (which another commenter mentioned when you posted that comment to HN).

> Again, this is _not enough_ attention to backward compatibility and stability in other ecosystems, even older ones like Python.

Your definition of stability must be wildly different from mine then. Every time I have to install a python library on a new machine I audibly groan. There's a great article about this, and an associated discussion on HN and lobste.rs that I recommend [1][2][3]. Python is complex enough that it literally drove the adoption of Docker, because it was easier to ship an entire containerized OS than to instruct people on how to install Python packages.

This conversation is getting a bit lengthy, but I'll leave you with this:

- The most popular programming language on earth will inevitably explore the design space a lot, because there are so many people working with the language and thinking about these problems. As Bjarne Stroustrup said "There are only two kinds of languages: the ones people complain about and the ones nobody uses".

- Older languages will inevitably have more cruft than newer ones, because design decisions tend to accrete over time.

- JS is in a particularly difficult position because it runs in both the browser and the server, and because it's built on web standards and TC39 uses design by committee (there's no BDFL supervising JS).

[0] https://news.ycombinator.com/item?id=37614611

[1] https://www.bitecode.dev/p/why-not-tell-people-to-simply-use

[2] https://lobste.rs/s/vtghvu/why_not_tell_people_simply_use_py...

[3] https://news.ycombinator.com/item?id=36308241


One last bit

> You certainly could've submitted this during the RFC on hooks. There was a lengthy discussion before the design was finalized (which another commenter mentioned when you posted that comment to HN).

There were RFCs submitted. My conclusion from that entire discussion was that the React team proceeded the way they've decided anyway without providing very convincing explanations and the priorities they had didn't align with most of the community. I believe this is in line with how other Facebook OSS works. React has largely been built with internal clients and their needs in mind. Additionally there seemed to be the desire to continue to be "iconclastic" and innovative and power through criticism in a similar manner that the initial React release did.

That's all perfectly fine, but in my opinion its not what the community needs, and we'de be better off moving on to other solutions (e.g. Solid, Svelte). But more important than that, we need to move on to a different mindset, because without that mindset change we're likely to go through the same problems over and over again.


There are very few projects, in Javascript or elsewhere, that care about backwards compatibility the way React does.

You can still easily put a class-based component inside your modern hook-based codebase, and it will work (I did it this year :) )

But I think they kinda dripped the ball with Suspense and server components


This is not the definition of backward compatibility I have in mind. Backward compatibility also includes "continuity", that is the new features flow from previous design.

React has been redesigned 2 times and now with server components its even less of an extension of previous ways of doing things.

Building 3 different frameworks where the old way to write code still works but nobody builds on top of that in any meaningfully useful way is a very minimal definition of "backward compatibility"


> I disagree with your characterization of async in JS. First of all you're tossing in a bunch of fringe concurrency techniques to distort history. There have been three mainstream ways of doing concurrency in the three or so decades that we've had JavaScript (in the following order). Callbacks (which is the primitive we were originally given from the browser and node emulated, e.g. addEventListener), promises (which have been standardized incredibly well, e.g. Bluebird promises still work after all these years), and async/await syntax which is a nice layer of syntactic sugar on top of promises that every language with promises eventually adopts.

I am (was) one of the maintainers of Bluebird. I'm well aware of the exact state of the node-dominated ecosystem at the time. Promises weren't much more popular than the "fringe" libraries you're describing, and (err, res) style callbacks largely dominated in the ecosystem. And yet it didn't matter - promises got standardized, even though the node community largely disliked them. Very different from the Rust process.

> Your definition of stability must be wildly different from mine then. Every time I have to install a python library on a new machine I audibly groan.

I'm aware that the python packaging story sucks. However there is broad awareness and strong desire to fix this. There is standardization work to get the ecosystem into a state that is largely coherent. There is much less such desire in JS: the thought of coming up with a "standard component interface" for example (i.e. a cross-framework compatible way of writing components) would be met with ridicule. (Note that I don't mean web components - I mean a common interface in the "trait" sense)

The Python article you linked itself admits "expirienced people will know what to do, this is not for them". Well that's not the case in JS - its actually really easy to get a combination of tools that simply will never, ever work with each other, just because they weren't designed with any common interface or compatibility in mind - regardles of how experienced you are. For example, at some (relatively short) point in time around 2019/2020, react-native simply didn't work with pnpm, because it assumed that packages installed together will have access to the same node_modules, so some of the packages dind't have the correct dependencies listed; pnpm being strict dind't provide the unlisted ones, which mean react-native was broken. (Later pnpm added a mode just for this sort of brokenness, but these sort of limitations pop up all the time)

> JS is in a particularly difficult position because it runs in both the browser and the server, and because it's built on web standards and TC39 uses design by committee (there's no BDFL supervising JS).

There's been plenty of time to fix things. Its not just about the ability, its about the general attitude in the community and our priorities. Before things get better, we have to admit they're really bad, and that we need to focus more on compatibility, continuity and colaboration instead of ignoring them in the pursuit of the latest "innovative" idea.


> promises got standardized, even though the node community largely disliked them.

Node is a phenomenal project, but it doesn't dictate the entire direction of the language. I have a hard time chastising TC39 for standardizing promises simply because Node developers didn't like the way it affected their core dumps. Like I said, JS is in a unique position as a language that runs on the server and in the browser. There are lots of stakeholders and not everyone is going to be happy, but I personally prefer promises over the callback hell we had before.

Your comments on Python:

> I'm aware that the python packaging story sucks. However there is broad awareness and strong desire to fix this.

Your comments on JS:

> There's been plenty of time to fix things

Python is one of the oldest programming languages that most engineers will likely ever work with. It was released the same year as Linux. Hasn't Python had "plenty of time to fix things"?

> The Python article you linked itself admits "expirienced people will know what to do, this is not for them".

First of all, the actual quote is "The people that will understand it’s not for them will self-exclude from the recommendations, as they should, since they know what to do anyway." The article also says the following: "I could not tag this as being “for beginners”, because it is not that simple. I know veterans that still struggle with this stuff".

But besides that, why in the world should you have to be an experienced engineer to figure out how to download and run packages of code from the internet? This is table stakes for most languages. You want third party JS frameworks to form a working group to establish interoperability before you'll grant the JS ecosystem your blessing, and yet you'll accept Python generally being bad at packaging code. You'll excuse anything when you're evaluating your favorite languages, but you nitpick when it comes to JS.

> For example, at some (relatively short) point in time around 2019/2020, react-native simply didn't work with pnpm

So you'll accept that the general Python packaging story is bad right now and has been for decades, and your only retort is that for an admittedly short period of time 3 or 4 years ago a single package supposedly had a bug related to pnpm (the least popular package manager)? Meanwhile the Python community is still debating how to install packages locally in a folder [1].

You don't think you're maybe contorting yourself to make some of these arguments work?

[1] https://discuss.python.org/t/pep-582-python-local-packages-d...


> You don't think you're maybe contorting yourself to make some of these arguments work?

First of all, I still prefer TypeScript to Python when it comes to language features. So no, I'm not defending my favorite language here - if anything, I'm "attacking" it.

Secondly, I don't really think in terms of defending favorite languages - they all come with strenghts and weaknesses and I think that its important to talk about and acknowledge both aspects even for languages we really enjoy. That's the only way for languages to move forward and improve.

That said, being embedded into an ecosystem for a long time makes us used to its strenghts and desensitised to its weaknesses. This in turn makes it harder for us to really see the ecosystem's standing compared to others.

I'm trying to provide a perspective as someone who has worked in the ecosystem for a long time and has recently started exploring other options.

Are JS package managers better than Python's? Certainly. If you took a sampling of various tools and libraries in Python, would you be far more likely to have them work together than a sampling of various tools in JS? Absolutely.

I'm a relative beginner to Python compared to JS/TS, but I had a chance to work with it more recently. I had no trouble dealing with pip, venvs and poetry. Yes, there were challenges and annoyances but at least it was possible to overcome them with sufficient knowledge. I agree though - Python has long ways to go to improve this.

However, if I compare this with my previous JS experience, its a different order of magnitude. No amount of knowledge would've made styled components work with next's app router - I simply had to start over and make different choices for my SSR framework and UI library (the alternative was to help with Mantine's migration away from Emotion). So what do I do with my other, already well-developed side project that uses Mantine and Next pages router? I can continue to use the pages router, but for how long? Who knows - its not like there is any kind of concept of LTS or even any kind of thought on that topic when it comes to Next.js anyway (e.g. https://github.com/vercel/next.js/discussions/41906)

At the end of the day, IMO the main difference is attitude. There is broad acknowledgement in the Python community about their problems and they're considered important to solve. I don't see anything like that in JS - compatibility and providing continuity is just not something most authors want to think about. Innovation is (still) priority #1

I realize its hard to make a convincing statement about frequency/magnitude/probability/attitude though. Similar examples will exist across ecosystems. What kind of data would convince you that the situation in the JS ecosystem is bad enough that it warrants a priority shift in the community?


> what do I do with my other, already well-developed side project that uses Mantine and Next pages router? I can continue to use the pages router, but for how long?

I honestly think css-in-js solutions that inject styles into the document head at runtime like Styled Components and Emotion were never really sustainable or performant, and server components just revealed the inadequacy of these techniques. Generating actual stylesheets at build time is obviously more performant (using Linaria, PandaCSS, Tailwind, CSS Modules, etc).

Even one of Emotion's maintainers proclaimed he was "breaking up with css-in-js" over a year ago [1]. He had this to say (before the app router was even announced): "With CSS-in-JS, there's a lot more that can go wrong, especially when using SSR and/or component libraries. In the Emotion GitHub repository, we receive tons of issues that go like this: I'm using Emotion with server-side rendering and MUI/Mantine/(another Emotion-powered component library) and it's not working because...".

Granted, even if css-in-js solutions are less than ideal, I still think Next should continue to support them for compatibility reasons. Neither of us can predict the future, but I would bet that Vercel will continue to support the pages router given that they have customers to answer to. Money is a powerful motivator.

> At the end of the day, IMO the main difference is attitude.

> I realize its hard to make a convincing statement about frequency/magnitude/probability/attitude though.

Like you said the problem is that it's hard to make a convincing statement about community attitude. All we can do is go back and forth with anecdotal evidence.

You'll speculate about whether or not Vercel will continue to support its pages router, and I'll provide my own anecdotes like the example I gave of InfluxDB (a Go time series database) deprecating its entire database query language on two different occasions, which fundamentally broke the entire product.

> What kind of data would convince you that the situation in the JS ecosystem is bad enough that it warrants a priority shift in the community?

The only data either of us have is anecdata, which is why we're kind of at a standstill here.

[1] https://dev.to/srmagura/why-were-breaking-up-wiht-css-in-js-...


> I honestly think css-in-js solutions that inject styles into the document head at runtime like Styled Components and Emotion were never really sustainable or performant, and server components just revealed the inadequacy of these techniques.

Sure. But here we have one that happens to be used by most popular UI toolkits. Forget about Mantine, it feels like MUI is in every other project.

Now again anecdotally, outside of the JS community, the approach and attitude of maintaners would typically be:

- how do I make this extremely widely used package have less issues with minimal backward compatibility breakage

- can I additively introduce features that are not prone to the issues

not "I'm starting over"

Just in the article comments there are 5 other people pointing out a variety of issues with the author's new favorite choice. And they're all kinda right - all choices have tradeoffs. That's precisely why only dedication to compatibility and continuity can grow any one of them into something that is still sensible to use while acknowledging its limitations.

This is IMO a more sensible approach to software engineering

- Start with an approach that is significantly stronger in a certain aspect

- Acknowledge its limitations

- Work to ameliorate those as much as possible

P.S. the article about Emotion is also an example of how React's "no wait, this was always a bad idea because <X>" (where X in this case was concurrent mode) is causing rippling breaking changes. Again, this is their prerogative and at the end of the day, they have their own priorities internally at Facebook. But that's not necessarily healthy for the wider community.

Also FWIW, InfluxDB's approach to breaking changes has been compared to "a random npm library" https://news.ycombinator.com/item?id=37206194

> The only data either of us have is anecdata, which is why we're kind of at a standstill here.

This discussion has been also been useful for helping me pinpoint what exactly is different about JS when I'm working with it.

I'm still hoping I was at least somewhat convincing, because I don't think this is a problem that can be tackled without wider acknowledgement from the JS community.


> - can I additively introduce features that are not prone to the issues

> not "I'm starting over"

React and Next DID additively introduce server components. The old way of doing things is not deprecated. Your entire argument is predicated on the idea that the old way of doing things will eventually be supplanted (which is pure speculation).

> P.S. the article about Emotion is also an example of how React's "no wait, this was always a bad idea because <X>" (where X in this case was concurrent mode) is causing rippling breaking changes.

It's not a bad idea because of concurrent mode, it's a bad idea because generating styles at runtime in a user's browser is much slower than building a static css file at compile time and serving it like any other static file. Ironically you want JS to be like every other ecosystem and that's what we're doing (everyone else outside of JS is using Tailwind and static stylesheets), and you're upset about it.

The frontend ecosystem would have done this from the beginning but analyzing components with dynamic styling, and extracting those styles at build time is non-trivial. It's essentially like building a small compiler (Tailwind in fact built a just in time compiler in v2.1). The simplest solution (from an implementation perspective) is to just use JS to dynamically generate styles at runtime and that's what early frameworks like Emotion and Styled Components did. The state of the art has advanced, which as a technology oriented person, I'm ok with. Like I said several comments ago, technology should be stable not frozen. But again, you're allowed to keep using these bloated UI toolkits because server components are optional.

> Also [for what it's worth], InfluxDB's approach to breaking changes has been compared to "a random npm library" https://news.ycombinator.com/item?id=37206194

It's worth very little. I gave you an example of a massive breaking change in another ecosystem and your rejoinder is "well some guy named 'somedude82' in an unpopular HN post compared it to npm". The arguments are getting weaker as this discussion drags on. By the way, I actually read that comment as saying "a paid database product shouldn't be as unstable as a random package". Since we're trotting out supporting HN comments, here's one from this very discussion that disagreed with you about React's stability [1].

> This discussion has been also been useful for helping me pinpoint what exactly is different about JS when I'm working with it.

This discussion has had the exact opposite effect for me. You seem to be willing to excuse any and all flaws in other ecosystems (Python's packaging story, the InfluxDB breaking changes, etc).

[1] https://news.ycombinator.com/item?id=38742422


> well some guy named 'somedude82' in an unpopular HN post compared it to npm

It implies that the JS ecosystem is well known for behavior just like the InfluxDB example - its the benchmark example of such behavior. That was my point.

> The state of the art has advanced.

Here is where I have the most fundamental disagreement. My position is: "It hasn't advanced significantly because the JS ecosystem wants to restart every time it gets difficult". Well ok, its a bit more complex than that (external pressures from React and other driving force libraries / frameworks), FOMO and all that - but at the end of the day, the actual progress has been very slow.

> This discussion has had the exact opposite effect for me.

I'm sorry about that, and if I think about it I guess discussion didn't do that for me either. In fact if I look at my hacker news history, I was probably arguing the opposite side quite a bit until 2-3 years ago. For me it was probably trying out Rust. Which isn't to say I think Rust is perfect (I'm still not sure that it makes sense to spend any effort thinking about memory management in the vast majority of cases, even with a compiler helping) but it provided a very interesting high-contrast point of comparison for me.


> It implies that the JS ecosystem is well known for behavior just like the InfluxDB example - its the benchmark example of such behavior. That was my point.

To me it implies that rather than self reflecting and admitting that other ecosystems have their own continuity problems, developers would rather respond with these tired tropes about npm. Sometimes a technology or ecosystem gains a bad initial reputation, and because of the tribalism that's so prevalent in tech, it's hard to dispel these tropes. Before JS it was PHP that was the butt of every joke, despite PHP making massive strides in usability, performance, and security. It took years for developers to stop thinking of PHP as a "fractal of bad design" [1]. JS is struggling with these same reputation problems. But I'm not sure if that random HN comment even warrants this much discussion — it's not like the author provided any actual examples.

> "It hasn't advanced significantly because the JS ecosystem wants to restart every time it gets difficult".

> but at the end of the day, the actual progress has been very slow.

I think you're drastically overestimating the scale of these changes. React hasn't "restarted". You act like the React developers decided to throw out the whole api and add Vue/Svelte style templates or something.

The InfluxDB example I gave you is actually a much better fit for your criticism. A database is nothing if you can't query it, so changing the query language twice is much closer to restarting. And given that they've reintroduced the original query language from version 1 in version 3, not only is actual progress slow, but it's actually a regression rather than a progression. This would be the equivalent of React deprecating hooks, and going back to class components after spending years promoting hooks.

> In fact if I look at my hacker news history, I was probably arguing the opposite side quite a bit until 2-3 years ago. For me it was probably trying out Rust.

Rust is cool and I like that the Rust developers are trying something innovative with the borrow checker, but I don't find my self pining for the rust development experience after having played with it for a bit.

- The compile times can be horrendous.

- Cross compiling is much more complex compared to Go.

- The standard library is lacking a bit (which says a lot coming from a JS developer). I don't expect every language to ship with an http server like JS and Go do, but I remember needing to pull in third party crates for simple things like making an http request, or calculating the shasum of a file. One of the most popular libraries for making http requests has 49 dependencies [2]. The entire async runtime has been outsourced to a third party package (tokio).

- The 6 week release cadence is a bit fast for me, and despite all your talk of standardization, Rust still doesn't have a formal specification [3].

- The 'unsafe' keyword gets thrown around a lot in Rust crates. As a newcomer, my options are to audit every dependency and hope there isn't a memory safety vulnerability, or cross my fingers and hope the packages are of high quality.

[1] https://eev.ee/blog/2012/04/09/php-a-fractal-of-bad-design/

[2] https://crates.io/crates/reqwest/0.11.23/dependencies

[3] https://blog.rust-lang.org/inside-rust/2023/11/15/spec-visio...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: