Hacker News new | past | comments | ask | show | jobs | submit | joelellis's comments login

As far as I remember, it was used to pass API requests for third party services (like APNS) through the main Bitwarden servers so Bitwarden's secret keys for these services weren't published, but self-hosters wouldn't have to register and manage their own accounts for these services, which can be complicated and expensive (To get access to APNS you have to pay the $100/yr apple developer subscription and you can only use it for your own apps, so you would have to build and distribute, via the app store or testflight, your own build of the app.


I heavily disagree with most of this comment. Static linking verses dynamic linking is a trade-off, and despite the shared bandwidth disadvantage and the difficulty of replacing the dependencies of a statically-compiled binary, there are a lot of advantages - primary of which is ease of distribution. Because the binary is mostly self-contained, it's far easier to install, can work in more environments and less likely to break as time passes and the systems around it change. These advantages greatly outweigh the disadvantages when developing g the program, for self-explanatory reasons, but also when distributing it. Many developers in languages where dependencies aren't included within the built asset bundle them anyway when distributing them because of this.

In regards to your second point, I will point you to the oft-linked https://wiki.alopex.li/LetsBeRealAboutDependencies .

Finally, your third point. Although much of the ecosystem is changing at a rapid pace, core and major packages, the ones most directly depended on, have either made very strong commitments towards stability, have a proven track record of having few or no breaking changes or both. Additionally, due to the nature of the package manager (and static linking), it's very easy to freeze the churning sections of the ecosystem for your application, use the important dependencies and even patch bugs you're encountering.

Additionally, the fact that rust's ecosystem is advancing at a rapid pace means that, well, it's advancing at a rapid pace. Most packages are high-quality and useful, and many are hugely advanced from what other languages have to offer, like the serde and regex crates. This reflects on the applications built with these libraries - you're probably heard of ripgrep, originally built almost purely to test the regex crate, or JQL, built on serde_json. There are many more amazing crates than I could possibly mention, and many more software projects like OneSignal's notification systems that showcase the amazing way the projects in the Rust ecosystem have been put together.

In short, most of the downsides that you mentioned are really mostly positives, and these positives do affect the applications a lot. Rust does have negatives, but it has plenty of positives, all of which have very concrete effects.

This is a bit of a wall of text, but your first paragraph irked me. Also, Gimp, OpenOffice and VLC are already huge. I don't use FileZilla, but I would expect similar.


> Because the binary is mostly self-contained, it's far easier to install, can work in more environments and less likely to break as time passes and the systems around it change.

That can't be the whole story, otherwise we would not have operating systems, or at least, we would not have anything more than a kernel with nothing on top. There are good reasons for software to be a part of a common system, for example when you want components of the software to integrate and communicate with each other. For example, you probably do not want to log-in to each app again with the same user name.

This is different from a web browser, which, in a way, is an operating system and a platform, and requires relatively little integration with the surrounding system (in these times, often not even a PDF reader). But as a consequence, you have something which is called "inner platform effect", and I think there are few places where this term is more appropriate than in web browsers..


> Additionally, due to the nature of the package manager (and static linking), it's very easy to freeze the churning sections of the ecosystem for your application, use the important dependencies and even patch bugs you're encountering.

Bug fixes are rarely back-ported, and if one wants to use new features, it is all too common to force an upgrade onto an partially incompatible version. And it is an open secret that most software users to not like that. They do not like change because they pay most of the cost.

The right way to do this is to have instead long-term backwards-compatible, ultra-stable APIs, and do development on these. And one cannot say that this isn't possible - this is precisely what the Linux kernel does.


> In regards to your second point, I will point you to the oft-linked https://wiki.alopex.li/LetsBeRealAboutDependencies .

This is not a convincing argument. Rather, it confirms that there is a general problem: As its first example, the post points out an exemplary C++ program, RViz, that has 36 direct dependencies, and 133 total dynamic dependencies. The second example, VLC, has 495 dependencies.

Yes, it is written in C++. Does this prove that a high number of dependencies is not a problem? No. For example, here is a list of security issues in VLC (and I do not think it is exhaustive):

https://www.cvedetails.com/vulnerability-list/vendor_id-5842...

One can bet that the number of general bugs is much higher since a security issue is just a special type of bug. So, do you want to download a new image of a rust-written VLC every time there is a bug or security issue discovered in one of the dependencies? Another regular browser update just for the video player? And another regular fat download for the Rust version of vim? Perhaps on program start?

What can be clearly observed is that in larger programs, the number of dependencies is growing more or less exponentially and that is becoming, or will become, very difficult to manage within the next few years.

This is a general tendency, and it is only going to get worse. Take kubernetes: https://lwn.net/Articles/835599/, as an example. Or tensorflow. This is clearly not sustainable.

And this causes complex problems. A very easy example is what happens if you have a program which depends on two libraries, which both in turn depends on a third library, but only allow for different versions. Vendoring or bundling into static libs is not going to solve that - the only solution is a strict adherence to backwards-compatible components.

Bundling everything as static compiled-in libraries is only going to make the problem worse. It sure makes it easier to produce a program at first, but it makes it more difficult to maintain it. But maintenance is most of a program's life cycle and work included.

And the fact is that the snowballing, exponentially rising number of dependencies also will dwarf completely the effect of Rust's better memory security as a language. If the result is very good, Rust might avoid 90% of security issues. So it might have one-tent of the number of errors in the first place, but this would not help in a near future where complex applications will have more than thousand or several thousands of dependencies. One-tenth of the number of serious bugs in VLC is still a large number, for something that handles untrusted input. This just is going to become unmanageable, if it isn't already. Making it easier to include dependencies is not the solution, and including dependencies as static libraries is not helpful for end-user systems.


Multi-cursor editing is the fastest way to accurately modify large amounts of text, especially repetitive text. It's well worth learning for your respective editor. Most other editing methods don't come close, especially in terms of speed.


But it’s still just search and replace right, like renaming a variable or function, or am I missing something?


What you are missing, is that you get cursors that you can manipulate. Overwriting what they selected is the obvious thing (and then it would be indeed the same as search and replace) — but there are some less obvious things that make this thing powerful in a different way E.g.:

  1. select the next 3 occurances of "foo"
  
  2. Press End on keyboard to jump to end of line (even if the part after "foo" was different each time)
  
  3. Start typing away

You can also use the arrow keys to move the cursors, hold shift to do selections, alt + arrow to jump to the next/previous word and so on. Of course all of this can be done with regexes as well, but it takes longer usually to come up with just the right one, even if you are good with them.


The difference is the interactivity and instantaneous feedback. I don't want to have to blindly write a regex substitution when I could instead be seeing what is happening at each step, right in front of me.


Neovim gives you a preview of the result of your regex replacement, and with the ability to reference capture groups it's a pretty powerful.


Yeah, I'm a neovim user, and I enjoy that feature. There are just times when multiple cursors would be ever so slightly more ergonomic.


Vim also has it with `:set hlsearch incsearch`.


That doesn't preview the result of the regex, just the text that would be replaced IIRC.


Disabling the `hydrate` option does this. ( https://kit.svelte.dev/docs#configuration-hydrate )


> The site appeared as a flat background old style early 2000’s sort of thing with bits of comic sans in it.

That's surprising - even external stuff usually has to follow the NHS's design system.

https://service-manual.nhs.uk/design-system


FYI, you can use ctrl+l to get the path bar as just plain text that you can edit.


!!! Well, I’ll be.


The most commonly used parsers only accept valid JSON - including the one included within most JS runtimes (JSON.stringify/parse). VSCode explicitly uses a `jsonc` parser, the only difference being that it strips comments before it parses the JSON. There's also such thing as `json5`, which has a few extra features inspired by ES5. None of them are unquoted strings. I've never come across anything JSON-like with unquoted strings other than YAML, and everything not entirely compliant with the spec has a different name.


I think you misunderstood what they're saying - you can get it for free no matter the user count, as it's open source software. They only do support contracts with organisations greater than 20 people.


you are wrong.

https://wiki.documentfoundation.org/Development/LibreOffice_...

Keep in mind that every development version (even the official one) has a maximum limit of 20 concurrent connections.


Because of this unsubstantiated and baseless claim:

> Support a stable kernel API probably wouldn't be that expensive


> or altered the login flow

Mojang recently announced that all mojang.com accounts would be migrated to Microsoft accounts. https://www.minecraft.net/en-us/mojang-account-move


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: