Hacker News new | past | comments | ask | show | jobs | submit | porcoda's comments login

Exactly. All of these things that ship data off to cloud services are basically data breaches waiting to happen. The fact that devices like this not only put the owner at risk, but others who didn’t consent to having their conversations recorded and shipped to a third party is even worse.

The UNIX philosophy is being a bit abused for this argument. Most systems that fall under the UNIX category are more or less like a large batteries-included standard library: lots of little composable units that ship together. UNIX in practice is not about getting a bare system and randomly downloading things from a bunch of disjointed places like tee and cat and head and so on, and then gluing them together and perpetually having to keep them updated independently.


They ship together because all of those small composable units, that were once developed by random people, were turned into a meta-package at some point. I agree with you that randomly downloading a bunch of disjointed things without auditing and forking it isn't good practice.

I'm also not arguing against a large popular project with a lot of contributors if it's made up of a lot of small, modular, self-contained code that's composed together and customizable. All the smaller tools will probably work seamlessly together. I think UNIX still operates under this sort of model (the BSDs).

There's a lot of code duplication and bad code out there, and way too much software that you can't really modify easily or customize very well for your use case because it becomes an afterthought. Even if you did learn a larger codebase, if it's not made up of smaller modular parts, then whatever you modify has a significantly higher chance of not working once the library gets updated, because it's not modular, and you updated internal code, and the library authors aren't going to worry about breaking changes for someone who's maintaining a fork of their library that changes internal code.


> all of those small composable units, that were once developed by random people, were turned into a meta-package at some point

No they weren’t. Every UNIX I used in the 80s and 90s shipped with those little composable building blocks as part of the OS, and GNU bundled them in things like coreutils forever. It’s not like there was some past time when there were independent little things like cat and wc and so on written by random people on the internet that somehow got bundled into a meta-package after they existed. That didn’t happen.


They were developed by different authors...


They have evolved together, though, not in isolation from each other.


So are different functions and modules in the standard library of a large batteries-included language like Python, Java or Go.

But in fact, most of the traditional utilities you know and love were first implemented by a small team in Bell Labs, with the most "core" of what we now call coreutils (i.e. "ls") being written single-handedly by Ken Thompson and Dennis Ritchie. The other significant chunk of utilities (commands like vi, more, less, netstat, head) were developed by a group of people at UC Berkeley and were released later as the Berkeley Standard Distribution (BSD).

GNU Coreutils, as we known them, are mostly re-implementations of the original Unix commands (incidentally, a lot of them have been initially implemented by two guys as well - Richard Stallman and David Mackenzie). This is no small feat, but the GNU coreutils authors took good care to maintain compatibility with existing UNIX tools (and later with the POSIX standard when it was released). They didn't just randomly implement commands in the vacuum and waited for someone to come up and bundle them together. It's worth noting that if you're using a Mac, you're not even using GNU coreutils. Most of the core commands in macOS are derived from FreeBSD which traces its lineage back to the original UNIX implementation.

The fact is, most of the individual commands that are now in coreutils were never released individually by themselves - they were generally released as a distribution and usually developed together and specifically to interact with other tools. The Unix Philosophy could not have been developed if Unix did not have a basic decent set of core utilities to begin with, and in fact the core utilities like cat predate Unix pipe support (only introduced in version 3).

The availability of a reliable core system commands which are guaranteed to be available and follow a strict IEEC/ISO standard(!) was pretty important for the development of an ecosystem of non-core commands that are built on top of them. Imagine what would have happened if some commands used fd 0 for stdin and fd 1 for stdout, but others used a different file descriptor or perhaps even an entirely different output mechanism, such as using a system call or sending data through a socket. Interoperability would be much harder.

But this is exactly the case in JavaScript, especially within the NPM ecosystem. The lack of a proper standard library means that every developer has to cobble together the most minuscule of utilities that are usually available as part of the standard library in other ecosystems. But what's worse is that all the various libraries don't play very well with each other. You don't have enough standard base types that you can pass between the different libraries.

For instance, there are no reliable types for describing time (besides a crappy built-in date), input/output streams (until recently), HTTP client, sockets, URLs, IP addresses, random number generators, filesystem APIs, etc. Some of this stuff (like Fetch API and Subtle Crypto) exist in Node.js and have been locked behind feature flags since forever, but that effectively means that you cannot use thme. In the earlier days of the JS ecosystems things were much worse since we didn't even have a standard way for specifying byte arrays (Uint8Array) or even Promises.


Who often worked together, sometimes, gasp in the same building! Definitely not random people on the internet not met or even emailed.


Then only use the software that Bob wrote in the same building you work at, I don't know what to tell you.

I don't think you can necessarily trust Bob's code without looking at what he wrote, just because you see him in person.

And coreutils has 200 contributors, many random people and strangers, from all over the world now.


> randomly downloading things from a bunch of disjointed places like tee and cat and head and so on, and then gluing them together and perpetually having to keep them updated independently.

I have distressing news about my experience using Linux in the '90s


We should totally have a system like that, though. It'd be such a great learning environment.


It's called Linux From Scratch.


I’ve worked on multiple compilers in industry that are written in Ocaml. A number of industrial static analyzers are written in Ocaml too (eg, Infer from Facebook/Meta). Yes, LLVM and GCC are the big ones written in the C/C++ family but they don’t represent everything.


And the Go, Java, Ruby, JavaScript, C#, Typescript, PHP, Kotlin, R compilers, and so on.

But even for hobby projects, it’s just a matter of personal preference. OCaml is great for implementing compilers. So are Go, C++, and Java.


I think we sort of knew this was the goal for some of these RTO orders: making the workplace unpleasant is a way to do a reduction in workforce without having a scary layoff. Management just shifts the decision to employees. Scummy, but not exactly a new strategy for trying to shed people without generating a ton of bad PR.


Also should let you disproportionately lay off working parents and people with long term health problems (who have higher healthcare costs on average) without technically running afoul of any laws.


I understand why advertising and marketing exists. What I wish for were people in that world who took an approach that was more respectful of users: both their privacy and experience as a user. I expect that most people (“regular” users, not tech savvy ones) use ad blockers to make the internet tolerable. Is it really that big of an ask to respect the people who you ultimately want to sell to? (Ok, that’s a dumb question: 30 years of watching the internet evolve, I think I know the answer sadly..)


Marketing executives didn't respect consumers before the internet was a thing, why would they start now?


The problem is that it is an arms-race, and if you don't engage in chumbox style shit, then others are going to kill you in the market.

It worked on Google, because nobody was allowed to do anything but text ads.

As for respecting the customer? That requires having something worth selling in the first place. Unfortunately, since 99% of everything is crap, 99.9% of ads are for things that should not be sold in the first place.


It's kind of the users' fault.

Being "respectful" is less optimal and you get outcompeted/eliminated by those that aren't "respectful". The reason is because users indirectly/subconsciously tolerate/support those that aren't "respectful".

If you want change, try convincing everyone to actively invest effort to support those that are "respectful". Good luck though, I don't think you'll get anywhere.


This has got to be one of the most out-of-touch comments I've ever read on HN. Most users of the internet aren't tech saavy and should not be preyed upon because of it.


There's some truth to it. There are some situations where it is extremely difficult for even impossible to protect people from themselves.


Isn't that sort of victim blaming? This post kind of says "people are getting wise" and the solution is "make sure everything and everyone on the internet is a fake influencer"

sigh.


No? Because the "victims" are voluntarily choosing what businesses they patronize. No one is forcing them to choose "bad business" instead of "good business" and thus causing "good business" to go out of business.

Protip: the good businesses tend to be more expensive and harder to find, and customers aren't willing to spend the extra time and money for the "good business".


Which is why you need regulation, to make sure there's a level playing field where the "bad business" can't out-compete the "good business" using methods that harm consumers.


> I know that I could beg our @dang to act as intermediary and contact the user in question on my behalf, but he's too busy and valuable to turn into a glorified phone operator :-)

Oh my god no. I intentionally am uncontactable on here and I’m sure that’s true for many others. I don’t want HN people to connect with me, open doors, etc. It may surprise some folks, but some of us don’t give a shit about networking, hustling, etc. via online message boards.

For all the high quality tech people on here, there seems to be a similar number of hucksters, get-rich-quick people, life-optimization-influencers, technical dilettantes, and other types I actively would like to avoid ever being contacted by. If the admins ever facilitate piercing that anonymity that would kill this site forever for me.


In OPs defense, this sounds like an optional take. I do not think anyone is forcing you to commit to changes. He is simply proposing a connect channel with folks that want to sync up or continue on subject matter content.


> Oh my god no. I intentionally am uncontactable on here and I’m sure that’s true for many others.

Then don't add an email to your profile.

I have been on this site 10 years more than you, and I have yet to be contacted by a scammer, con artist or whatever else you are afraid of, FWIW. Not everyone is out to scam their fellow neighbour.

Again, no one is forcing you to become contactable.


I have received multiple emails that were initiated by my email in HN. Most of them were legitimate contacts related to job positions or participating in open source projects but a good 20% that I didn't engage were probably scams.


So people who don’t want to be contacted by others don’t get an option for password recovery? Interesting perspective…


Adding your email to HN and adding it to your public profile are different things, you can add your email without make it public.


OP‘s idea was that HN‘s admins use the private/non-public email field in the profile to play middleman.


This surprises me. On most platforms it’s just a package download and install. On Mac, it’s macTeX. On Linux, it’s whatever your distro calls texlive via the package manager. On windows it’s mikTeX. That’s not exactly complex or requiring any sort of latex expertise. Linux can be the one that requires the most thinking if they don’t have one package that pulls in all of what you need, but I can’t remember it being more than a couple minutes of effort last time I did it on Ubuntu or fedora.


The difficulty is getting multiple collaborators to install and pin the same packages, where everyone might be using a different platform/distro.

Example: I might commit a change that compiles perfectly fine with my version of asmath, but it conflicts with the version of asmath in the style guide of some UC Berkeley department/lab.


It requires choices and knowing what to install and if things don’t work, troubleshooting the install can be difficult. For a first time task of “install latex”, it’s not the easiest. Especially for newer users. I e done it half a dozen times and I’m still not quite sure if I’ve done it right on my Mac (right away).


On mac: `brew install texlive`

Been using texlive for years (also use it on Windows)


I wasn’t aware of a brew package; I will definitely check that out. I have always been using the texlive installer for macOS (MacTeX), which is very easy to use. Although the install instructions can be a bit long and important to read when Apple breaks things.

https://tug.org/texlive/


Enjoy your 10GB of PDF documentation for packages you'll never use.


Is 10 gigs really that much nowadays? I have to think that if you're frequenting HN you're likely to have at least a terabyte in storage on your personal computer?


It’s not about the HN visitor… it’s about the collaborator or grad student who might be on an entry level computer with 8GB of RAM and 256 GB of storage. The entire system needs to be easy for them to install and maintain. And even if I have 1TB of storage, if I could avoid an extra 10GB of space in my backups, I’d appreciate it.


try MonsterWriter, it caters to exactly this group of users


The fact that you don't seem to realize that downloading 10GB of stuff just to edit/generate PDF documents is completely bonkers just shows how out of touch Latex afficionados are.

As far as I'm concerned, the outputs are pretty good but until somehow really makes no-nonsense software that can do that in an efficient manner, it might as well not exist at all.


Typo in title: should be LBL (Lawrence Berkeley) not LLNL (Lawrence Livermore). Similar names and nearby each other but different labs.


Whoops! Sorry 'bout that. I misread it.


I’m not sure Microsoft will ever achieve the level of trust they’d need to make things like this feature ever be acceptable. I’m sure in parts of the company they care about user trust quite a bit, but those people will never be able to counter the actions that the “maximize revenue at all costs” people take that undermine trust left and right. I don’t see them putting “build and maintain user trust” as a corporate goal that they ACTUALLY try to achieve (not just use as a corporate feel good statement), since “maximize shareholder value and revenue” will always win.


Uhm… The boom in AI with LLMs wouldn’t have happened without about a decade of major focus on images (both generative models and DNN models that blew traditional image processing out of the water), and planning/optimization type problems (alpha go, chess, etc.). Seems incorrect to claim that chat starts every cycle.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: