Hacker News new | past | comments | ask | show | jobs | submit login
PSA: Intel Graphics Drivers Now Collect Telemetry by Default (techpowerup.com)
407 points by baal80spam on Aug 7, 2023 | hide | past | favorite | 411 comments



Windows and its ecosystem have already become a telemetry monster; if you're fine with everything else, you ought to be fine with this. The linked Intel support page shows the simple instructions for opting out of the telemetry.

Having said that, 2022 was already the year of the Linux desktop. Everything works. KDE gets better every week. Be the owner of your technology. Join us!


Linux is great! And, thanks to Steam, a lot of games wirk as well. Not all, but most.

Ubuntu, I haven't tried anything else so far, is a charm on Lenovo hardware. Update wise, it is were Windows was a couple of years ago. Meaning there are random updates that just kill and break certain functions. Happened last week, everything was running fine, including steam, as it always did. Until an update managed to delete the GUI, reinstalling the GUI killed nVidia drivers, Steam couldn't be installed under 23.04 for some missing dataset, installing said data (some 32-bit stuff) killed WiFi...

No attack on Linux so, I rember that not too long ago I stopped all Eindows updates because for months the auto-updates had the same effect (save the WiFi bit).

So yes, my dqily, private driver is Linux. And will be, except for games, for the time being. All I uave to do now, well after vacation, is to get Steam running properly again. Feels like a throwback to the pre Win 10 days, when Windows randomly did the same thing with regards to drivers and certain games / programs.


It is unfortunate that people recommended Ubuntu as a starting distro for so long.

It does too much. It will update things for you, or give you a pop-up to tell you to update. Updates happen all at once, rather than a little at a time, so you get these big dramatic updates with combinatorial bug explosions. Maybe the repos will be gone if you don’t update in time. Maybe your favorite packages have moved from apt to snap. Good luck!

A rolling release distro like Arch would be a better first experience for most people I think.

Linux is not where Windows was years ago. Software gently rolls in at a nice steady rate. Some distros choose to take that nice steady flow, chop it up, and for some reason emulate the Windows catastrophic update experience. It is… an odd decision.


I've not had a good experience with Ubuntu or Pop OS. They've always felt sluggish on any hardware I've put them on.

Debian may take a touch more wrenching to get running but it has been good for me, although I haven't tried 12 yet. Fedora was good but Red Hat shenanigans seems to have messed with it.

I may try Mint next time. I've heard good things about it.

I said all of that to say, isn't Linux great?

If you have a quibble with it, you can kick it to the curb and try something else in an hour and everything just works. There's so many options to choose from, and you don't have weirdo corporations tracking your every application launch or building a psychological profile off of you from how you tab through a spreadsheet or cloud mapping your speaking patterns based off of how you type.


Since you want responsive, try XFCE (xubuntu)


>so you get these big dramatic updates with combinatorial bug explosions.

Yes. My Ubuntu installations always have bugs when I do big updates. A

And while I'm here, snapd can suck it. Me and my homies hate Snap.


The worst is when they tried to call home, like Windows


Please use Ubuntu LTS, you can live with 2-year old versions at its worst, specially when snaps can deliver up-to-date applications.

I'm not happy with some of Canonical's decisions, but I can't deny that using LTS as a daily driver is boring because most things just work without fiddling. I don't have as much time to fiddle with the OS nowadays.


I’ve used LTS on a shared system, but I don’t love it. 2 years is shorter than you’d expect, and now if we want to update we’ll need to deal with 2 years worth of work, configuration, and cluges if we want to update.


My first fall back was to downgrade from 22.04 to 20.04 again. Steam worked fine, wifi worked. I was happy. Until, that is, I tried reinstall Darktable, in version 4.x as that was the one that used for basically all my photo edits in the last year. Turned out Ubuntu 20.04 only supports darktable 3.x, and darktable 4.x edits are not backwards compatible (kind of logic, so the loss of around 100 edits is totally and absolutely on me and only me). For now, I use the Linux installation as my daily (as soon as I re-imported my passwords to Firefox, as of course I lost those as well during reinstall number 3...) and Windows as my gaming "console". Until, that is, the official Steam installer works under 22.04 again. The Steam client from Canonical sucks IMHO.


Ubuntu LTS breaks my stuff more often than Manjaro (~= Arch).

I don't understand why Canonical decides to backport breaking changes to LTS releases, but they do it on a regular basis, and I don't trust them for anything important. (I'm not suggesting using manjaro for stuff that needs to be stable over a long time period -- it's not meant for that, which is precisely my point!)


I've been using xubuntu for about 6 years now and I barely understand how linux works. So far so good, I've had one major issue which was related to my storage getting too full.

Admittedly I'm reasonably comfortable in CLI, but I don't know a whole lot about bash, just cd, mv, cp, ls -a, etc.

Is xubuntu better than ubuntu just because it removes a lot of cruft?


Xubuntu or *buntu distros are mostly Ubuntu distros with a different default desktop environment (KDE, xfce, etc)


> A rolling release distro like Arch would be a better first experience for most people I think.

Nah. Most people can't grok partitions or do the command line installs by hand. "Most people" aren't very technical, and even white collar office types will struggle.

Likewise they don't grok rolling updates and snap, they just get their updates and they restart once a month, just like on their windows box. That's enough, and once they get comfortable with the rest of the Ubanto env they can start thinkin about their Gentoo build from scratch.


I think most people barely are able to administrate their Windows laptops either, usually they just use them until they break and then buy a new one, and never really configure or fix anything.

Phones, with their app stores, are sort of like rolling release with automatic updates, right?


The issue described above was the only update problem I had with Ubunutu so far. The reason I picked it was the basically guaranteed compatability with my Lenovo laptop, I guessed better not take any risks. popOS was on the list as well so!

Positive side effect of above tepuble shooting, I got pretty fast installing Linux now.


Agreed, Arch is likely the best introductory option, especially now that archinstall comes with the iso. Dead simple to install, and the documentation is fantastic for setting up the rest of your environment.


Arch is a good into to Linux, for people that want Linux.

It might not be the best for people who want a web browser/computing appliance, but I’m not sure what is, there just might not be a good one yet.


It would be great if Linux had some automatic checkpoints so you can revert to a working system after an update. So many times updates leave a broken system and re-install Ubuntu from scratch is quicker than fixing the problem.


OpenSUSE had this 7 years ago when I last used it. I think they still have it, it's called snapper.

Not that I would really have turned any Linux significantly non-functional by updating during the last 15 years. But it's all software, so everything can happen one day...


Bootable snapshots are a thing, also known as boot environments. You need to use ZFS or btrfs, or possibly bcachefs or LVM offer them. And they never have seemed to have a great implementation on Linux compared FreeBSD, but they do exist.


> but they do exist.

And here's the crux of it, isn't it? Unless it's accessible and enabled by default, it might as well not exist from the PoV of the end user.

Windows' restore points happen automatically before updates or driver updates. Then you're prompted automatically to restore if Windows fails to boot after an update.

As usual Linux has all the pieces but no desire to create UX to go with it. Which is a shame because ZFS with automatic snapshotting daily or before major events combined with a simple UI that can run in the preboot environment would be a game changer and make Windows look like a toy.


OpenSUSE Tumbleweed has this. The installer defaults to btrfs and snapshots using snapper. Snapshots are automatic before and after every package manager operation. Save for something that breaks grub, you can always recover.


In that spirit, if you have an EFI system, you don't need GRUB


You sort of do if you want to boot kernels on btrfs. Refind is an option, but not most of the other EFI bootloaders.


Ah, good point - I neglected the BTRFS involvement/need to find kernels... that one would ideally include with these snapshots, of course


You put the kernel (built with efistub) directly onto the EFI partition - no boot loaded needed at all.


Which kind of ruins bootable snapshots, as your kernel isn't part of them. You could have a massive esp with ~10 unified kernels each pointing at a different snapshot, with some kind of management script to handle them, but using GRUB seems easier than that.


I think there's still value in a bootloader. You can easily load an old kernel with an old initramfs if you messed up somehow, or adjust kernel parameters.

The boot UI for most modern UEFI-Firmwares is really primitive and they usually don't come with EFI shells.


Can you point to read-only snapshots that way?


> And here's the crux of it, isn't it? Unless it's accessible and enabled by default, it might as well not exist from the PoV of the end user.

For what it's worth, at one point Ubuntu supported installing on ZFS through its regular installer and installed the boot environment thingy, with a snapshot before every update. I have one such machine which is now running 23.04 and that still works. I've never had to use in practice, though.


Oh, so this! Having had the option last Friday to just reboot into whatever it was before said update, and just skip the next couple of updates, man, that would have been great!


To be pedantic It's impossible for "Linux" to have this as this exists way higher than the kernel or even the filesystem layer.

Linux mint has timeshift.

The now defunct Project Trident installer set up a zfs root with zfsbootmenu and an update script that automatically sets up a boot environment you can revert and boot from with the built in feature of zfsbootmenu.

Both let you achieve your goal. You can set up either system even if they aren't built in.


A lot of comments here say Timeshift is crap - do you find it helpful yourself and can you elaborate on what works / doesn't?

https://community.linuxmint.com/software/view/timeshift


I have not personally used timeshift. I have used zfsbootmenu and syncoid found it useful and fit for purpose. Personally I think its more reasonable to separate boot environments from backing up as they are two separate if related concerns. EG boot environments are for when something went wrong with an update and backups are for when your drive failed. If both went wrong you just do both operations in sequence restore from backup and THEN pick the prior boot environment and make it the default.

Insofar as timeshift consider the review source, circumstances, and nature of complaints. Users who don't have difficulties rarely post anything and yet many reviews are positive. Those that aren't seem to focus on people who didn't realize they could provide a secondary drive and just kept storing data until their OS drive filled up.

Mint leans heavily towards technically incompetent users and yet many people were able to use it successfully. Meeting every users needs no matter how incapable is probably a poor benchmark to define if somethings is "crap" or not even if those users problems are are a great guideline to a path forward to helping all users better.

For instance it might be better if timeshift only supported a fs with snapshots instead of rsync and required a secondary disk and warned of space issues long before the disk fills up but it seems quite possible to competently use it right now.


Some Linux distros actually do this. :)


Once NixOS is desktop-ready you’ll have a way more powerful checkpoint system


I haven't had that happen for a few years now... I will say the RX 5700 XT had a really rough few months after launch on Linux. Other than that, it's been relatively smooth.

Mostly been using PopOS and Ubuntu-Budgie.


NixOS has this by default, on any filesystem. You can also do it with ZFS or BTRFS filesystem snapshots.


Check out immutable distros like Fedora Silverblue & Kinoite.


> Everything just works

No it doesn't. Every Linux desktop requires me to spend hours tweaking things or finding workarounds to make it usable for myself. Window management is horrible compared to other OS. Fractional scaling mostly doesn't work. The Pomodoro timer in the "Software" store is no longer maintained and doesn't work at all on the latest Gnome. (Windows 11 has it built-in). Even so, I have to live with certain restrictions. I tried to get into Linux desktop every few years, and I never find the situation has improved much.

By comparison, setting up the environment on Windows or MacOS takes no more than a fem minutes.


It takes me days to make Windows usable for myself by hosing all the garbage off it. And it's still a miserable experience. On the other hand, if I just install a batteries included Linux distro I get something I'm not necessarily thrilled about, but I can go pretty much straight to work. And if I spend a few days, I can set up a system that's exactly tailored the way I want it to be, eats an order of magnitude less RAM than Windows, and doesn't regularly break itself by forcing updates on me, changing settings from under me, etc.

MacOS is somewhat better than Windows(less garbage to hose off), but it's not worth the money of buying the overpriced unrepairable and unupgradeable hardware it runs on.


> It takes me days to make Windows usable for myself by hosing all the garbage off it.

I guess it's what you're used to. I don't find Windows decrapification to be that much more effort than Linux configuration.

> it's not worth the money of buying the overpriced unrepairable and unupgradeable hardware

For your use case, presumably not.

The ship seems to have sailed for upgradable Apple hardware (and I expect Apple did their homework and discovered 90% of Mac Pro systems were never upgraded anyway) but I still appreciate the advantages of hardware-software integration, nice form factor and battery life, unified CPU/GPU memory, etc.


>I guess it's what you're used to.

Exactly. All these exchanges of experiences boil down to people rationalising their own preferences, which are really just about familiarity. I do it too. I'm really just more familiar with Linux systems because I used it since I was 10-12, and as a main OS since I was about 14(I'm 30 now).

My view is that Windows being more user friendly is really just a myth. Most non-technical people use their Windows PCs or macs for the same 4 different tasks they always have. Ask them to do something new or do some troubleshooting and they'll struggle with it and get nowhere. Whether they're staring at a terminal window with no idea what to type or a "troubleshooting wizard" that offers no useful advice and links to an MSDN article with no useful information, is really irrelevant. They'll still need help from a technical person.

And technical people like us all just prefer what we're more familiar with too.


Sure, attempting to make Windows into Linux by “decrapifying” it is harder, but if you use Windows as intended and upload all your documents to Microsoft’s cloud it is easier for the vast vast majority of users.

Windows breaking itself with updates is less a symptom of Windows and more of a side-effect of “decrapification”. I’ve seen a pretty damn large sample size of windows users undergoing windows updates and Windows Update does really not break computers other than by annoyingly changing your defaults to Microsoft stuff “woops, by accident” but if you do stuff like use Edge as your default browser it stops breaking.

Apple forces you to use their defaults, Microsoft theirs, and Google theirs. When you accept these companies operating system is not Linux, and stop acting like they are Linux, and stop acting like you own your windows PC, they do become pretty simple to use.

If you want a PC you can own, you use Linux. Linux is great. My grandmother uses Microsoft Windows in S mode with edge as her browser. Her photos get backed up. She only uses Edge at 175% DPI scaling and maybe looks up a few family photos. It is easier for her than using Linux because she’s familiar with it and because nobody has taught her that unless she is using Windows for education without a decrapify script she is doing things all wrong.


You're absolutely correct


People are acting like Linus Sebastian and that other guy didn't JUST RECENTLY demonstrate Linux to still be a tough sell to non-linux gurus for daily driving.


He deliberately shitcanned his system to generate clicks. What are people going to say if an uppity linux fanboy youtuber shows himself deleting system32 and then crying about Windows is unusable for non-Windows gurus for daily driving? They're going to say he's an idiot and move on.

When I instruct my package manager to uninstall your desktop environment, and the package manager says, "You are about to delete critical system components. Are you sure? (yes/no)" and I type in "yes" I expect my package manager to uninstall the desktop environment. That is the correct thing for it to do. Anything else is the wrong thing to do.

"A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools." -Douglas Adams


Let's not forget, that while he did kind of shrug and agree to remove everything... that was a problem in the packaging spec.

The package (Steam, if memory serves?) should not have had the other packages referenced as they were. The dependency/requirement resolution was faulty...

Then he didn't truly take in the message and this is the result we get.

There's plenty of "probably say no if you don't know what this means" in what he ignored. Fault is all over.

    - His distribution of choice [or] the repository supporting it
    - him for not reading and acting accordingly
    - sheer chance
Had he chosen another distribution at random, there's little chance that would have happened.

If he repeated it on the same one now, it wouldn't happen. When you choose a niche distribution, you get niche problems.

My entire family manages fine on 'bleeding edge' Fedora, yet it doesn't market itself this way. Packaging is specifically in their domain of expertise

This isn't to say Linux is for everyone, but I really wish for a more fair representation.

As the reporters they should have dug in a bit more. They become part of the problem, in a sense, by not clarifying where there truly be dragons.


> There's plenty of "probably say no if you don't know what this means" in what he ignored.

Users will pretty much always click what they think will make the goal they're trying to accomplish work. Usually "yes" or whatever the default is.


Indeed, and a tool that does what it's told is a good tool

The packaging snafu was unfortunate, but beyond preventing him from having the ability, I don't know how it could have improved

The spec shouldn't have been so egregiously incorrect. Pretending it didn't happen... how is something both powerful and safe?

I don't know how anything could be made to take arbitrary input and sort it out to meaningful work

Edit: Fedora does well to mark things as protected, this would likely help. But still, the operator should serve as a filter


> By comparison, setting up the environment on Windows or MacOS takes no more than a fem minutes.

My wife is constantly taking her laptop into her IT staff.

The main difference between Windows and Linux is that corporate IT staff are willing to futz with Windows but not Linux.

The main difference between macOS and Linux is that macOS users will spend money to futz with macOS but not Linux.


This is because corporate IT staff are largely responsible for Windows not working. Their endless and senseless configuration changes make Windows not work.


To be fair, fractional scaling on Windows is kind of garbage, at least if you don't have the same scaling set on every display. If I have one screen at 100% and one at 125%, any application will look okay only on the display it was started on, and blurry on any others. I don't think the situation is any better on Linux, but I'm pretty sure that as long as you're using Wayland (only), it's not any worse.


meanwhile, .bashrc, .bash_login, .bash_profile, .profile, /etc/profile, and /etc/environment all still exist, are poorly documented, and if you want a GUI for them, you have to build it yourself.


Patently incorrect.

https://www.gnu.org/software/bash/manual/bash.html#Bash-Star...

These files are all shell scripts and largely empty before distribution-specific customizations, there isn't much to document there. They are commands that you want to run in startup.


> are poorly documented

There are many things you can criticize Linux for, but this is not one of them.


As opposed to the amazing Windows Registry? Those who throw stones ...

Nevertheless, your criticisms are fair. However, unlike on Windows, Linux is actually working on that. Distributions like NixOS and Fedora Silverblue are starting to take the whole "put everything under control" very seriously.

I see no comparable efforts on Windows.


I've never had to set an environment variable in the windows registry. I'm talking about simply adding a directory to the Path or setting JAVA_HOME. Windows GUI needs tons of work but it's at least 20 years ahead of linux. It gets one year more ahead of linux every year.


AD, gpos and mdm csps go very far. this is windows business bread and butter


.bashrc, .bash_login, .bash_profile: bash(1) manpage

.profile, /etc/profile is a mechanism from the bourne shell, so you would go sh(1) as long as its not bash again. Otherwise mksh(1) or dash(1).

/etc/environment is pam_env(8)


As I said, poorly documented. In fact, there is no defensible reason why they all need to exist.

And I didn't understand anything you said after and including bourne shell. I don't know if Windows has any documentation for its environment variables program, but I have never needed documentation for it because it's not brain dead. It's a GUI. An interface which is self-documenting.


> An interface which is self-documenting.

I wanna see you find the right dialogue windows in the control panel.

Would you know how to enable/disable IPv6 from your head?


I have a GUI which has a checkbox for IPv4 and a checkbox for IPv6. Checkboxes are self-explanatory. And if I hover over them they say something like "Use servers reachable over IPv6 (do not enable if you don't have IPv6 connectivity)". And yes I didn't have to look up how to do that. It's staring me in the face every time I open the program.


There are plenty of GUIs for them, e.g. emacs, vim, nano, pico, joe, jed, etc.


LOL. notepad.exe too. now that you mention it, the command line allows you to select text with the mouse. so technically cmd.exe is a GUI


you are correct about fractional scaling being suboptimal and there are a few quirks here and there but linux has become largely usable at this point. most of the criticisms you have made about linux are due to inexperience and not being familiar with it rather than being fundamental problems with it. If instead of spending the last x years of your life daily driving windows or macos you had instead spent that time daily driving linux you would have all of the understanding infrastructure built up around linux instead and these would be non problems. I use fedora with gnome and I don't install any extensions or do any tweaking or workarounds as you say.

I know this is true because i'm a non developer, non power user (though still relatively technical) who has been using linux since 2006 and it works just fine. Not only that my dad has been happily setup using linux since around 2014 and he is as non technical as they come.

To really go in on this point I bought a macbook pro when the m1 devices came out and the experience you have with linux I have with macos. It's the worst operating system I have ever used. You complain about linux having bad window management but macos has basically no window management. you double click the top bar and depending on the software sometimes it will maximize, sometimes it will pull the window all the way down the screen and sometimes it will do nothing at all. You drag the window to the top or to the side and nothing happens at all. Window management on macos is so bad that most people say that you need to install external tools to mac it even half way usable. Even when you do its still less performant and buggier than what gnome or windows offers out of the box. You say macos needs no setup, but I spent 10's of hours desperately trying to make the workflow and ux of macos not be a horrible experience for me. Everything from no tools and trying to work within its paradigm, simple window management tools even going so far as to trying yabai and none of them felt right to me.

Now that said I would bet a person who has spent a decade daily driving macos probably has internalized the ins and outs and quirks relating to macos and wouldn't find it nearly as problematic. Most of the issues people have with linux are much less problems with linux and much more a lack of workflow understanding that they haven't built up but have built up around other operating systems instead.

The main exception being that there are some proprietary tools that are pretty explicitly not supported on linux which require windows or macos.


>> Everything works.

I'm a video games developer. Neither PS5, Xbox nor Switch toolchains work under linux - not to mention that actual proper Visual Studio doesn't and that alone is worth staying on windows for.

For playing games though - sure, Linux is already great. Steam Deck proves that by playing pretty much everything flawlessly.


Also good luck in convincing Native Instruments and other musical instrument developers to also compile their plug-ins for Linux.


Ubuntu Studio LTS has VST bridges using WineLib.


Do VSTs not work on Linux? I'm not being obtuse, I am in a situation where I'm about to build a new PC for writing / recording music and I saw that Reaper is supported in Linux, but was not sure about VST plugin support.


Conceptually the idea of a "VST" works on either platform, because VST is an API. However, VSTs are distributed as executables targeting a specific platform, and almost no VSTs are compiled natively for Linux.

If you're already content with Reaper as a DAW, use "yabridge" which runs VSTs in WINE. It's virtually perfect for most purposes. Getting Linux audio set up for low latency is another Linux rabbit hole, though.


how does this work for VSTs with DRM? Or protected by iLok/etc?


I think software iLok used to work if it were licensed to the yabridge environment; whatever device identity parameters it read were implemented in WINE. For hardware iLok and other stuff, yeah, you're pretty stuck, they usually use goofy hacks or custom drivers that can't be handled in WINE. Someone would have to reverse engineer the DRM and figure out how to redirect the licensing methods. Or just remove the DRM from the plugins (crack them), which is the honest answer to people probably do it in the real world.


Check out LMMS [0], I just found it yesterday to run VSTs, I normally stay away from them and just needed a minimal VST host for one I couldn't resist, and I was blown away at what it could do for free. Between that and the synth Surge XT [1] (another free and open source project), you can make any sound you would want to, and use either the sequencer on LMMS or an external hardware sequencer (yes, USB MIDI works, even Bluetooth MIDI).

[0] https://lmms.io/ [1] https://surge-synthesizer.github.io/


Thanks for the recommendation. Unfortunately I do mostly prog / metal stuff these days and while the occasional synth sound can be had for free, a lot of my sounds come from either Helix Native / Neural DSP for guitar / Kontakt for drums. I'll take a look at LMMS.


For what it's worth, Bitwig Studio also works on Linux.

The issue with most professional VST plug-ins is mainly the UI which is not compatible with Linux.


Native Instruments is dying anyway.


How does that matter if you already invested 1000+ USD?

Go to Plugin Alliance [0] and select any plugin and when you scroll down you'll see that they are Windows and Mac only. Same for Plugin Boutique [1].

[0] https://www.plugin-alliance.com/en/products/amek_eq_250.html

[1] https://www.pluginboutique.com/product/1-Instruments/4-Synth...


It's because of obnoxious DRM garbage. I doubt helps these companies since the unlicensed users are starving musicians and kids with no money who aren't going to magically spend more.


As a company, or in your opinion of their products?


As a company. Since they have been bought by an "investment company" they have been in a downward spiral. It will take a few years but the process started.


There are plenty of niches that use proprietary tools that don't work across platforms.

I don't think that's a fair criticism of the platform 'working'. That's just status quo / profit maximizing on the part of the tool selectors and developers.

Linux works very well with regards to hardware.

Calling out microsoft tools like visual studio as proof linux doesn't work is sort of dumb. It's a tool that targets windows (mostly) written by the developers that sell windows.

You don't have to look very hard for a world class c/c++ tool chain on linux.


>>Calling out microsoft tools like visual studio as proof linux doesn't work is sort of dumb

Oh sorry this wasn't my intention at all. I'm just countering the argument that I see a lot on HN(and tbf, maybe this isn't what OP was saying) - that Linux is so good that there is literally no reason for anyone to ever stay on Windows. Like......yeah, it's great and a lot of things work really really well - but some things still don't. That's all. It's not a criticism of Linux, or at least I don't mean it in that way.


Yeah, it isn't all roses on the linux side, for sure. It really isn't perfect on any of the platforms though. I have had no end of USB issues with MacOS, and with no indication it will be fixed. I am told to buy 'supported' (Apple) devices. That's not very 'universal'.

It is amazing how much improvement there has been in recent years. The main thing that makes me want linux ecosystems to be used is that at least they are slowly improving, not simply positioning for a rent-seeking position in the 'market'.


>For playing games though - sure, Linux is already great. Steam Deck proves that by playing pretty much everything flawlessly.

The caveat is that this is only true when applied to games on Steam. I play games that aren't on Steam, and I've not even bothered to see if they would run on Linux because it's just not worth my time.

(No, I do not expect something in Japanese that communicates with DMM Game Player for user authentication and DRM shenanigans to work in Linux.)


Wine is older than Steam on Linux. And Valve's enhanced Wine aka Proton can be used independently of Steam. The deciding factor is not whether the game is on Steam but whether it uses draconian DRM or anti-cheat that requires and actual Windows kernel.


>>The caveat is that this is only true when applied to games on Steam. I play games that aren't on Steam, and I've not even bothered to see if they would run on Linux because it's just not worth my time.

I play Diablo 4 on my steam deck and it's so little effort to install it - you install Lutris, install battle.net and then it just works, you can launch it directly from the Steam Deck UI.


Exactly. I refuse to use Steam, so linux is a no go for me.

I hate it so much when people equal games to steam.


Linux gaming is not restricted to Steam.


FWIW at least cl.exe and msbuild work just fine under Wine. I never cared for the IDE so no idea how well it runs but if you just need to build VS projects you can do that.


You should demand Linux software from Microsoft, Sony and Nintendo.


The “year of the desktop” implies a solution for consumers, not professionals. That being said for almost every other kind of development Linux is by far the best platform. I cannot develop on Mac anymore due to Apple making things more difficult in every update.


Package managers are useless for getting the latest maven, gradle, or ant version. This in combination with there being multiple poorly-documented ways to set environment variables and there being no GUI for any of them means that setting up for Java development is massively worse than Windows the instant you step outside of your IDE.


> Package managers are useless for getting the latest maven, gradle, or ant version.

My package manager has the latest versions of maven and gradle. Ant is behind; upstream is up to 1.10.13 but my package manager only has 1.10.9.


did you have to enter some magic string like ppa:chien that literally no one would ever know


No. They're not in an overlay. They're in the primary, central repo.

maven 3.9.4: https://packages.gentoo.org/packages/dev-java/maven-bin

gradle 8.2.1: https://packages.gentoo.org/packages/dev-java/gradle-bin

ant 1.10.9 (upstream is 1.10.13): https://packages.gentoo.org/packages/dev-java/ant

Oddly there's no overlay that has a more up to date version of ant. So even if I did want to enter magic strings, that wouldn't have helped.


lol gentoo. that's not available in debian, bruh


I'd say they use a rolling release distro like Arch or similar, where packages are updated very often. In fact, I believe Arch even has a tool to seamlessly manage multiple Java installations.


Maybe in 2054 Debian will figure out how to update packages.


I’m sure the versions in AUR are up to date.


Switching to linux is a good option but the ideal solution is one where coporations do not or are not allowed to collect telemetry by default.

Microsoft has normalized default optout telemetry in the OS, IDE & developer tooling and now others are also following suit even in areas where telemetry doesn't make sense.

I wish there is an easy solution to the pervasive telemetry problem for those who can't switch for any reason.


Microsoft has really poisoned the well with default-always-on-telemetry.

Technical solutions (firewalls, switching to Linux, etc.) aren't necessarily practical for many people.

> the ideal solution is one where corporations do not or are not allowed to collect telemetry by default

This is probably the only effective solution, but I don't see it happening without enforceable legislation.


I think TV manufacturers don't help either.


Needing an firewall (both incoming and outgoing) for your TV isn't something I would have predicted a few years ago.

At least "monitors" don't seem to have telemetry. Yet.

Who knew that companies would use 1984 as a product roadmap.


The fact that most of them include the content of the screen as telemetry is especially concerning


Corporations should not be allowed to collect telemetry at all. Information should always be pushed from one party to another, never pulled.


Companies have not yet woken up to the idea that implementing this kind of telemetry is effectively leaking their private business data to third parties (competitors even).

Nobody's been so obviously burned by it yet that the lawsuits have started flying.

Just imagine the kind of data (say) Microsoft is leaking to Google, via all those users running Chrome. How about all those AMD users who are running Intel/NVIDIA graphics drivers in their laptops?

If I am a big tech company sitting on a pile of that telemetry data, you can bet I'll be tempted to data mine it for such leaked data about what the competition is up to. It'll probably take an email leak to reveal the practice, and cause some sort of consequences for this though.


Economic espionage is the largest field of espionage. Although I guess if any big tech company would get caught, there will be repercussions for their business.


KDE is great but also very, very buggy. I'm using it since 2016 and some bugs just won't go away and good luck with openeing a bug report - nobody will care. Especially multi monitor support just isn't good. Multi window support is also, by default, not as good as Windows does it. Sicne a few updates I'm getting some polciykit errors everytime I'm doing anything session related just because - I have not found a solution since then.


I use KDE daily, including a multi-monitor setup, and it is not buggy in my experience. Like, at all, much less "very, very." I wonder why we have had such different experiences? I'm using Fedora.


Maybe he's using an older KDE release or on a not-first-class distro which doesn't ship with it out of the box, like Mint.


"very very" was a bit exaggerated but it's far from perfect. There's always something. I'm using KDE neon which should be as new as it gets. For multi monitor: it forgets that I had a second desktop, renders it black, switching between two monitors and none is also error prone.. it's not "bad" I'm used to it but as I said, some bug been here for a while now.


Are you using nvidia drivers by any chance?

And correct me if I’m wrong but isn’t KDE neon supposed to be a bit less stable?


The black desktop used to happen on every brand of GPU (I have a mix of Intel and AMD), it just had to be the third or the fourth display, something like that. Or after sleeping and waking. I used to have it a lot.

I haven’t seen it in a while, though, so it just might be they have fixed it. Fedora + zawertun COPR, for what it’s worth.


I recently fixed this issue with KDE not coming back from sleep by setting it so that the VRAM is all saved on my nvidia card: https://nixos.wiki/wiki/Nvidia

Trying to explain it I'm actually not sure I 100% understand the solution. It looks like it uses systemd for sleep AND saves all of the VRAM (I'm guessing into RAM if I'm using anything other than hibernate).


Debian (stable) user here. I'm also scratching my head trying to understand what bugs you find on KDE. Or what kind of multi-monitor or window behavior you have a problem with when compared to Windows. (Isn't the default window behavior basically equal to Windows? I have never notice a difference in anything that I didn't change myself.)

I too think you were unlucky on your combination of hardware-distro-setup in some unusual way.


HiDPI is not perfect neither in KDE (arguably, better situation)not in Gnome, compared to Win 11. When your enable scaling, window's context menu (left click on menu) is not scaled in debian 12 and Ubuntu, irrespective of environment variables (PLASMA_USE_QT_SCALING=1 and such). In Wayland (which solves many scaling problems) KDE renders fonts blurrier than in X11. For many it is not a big deal, but not is not acceptible for me. Gnome apps are not properly caled unless the scale is bigger than 175%, and then they are too big (overscaled).

I still use KDE though.


Use Manjaro, KDE focused arch linuxes, they have latest KDE , I am using KDE dedicated Linuxes ( Gentoo, arch) and past 5 years is a blast for KDE. Much stable.for WM I am using QTile because it have best tiling wm that works with KDE application


Use Manjaro, KDE focused arch linuxes, they have latest KDE , I am using KDE dedicated Linuxes ( Gentoo, arch) and past 5 years is a blast for KDE. Much stable.for WM I am using QTile because itis best tiling wm that works with KDE application


The worst thing about linux is that it is made an maintained by people who like linux.

I would love more than anything to see a paid fork of linux whose goal was to make a power user friendly user OS that never needs to pull up a CLI.

People will come out of woodwork here to suggest whatever shitty half-assed CLI wrapper enviroment. No. No. No. They suck. I have been using them on and off for 20 years. Including right now.

I'm someone who does way more than email and youtube, but has less than zero interest in spending 6 months learning the nomenclature and structure of linux so I can become a proper user.


Please correct me if I am wrong, but is this not a straw man argument?

> spending 6 months learning the nomenclature and structure of linux

> become a proper user

Care to elaborate? My parents have been using Ubuntu successfully for over a decade now for "email and youtube". They do not even know what a CLI is. What are you trying to accomplish that does not work out of the box?


>My parents have been using Ubuntu successfully for over a decade now

Add my 87 year old grandmother to that list. Firefox->email, youtube, kroger, banking etc. These days most users just need access to a web browser via their OS.


Absolutely agree. For most people, their entire OS is just now just a bootloader for their web browser. More and more I feel like downloading any program at all is treated like a "power user" task


Troubleshoot network issues using a GUI.

That's the thing, linux is great for linux power users and absolute hands off users. Its terrible for people in the middle, and hence why it cannot get the ball rolling towards adoption for 30 year now.

There are tons of tech literate people who know what a problem is, know what the fix is, but do not want to deal with climbing through forum posts and documentation to figure out the correctly structured command to do the actions (and how to unfuck if the command wasn't right). Just make fucking buttons, toggle switches and drop down menus.


Debugging and troubleshooting a networking problem is approximately the same on Windows vs Linux (except that there are additional powerful tools available built-in to most Linux distributions). Which is to say that anything non-trivial likely needs a command line. Windows does have some graphical network troubleshooter thing, and I've let it do it's thing a few times, and it has never done anything (at least nothing noticable).


There is a FreeBSD fork that does exactly that, but to use it you need to buy a $3000 hardware dongle....

That's the path a lot of Mac Users are on, though we also have a telemetry problem; the only advantage is that it stays "in house".


Mac has never been power user friendly.


> never

Oh, yes it was. The Macintosh was once a platform for independent professionals and small businesses to build their own software using native tooling. The add-in cards you could buy for the Apple II would shock you in today's anti-consumer ownership war being waged by vendors.


I’d say it’s actually been power-user friendly 3 times:

1. The Apple II

2. Late 80s/early 90s when screen savers and [I can’t remember the name. Something makes me want to say ‘shell extensions’? small apps that made deep and wondrous tweaks to the system] were allowed to experiment with almost complete freedom

3. OSX. The first version was specifically designed for power users who had existing Unix/Linux skills. Special shout-out to some of the early tools as well: Automator, Quartz Composer, Audio Unit Lab, and even Applescript.


> FreeBSD fork

Isn't NeXTSTEP quite a bit older than FreeBSD?


What's the name of the fork?


They are jokingly referring to macOS, which is descendant from a fork of BSD with the Mach kernel.


He used a periphrasis to mean macos.


If you didn't have that attitude you could have learned that stuff in the last 6 months and then you wouldnt have to worry about it.

Also, "power user friendly" but hates CLI..... I see. You might have to hand in your power user card over that one

I am a diehard windows user but the OS's affinity for burying settings in nested, labyrinthine setting dialogs gets old super fast.


This comment perfectly sums up why linux is perpetually stuck with close to zero adoption.

"It's not linux that's the problem, it's you!"


Yeah, forgive me for thinking that people need to meet the computer halfway. I keep forgetting that most are too stupid that they cannot be trusted to make effective use of such powerful machines.


> forgive me for thinking that people need to meet the computer halfway

Seems like a very defeatist attitude to have..


Defeatist? How? We don't just put anyone in a vehicle, they have to have a license. That's a bit much for a computer but what is wrong with having to learn?


> what is wrong with having to learn?

Often that just seems like an argument used to justify poor UX. There is nothing wrong with learning but many people have other interests and/or prefer/have to spend their effort on learning other things.

Outright dismissing them as "too stupid" seems a bit elitist especially if you want them to user your software.


> If you didn't have that attitude you could have learned that stuff in the last 6 months and then you wouldnt have to worry about it.

The issue is that this has to scale to everyone if the goal here is mass adoption (meaning your family, maybe including your grandma, running Linux). If someone says "I don't have 6 months to learn this" or "I couldn't learn this even with 6 months to do so" and your goal is mass adoption, your action should be resolving whatever the roadblock is, not blaming the individual with the issue.


"You could have learned that stuff in the last 6 months"

"You might have to hand in your power user card over that one"

And these attitudes are why most consumers almost exclusively use proprietary software. You have to let people be lazy to get mass adoption. Businesses know and exploit this, the foss world writes tools with steep learning curves and says "take it or leave it." And that's perfectly fine as long as we can be honest with ourselves: the vast majority of people will never invest the time to learn to use cmd line applications, or debug wifi drivers, or learn to use an environment that's more complicated than what they already have. Time is money so even a highly motivated person should question spending months to learn new tools.

I love Linux for being superior for servers and hackable and having so much powerful software available for free...but if I weren't a software developer and I didn't enjoy this stuff there'd be no justification for the time I spent learning it.


And when you look around you and realize that everything sucks, now you know why.

Enshittification is real. Knowledge is the antidote...


linux bros think CLI is required for everything but they don't realize that it's only required for everything in linux


yeah because clicking through half a dozen nested dialogs to change my DNS settings is good fucking design

Or even better, for many things the option doesn't exist any more? Why? Because fuck you thats why!

I just LOVE watching Windows take away features because some normie asshole decided options were bad for usability and testing.

Typing this on my Windows daily driver, btw. And I have to drop into CLI on a pretty regular basis.


Tree structures are terrible design, yes. They shouldn't teach trees in computer science. Everything should be flat. Just like the Earth.

I love watching Linux stay stuck on design choices from 1970 because to change the interface now would cause the operating system to implode and the resulting riot would exceed even the asspain over systemd


Some of those design choices are pretty good, they stick around for good reason. Everything is a file/chaining together tiny commands/text-based configuration files are computing zen for a large portion of users.

A lot of text config now has way more documentation -- right there above the freaking setting! -- than a Windows design analogy could ever cram into (never-used) help files. The majority of config I deal with is like 25 lines of comments for every 1 line of setting.

Systemd's not that bad either, I will take writing systemd config files over trying to make a service daemon in init.d scripts any day of the week.


So what happens if you accidentally delete the comment in the text config? There's a parser that generates an error and recovers from there? Or it's just gone? LOL.

Do you know how annoying it is to have to go into etc/netbeans.conf, scroll all the way down, then find the JDK path from somewhere else and paste it in the quotes after netbeans_jdkhome= just to get Netbeans to run? That's not computing zen! It's the reason nobody uses Netbeans -- old, bad design.

The more people use your help files, the more you know your program sucks. This is not disputable. This is why Linux requires a literal support group called LUG.


LUG = Linux User's Group? That's more of a fan club...

If you accidentally delete the comments you can usually look the file up online, sometimes even just googling the name of the file will do the trick. Sometimes there's a default or template sitting right next to the file in its folder.

And I don't think you're going to find any arguments against your experience with Netbeans, at last recently. It's old software. New software doesn't just magically spring into being, it has to be written. And there's plenty of good replacements for NetBeans (I like IntelliJ) so I don't know what the issue is there. It's like calling Windows 3.1's Program Manager 'dated'... its like, well duh, of course its dated. It's old!

Also can't you just `echo $JDK_HOME` and paste that output into your config?

> The more people use your help files, the more you know your program sucks.

I really hate it when programs make important decisions for me. Having optionality kind of requires documentation, so one can understand the change they are trying to make. One man's intuitive is another man's pain in the ass. To say nothing of differences between culture, time period, training, or upbringing, that might change those assumptions.

If you want mindless information appliances, you are very likely holding one in your hand right now. Mobile is fantastic for that form of braindead design.


I believe Ubuntu came out 18 years ago. Almost as old as Netbeans. Much of the user-facing commands date back to 1973.

>can't you just `echo $JDK_HOME` and paste that output into your config?

Yes, but every time I download a new version of Netbeans this has to be done again. I only use Netbeans to make sure my project works there, so a significant percentage of my experience is the annoying 30 seconds of telling it where the JDK is (which if they put in any effort they could do in a startup dialog).

>One man's intuitive is another man's pain in the ass.

Press 1 for English. Press 1 for QWERTY. Press 1 for GUI. It would be nice if there were Press 1 for GUI! It's up to the user to hack his OS with shady third-party hobby projects to get a GUI for basic functionality like entering key value pairs. Why don't webpages force users to enter "last_name=Jones;\nfirst_name=Bobby" when filling out forms? It's clearly so much better than boxes! The world has moved on for a reason. 1971 was the year of the nix desktop when chmod came out.

I quit Android development because I hate phones and I'd quit Linux development too if I ever was dumb enough to start because it's a bunch of unproductive hobbyists trying to make everyone's OS into vim. And I'm on Windows 10 as always. Which is very customizable. And not in the dwarf fortress way like linux and bsd.


All I can really say then is, be the change you want to see in the world.

We're all just writing code to scratch our own or others' itches, everyone has differing opinions on what is good or not.


I would call deferring to the API design of the 1970s pathological humility. Like, humans and code are literally forced to use the same API. LOL.


Power users require CLI on every OS. Not just Linux, but also Windows or OSX, to be specific.


Especially macOS. Generally I feel like it's way easier to avoid the terminal on Linux since there are GUI apps for most stuff.


Yep, the difference between the Ubuntu experience and the Mac experience is stark. Also when the time comes to do power user stuff, macOS has many differences from Linux that have not been translated to documentation well, basically issues there are less Googleable.


>power user...that never needs to pull up a CLI.

What strange creature is this? Is it a mythical creature like a unicorn or closer to big foot?


The ultimate power user is he who has CLI powers but does not need to use them


> a power user friendly user OS that never needs to pull up a CLI

ERROR: Does not compute.

A CLI is the pinacle of power user friendlyness.


You might like the ability to use natural language on the CLI, then have AI create and run the command for you.

One example of many: https://github.com/mattvr/ShellGPT


It's called a Chromebook.


FWIW ChatGPT is pretty good in generating the specific Bash incantations you need to perform if you describe what you want to do in plain English and don't forget to add the specific version of the OS you're using. Unless what you're trying to do is pretty advanced and would be cumbersome in any other OS as well.

Learning specific API's are over. Mostly.


Yeah I used chatgpt to write the string for a cronjob yesterday, and then an awk script to parse the output.


Don’t know why you’re getting downvoted.

I think you’re currently right and will be even more right in a year or two.


Yep. Jumped ship after Windows 7 due to pervasive telemetry. That was a deal breaker for me. It's funny that Windows (and its ecosystem) getting shitty is the major factor of "Linux desktop" happening.

I figured that the effort of decrapifying Windows has outgrown the effort of configuring a Linux system, and furthermore, the former becomes obsolete with every Windows update, but the latter stays with you forever.

In 2022, Linux port of Far Manager, far2l, got a fork with LuaJIT scripting support, far2m, and that was the last thing holding me on Windows.


>KDE gets better every week.

As someone who has tried KDE off and on over the years since its version 2.0 release in SuSE Linux, and through various versions and distros since is there a fork that doesn't have twenty different configuration options in their own apps?

I was most happy with Ubuntu's Gnome 2.x desktop and am currently using Mate Desktop these days to try to hold on to what I consider the best desktop and what I always felt "home" at when using it. But it gets more and more inconsistent release after release due to intentional theme breakages. I'd love to be able to use KDE or Plasma instead and take advantage of the way that desktop leverages its shared libraries for performance if only I could get over the behavior of its shell and configuration.

Is there anything that I can run that re-organizes it into the traditional desktop that I prefer? A script? A combined theme?

I see all sorts of themes for Windows (why?) and themes for OSX (also why? especially without the skeuomorphic looks of the past?) but there seems to be very little that would take someone from Gnome 2.x\Mate to KDE. Am I just missing it? Is there any way off this burning platform?

I keep hearing how great KDE is and can perform but it makes my skin crawl when using it. Is there any way to change it that doesn't leave me lost at sea going between multiple applications with nested options?

Thanks in advance for any advice given.


I have yet to use a linux desktop which has working drag-and-drop. When you drag a file or file path from the file explorer and drop into the command prompt, it's supposed to put the file path there. Also, I shouldn't have to hold down twelve modifier keys to move a file from one place to another. I should be able to drag a file to a segment of the file path in the file explorer and have it move to that directory level.


> When you drag a file or file path from the file explorer and drop into the command prompt, it's supposed to put the file path there

> Also, I shouldn't have to hold down twelve modifier keys to move a file from one place to another

> I should be able to drag a file to a segment of the file path in the file explorer and have it move to that directory level

Guess what, that's exactly what happens in Dolphin (KDE Plasma). Please don't spread falsehoods.


I've used KDE plasma as recently as 2019 and it most certainly does not do that. If you're saying that it just recently got added then that's extremely sad that it took this many decades. Surely a few decades more until it's the default, though

EDIT: ha, I just looked it up. exactly as I remember, you need to use a modifier key to do any dropping. And it opens a dialog half the time instead of just doing it. I'm 99.99% sure it's not integrated with bash at all, let alone the OS, meaning you can't drag and drop a path from one program to another. Something you could probably do starting in windows 95


I'm not sure I understand what you're describing but I believe Plasma and Dolphin work like that. If you drag and drop a file into a program it will select that file (if it expects a dropped file) or paste the path. You can also drag and drop files inside Dolphin itself to move or copy stuff around. I think what's confusing you is that, by default, it doesn't assume a move and prompts you to pick between that and copy, and you can force the choice and prevent the prompt by holding a key while dragging. You can, however, change this behavior in the settings to instead mimick Windows explorer.

Correct me if I'm wrong.


It may be gnome (which is even worse). But if you're correct about the setting to change the behavior, then this is a non-default (setting) of a non-default (desktop environment) in Ubuntu. A really unfortunate situation.

I am skeptical about the claim that KDE's file explorer supports dropping files onto any part of a file path e.g. three levels up in one go, but if you say so.


I would if I could, but I don't need/use desktops --- my preference has always been for tablets, preferably portable with a stylus (and touch) --- CellWriter is kind of primitive, and driver support is problematic at best.

That said, my next project is connecting my Wacom One screen to a Raspberry Pi 4 (which unfortunately means giving up touch --- I'd be very interested in graphics tablet/screen w/ support for current Wacom styluses _and_ touch). Can't quite justify a Wacom Cintiq since I don't want the complication of a different stylus technology than my other devices (it's really nice to be able to switch between drawing on my Samsung Galaxy Book 3 Pro 360 to taking notes on my Kindle Scribe to checking something on my Note 10+).

I suppose I should try an Android tablet, but there's not much software there which is suited to the work I do.


Is it confirmed that Intel's drivers on Linux aren't doing this too?


Intel drivers are part of the kernel, and I imagine Intel's attempt to put telemetry in them would be shut down fairly quickly by Linus Torvalds.


Graphics drivers, at least on unices, come in two parts: one in the kernel, another one in the userspace. The latter provides interfaces such as Vulkan. That said, ANV (Intel's driver) is part of mesa, and not bespoke like nvidia's. But that's a moot point: this telemetry component is not a part of the actual driver, but another separate program in the package.


On libre drivers? No.


I do use the full fledged nvidia drivers on linux (I don't know if with the mentioned libre drivers the steam games, LLM's and image generation I use would work?), so I would also like to know if this telemetry is also present in the linux drivers


Agreed. It is the year of the linux desktop. I installed Pop OS and everything works great right out of the box. It's now good enough that I could install it on my mom's computer.


Not in my experience. I was using a Dell work laptop recently and decided to use Linux instead of windows. Once I got things working mostly to my liking, I updated the GPU drivers and got a kernel panic. I’m lucky to be tech savvy so I can recuperate from that, but after that I asked for a MacBook and now have a solid daily driver that I can rely on, without having to spend hours looking for workarounds, crashes and kernel panics


> Everything works.

...if you're a software engineer. If you need Photoshop or Word or another industry standard software then you don't count.


>>...if you're a software engineer.

I'm a software engineer and you'd have to pry Visual Studio out of my cold dead hands, it's the reason why I deal with all the nonsense of using Windows.


Out of curiosity, how does Visual Studio (proper) stack up against JetBrains tools, especially CLion?


But you can only use it to write Windows software?

Unless you mean VSCode which is cross platform.


Visual Studio supports development for linux using remote linux machines, virtual machines or WSL for execution.

https://learn.microsoft.com/en-us/cpp/linux/download-install...


I think they must mean legacy visual studio, rather than VSCode.

VSCode is of course very portable. It also seems to be Microsoft’s (successful) attempt to get everybody to use a reasonable Linux-style workfow. If you look at it as a text editor and terminal in a tiling window manager, it suddenly makes sense that it became so popular.


There is nothing legacy about Visual Studio. There is simply no equivalent of its debugging and profiling capabilities in C++ and C# especially in graphics / game development. No such equivalent exist in Unix world including macOS. They set the bar.


Agreed, but all those features don't help at all if you are doing software for platforms that aren't Windows or apparently the major game consoles.


Have you tried Vtune? I don’t do much profiling, so I’m not sure what exactly good is, but when I’ve played around with it, it seemed neat.


I've never tried it but does anyone know how C/C++ development is in Xcode?


I'm a games developer - PS5/Xbox/Switch have excellent VS integration. And yes I mean the full fat VS.


Oh I didn't know that. Those versions aren't accessible for us plebs :)


If you're referring to Visual Studio, then Visual Studio Community[0] is free (for individuals and "non-enterprise organizations") and is equivalent to Visual Studio Professional.

[0]: https://visualstudio.microsoft.com/vs/community/


The console SDKs. I know what Visual Studios are available for Windows. Until 18 hours ago I had no idea you can develop for Nintendo/Sony with it.


> If you need Photoshop or Word or another industry standard software then you don't count.

But you can always use GIMP, right?!

OK, this was a bad joke.


To be fair this is adobe's fault. The only reason it isn't available on linux is because adobe goes to great technical and legal lengths to ensure it can't be. A VM with seamless windowing isn't a bad solution for running that kind of forcewear, compromising the whole host OS seems excessive.


Blaming doesn't fix anything. The real point here is that it's a political issue and the open source community is too infantile for politics so they keep lying to themselves instead.


I wonder if it would work under a vfio system.


Almost surely! Outside of seeing obvious signs like VirtIO devices, one can go to great lengths to hide the virtualization

Similar tricks that work for the ESEA Anti Cheat client and Nvidia code 43 will likely suffice

VFIO takes this a step further and provides one less virtualized device


I really want to try VFIO and ditch my windows install completely but I'm worried about anti cheats. Nice to hear there are steps you can take...

You have experience with this? If so, just wondering...are there linux distributions to avoid for VFIO? I'm between arch and NixOS, nothing too outside of the mainstream.


It's definitely worth a whirl! Some anti-cheats are more effective at catching this than others.

I was using this as my method of 'Gaming on Linux' until Proton became a thing.

Lots of experience, indeed, though my memory hasn't aged particularly well. I even had SLi working with two RTX2080s! Hacked drivers and EFIGuard to bypass security things

Valorant was the one game I couldn't really manage.

Perhaps with more determination, but I lost interest rather quickly. Not that into the game and Proton really hurt my VFIO involvement; the timing was unfortunate.

There are some rote edits to the libvirt XML I can't recall. Both to get the nvidia driver to work (if applicable, look for 'code 43'), and to hide the VM state for anti-cheats.

You'll generally be well served by your distribution of choice with modern kernels and QEMU/libvirt.

I don't know Nix well, but from what I gather, you get to pick a lot... so it shouldn't be a problem. Arch is Arch, it'll be fine being so new!


> Lots of experience, indeed, though my memory hasn't aged particularly well.

That's kind of hilarious because my aging thing is manifesting itself in making me unable to play more complicated games. Like, I wish I could get into dwarf fortress or the new baldur's gate but always feel fried and opt for a round of call of duty (and now diablo 4). Those games are require 0 reading, it's all instinct and nothing in the game happens without some audio-visual feedback...

The irony here is that I might take a deep dive into VFIO because call of duty is one of those games that will never work with Proton...I guess for you that was Valorant?

These must all be the side effect of having kernel level anti cheat programs running on Windows and very dedicated anti-cheating teams.

> I don't know Nix well, but from what I gather, you get to pick a lot... so it shouldn't be a problem. Arch is Arch, it'll be fine being so new!

I have a habit of setting up linux machines and forgetting the process. I'm hoping NixOS will help with that :D


Don't worry, you're not alone! I have the same 'aging problem'

Though... I suspect it's a compound issue. Work is draining! Mindless fun is all I can handle, too :D

Kernel level anti-cheat is indeed the bane of Proton. Fortunately, VFIO can help there - giving a full trusty Windows kernel.

I long for a future where 'we' collectively reject these. I believe there are less-technically-invasive methods for dealing with cheaters than... essentially writing a driver and creating attack surface area.

NixOS should indeed - forcing you to write things down as you go!

I'm partial to Ansible for this, personally. You can automate quite a lot with it - there's a very healthy ecosystem of modules for almost anything you could imagine

If written well, it will work on any distribution. It's a fun challenge/art


> adobe goes to great technical and legal lengths to ensure it can't be

Could you elaborate? It was my impression they simply didn't care, but I never looked into it.


You can use Krita! It’s legitimately very good.


not industry standard, useless


How is an amazing, advanced, free design program “useless”? If you can’t dictate the tools you use at work you should find a new job.


GIMP's pretty good


Also, no cloud or subscription required.


Creative Cloud has a web version of Photoshop[1] supposedly and then there's Office 365, which has been around for a good long while now. I suppose one could use those if need be.

[1] https://en.wikipedia.org/wiki/Adobe_Creative_Cloud#Desktop,_...


"don't count" as far as Adobe and Microsoft are concerned, yes. You can't blame the people that spend a lot of their free time trying to bring free software to a free platform for not coming up with something that can act as well as Photoshop or Word and fit in with their ecosystems well given the way those companies try to lock things down.


You might want to into who makes Linux. Intel wrote 23% of the changes for kernel 6.0 and AMD wrote 32%.

https://lwn.net/Articles/909625/


Proceeds to defend this by arguing that Windows already does it anyways then goes and mentions how Linux doesn't do it... but then your whole comment makes no sense, because there is a justifiable annoyance at being tracked on Linux as well even if you quit Windows through Intel drivers


> there is a justifiable annoyance at being tracked on Linux as well even if you quit Windows through Intel drivers

I feel like you may be making some pretty wild assumptions. The Intel GPU drivers for Linux have little to nothing in common with the Intel GPU drivers for Windows.


KDE has been getting progressively worse and less usable since 3.5.


So refreshing to see someone else who revels in the nostalgia of 3.5!


I believe the correct word is KNostalgia.


Fully agree, I've been using cinnamon instead for a long time already


By everything you mean an extremely narrow slice of hardware and software works -- which admittedly are widely used and can give the false sense of "everything".

But https://xkcd.com/619/ is just as true today in spirit: mainstream multimedia has serious issues. Bluetooth often just doesn't work, forget about any decent resolution from any streaming service.

Heaven forbid you wanted to dock to an eGPU, might as well reboot because you need to restart your apps anyways.

Multifunction printer/scanners are a crapshoot. Strange enterprise VPN and wifi, well, I am wishing you good fortune in the wars to come.

Nah. Life is too short to struggle with this.

O&O ShutUp10++ handles the telemetry. Way easier to deal with that once then constantly struggle with Linux.


We need a new rule for tech, that says "Any major tech conglomerate grows until their main source of revenue is tracking and telemetry."

Why the fuck does Intel need to know "the category of websites I visit, but not the URL themselves"? Who defines the categories? We all know that metadata is at this point as important as data.

Stallman was right.


> We need a new rule for tech, that says "Any major tech conglomerate grows until their main source of revenue is tracking and telemetry."

The "Don't be evil" rule.


You mean because that was stricken from Google’s corporate values once they were big enough that two of their main sources of revenue were tracking and telemetry?


Basically: any company that claims immutable principles will eventually mutate those principals.


Not defending the opt-out thing, but they are probably talking about video-first sites -- NetFlix/YouTube/TikTok/whatever likely call for different performance optimizations than Reddit or HN or Facebook.

And yes, Stallman was and is right about a great many things. He will be remembered with respect in the underground general-purpose computing scene of the not-too-distant future.


But we don't know that, do we? There might be a government-pushed category of websites that they monitor and report upstream for legal purposes [1]. You don't need 30 categories to know if I'm on a video-playing site or not. And you can monitor performance without knowing that I was on a website or not. They write the video decoding engine, it can tell them if it's not performing as expected, no matter if I was on "Social media site" or "Porn streaming site".

It is all bollocks. A smoke screen. A slippery slope. It starts with performance metrics and ends with selling user data for profit.

--

1: I am wondering that if you know what category of website an anonymous user is visiting, and what time they open/close the tab, it should be possible for three-letter agencies to uniquely identify you and track you across the internet.


rms is only seventy, unless you have information to the contrary we should expect him to live another few years yet!


Now wait and see what AI does with all of that data...


As a practitioner in the AI space: absolutely nothing useful.


For sufficiently high standards that may be true, but even if so, it won't prevent numerous organizations from profiling based on it anyway and targeting accordingly.

Also, in favor of freedude's position, we may have just not waited long enough to see yet, because unless they're forced to there's no way they'll ever be deleting the data.


And how do they know the category if they aren't "collecting and using" the URL?


> Intel CIP is functionally similar to the Telemetry component of the GeForce Software. NVIDIA's Telemetry is installed and enabled by default along with your GeForce graphics drivers, and you cannot opt out from it, as it does not even figure in the "custom" installer options.

No idea what data Nvidia collects but this sounds even more shady.


> No idea what data Nvidia connects but this sounds even more shady.

It gets worse - if you're on Windows and your driver was not installed by an OEM (if it didn't come with the computer, or you reinstalled the OS or something), GeForce Experience wants you to connect to the internet and sign into an Nvidia account to access some features of the graphics card, like ShadowPlay.


But thankfully GeForce Experience experience is totally optional, you can just select not to install when you install the driver

Honestly it doesn't even have any critical feature https://www.nvidia.com/en-eu/geforce/geforce-experience/


Then you will not be able to access some of the features that is advertised for the hardware you bought (like shadowplay).


Well, for some features you might be able to. Based on around 20 seconds of googling it seems like GeForce Experience has some flags that'll start some things up directly (this may be how they're managed by the interface, I'm not sure). This should include shadowplay if you pass it the `-shadowplay` flag.

It's a bit worrying that it already requires this though, the whole thing is as close as it can be to DRM without obviously being DRM. And Nvidia is known to DRM their GPUs a lot.


You can still use alternate front-ends to "shadowplay" like OBS replay buffer.


One of the huge sells of ShadowPlay is that it doesn't place extra burden on Windows to capture the screen every frame, instead it directly captures the actual framebuffer (and encodes the frames) directly on the GPU, without having to switch or bottleneck any rendering code paths. It wouldn't surprise me if people who swear by ShadowPlay can tell the performance difference compared to software screen capture.

Basically it aims to replace the need to for an additional out-of-band capture card (those can get very expensive, especially if you want 4k60).


I don't think you're going to be running into much performance difference between the two. People say GameCapture already grabs the frame buffer, but I can't say whether that's true or not.


Most likely, different framebuffer - ShadowPlay grabs the exact frames sent to the physical monitor while GameCapture intercepts it in software (most likely anyway, if it doesn't use the Nvidia-specific APIs that ShadowPlay does). The former doesn't require inserting any extra steps into the rendering pipeline, so it's generally faster (sometimes by a lot).


While you can't use ShadowPlay you can use Nvidia's hardware encoding with software like OBS to do the same thing as ShadowPlay. Actually, it's even better because you can configure OBS to only do it in RAM.


A good portion of the overhead that ShadowPlay addresses is actually video capture, not just encoding/compression - but IME it's true that even NVENC on its own can take a good load off the CPU.


Doesn't this at least make some sense? ShadowPlay is literally for recording and sharing video.

If you don't want Nvidia to record you... It sounds like you don't want shadowplay either.


>Doesn't this at least make some sense?

No, it doesn't make sense. Why do you need an internet connection in order for your graphics card to record video? Even if it's inherently networked, NASes exist and don't need Nvidia's servers.


ShadowPlay is not just for uploading recordings to the cloud. ShadowPlay also allows you to capture the output of your GPU without having to insert an external capture card in between. It essentially uses NVENC to generate a video stream before it even leaves the graphics card, so the CPU doesn't have to do nearly as much work.


You are equivocating with the use of “record” in two different senses if that word. A person may very reasonably accept one of them without the other.


Absolutely, I think I have it installed, but it's intentionally one major version behind (GeForce Experience 2 instead of 3) so it won't make me log in, and that works well enough when I certainly don't let it run in the background, I only boot it up for certain buttons it has. I don't remember which ones - I haven't touched it in months, so they must not be that important anyway.

Edit: Apparently the GeForce Experience that showed up in my search was actually the one that came in the driver support package for my laptop (which I no longer use), I may have never even used it on this computer at all.


No Shadowplay, no Gamestream (so, if you bought an Nvidia Shield for that, you're screwed. No, Sunshine isn't currently at feature parity with Gamestream), no automatic driver updates (which, in this day and age, are quite important for newer games), no Ansel.

Nothing critical no, but all things that, if you don't have and need, have no real alternative.


I just wait til I run across a game that refuses to start due to old drivers before I bother to update them. Works pretty well as a notification system for me.


Surely you can just firewall off Nvidia's telemetry servers.


It's a bit hard to log into an Nvidia account without connecting to their servers. But I guess if you're not philosophically opposed to online logins, you could totally just log in and then block its internet access with a firewall rule (either Windows Defender Firewall, or another app like Portmaster).

I do this anyway even though I don't have GFE installed because other components of the driver also try to connect to Nvidia's servers for some reason. Not on my watch.

(Though I wonder if there is some registry key or something to make GFE believe you've logged in, even if it's never connected to the internet. Surely someone would've figured it out by now.)


Why do you have to login to enable video recording functionality?


I don't know. The app won't let you past a login screen, so you can't use any of the settings. Luckily the Nvidia Control Panel that comes with the driver is devoid of any obstacles whatsoever, but it seems that at least the ShadowPlay implementation happens to live with GeForce Experience.


Weird, seems like someone would have hacked a workaround as you say.


Nvidia collects all the same data, and you cannot opt out of it if you run GeForce experience.


> if you run GeForce experience

Exactly. It is a reason I never install it when updating Nvidia drivers (yes I know I'm missing out eg. on shadowplay but I don't use it anyway).


I think GFE's implementation of ShadowPlay is replicable by third-party/open-source applications as well. It just uses some sorta-undocumented Nvidia APIs to capture the screen without the software overhead: https://forums.guru3d.com/threads/msi-afterburner-3-0-0-beta...

Not sure if anything like OBS implements this or not.


FYI, I was wrong here.

Telemetry on window information, focus times, click spots et al is collected with just the driver.

GeForce experience additionally collects personal information.


DNS block:

events.gfe.nvidia.com lightstep.kaizen.nvidia.com


Ridiculous, Nvidia cards are expensive and on top of that they collect the data for free.


High end TV's are filled with all the crapware and telemetry shenanigans of the low end TV's as well. Why would companies not double dip?


Huh, I wonder if this applies to people with Linux Nvidia drivers.


Linux drivers don't come with GeForce experience


As a long time nvidia-on-linux user, that's an "experience" I'm willing to miss.


How can engineers working on this even look at themselves without being ashamed?

For the record: I'm a graphics driver engineer myself.


>How can engineers working on this even look at themselves without being ashamed?

Same how can Google/Meta engineers work at the biggest user tracking ad empires and not be ashamed.

What do you expect them to do? Quit their jobs in protest? Other engineers who need that money will gladly do it. Everyone has their price, not everyone has the luxury of being so picky with their carriers, especially when it comes to having a big name like Intel on your resume. You go to work, implement the tickets assigned to you, then go home, without vocalizing on the ethics of the big picture, is what the vast majority of employees do.

If you want this practice to end you have to legally regulate it, the free market alone doesn't work in favor of the end user's privacy. Without laws and regulations in place, if Intel won't do it, all the other companies would. It's why regulations are important.

Plus, I think half of HN userbase built their wealth or currently works for FAANG companies that make their money from spying, tracking or monetizing user data for ads, so the stone you threw will break a lot of glass here. How do you think they justify it other than "I like money"? So why single out only Intel engineers for what is common place around these parts?


> What do you expect them to do?

If they can't bring themselves to do what's right, the least they can do is be ashamed of what they're doing.

> So why single out only Intel engineers for what is common place around these parts?

Because this particular example is about Intel. When other companies that do this come into the spotlight, then the same criticism is levelled at the devs enabling them as well. Big picture, Intel's not being singled out.


>then the same criticism is levelled at the devs enabling them as well

It isn't. Nobody criticizes those who work at Google or Meta here. There's more talk about maximizing your TC there, than telling people not to work there.

Double standards, probably because Intel doesn't pay as good as Google/Meta so it's easy to throw stones where you don't plan to work anyway, but people stay quiet when it comes to places where they earn their wealth.


> Nobody criticizes those who work at Google or Meta here.

What can I say? I see such criticism frequently. I even pile onto it from time to time.


They criticize only the companies but not the workers who choose to work there like how the original poster choose to criticize those working at Intel for implementing telemetry. That's the difference. Meta/Google workers aren't judged here for working at these un-ethical companies.


> Meta/Google workers aren't judged here for working at these un-ethical companies.

Sure they are. I've seen comments like these aimed at workers of those companies far more than workers at Intel. There have even been entire threads about whether or not having Meta or Google on your CV hurts you, and should hurt you.


On Google/Meta it's built into the product though. You're connecting to their servers to use the service on their servers. You also don't pay them anything by default.

For Nvidia you're expected to pay $1000 for a graphics card and then they will collect data on you and send it to their servers even though you didn't choose to use their online services at all.


Frame the mirror with their pay stubs? I'm kidding, Intel doesn't even pay that much.

It probably requires not thinking about it and just following the plan laid out by the PM, which a lot of people are very good at.


They'll rationalize their behavior, just like most of us do. "All the other companies are already doing it", "No one cares about privacy", "It's not my decision", "We need this data to help the customers".


Well let's ask HN:

How many of us click accept on those annoying cookie dialogs or have an extension do it for them?

And how many take the time to find the reject button and/or install appropriate extensions that will do the rejection?

I'm in the second category but I'm sure the first one is the majority, mostly because they just don't care.


"I have resigned myself to everything insisting on taking as much of my personal data as it can possibly manage, and have already been worn down too much by so many years of trying to navigate it in ways that protect myself, so now I just sadly click accept so that I can get on with my life" is not the same as "I support the right of companies of all types and sizes to take all of my data to do whatever they want with."


I personally know one person that simply doesn't care. They're not all resigned, I think.


Those pesky popups keep reappearing on each visit until you hit accept. Once you accept, they never bother you again. Not sure there is a way around besides aggressive anti-tracking extensions / DNS servers etc.


Use the annoyances block lists on ublock origin, gets rid of them on most any bigger site. Annoyances isn't on by default bc mainstream users would get too frustrated with the occasional breakage it may cause.


I don’t, not anymore. I use the annoyance lists for uBlock origin, click on “deny” on smaller sites, use the AdNauseam browser extension on every desktop browser to poison the wells of advertisers, switched to iOS from Android, installed LuLu on macOS and deny programs internet access if they don’t need it…

I was never particularly privacy conscious. I was “normal”. Now that my fucking GPU driver spies on me, I think I’m rightfully becoming slowly paranoid about it.


Since I browse without Javascript enabled by default, I almost never see those cookie dialogs. I don't allow persistent cookies regardless.


In other words, they'll lie to themselves.


The movie Thank You For Smoking about a tobacco lobbyist introduced a term for this.

The Yuppie Nuremberg Defense, I've got a mortgage to pay.


Thank You For Smoking was an all-around great movie.


Rationalization. I've seen developers I thought were reasonable doing much worse things with cellphones.


Intelligent/smart people are fantastic at rationalizing, it's like a superpower.


I'd be interested to hear more about the cellphone bit, if you're willing to share.


I'd love to, but since I'm not sure what could be the implications (I was employed by them at the time), I don't know if I can't safely do it. All I can say is that it was locking out features through software.


Oh, I feel ashamed, but that doesn't stop me.

I put considerable thought into the ethics of storing data and letting that data get inaccurate and harming people. Same with providing that information to people whose goals actively cause harm.

But I'm here for the war, not the individual battles. I raise objections and sometimes I can make a difference. This is one I'm going to lose most of the time. So I say my bit and do my job in a way that causes as little damage as possible because the next person won't.

Meanwhile, I get paid and live another day to make a difference somewhere else.

And yes, I know exactly where that reasoning leads. I wrote a story with an assassin justifying his actions the same way. There are limits in the amount of damage I'm willing to contribute to, telemetry doesn't cross that line.


Who are the parties in this war? It sounds like it's you v. everyone else.


The battles are usually things like invasive user tracking, insecure storage of PII (comprehesive credit reports), locking people out of accounts without recourse, dinging people's credit scores, giving out (potentially incorrect) information to police without warrants, etc.

So, upper management, government agencies, finance departments, and marketing departments.


Pretty sure Intel would be happy to apply a level of indirection through external contractors to get this done either way. That's the world we live in. I don't see how engineers at Intel would risk antagonising whoever dreamt this up.


> How can engineers working on this even look at themselves without being ashamed?

These aren't engineering decisions but marketing ones. The role of an engineer is to engineer a solution, not make business decisions. The decision to turn telemetry on by default was not taken by an engineer, so engineers at intel shouldn't feel ashamed of anything at first place.


The guards at Auschwitz were not there to make business decisions for the country as a whole, just there to guard the camp, so they didn't have to feel bad about the human ash landing on their clothes every day.


> The guards at Auschwitz were not there to make business decisions for the country as a whole, just there to guard the camp, so they didn't have to feel bad about the human ash landing on their clothes every day.

https://en.wikipedia.org/wiki/Godwin%27s_law

No need to say that how easy it is to brush off that kind of argumentation for the fallacy it is.


Godwin's law does not mean that any comparison involving Nazis is automatically a fallacy.

In this case, it was a relevant comparison. Court cases involving former concentration camp staff have established that "I just work here" is not a particularly strong defense. If your work enables harmful actions, and if you are aware of it, you may be found responsible for the consequences. Even if you don't participate directly in it. Even if you could face adverse consequences (including death) for refusing to do your job. And even if someone else would likely just take your place.


> In this case, it was a relevant comparison. Court cases involving former concentration camp staff have established that "I just work here" is not a particularly strong defense. If your work enables harmful actions, and if you are aware of it, you may be found responsible for the consequences. Even if you don't participate directly in it. Even if you could face adverse consequences (including death) for refusing to do your job. And even if someone else would likely just take your place.

No it wasn't a relevant comparison, no genocide is happening because someone turned on telemetry by default, this is completely unhinged.

If you're so outraged by all this then you should give these engineers alternative opportunities of work that don't involve telemetry. Can you do that?


>If you're so outraged by all this then you should give these engineers alternative opportunities of work that don't involve telemetry. Can you do that?

These workers CHOSE to work for FAANG. Nobody puts a gun to your head and makes you choose the $300k a year FAANG job over the $200k a year non-FAANG job. Hell, even IN FAANG, Netflix isn't really doing anything unethical. Sometimes taking the high road means less personal profit.


> These workers CHOSE to work for FAANG. Nobody puts a gun to your head and makes you choose the $300k a year FAANG job over the $200k a year non-FAANG job. Hell, even IN FAANG, Netflix isn't really doing anything unethical. Sometimes taking the high road means less personal profit.

Intel isn't a FAANG to begin with, second, I'm asking you again, are you creating alternative opportunities for the people who would refuse to do their jobs because it goes against your sense of morals? What are these opportunities of work you've built for these people? Answer that question. If you can't answer that, then who are you to judge? Not everybody can afford your grandstanding.


> The decision to turn telemetry on by default was not taken by an engineer, so engineers at intel shouldn't feel ashamed of anything at first place.

And this lack of accountability is exactly why our world is the way it is. Just remember anyone can justify evil behavior if they try hard enough. Why? because people are not basically good by nature. They are basically evil by nature. Observe any 2-3 year old throwing a fit and you might begin to grasp the concept.


> And this lack of accountability is exactly why our world is the way it is. Just remember anyone can justify evil behavior if they try hard enough. Why? because people are not basically good by nature. They are basically evil by nature. Observe any 2-3 year old throwing a fit and you might begin to grasp the concept.

There is no "lack of accountability" anywhere in that story. Nothing done here was illegal, nor "evil" which is an unnecessary hyperbole.


I don't agree with the perspective that if there's no law against doing a thing, that makes it OK to do the thing.


Sure there is. The fact that nothing here was illegal, but is objectively harmful, is the lack of accountability.


All you’re doing here is revealing your lack of ethics with regard to this topic.


> All you’re doing here is revealing your lack of ethics with regard to this topic.

No argument, resorts to personal attacks.


> The decision to turn telemetry on by default was not taken by an engineer, so engineers at intel shouldn't feel ashamed of anything at first place.

Yes, they should, because it's those engineers that are implementing the decision. It doesn't matter even a little whether or not they were one of the decisionmakers.

The engineers implementing things like this should be deeply ashamed.


same as all other IT personnel contribute to their company products' marketing efforts: " know your customer", "improve the product", etc. Not necessarily evil, just a reflection of how money and profit drives free enterprise. as the struggle for margin intensifies, all efforts are put into capturing the most details about the usage of a product.


Intel is suffering financially. This is the phase in which they, regretfully, cross all ethical boundaries for the sake of the company's bottom line.


If that's not the reason Nvidia is doing it too, why should we believe this is the reason Intel is doing it? Truth is they're just greedy and they're doing it because they can.

Edit for FirmwareBurner: I will, after you point out the part of my comment in which I claim that some F500 companies aren't greedy.


Please show me a F500 publicly listed company that isn't greedy.

The way our capitalist economy functions and CEO compensations are set, encourages nothing but greed and short therm profits at the expense of everything else.


It's the pressure for growth. Who started this trend of rewarding shareholders through share price increase instead of dividends? Was it Apple? Some entity in an unrelated market 50 years before Apple?


....Where did Apple come into this?

The origin of the pressure is likely a shift from shareholders being primarily people who want to hold onto a given stock for many years, and be rewarded through dividends, to being primarily financially-oriented institutions who want to buy low and sell high constantly.


I just have a faint idea Apple is one of the first tech companies that never paid dividends.

And yes, that was my point.


What difference does that make?

The demand for returns comes from equity owners, such as people with 401k and pension funds.


Yes but if your numbers are constant you can still pay dividends, as long as you're profitable, while if you need to provide capital gains you need to increase the numbers so the share price will go up.

This is before taking into account how these things are taxed.

Edit: you changed your comment :) I answered the previous version.

Edit 2: and even a pension fund could take their dividends and reinvest them. But I guess that would be too much work.


I changed my comment to ignore the tax treatment since dividends from stocks held for more than a year are taxed the same way capital gains are taxed.

> Yes but if your numbers are constant you can still pay dividends, as long as you're profitable,

Is this not where the money for share buybacks comes from? The business can choose to pay dividends (cash), or it can spend the cash on buying back its shares. Either way, it should offset equally.

>while if you need to provide capital gains you need to increase the numbers so the share price will go up.

This does not make sense. The share price depends on supply and demand of the share. If the business is doing share buybacks, then there is less supply and more demand for the shares, hence share price increase.

Anyway, my point is the pressure for growth has nothing to do with share buybacks versus dividends. Investors like more money than less money just like anyone else does. One could even say all these taxpayer underfunded pension plans for government employees that have long assumed 8%+ growth to meet their obligations are “pressuring” businesses for growth.


Telemetry by default should be illegal. They should be required to obtain owner permission.


And not just “permission” as a mandatory click-through, but freely-given permission. The permission may not be bundled with any other agreement, and the product must remain usable regardless of the telemetry settings.


This is why I prefer "informed consent" over "permission". And even "informed consent" is redundant, as it's not possible to provide real consent if you're not fully informed about what you're being asked to consent to.


Informed _uncoerced_ consent like what we're starting to see thanks to the GDPR ([Allow All] and [Deny All], and not [Yes] [Later] or worse, [Yes] [Back])


I agree -- coerced consent is not actually consent at all.


I wish we could all pool money and buyout politicians in some tiny corner of the globe to pass laws that made all telemetry illegal for any device that entered their geography and allowed hefty fines for even basic data collection.


The question here is not telemetry itself, but the infinite over-reach in what data is collected. Categorized websites have nothing to do with driver crashes or performance issues.


It's also about telemetry itself. That this telemetry is obviously overly expansive just increases how objectionable the practice is.


That would make any site using mixpanel/google analytics/sentry/etc. illegal. Not so easy to enforce.


And wouldn't that be a terrible shame.


Perhaps they meant locally installed software.

And EU has tried making sites classify and ask for permission for cookie storage.

And maybe mixpanel and Google and cross site tracking should be opt-in for users.


But these are two very different scenarios though. When I use a website I'm actively interacting with that website. I want something from their web servers.

When I'm using an Nvidia graphics card I'm not actively looking to use their web services. I want the graphics card that I bought to work and that's it. The graphics card doesn't improve as a device by connecting to the internet.


But metrics could help nvidia fix issues and add features in the next driver update?

I'm mostly playing devil's advocate here, what I truly think is that all monitoring and analytics should require user approval a bit like cookies. E.g.: you visit a website and it would ask for permission to send monitoring data to listed sites.


As far as I know, it is required.


> Telemetry by default should be illegal.

Is it not?


I'm not aware of any laws in the US or in Europe that generically prevent "phoning home".

There are laws that limit your ability to collect certain types of sensitive information without some quasi-meaningful user consent, but most telemetry goes around this by notionally not collecting PII.

The gotcha is that in practice, most companies don't put a whole lot of effort into making sure there's no incidental PII in the telemetry, and no other way to infer who you are. Browsers automatically collect crash reports that, for a good while, might have contained your cookies, URLs, and other goodies in the logs or memory dumps... cars collect "anonymized" telemetry that shows you driving from your single-family home to wherever you're headed... etc.


> but most telemetry goes around this by notionally not collecting PII.

The problem is that an IP address is considered PII and is inherently sent in any HTTP request, so you could argue that any non-essential request to any third-party should be opt-in since it contains PII.


Unless you have a contract stating the non-collection of PII, the you can wash your hands. OTOH, it's just as easy to be willfully blind about that.


True. I'm surprised this doesn't go against GDPR.


You don't understand the GDPR then. The focus is on controlling where data goes (not out of the EU), but your point is about telemetry in general. If you keep the data inside the EU and manage it in a way that's compliant with the GDPR then then that telemetry doesn't put you in jeopardy.

If you want to be protected for telemetry then you need to advocate for better legislation. Stating that something is covered by the GDPR ultimately masks the issue because it's not against the GDPR and you lose an opportunity to advocate for better legislation to reduce, or kill telemetry.


GDPR is not only about where the data goes, it's also about asking for the user explicit consent for their data collection [0], this is what I was referring to. Not the telemetry collection. I just thought it worked for other stuff than cookies.

https://www.privacyaffairs.com/cookie-consent/


I think that's a good point. Companies are likely going to try to neutralize this by saying that telemetry just being another kind of data that they can make safe by removing PII etc.


Probably so, even though that argument is absurd.


Collecting data is not against GDPR. GDPR is about consent, processes etc.

Whether or not this is compliant depends on the way their installer is set up.


But "default" is precisely not asking for consent. One of the thing GDPR asks, is for websites to explicitly ask for the user permission to allow cookies


You can count on the tech industry to provide you with a daily dose of outrage. This is very corrosive. Not only does it poison the well people want to use towards positive goals, but with tech being effectively unavoidable by anybody but extreme hermits, it creates a sense of complete and irreversible moral decay that permeates all of society.


> it creates a sense of complete and irreversible moral decay

or, it is just another reflection of the "complete and irreversible moral decay that permeates all of society". the decay of any civilization is predictable and irreversible, and not a pretty sight...


Maybe we should stop re-positing the specific developments of contemporary capitalism with some generic notion of civilization. A civilization is not some 'national body' that grows, matures, and dies exactly analogous to a biological organism.


definitely not "national". but a civilization rises and then falls. it's a universal cycle, entropy at play.


A word replaced with one more vague one does not a robust argument make. Such a notion of universality lacks the concept of frames of reference and relies on an impoverished understanding of the concept of entropy. The argument for a cyclical imperative to history is at best tautological and fundamentally anti-empirical.


scrabble soup, lol


Not on linux. One more reason why drivers should be open source and live inside the kernel.

I remember months (years?) ago when it was discovered wacom tablet drivers did the same on MacOS; well, not on linux.


Can some EU citizen please make a complaint to a DPA (Data Protection Authority)? It's gathering so much stuff, I don't think that this would hold up to scrutiny.


Yet another repulsive software move. Our collective deterioration continues unabated.


What is the Mac OS side of telemetry looking like? It feels like Apple having commercials about how their computers don’t record everything you do would be a great knock against chrome os and windows, but I honestly don’t know if that’s true.


MacOS sends out a whole lot more than the zero things I have granted it permission for. It regularly sends out diagnostic reports covering day-to-day usage without anything exciting through the SubmitDiagInfo tool, even though the Device Analytics & Privacy doc explicitly mentions this won't happen when Sharing Mac Analytics is off.

The vast majority of third party applications are also sending everything they can access and then some, often far exceeding in scope and nowhere even close to meeting the privacy and security standards as is claimed in the relevant policies.

Telemetry endpoints are increasingly being CNAME masked or called through anynymous proxies. Betraying the customer/user trust this way is absolutely despicable. Shipping normal release channel apps with a Sentry SDK configured to run in debug mode seems to be the new meta for some reason (nothing good, probably).

And then there's Mozilla... Not only are many of their recommended extensions secretly sending telemetry and doing all sorts of other things that violate their extension policies multiple times over (strict verification and thorough code auditing process my arse), but Firefox itself includes UTM and other tracking parameters in its outgoing links to third parties (including all preinstalled search providers) even when "Allow Firefox to send technical and interaction data" is unmistakably set not to allow it.

Things would be different if this software were to have actually improved over time, but in most cases the opposite is true. Thinking about it, the more telemetry an app collects, the faster it seems to deteriorate across the board. An example of two opposites on this spectrum would be VIM vs Spotify.


Which platforms does this apply on? The article doesn't mention eg Windows or Linux.


This is 650MB[1] Windows bloatware (now spyware) that gets installed alongside the driver. For 99.999% of users, it provides absolutely no benefit[2].

For Linux users, all that is required is a periodic update to the open source kernel, as well as proprietary firmware blobs, and open source Mesa software, and that's it. If you really feel it necessary to see statistics for your Intel GPU, there is a sysfs interface for doing so, and open source tools readily available to make it easy to view and use these statistics[3].

[1] https://www.intel.com/content/www/us/en/download/726609/inte...

[1] https://www.intel.com/content/www/us/en/products/docs/discre...

[2] https://gitlab.freedesktop.org/drm/igt-gpu-tools/-/blob/mast...


650 MB for a graphics driver, to support a small family of related GPUs, on 1 or 2 OSes? Just... wow, unbelievable. What a waste of bandwidth, storage & RAM / CPU resources.

I look forward to the day where one could take such an installer, have AI assistant install it inside a simulator, decode & categorize every byte installed (+ in the installer that goes in), then maps out the function of each byte. Just for giggles.

I'll predict the functionality vs. fluff ratio must be 1:10000 or worse on this thing.


Looks like Windows - it mentions a setup/installation wizard, which isn't a thing on Linux for GPU drivers. And Intel has separate GPU drivers for Windows and Linux (the drivers for the latter being open-source), so it's likely just Windows. Especially since I doubt Mesa would accept a patch for a telemetry feature.


Intel themselves do seem to offer downloads, eg https://dgpu-docs.intel.com/driver/installation.html & https://www.intel.com/content/www/us/en/docs/oneapi/installa... - it's not outside the realm of possibility to have some kind of installer app there.


I would be surprised if the Linux kernel or Mesa would approve such a pull request.

Slipping it into already secret code (proprietary drivers), though, would be a cinch.

So, it's Windows.


Am I the only person who thinks that this sort of telemetry is a good thing? It helps the user if telemetry provides reports of bugs and performance issues that help improve the product. It is a benefit to me that if a program keeps crashing, the responsible team will see this and be motivated to fix the bug. (I also like the thought that teams that write bad code will get embarrassed in meetings by bad telemetry numbers.) Obviously telemetry can be used for evil, which is bad.


How does tracking what websites I visit reduce bugs in my graphics driver?


Just make it opt in or have the user manually click yes/no during the install.


Nothing wrong with this in principle. The burning questions is what they use the data for. In particular, do they sell it to 3rd parties? How long do they keep the data?


Telemetry can indeed be very useful in a way that benefits everyone.

But that doesn't make it right if it's done without the user's informed consent.


„The categories of websites you visit, but not the URL itself, Includes universal plug and play devices and devices that broadcast information to your computer on a local area network: for example, smart TV model and vendor information, and video streaming devices.“

This sounds like a really bad idea to let it run on your pc. This is basically Spyware, that Scans your (corporate) network too.


"You have been blocked from accessing this website. We're now blocking all traffic coming out of Hetzner networks."

Thank you, AI!


Does any OS use capability-based security or other means to categorically lock graphics drivers out of network access, and why does Windows not?


These people never stop, do they.

Move to Linux. Take a look at NixOS.


Sadly, we're seeing an uptick in opt-out telemetry in Linux/OSS software as well. Even they can't limit themselves to making it opt-in.


How do I shut off Nvidia telemetry? Can I block IP addresses or DNS names?


What is the Windows equivalent of Little Snitch?


Safing Portmaster: https://safing.io/

It's open source version supports system wide firewall and blocklists. Pretty easy on battery as well.


Thanks for the mention. If anything is missing on the filter lists to block this stuff, let me know, I'll add it asap.


nvidia doesn't collect anything from linux drivers, i wonder if intel does the same?


tl;dr:

> We are unilaterally altering the Terms & Conditions for the device you purchased. Pray we don't alter them further.


Send them your GDPR Requests!

Give them fire.


Intel wants to provide the best computing experiences. To accomplish this, we would like your permission to collect, use, and combine information to understand:

The categories of websites you visit, but not the URL itself

Software usage: for example, frequency and duration of application usage such as Intel® Driver & Support Assistant, but not the application content itself such as specific actions or keyboard input.

https://www.intel.com/content/www/us/en/support/topics/idsa-...


> To accomplish this, we would like your permission to collect, use, and combine information

Oh man, they left a typo in there. Did anybody catch it? They use the word 'permission' when this is actually selected by default for all users who install their drivers. Hey, that's not how you ask for permission! I would love to see the look on their faces when they realize this mistake.


In general, Silicon Valley has a huge problem with / misunderstanding of things like "permission" and "consent." If computing was like dating, Silicon Valley would be a creepy guy in the nightclub constantly telling women: "We are now dating. Do you want to dance? [Yes] [Ask Again Later]"


They'd be more like the guy who sees a girl at a bar who he works with, roofies her, takes her home, and then doesn't do anything to her but makes sure she knows that he *could* have, isn't he such a nice guy?


Everyone is all aboard the consent train until it becomes inconvenient. See all the apps/websites with options for "yes" and "not now". I could never make that a button without feeling gross


Wow ... that sounds a lot like Spyware.


Telemetry is spyware with a marketing facelift.

But I agree with a previous poster. If you're OK with what Windows already then this barely registers. For all we know this one won't even reset the manual opt-out every 6months to a year so it might be a minor annoyance in comparison.


Philosophically and ethically and technically speaking, telemetry that happens automatically without first obtaining informed consent from the user being surveilled (i.e. so-called "opt-out telemetry") is equivalent to spyware.


This is not surprising. Intel did things like ME in the past. The days where you bought the hardware and had full control over it are long gone imho


They're still doing ME are they not?


Did they ever stop?


That was my question exactly.


People hating telemetry just because it is telemetry are tiring as hell.

Blame telemetry for collection of unreasonable stuff, not for collecting data that is valuable for software improvement and doesnt violate privacy.


People defending telemetry just because they deem it "reasonable" are tiring as hell.

Collecting the categories of websites I visit violates my privacy, full stop, end of discussion.


So did you just agree with me?


Depends on your definition of "reasonable".


Collecting website categorization is not "valuable for software improvement" and certainly does "violate privacy."


I didnt say it is.

Just arguing against hate for all telemetry


Show me one example where telemetry made software better.


Why would I have to? Concept is solid and stays on its own.


If the concept has had no benefits, then it is not solid and does not stay on its own.

Especially if you cannot think of a single example.


I do not follow OSS projects and their decision making processes in order to provide links.


Telemetry is wrong if it's happening without my informed consent. What data is being collected doesn't affect that. If you're collecting data about me, my machine, or my use of my machine without my informed consent, you are spying.


>If you're collecting data about me, my machine, or my use of my machine without my informed consent, you are spying.

What if I collect data about the way you use my software?

As a dev im using telemetry to provide better product for my users


Just because spying is useful for someone doesn't make it not spying.


This is not spying.

The purpose of spying is to gain advantage over e.g enemies. There is no good faith.

The purpose of telemetry is to collect some data about how my product is used in order to make it better.

Also telemetry is documented, so it is no secret, but it is just that users dont give a shit. Meanwhile spy tries to be undisclosed as long as possible.

Intention makes difference


Allies spy on each other all the time.

* https://www.npr.org/sections/parallels/2013/10/28/241384089/...

* https://www.vice.com/en/article/5d9bp8/us-spies-allies-south...

And good faith and intent is not a sufficient guard.

Intent makes a difference, but it's not magic. It only makes a meaningful difference if it changes what data gets collected and stored. The data gets collected either way. Data breeches happen, and intent isn't stable (or even fully coherent in companies. You might trust them now, but you probably shouldn't trust them a few years down the line. Look at how different google of today acts than the google of 2000.

For the case in point, that data includes "The categories of websites you visit, but not the URL itself, Includes universal plug and play devices and devices that broadcast information to your computer on a local area network: for example, smart TV model and vendor information, and video streaming devices.", so far more than "data about how my product is used in order to make it better". That's some other purpose. We already have evident of bad intent. Bad faith is not at all uncommon with large companies. You do remember the sony rootkit[0], right?

And even with good intent it's easy to overcollect data, because of the fear of missing out on something useful. It really could be useful be useful to see if crashes correlate with other running software, various registry settings, etc, but collecting that absolutely should be considered way beyond the line. (That rootkit was packed with telemetry too, collecting e-mail addresses and listening habits)

Telemetry we _know about_ tends to be documented, however there's an unknown amount of sampling bias. Further, secrecy is not binary, things can be poorly disclosed. I'd argue they often are, with documents that are neither obviously visible nor transparent about what is collected. This is in no way surprising. Disclosure doesn't directly help any bottom line, it just guards against possible reputation and legal damage if it is discovered.

As you say, users mostly don't care. We live in an age of mass surveillance and have raised generations who think it normal.

[0]: https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootk...


> What if I collect data about the way you use my software?

That's collecting data about me, my machine, or my use of my machine. If you aren't getting people's informed consent before you do this, you're spying. If you have consent, then that's people sharing data with you and isn't spying.

> As a dev im using telemetry to provide better product for my users

Cool. I'm not saying you shouldn't. I'm just saying that you should ask first.


This is not spying.

The purpose of spying is to gain advantage over e.g enemies. There is no good faith.

The purpose of telemetry is to collect some data about how my product is used in order to make it better.

Also telemetry is documented, so it is no secret, but it is just that users dont give a shit. Meanwhile spy tries to be undisclosed as long as possible.

Intention makes difference


> There is no good faith.

We in the software industry have already burned that bridge. There's been so much abuse for so long that there can be no reasonable assumption of good faith. Trust now has to be proactively earned.

> Intention makes difference

I disagree. If some guy decided to follow you around and write down every place you go and when you go there, wouldn't you say he's spying on you? Even if he has no ill intent, you can clearly see him doing it, and he literally does nothing with the data he records?

"Spying" is when you're collecting personal data without consent. What that data specifically is and what purpose it's to be put to don't enter into it.

> Also telemetry is documented, so it is no secret

An activity doesn't have to be secret to be spying. And sometimes telemetry is documented, but certainly not always. And often, even when it is, that "documentation" is buried and is hard to find. But regardless, simply documenting a thing is not equivalent to getting consent for the thing.

Think of it this way... why is there so much resistance to getting consent to collect telemetry data? The most common answer I've seen to that question (from the pro-telemetry camp) is "because if we give people a choice, too many will decline to provide telemetry". Which means that they know that lots people don't want this data collected, but want it so badly that they don't care. How in the world is that not spying on people?


There are so many things collecting telemetry, nobody has the capacity to understand the data-points and business use case of every piece of software they interact with. There are two practical options:

1) Roll over

2) Reject as much as possible


> doesnt violate privacy.

And who is making that determination? You?

Because all data going out of my system without my explicit and freely-given consent is a violation of my privacy.


I will agree that there is some telemetry data that can be useful for the developers. People developing a display driver might want to collect statistics on what resolutions and refresh rates people are using so they can optimize their test setups for the most common setups. They might want to know what games people are playing so they can optimize performance in those games.

But why does a display driver need to know the categories of websites I visit? They have no need for that.


The thing is separately most telemetry seems reasonnable.

But if you let them all activated you end up with a computer that connect to hundredth of remote hosts constantly.


Based on the state of Microsoft software these days, I don’t think telemetry is as useful as developers think it is.


It is probably used to confirm how effective dark patterns are at tricking people into do things they don't want to do.


After having realized that I don't know and can't put into exact words what harm telemetry represents for me personally, I stopped caring and I always opt in to share as much of my telemetry info as possible


I get not being bothered enough to opt out.. but actively opting in? What's the motivation there?


the model citizen




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: