On the topic of Microsoft is not less evil than before : today I needed to use Teams, I was pleasantly surprised by the fact it supported a web client. I tried to use it and got caught in a tunnel of dark ux. You go to the website, click to sign in, get asked for your email and password, then verify your email, then you discover that you actually signed up for something else and now you can signup for teams. You follow the wizard again, then you are told to use Skype because you are not a company. You restart and make sure to check the box saying you are a company, then put your company info, it works! You try to start a call, and you get told it does not work in firefox...
On the topic of Microsoft is not less evil than before: they recently released a GUI kit called MAUI, thereby stealing the name of the already existing KDE project. They know and they're not being nice about it.
Yep. Google also offers a lot of "rich cards" in their search (interactive things mostly) that are only available on Chrome .. unless you change the user agent of another browser to Chrome, in which case it also works perfectly.
Ah, but the point was about using "their dominance in web search to undercut competing browser vendors".
In this case, ~90% of web searches go through Google[0]. So they are using the fact that they are the number one web search engine to convince users to switch to Google Chrome.
I have hit cases, especially with the O365 admin panel where things are just broken in FireFox. They didn't even bother to deny FireFox, just didn't test at all.
> diagnostic data collection (telemetry) is not enabled for private builds
> this data collection is covered by windows 10 privacy, You can find the windows 10 privacy statement and details of controlling the diagnostic and feedback settings here.
So if you build from source, you can disable it, and if you don't build from source but install it from the store, then telemetry is controlled by the central privacy settings in Windows 10.
Presumably this would be a problem only if you specifically don't want MS to have telemetry from winget, but you also specicifically want them to have telemetry on the rest of your OS, which would be... weird.
You're technically correct for some use cases. For Windows 10 Enterprise users, the official tool does allow you to disable telemetry by modifying the system telemetry settings.
For Windows Home and Professional users, this is not the case. Disabling telemetry is not possible because Microsoft have decided that there is a minimum "required" amount of telemetry every OS installation must send in order to function.
If the telemetry description stated that some Windows users are able to opt out, they'd be correct.
It's just another example of Microsoft showing they couldn't give a rat's ass about their customers' wishes and that you'll just have to deal with them tracking everything you do.
And who knows? Maybe I do like to contribute to Microsoft about kernel bluescreens so that Windows can get more stable, but do not wish to upload a report every time I install or uninstall software? Why would it be strange that I don't like to share some telemetry but not all telemetry?
> For Windows Home and Professional users, this is not the case. Disabling telemetry is not possible because Microsoft have decided that there is a minimum "required" amount of telemetry every OS installation must send in order to function.
Yes but AFAIK, winget telemetry fits entirely under the optional category. No metrics of winget fits under the "required" category that you cannot disable.
In other words, you can completly disable the telemetry of winget, which the title of this article say you cannot.
Elsewhere in this discussion, a Microsoft employee working on the new terminal was completely sure that the data it collected wasn't sent anywhere, until they looked into it again and discovered that it actually was being picked up by some part of the inscrutable telemetry machinery and sent off to Microsoft.
It's not quite so clear. The data collection policy linked in the comment (that can be controlled in settings) distinguishes between required and optional data collection. It's possible to disable optional data collection, but not required data collection.
Here's one of the elements that is required (and so it cannot be disabled):
Information about your device, its settings and capabilities, including applications and drivers installed on your device
The required telemetry is already gathered by windows 10 though. My point is to say that if you disable the optional telemtry in W10, then presumably winget doesn't collect anything.
So while I agree with you that the required telemetry is obtrusive, it's not new and it's not related to winget.
This project collects usage data and sends it to Microsoft to help improve our products and services
It's this type of broad data collection disclaimer that I find lacking in modern products. Usage data? It's not clear what that is, or what exactly is being recorded, and what is sent that is required, and what is sent that is optional.
This is an impossible problem. If they underdeclare, they invite a lawsuit. If they overdeclare, they get accused of malfeasance they aren't even doing.
If they open the whole business to transparency, they become victims of commerical espionage and expose their users' data to invasion.
How about just providing a link to an actual payload (with example data)? And within the installed application, providing the ability to review all the data that was recorded and set?
MS adheres to GDPR export/delete requirements. Failure to adhere risks litigation so it's not something that MS is just going to willy-nilly flaunt.
It's frustrating reading a lot of comments that are of the form "Why doesn't MS just do XYZ?" from individuals who don't seem to know that MS is already doing that.
The parent comment that I replied to was asking about transparency and I tried my best to answer.
> Why doesn't MS just stop forcibly collecting data from my machine?
Updates.
At the minimum data collection level for client SKU (set via user settings), MS asserts that only data needed to keep the machine running and secure is collected (i.e, updates). They really do try hard to honor this assert, although I understand I'm not going to be able to entirely eliminate skepticism.
You can argue that you should be allowed to disable telemetry for updates, but I think that's a different conversation. MS didn't want Windows reputation to suffer due to undeployed fixes (this was a big problem prior to win8), and they took a hard line on the updates.
Enterprise SKUs have a zero exhaust option (no telemetry), but alas, it's not free.
> MS asserts that only data needed to keep the machine running and secure is collected (i.e, updates)
I think this phrase is going to be a contentious point anyway. Microsoft doesn't need any telemetry to "keep the machine running" - as trivially evidenced by me pulling out the Ethernet plug, disconnecting the wireless adapter, and booting up Windows 10. The OS will run just fine. As for keeping it secure, it depends on how much you stretch the word "secure". I can't see any reason why telemetry would be needed to deliver updates (Windows could just download the list of available updates and request the ones it needs). You could argue telemetry is needed for malware detection and mitigation, but that can be stretched to justify everything up to and including uploading snapshots of your hard drive.
I'm guessing "needed to keep the machine secure" is how we've arrived at the rather absurd situation in which Windows Defender is automatically uploading executables it finds on the machine, to have them tested on Microsoft servers - which has been demonstrated to be a neat vector for exfiltrating data from secure corporate networks: the malware can just encode the data it collected into an executable and write it out to the file system, at which point it gets picked up by Defender and uploaded to Microsoft (which corporate firewall will most likely allow), where upon execution it connects to the attacker's server and delivers collected data.
Linux distributions manage to provide updates (and keep the machine running and secure) without telemetry. They were already doing this before Microsoft implemented automatic updates in Windows. (And, in fact, Windows had updates before it had telemetry.)
This is not entirely correct: for as long as MS has offered updates via the internet, the update service has required some information from the users machine. If we went back to 2003 with the same legal definitions and awareness that we have today, we could have called this information 'telemetry'. The legal definitions and public awareness of the data gathering has changed a lot since then, but the use of 'telemetry' (for lack of a better word) hasn't.
You'll have to forgive my ignorance about Linux because I haven't used it in a long time: I just fundamentally don't see how any update manager (be it chocolatey, nuget, windows, or linux etc) can efficiently update a client without knowing what's already on the client. If any installation information is sent from the client to an external entity, it may be considered 'telemetry' and subject to certain laws. Doesn't sudo send such information? (again, forgive my ignorance on the Linux implementation).
> I just fundamentally don't see how any update manager (be it chocolatey, nuget, windows, or linux etc) can efficiently update a client without knowing what's already on the client.
Pretty simple. Instead of sending information about what's on the machine to the update server, the client would download the list of available updates from the server, use that list to determine what updates it needs, and request those specific updates from the server.
That's e.g. how apt - the Debian family package manager - works. You `apt update` to update the local cache that lists available packages and their versions; then, you `apt upgrade` to download and install any or all packages that need updating.
The client downloads the list of packages from the update servers, compares the available versions to the installed versions and offers (if the user so chooses) to download and install the newer versions (or a subset). (And there is some additional logic around resolving conflicts, all on the client, using metadata downloaded from the servers.)
You might argue that this is not "efficiently", but it works great and in my personal experience a lot faster, more reliably and more predictably than Windows Update.
(As for sudo: No, unlike e.g. the .NET Core CLI and winget, system utilities don't normally contain extra unrelated functionality to silently collect user data and send it off somewhere. Perhaps one reason Microsoft don't hear more complaints about it from developers is that nobody imagines that you would do such a thing. It's normally not done.)
Yeah, to extend the thought about sudo - which, for clarity, is a command you use to execute something as a superuser on your system; it's not a package manager, but something you often prefix when interacting with one.
So sudo, depending on system configuration, does log interactions, but does it locally. The rationale behind this is that Linux heritage is one of multi-user systems, so you want to have the owner/sysadmin of that system to be able to audit the use of superuser privileges by regular users, in particular when they break something. In a Linux system installed on a desktop, or a server you admin yourself, the user is the only person who can access these logs anyway, so it's no different than any other system log.
That these logs would be automatically sent to third parties is unthinkable - in the sense that nobody would even think an operating system would dare send sensitive information out like that.
* Basic error information to help determine whether problems your device is experiencing can be addressed by the update process.
* Information about your device, its settings and capabilities, including applications and drivers installed on your device, to ascertain whether your device is ready for and compatible with the next operating system or app release and ready for update.
* Logging information from the update process itself to understand how well your device’s updates are proceeding through the stages of downloading, pre-installation, post-installation, post-reboot, and setup.
* Data about the performance of updates on all Windows devices to assess the success of an update’s deployment and to learn device characteristics (e.g., hardware, peripherals, settings, and applications) that are associated with the success or failure of an update.
* Data about which devices have had upgrade failures and why to determine whether to offer the same upgrade again.
The most egregious of that list is, imo, this:
* Information about your device, its settings and capabilities, including applications and drivers installed on your device
> The most egregious of that list is, imo, this: Information about your device, its settings and capabilities, including applications and drivers installed on your device
How is it possible to know what updates to install without checking this information?
A. Software Update process provides your local machine with a list of available updates.
2. Software Update process includes metadata that indicates what local machine conditions should be satisfied to warrant installing a given update.
D. Local machine checks the metadata against its own settings, capabilities, applications, and drivers to determine a matching condition to select and apply relevant updates.
Is that practical with the sheer number of different drivers/updates that can be delivered through WU? And wouldn't they get the information anyway when they see your machine in the access log for the update files?
The local machine only needs a manifest of available updates and its local state to determine what updates it requires. The potential size of the manifest doesn’t strike me as a good reason to exfiltrate more data from the local machine. There are myriad options for limiting the size of the manifest into something more manageable—say, splitting it into many smaller manifests—and let the local machine identify the updates it needs. An access log isn’t going to provide the Update Provider the same level of data that transmitting local settings, capabilities, applications, and drivers would—or the Update Provider will have to write additional software to turn access logs into best-guess approximations of, at most, apps & drivers being downloaded by a particular IP. However, the local machine still retains more information about itself than is otherwise given up by blanket data collection of its settings, capabilities, applications, and drivers.
The collection of massive amounts of data is not necessary to provide a collection of available updates and leave it up to a local machine to determine which of those updates it needs to apply.
You can't disable telemetry on Windows Home or Enterprise in any supported way. You can either hack through multiple registry and task manager entries or use a third-party program. I suggest Shutup10.
yeah. Looks to me like the issue lies more with the base telemetry data policy surrounding Windows 10 itself, which has been a point of contingency between Microsoft and users ever since Windows 10 released.
Even if winget allowed for disabling telemetry entirely, our OS is still collecting telemetry too, so it seems the point is pretty much moot, imo.
Are people worried about the actual contents of these kinds of telemetry, or rather just annoyed by the fact that it's there at all?
The first position seems a bit odd for something that is open source (so presumably you can verify what's being sent). I mean it might be bad to send "I installed product X" or "I used the command X" to a remote server, but on the other hand if I really feel this is a problem would I ever even be using the closed source binaries that the package manager installs, without worrying more what they might do, than what happened when the package manager ran?
Some times I get the feeling that the telemetry thing just became an expression of annoyance with something else entirely, or just the current state of affairs. It's like one of those cultural wars where every battle is so symbolic that everyone forgot what the real issue was ("Why do we worry so much about who uses which bathroom again dad?").
My purchase of a piece of software does not give that software provider carte blanche access to what I do with that software. Unless I explicitly give my approval (opt in), exfiltration of information from my system is an ethical breach of my privacy if not a breach of data protection laws.
As for why this matters, you need only look at Hong Kong for that answer. And even here in the US, many large companies are kowtowing to the Chinese government and changing their products in the US to toe those lines.
I'd gladly agree to negotiate a sale of my usage data, where I can set my price and they can decide if they'd like to buy it. Or better, I'll agree to send my telemetry data if they agree to send me their source code so I can see exactly what is being sent.
The "equitable exchange" of telemetry is you give anonymized usage data for free, and the developers provide you with a better product based on that data. If you trust the developer, it sounds like a reasonable deal to me. No one is making you use that software.
> the developers provide you with a better product based on that data
I think that it's incorrect to believe that telemetry allows developers to build "better" software. Web pages have had intrusive telemetry for over a decade now, and they still broadly suck. We're able to navigate the world of hamburger menus, text with poor contrast, and scrolljacking because we're adaptable humans, not because these kinds of changes actually improve the websites.
The problem is that telemetry can only tell you the "what" and "how", but not the "why".
> If you trust the developer
It's not just the developers who gain access to that data. It's everybody at the company. It's 3rd parties who contract with that company. It's everybody who hacks into the data set at the company. And it's every governmental agency from (almost) any country with an overly broad warrant who has access to that data.
See my upstream comments about China and Hong Kong
> No one is making you use that software.
Just as nobody is making an effort to make it simple (or even possible) to make an informed choice about whether I should use that software. Homebrew is a great example. I know that it's sending telemetry to Google only because someone raised a ruckus and used social pressure to make the homebrew devs add in a reasonable amount of consent into the install and update process.
Thankfully, laws are catching up with technology, and I should soon be able to rely upon simply saying "data collection not directly tied to software functionality legally requires my explicit consent, and that consent is not required to run the software".
I don't think you understand how telemetry works because you're comparing it to scrolljacking and such. The telemetry they're talking about is for error reporting. It answers questions like
1. Did this error condition that should not happen happen?
2. How many users did it happen for? (Who is not available for legal reasons)
If you search for tracelogging in the code you'll find they're associated with debugging and errors.
>>I bet you also want the software to work and not have bugs. Telemetry gives devs information to prioritize and debug information fix the bugs.
You're just illustrating a single possible use among the universe of actual real world uses of telemetry, such as gathering your personal info to build up your profile to use against you as the company sees fit.
As others have stated throughout this discussion, it's standard practice for over a decade to use systems that don't collect telemetry to keep systems updated and safe. It boggles the mind how some people try to claim that the only way to remain safe with Microsoft products is to be perpetually spied upon.
> Are people worried about the actual contents of these kinds of telemetry, or rather just annoyed by the fact that it's there at all?
Both, I imagine. Worried about the contents because the trend of companies pulling in as much data as they can tends to not end well for consumers, and annoyed as it's an invasion of privacy that there is tracking for the many things we do on our personal devices.
We have regard for bathrooms as well as telemetry for a shared reason: privacy and a sense of control and safety. We are exposed and we seek to limit our exposure.
Without necessarily being worried or annoyed by telemetry, I could understand that people might question the need for it; is it required by other package managers?
Collecting data creates a risk that the data will be intercepted or misused at a later point, and there are often legal requirements regarding data collection and processing too.
It exposes a lot of indirect info about what you may be doing, writing, etc. I know in a high security environment that is a no no.
Its also the principle.
Lastly its trust. I tolerate some telemetry from Apple and sometimes opt into it with things like Debian. I don't trust Microsoft not to be customer hostile.
Windows itself collects telemetry so I can see your point, but your example is bad. Single sex bathrooms are not symbolic at all and are a serious issue.
My stance is simple, I want to turn off all telemetry, period. It's my computer, I should be able to do whatever I want on it. Anything less is unacceptable.
I understand why telemetry is valuable to developers and I can accept when it's active by default. But I _insist_ on being able to turn it off.
Hopefully Shutup10 will be updated to disable this telemetry too.
But people also worry about what those closed source binaries do, in the last few months there was plenty of discussion about the Zoom installer doing shady things.
You are correct. It's annoyance with data harvesting and idealism around "I have the right to remain uncounted". I support much of that in theory, but not to the absolute extreme that is popular in some communities.
Unfortunately, it's often also combined with "I have the right to have my opinion counted", which leads to some very ironic moments where people opt out of telemetry and then complain that their uncountable needs aren't counted.
I find it outrageous; "Telemetry" is built into most new Microsoft software. For example, they recently released a replacement for powershell and CMD, called "Terminal 1.0", which also comes with some aggressive telemetry built in:
This also applies to newer releases of powershell, aka PS Core. I haven't tried either, but I guarantee you telemetry in both applications is not opt-in but opt out using some obscure method, if that is even possible.
In any case, the claim that telemetry is necessary to improve anything related to customer experience is ridiculous. Not only is a general data collection unnecessary; it would be more efficient to run some experiments, and be it some opt in A/B tests. Surveillance like the above is encroaching and can easily be abused. The data collected are usually fine-grained enough to allow for some nice fingerprinting of individual users. The potential for abuse is high.
The file you've identified produces a local, opt-in event stream that does not leave your machine unless you literally e-mail it to me. It's just got that unfortunate word in the filename that means we're bad guys.
EDIT, upon closer inspection: when this is built as part of the Windows product (which consumes source from this repository) those values may end up in an event stream. In the interest of full disclosure, those events are:
1. Part of the console host (conhost.exe) and covered by the Windows global data collection settings
2. Pertaining to (incomplete, but it's too early in the morning for me to do a full review of this code):
2.a. The number of times each low-level console API was used
2.b. How the legacy Find dialog is being used (long strings, short strings, search direction, number of times)
2.c. Specific settings like font size, how many colors are configured, how big the window and buffer are
I should put a disclaimer at the top of this saying that I'm just a regular old Hacker News commenter who skimmed that file and really has no idea what this code actually does, so I'm not trying to scaremonger because I saw something sketchy without following up on it. However, that file seems indicate that Terminal logs process connections. Is there a way that this information might leave the device? Could it include arbitrary processes on my system in that data?
When the console host (just C:\windows\system32\conhost.exe, not the new Terminal) exits it emits the following information for processes that had connected to it:
* How many ANSI/VT sequences they used
* How many of the above we understood
* How many of them we did not understand
* The executable stem name (ConsoleApplication1.exe, wsl.exe, cmd.exe)
* How many times we saw that executable
~1-5% of those entries make it into a data pipeline that I believe we stopped looking years ago. These pipelines are usually(?) turned off by the OS, so it's possible that these were rendered inert. Still, though, and because the executable stem name might be a little more exposure than anyone's comfortable with, I've filed https://github.com/microsoft/terminal/issues/6103 to yoink it.
(It's been a long time and I still don't know how to format things properly on Hacker News :))
Thanks for looking into this, and I appreciate you filing an issue! (Hacker News doesn't really do formatting, so I think that's the best you're going to get.)
I am not sure what you are trying to do with your clarification here, but I feel even worse about Terminal and your employer.
> ~1-5% of those entries make it into a data pipeline that I believe we stopped looking years ago.
Can you elaborate? What does that mean. 1 - 5 % of what exactly? Of all records collected? Of the records on my machine? I thought those were localized traces, do they end up on MS servers or not? What criteria are used to reduce the data? You believe that "we stopped looking years ago". What does that even mean? You not sure what confidence interval "believe" accounts for, but the error term is a bit high for my taste. And as for "we stopped looking years ago": I feel offended. You must think that I am very naive (that extends to all readers here, but I will only speak for myself here).
The code is clearly labeled telemetry, yet you claim that this is soley about local traces. If I am to believe that to be true, that makes me distrust your software even more, because then it must be of very poor quality. How do you fix bugs quickly, given that you engineers can't differentiate "local traces" from "telemetry". The latter literally means "to measure from afar". The fact that you opened a GH issue about this makes me want to believe you; the fact that it's locked doesn't; but hey, that's your dev process and community work -- I won't judge that.
As for your next argument: "It's the OS'es fault". I don't care whether the OS vacuums all my data and sends it to your employer, or whether you personally break into my house to exfiltrate my harddrive: I don't want you to obtain my data in any way. This makes it even worse -- the application you're responsible for appears to passively creates some sort of profilable data, which the OS exfiltrates. This does not remove you from the responsibility.
Also: Please consider the bigger issue here. I really want Terminal to be a great piece of software. I would consider myself a Windows fan, if it wasn't for the ongoing disrespect of my privacy; Windows is unmatched in terms of stability and consistency and a proper terminal has been missing.
Your employer claims to gather data as to improve the software I use in my interest; despite the fact that I (and many others) keep telling them that they are REALLY interested in retaining the right to privacy, individualism and secrets.
This is about trust. You keep breaking it. I don't trust you, and I don't trust your employer. When I tell you and your peers so, your answers are evasive (GH issues is not the right place), hazy (oh THAT is done in another component), bureaucratic (look at all these legal statements), irresponsible (it's the OS), or otherwise elusive (that thing clearly labeled Telemetry doesn't do telemetry, because @architechture). You don't understand the problem and you don't really care. I believe you stopped looking at that "pipeline" years ago.
It seemed reasonable considering VSCode is also a Microsoft product with an explicit telemetry option that you can opt out of.
Within 15 minutes the issue was closed and the idea of adding a telemetry option was dismissed by a contributor.
Kind of scary to use something so integral to your day to day as a developer is having that much data being sent out to Microsoft. It's partly why I stick with wsltty (which is equally as fast and has no telemetry).
Is that sending Microsoft all of my bash/zsh aliases? And what about
`TraceLoggingUInt32(_rguiTimesApiUsed[GetConsoleTitle], "GetConsoleTitle"),`
If it works how most other Terminals I have used - that is going to send the name of the program I am running or host I am connected to to Microsoft. I think that is pretty invasive if you ask me.
I dunno. The name "_rguiTimesApiUsed" (and that it's a "uint32") suggests that it's a count of times an API was used, not the raw data that went through that API.
EDIT: I put together a list of what happens in this file in a sibling comment.
Telemetry is also built into every web application you interact with. The lessons learned in this space are shifting to non-browser based applications it seems. Sucks.
Telemetry seems to be a sore and curt topic with Microsoft. I've yet to see anyone make headway on even just having a discussion about it with public facing MS devs; it almost always gets a quick, rote toe-the-line response and the discussion gets terminated or blackballed or ignored thereafter.
It has the airs of an internal mandate. I can't help but be deeply suspicious of this behaviour.
Based on the quality of today's conversation here on HN, I'm not inclined to fault them for taking that approach publicly. We can't even discuss this topic coherently ourselves. Our discussion here is filled with "this is my position" statements that are presented as arguments contradicting another's position.
I think this post here today serves only as a rallying cry for "no telemetry" extremists and contributes nothing interesting or curious or relevant to HN that hasn't been covered in hundreds of framing-implied or framing-explicit "telemetry is bad" posts prior.
Okay, I could buy this argument on somewhere like Chan. HN though?
I can count on a single hand the number of times a thread has gotten hopelessly out of hand. Most people have reasons for their positions, for or against. That they may not necessarily agree with your own, or be politely phrased enough for your personal tastes doesn't detract from the message.
Software companies have treated the User's machine as their playground for years with little or no resistance, because most of the technically savvy were A)employed by them B) kids or niche enough hobbyists to be safely ignored. or C) cared not a lick as long as things worked.
I've been trying to warn places for years this free-ride will end as soon as larger swathes of the population are both socially and technically savvy, and large groups of people are in a position to put in substantial resistance in terms of implementation behaviors they are willing to support.
I'm a hardliner against hiding anything* from the user. It's creepy, and inspires nil trust. Companies have only themselves to blame when they start taking advantage of user's ignorance. I spend half my time making the various degrees of learned helplessness foisted on people by tech companies a thing of the past for user's I support. It takes time and persistence, but it can be done. Not a one of them appreciates the wool having been pulled over their eyes.
> Based on the quality of today's conversation here on HN, I'm not inclined to fault them for taking that approach publicly.
Oh yeah sure, Microsoft is the victim, cyberbullied by their own users. It's a classic case of innocent naive corporations getting senselessly dogpiled by mean common people, clearly that's the way this power dynamic is arranged. Yeah, fucking right.
I've tried having this conversation, you're right. Making headway is really hard and one of the primary reasons is that the topic is very complicated. There are many employees at MS who deal with those complexities as a full time job.
MS Devs who do venture to discuss it are aware of the complexities and tread very cautiously: any slight misphrase has bad outcomes. Smarter participants in the conversation will merely point you to Microsoft's public facing privacy statement, which is such broad legalese that it's not entirely helpful to the analytically minded individual that wants to learn more.
That said, I've worked with diagnostic data for several years now and I've never seen anything to justify the hatred that MS gets over this. I've been on teams that have spent a lot of effort ensuring that any data is gathered, stored, and used in a respectful and legal manner. I've seen mistakes, but they are disclosed and remedied as part of the business process.
[disclaimer, MS employee, my employer wants me to state that this is my opinion and not Microsoft's]
But there's a solution that's neither complicated nor complex, greatly reduces the effort required in the areas mentioned and virtually eliminates those mistakes: don't collect data from my machine.
This solution has been successfully used before, even by Microsoft themselves!
Note that you can still collect crash dumps and usage data (at the risk of reintroducing some of the problems solved by not collecting other people's data) if you just ask nicely (and clearly) and the user decides to allow it. Office and .NET used to do this.
You can make telemetry opt-in and ask for consent at the moment of the installation. Microsoft refused to make it optional and lost the trust of millions of users worldwide, all while doing nothing to actually make Windows 10 suck less.
You make it a binary choice, whereas it's very simple: you can make the dafaults you like, but then you should also offer an option to opt out, clearly explaining what it does and why, and you should respect the choice of your customer throughout subsequent updates. This is what being decent means.
Most people will just accept the defaults, so why piss the rest off?
I recently installed an Insider build of Windows so I could use WSL2 to run some docker containers in a Linux environment.
The insider build requires that you enable full telemetry which includes sending your visited websites to MS. I need WSL2 so I’m just avoiding doing anything private on my personal computer for now.
I understand why the data is useful to them but I don’t think they understand or care why this is an important issue to others
Windows as a Service has been their stated goal since prior to the release of Vista. Nadella has opted to go full speed ahead, with everything from Office to the versions of Windows running on the Surface & Xbox being entirely SaaS. I'm just waiting for the "You must subscribe" hammer to fall for Windows in general.
This is actually not how I understand the origins of the phrase. Before the PC, all computers were owned by large corporations and governments, and the PC was seen as a way to take back that control. See also: Apple's famous 1984 ad.
> the only purpose of this is specifically to be able to get telemetry to detect issues before releasing.
They could just ask for feedback, which is what I naïvely considered the "insider" build for—obviously Microsoft is just spying on you for some reason.
Insider build is very clearly targetting people that are ok with potential bug and instability and that are ok providing telemetry to Microsoft so they can improve stability.
You are not one of those people, and thus shouldn't use it.
Have you ever seen support forums (like Apple)? It’s full of people who just say what the problem is with no detail about the circumstances. Then when you ask them what they were doing, they don’t remember or can’t reproduce it. There’s tons of stupid users.
Telemetry aims to fix that by providing context to what was happening when the problem happened. Whether it actually helps software development, I can’t answer.
How does logging websites you visit help Linux? Windows 10 2004 is already RTM. I've never understood why Microsoft hides their ISO images. It's like they want people to install a version of their OS with a bunch of manufacturer spyware or missing multiple major updates which take hours to install and leaves a bunch of files/directories behind.
If you want to download ISO, you must pretend you are not Windows. Otherwise, it will redirect you to the media creation tool, that will download bunch of files that it wants to hide from you and insist on creating a media for you.
Media Creation tool doesn't download the ISO with CU updates. It downloads one that is about a year old and then once you run Windows Updates, it has about a years worth of updates to install (if not more).
The point was that Windows 10 2004 is RTM but you have to be a MSDN subscriber to download it, or find some shady Russian website to get it from.
Last time I tried, changing your user agent to Safari (OS X) or Firefox (Linux) made the media creation page redirect to a link to download the ISOs. I suppose if you had WSL installed and an X server you could bring up a browser in there, or you could use lynx :P.
Actually you need Linux, not WSL2. That's the way Microsoft is slowly, sneakily, furtively, planting the idea that their Linux is the real thing, which is absolutely not.
This approach is much much much worse than Ballmer yelling that Linux is cancer, because there is no evident hate, no apparent will to destroy the competition which would raise warnings in the community. It is just Microsoft becoming smarter, then absorbing the technology up to the point it will become one of their products -and associated services-. It's Microsoft PR 2.0 and sadly... yes, it will succeed.
WSL2 IS linux though. There was definitely an argument to be made that WSL1 wasn't linux, but WSL2 is full linux within a virtual machine on windows. I don't know how you can claim that is not the 'real thing' when it definitely is.
You get a Microsoft-supplied kernel, which has modifications to integrate more with the host OS. I'm not going to make a value judgement about it being the real thing or not, but it's definitely not the stock kernel that comes with Ubuntu.
That's fair. True, it may be a modified linux kernel, but it is still linux. To say otherwise would be disingenuous and misleading, imo.
WSL1, on the other hand, was just a proxy masquerading as a linux kernel, so I would understand confusion between WSL1 and WSL2 on which is real linux and which isn't.
Microsoft effectively forked Linux here -- some pieces such as integration with the host NTFS filesystems or Direct3D support are unlikely to ever be upstreamed. Yes it's Linux, but not really.
Another way you can tell it's not actually Linux: if something is acting up, where do I inspect the source code? Where can I send a PR if I come up with a fix?
It's smart that MS bought itself a Linux foundation platinum membership; that probably goes a long way in preempting trademark claims of the 'Linux' name.
Op was saying that he/she could not run WSL2 without Microsoft telemetry.
So, yes WSL2 includes Linux plus Microsoft telemetry. Parent was saying that Op actually wants Linux, since Op wants Linux functionality, but not Microsoft telemetry.
It is not true Linux in spirit because it forces the user to live with telemetry and other closed source software and drivers, which defeats the purpose Linux itself was built upon.
Not just that. Privacy, trustworthiness and security aside, let's imagine Microsoft porting (or simply making WSL-aware) some of their software, libraries, system internals hooks, anything that Linux users wanted for ages, gaming libraries etc.), to WSL and not Linux, that is, they require WSL but won't run on Linux, or will run "better" on WSL, or possibly will run on Linux but require some closed or prohibitively licensed code to (properly) run.
Those surely are bleak scenarios, but corporations are there to make profits, and forcing Linux users to require a Microsoft branded layer of software is without any doubt a way to keep them hooked to Windows, which for some will turn into buying products and services from Microsoft, and for others into finding less and less software that will run on plain Linux. I wouldn't be surprised at all if say in two years more books will be published about WSL than about Linux. That would hurt as well since it would mean less courses, less schools adopting Linux, in plain words less users.
WSL2 contains Linux and other stuff. I would not say "WSL2 is Linux" because that kind of imply a bijection which does not exist, and MS is starting to play on that while extending WSL2 with e.g. D3D12 access to the host and talking about that with titles like: "DirectX loves Linux". That's not even strictly similar to the more traditional "integration tools" of VMs, given the user experience integration they provide which is at the core of WSL (in their case the VM aspect is more of an implementation detail, see the network, stdin/out, and maybe tomorrow GUI integration -- and of course see historically the architecture of WSL1 which provided a subset of the same services without a VM and without Linux), and also given they never talk about opening that kind integration to other VMs, even less third party ones. Maybe they will (I would not count on it) in which case it is slightly more ok, maybe they won't, in which case this is border line an hostile attempt to fork the ecosystem and redefine "Linux" (in the sense of traditional GNU/Linux desktop distros) as Linux running under Windows with WSL.
I'm not seeing anything in what you said that would prove WSL2 is not Linux. Saying WSL2 is Linux does _not_, in fact, imply that Linux is WSL2, and there have never been any claims to that by anyone within or outside of Microsoft.
What you just said equates to "A Motorcycle with a side-car attached can't be called a motorcycle because to call it a motorcycle implies all motorcycles have side-cars", which makes absolutely no sense.
I dual boot to Ubuntu 20.04 and I also use WSL in Windows10.
WSL is Linux, but without X Window or gnome. Very convenient for command line stuff because the native Windows Command line is awkward and unfamiliar to use.
Some things work better in Linux, some things work better in Windows. To pretend otherwise is being a fanatic, in my opinion.
My .02 cents. To me the arguments I read here around "but product improvement is hard so that justifies collecting the data" ring hollow. When your convenience at work is balanced against someones right to privacy, there's no middle ground. Privacy wins. You need active informed consent to phone home for reasons not related to the proper functioning of the application.
I wish most applications offered 3 boxes:
1) Don't send telemetry
2) Send data needed to catch bad rollouts (think SRE style status code and latency metrics).
3) Send anonymized data to help improve the product.
4) I want to be a beta tester/insider, you can capture my logs.
0) Don't compile the code responsible for the telemetry into the binary because I don't trust the checkboxes will always be respected in code. Especially after seeing an off-by-one error in the description :)
This whole bastardization of the word 'Telemetry' by the online community is completely abhorrent.
It is impossible to get proper usage feedback from your programs without being swayed by the vocal minority community.
We always find posts online on how crappy software is, but how can software improve if the majority of people actually using the software don't give feedback at all?
> This whole bastardization of the word 'Telemetry' by the online community is completely abhorrent.
I’m going to reply strictly to how you’re framing your response.
I believe that “telemetry” Was only “bastardized” because companies have taken Privacy Policy language to the extreme. I’ve read every single Terms of Service and Privacy Policy I’ve ever had to agree to. The language in them has always been dense but since the age of the ad driven internet, the scope creep has been intense.
Most people will not read these documents, so they will assume that they are all the same and that all of them are equal to the worst version you hear of. That’s why “telemetry’s” meaning has changed. You cannot blame the general populace for this. The root cause is how bad acting companies have added language to Privacy Policies such that:
* just about all data == telemetry
* telemetry can be used for “other purposes”
* privacy policies can change without notice
* you cannot opt-out
These are the worst case scenarios. Do not blame people for assuming the worst. Blame those who have changed the rules of the game such that assuming the worst is the default.
> Blame those who have changed the rules of the game such that assuming the worst is the default.
Ah, so blame lawyers, product managers, anyone who releases free as in beer software, the open source contributors making all that software everyone else is stealing, the customer’s unrealistic expectation that someone else should pay for things, engineers for lacking the affect to care about any of this or to empathize even one iota with the real economy, product designers for being outwardly countercultural while authoring the literal dark patterns that get the telemetry in the first place...
Seeing as you posted this comment twice, here's my reply again:
No. I hope this burns to read.
Software never improves because incompetence is the norm. Not because we didn't have a magical data collection unicorn available.
Competent software companies ran user panels, had decent quality control, didn't steamroll their communities, didn't market loudly over user dissent and certainly didn't shut down their issue tracking to even their top tier partners.
That was Microsoft 10 years ago. That is Microsoft today. But you know, Telemetry solves all these problems doesn't it? No.
The real answer to your question: ask and listen. People will gladly tell you. Do not just take the data otherwise you end up with a set of poorly selected metrics which do not represent user opinion and a lot of pissed off customers who don't want to or can't tell you due to legislation and data protection.
Edit: to back up my point, Microsoft closed down Connect with over 30 issues open from me and our account manager left to go and work for a competitor because he was fed up of dealing with that kind of shit and couldn't even get basic issues from a Gold partner actually escalated to anyone. We had a ticket open for 7 years against clickonce where IE9 broke it completely for about 15,000 users.
Edit 2: I have removed some irrelevant stuff. This story goes on forever. I have so many anecdotes from dealing with MSFT pre and post OSS glory that I concentrate all my effort on staying as far away as possible.
Having the ability to track the crashes of millions of machines, to find patterns in which drivers are crashing which applications, seems like an impossible thing to replace.
I don't have a problem with this. If asked I will submit a crash dump. If it shows me what is being sent. That's common courtesy. Opt-in information is absolutely fine.
Being unable to opt out and the default being opt-in is what is unacceptable.
It's like a consent form for a medical procedure. At the end of the day, you're not a medical professional. Is the average person really informed when they do or don't provide their consent?
Nevertheless, consent is still paramount. Removing consent on the basis that most users are incapable of being informed is a poor excuse.
Also, as someone who's been doing tech support since 1995, people here either wastly overestimate the dumbness of others or they just happen to have unusually dumb colleagues, friends and whatnot.
Most people aren't really stupid, rather bad software make them look stupid and bad tech support shifts the blame to the users.
Why are you setting up a straw-man? Nobody said that Telemetry solves all problems. Every additional piece of information can be helpful. If you don't think that it can be, then really there is a fundamental disagreement that will just result in an endless argument.
> The real answer to your question: ask and listen. People will gladly tell you. Do not just take the data otherwise you end up with a set of poorly selected metrics which do not represent user opinion and a lot of pissed off customers who don't want to or can't tell you due to legislation and data protection.
And likewise simply listening to a vocal minority via "ask and listen" is not a silver bullet.
So, you're both right, you need both to make informed decisions.
See the comment about user panels. Select a random portion of your userbase and ask them. Talk to your account managers. Communicate between them. This is software 101.
I've built and supported software with 80k end users and did that effectively single-handedly.
I'm sure a large corp can do the same if it sacrifices a bit of bottom line...
Finding representative portions of your userbase that will actually talk to you is pretty difficult. I've worked on a couple different products with millions of end users and we frequently discovered a subset of our userbase was having big problems and we simply weren't hearing from them.
> Select a random portion of your userbase and ask them.
This sounds great in theory - harder to do in practice. Often what ends up happening is the only people who will share their time with you are the ones who want something specifically changed for them. Thus my point, it's effective, but it's not a silver bullet.
> I've built and supported software with 80k end users and did that effectively single-handedly.
And plenty of businesses have used Google Analytics, Mixpanel, etc. combined with the aforementioned technique.
TL;DR - The two strategies are not mutually exclusive.
Problem is, the latter strategy is rarely used (telemetry is cheaper!), and as far as I can see the consequences, telemetry data is hard to use correctly. In particular, it's prone to become a mirror of your design, creating a feedback loop. Telemetry will show people will use more of the things you've exposed more, and less of the things you've exposed less, so if you take that at face value, then you're just amplifying your own initial design bias.
The user spent an hour fiddling with settings. Is that because they love the new settings toolbox? Or is it because they were very frustrated with it and couldn't find what they were looking for?
This could also be done by reading their forums, and reaching out to people.
For example, I have an XBox One controller. It used to pair fine via Bluetool. It still pairs fine with my Mac. Other stuff still pairs fine with my PC. But it just won't work after a Windows update.
What is telemetry going to tell them that they don't already know from the forum? Maybe the scope, but it's fuzzy. Some users might give up after one or two tries. Some users might be using the "Add hardware" box several times in a row for different reasons. Telemetry isn't a magic insights thing. It's difficult to get right, and to draw the right conclusions.
One thing's for sure, telemetry's cheaper than QA-ing updates properly.
Telemetry is generally a key component for quality assurance. While a paid tester can file bugs about "it crashes" or "it takes 10 seconds to load", a regression taking load times from 100ms to 200ms will be very hard (if not impossible) for a tester to notice and file a bug about. It will only show up in your telemetry.
You could argue that telemetry should then only exist in your beta channel or testing builds, and some developers do that. It's silly to argue everything can be caught by your QA team, that is simply not true for online services. In the past projects I've worked on have had long-standing bugs that took weeks of ongoing effort between both our paid QA staff and customers to finally identify reproduction steps, at which point we were able to examine telemetry for those reproductions and fix the problem.
"Pairing is failing on device with A1B2C3 Bluetooth controllers on driver versions 8.2 and 8.3, but not 8.4"
"This is happening for 100% of users with the B2C3D4 controller and is likely a driver bug, but has happened only twice on the C3D5 device, both for the same user - likely a hardware failure"
A lack of feedback from everyday users is not a reasonable justification for the carte-blanche exfiltration of data from those 'reticent' users.
There are dozens of ways to get feedback from users, but most of them require the company to pay for them. Companies are as bad as your average Joe in this fashion; why pay when they can simply pretend that privacy and data protection laws don't exist and just take?
I said it elsewhere in this topic: My purchase of your software does not give you the right to exfiltrate data from my system. You're welcome to ask for it, or to pay for it, but in no way is it yours to just take.
They do "pay for it". That's why the Home edition of Windows went down from $239 with Vista to $199 with Windows 7 to $139 with Windows 10. Sooner or later, I suspect they will have a "free" (ad supported) SKU.
Windows 10 is already ad supported. Unless you find the right options to opt out, you're bombarded with ads in almost every aspect of the UI.
Additionally, mandatory telemetry was in no part of the purchase process (well, neither were ads). Instead, it's in a completely separate clickwrap 'agreement' (that's subject to change without warning) that's only made clear when you're install the software.
>That's why the Home edition of Windows went down from $239 with Vista to $199 with Windows 7
One thing you're omitting: windows xp home was only $199, just like with windows 7. vista seems to be an outlier in terms of prices.
Also, all the prices you've listed are for the full version (ie. not upgrade). The upgrade prices are much lower, and are in line with the current price for windows 10, which does not have separate pricing for upgrade vs full. You can still interpret this as a price drop, but most people get their computers through OEMs, and so aren't paying retail prices. I suspect people who build their own PCs also tend to not buy legitimate licenses. Also, AFAIK the checks for the "upgrade" version aren't particularly rigorous. You could install a pirated copy first, leave it activated, then do a clean install, and it wouldn't complain about the licensing.
If you have a problem with "telemetry" becoming a bad word among technical audiences, take it up with the developers and product managers who insist on surveilling users without opt-in, and without an option to opt-out, and who then decided to use a euphemism for this behavior: telemetry. This user-exploitative behavior is eroding any veneer of euphemism from "telemetry" and revealing the surveillance at its core. Highly technical audiences have decided to take the battle directly to the word their opponents chose.
"Surveillance" has already been thoroughly scorched by the reaction to the US government's broad violation of the Fourth Amendment. Talk to anyone in the defense industry. Now, rather than internalizing that many people don't want to be spied on, product managers are deciding to double-down on surveillance but use euphemisms. Don't whine when people respond by scorching the new word as well.
They could call it "logging" next year and we'll start tarring and feathering the word logging. The issue is not the word, but the behavior it represents.
Gathering detailed usage behavior of applications must be made optional to the user, as doing so without opt-out is decidedly hostile to the user's privacy.
You say it is impossible to get feedback from programs without a vocal minority being dominant. This is untrue on the surface since providing the option to disable telemetry removes a minority of users, probably a set highly correlated with the vocal minority you are concerned about. So if they have something to let you know, they'll probably contact you directly—the old fashioned way.
As others have pointed out, there's also no compelling evidence that software is better since the advent of widespread telemetry. Telemetry so often lacks context. You don't know what the user was trying to do; only what they did. Just because a feature is used a lot, that doesn't mean it's a good feature. It's merely what users have found in your software that approximately does what they intended. What's unseen, what can't be seen, is intent. You can't (yet, thankfully) measure the reluctance or happiness of the user as they pressed the button.
Even when working ideally, and observing willing users, telemetry has a nasty habit of navigating products to local maxima at the expense more quickly finding significantly better options.
This looks like a straw man. Automatic collection and transmission of usage data is not what is meant by "giving feedback". Giving feedback is manually opting to provide a statement about the experience of using the software. Giving feedback is uncontroversial.
> Giving feedback is manually opting to provide a statement about the experience of using the software
Which almost nobody does, unless they are either completely unsatisfied with the software or completely in love with the software (just like TripAdvisor feedback)
> Which almost nobody does, unless they are either completely unsatisfied with the software or completely in love with the software (just like TripAdvisor feedback)
Good point. Better build in a surveillance engine to spy on your users. /s
At least in the enterprise space, many users won't provide feedback because they feel the product teams won't do anything with the feedback, since someone above your pay grade agreed to pay for the software -- ergo, your opinion does not impact the bottom line.
I work in enterprise software, and I feel like there are two types of feedback. One is end users who provide accurate, actionable feedback on how we can make their job easier, and the other is end users who are mad they have to use software at all. Unfortunately the latter are the ones who complain openly in a way that just makes people angry but gives us, the product teams, no real direction besides writing AI that entirely replaces their job in the end.
In the enterprise space (i.e. Windows 10 Enterprise users), users can fully opt out of Windows 10 telemetry, unlike everyone else but Windows 10 Education users. (Someone asked in the GitHub issue how that tool respects the various control levels over telemetry that Windows 19 offers, but the issue had turned into a shout-fest, so I'm not expecting the rank-and-file MS dev dealing with that issue to want to respond there.)
This seems like an issue with how the product presents that choice or ability to users.
For example about half of the people who make it to the end of my video courses write in and give me feedback based on a 7+ question form I ask them to fill out at the end of the course. I ask very specific questions that could likely be answered in 1 minute or less each. Out of thousands of submissions, a huge majority are positive.
I don't ask for feedback or anything early on, and do my best to avoid giving someone extra "work" to do. I present it in the form of "hey, I see you made it to the end of the course, your feedback is helpful so that my next course is even better aligned with what you want...".
Folks are happy to provide feedback in that case.
I'm one of those people who typically turn telemetry off when I can because in a lot of cases it's not clear on how it'll be used unless I read a 100 page TOS or I simply don't trust the company is telling the truth on how it'll be used. I shouldn't feel like I need to diagnose my network traffic with WireShark just to double check a company isn't harvesting usage stats about an app I'm using.
There's a reason that gathering feedback from IVR and web sites is a multi-million dollar industry. It asks people, and they respond.
There's a sports quote about "You miss 100% of the shots you don't take." The tech industry has to learn that it misses out on 100% of the feedback it doesn't ask for.
You're right, there is a negativity bias in reviews. That's mostly because people don't want to spend their free time and effort on making your product better.
Hire people specialised in Q&A, send someone over to big customers to observe how people use your product and ask/pay a customer to interview a random selection of the people using the software.
Or just stalk your customers. It doesn't require human interaction, it's cheap, and probably not illegal enough to actually get fined. Who cares about the actual opinions of your customers when you can just interpret some carefully selected dashboards, right?
A user giving feedback has already paid for the service in one way or another. You might counter with future services, but that's a function of market share, which is not served by customer satisfaction but by customer convenience.
Data mining is about using the customer as product, not improving yours.
By asking! Every company under the sun stopped asking for feedback, stopped doing user testing, deleted their emails, put scripts on the phones, and redirected “Contact Us” to “Help Articles”, and you blame the users for not getting in touch?
Telemetry isn’t about better user feedback, it’s about cheaper user feedback, even at the price of quality and ethics.
For me, the problem with Telemetry is there's no way to control what information is being sent back. This applies to all software companies that want to capture "metrics" about the use of their software.
Additionally, What's to stop Microsoft from turning their Telemetry data into sales or marketing data?
> how can software improve if the majority of people actually using the software don't give feedback at all?
Software, especially Microsoft software became much worse in the past 5 years or so despite their heavy push on telemetry. Therefore I don't think telemetry is a magic bullet that will make software better.
Simple things that used to work fine. Start menu search, for one. The start menu used to be blazing fast and search was consistent. Nowadays the start menu can be sluggish in certain cases and search might not return local results seemingly at random.
The calculator used to be fast and load instantly, now it's one of those UWP monsters that even asks you to rate it in the Microsoft store...
I don't recall hearing about updates bricking machines or causing data loss at scale back in the Windows 7 days but it seems like that is now a relatively common occurrence, amplified by the fact that you can no longer hide/defer updates on consumer versions of Windows. I think the firing of their QA team and delegating the work to unpaid "insiders" and telemetry might have something to do with this.
The new Settings UI is absolutely disgusting both in looks and information density and is a clear downgrade from the previous version.
I can go on and on. I would sympathize if they were pushing the boundaries of software engineering but what we're talking about isn't groundbreaking - these are problems that were mostly solved a decade ago and Microsoft intentionally backtracked on their progress by the looks of it.
> I don't recall hearing about updates bricking machines or causing data loss at scale back in the Windows 7 days but it seems like that is now a relatively common occurrence,
This could also be explained by user expectations for software rising but quality of Microsoft code remaining constant. In the past users may have written off such events as 'just the way computers work sometimes' but perhaps now users realize that computers needn't be so unreliable.
> This could also be explained by user expectations for software rising but quality of Microsoft code remaining constant.
I disagree. Evidence that supports MS code quality dropping includes a significant amount of users hanging on to Windows 7 with their cold dead hands even post years of MS marketing, arm twisting, GWX updates, and EOLing Windows 7, with users paying for ESUs via Ask Woody vendors and/or that 0patch tool.
I myself moved to Windows 8.1 and from there am hem-hawing on whether to use KDE Neon or Linux mint XFCE and just leaving behind Windows except for the air-gapped Windows 7 VM I will no doubt need for things like Anime Studio. I will not allow Windows 10 (outside work devices) on my home network.
(Maybe for Centaurus aka Courier Jr...but I'll put it on the guest wifi and make a bunch of throwaway accounts for it. )
I agree that Microsoft's software has gotten worse. I don't agree that user complaint rates are necessarily evidence of this, since users have evolving expectations of software.
How long does it take after boot for disk I/O to stop if you've got a 5400 rpm hard drive? It's maybe a few minutes after login on 7, and I've never seen it stop on 10.
Why does the calculator take seconds to start? Why does it ignore keypresses when it has focus?
How many clicks does it take to set my IP on an isolated network with no DHCPd? How many different contol interfaces will I see on the way?
On Old Edge (I haven't tried Chrome based Edge), why does the stop button sometimes not stop until the page finishes loading over several seconds? Why do the back and forward, and url navigation interactions queue up in that case?
I'm sure this was your experience, but it was not mine:
>How long does it take after boot for disk I/O to stop if you've got a 5400 rpm hard drive? It's maybe a few minutes after login on 7, and I've never seen it stop on 10.
Unsure what you are saying. Windows is the only OS to write to disk after boot? I don't see anything hitting the disk after a few seconds.
>Why does the calculator take seconds to start?
Instant
>Why does it ignore keypresses when it has focus?
unable to reproduce
>How many clicks does it take to set my IP on an isolated network with no DHCPd
4
>On Old Edge
I don't use that particular piece of software so I can't tell you.
It's "instant" to show up, but you got to wait one to two seconds for the splash screen to disappear, whereas the windows 7 showed up instantly and was usable immediately.
> Forced reboots for updates even if the computer is in the middle of a long-running task
Oh God don't get me started. Few days ago I was in the middle of a videoconference with some important people, when suddenly my screen went blue, flashed a Windows spinner and the word "Restarting", and boom. It just rebooted. With no warning, despite me being on a videocall, on Microsoft Teams of all things! And within the "active hours". How this behavior is acceptable is beyond me.
I'd recommend checking if you don't have third-party code forcing that restart - for a pretty long time now, Windows 10 has a policy that it does not restart for updates unless it can't detect user activity for a longer time, and even then it defaults to updating at night.
I came pretty close to playfully trying to have Microsoft cover a portion of my monthly electric bill.
Why? Because I set my dual-GPU gaming rig to sleep every night, and Windows feels the need to wake it up about an hour later, fail to do updates, and then leave the machine on the rest of the night, even with auto-sleep set for 30 minutes.
I noticed the problem a few weeks later, and by then my electricity bill had increased by 20 bucks.
That happened to me a few days ago. It was at like 5pm when my hours were set to 10pm. It didn't even give a countdown. It just restarted in an instant. I so annoyed as I was installing a game at the time and it completely corrupted the installation.
I agree, they should add some kind of option to disable updates for pro users even if they warn you. For me, W10 has always given me an option to defer updates and let me pick a time. At work, we use WSUS to manage our machines and my W10 box regularly gets months of up-time. Crazily enough, we even have a W10 laptop (!) that does some critical stuff with a year and half of up-time. https://i.imgur.com/uRGmhTU.png
i think people tend to mean much worse "than before".
for example, i have seen videos of ms word and ms visual studio, on old pentium, load instantly with a splashscreen flashing by. i was truly impressed indeed.
You needed a pretty hefty machine at the time for the splashscreen to flash by.
For example, Word 6 on my 386DX with 4 MB RAM took some time to launch and I had ample time to admire the art that went into the splashscreen. On 486DX with 16 MB RAM, it did flash by.
They made it impossible to opt out of telemetry unless you buy enterprise licenses and run a domain controller.
Old Microsoft software had a simple toggle switch for this.
By default, windows 10 lets Microsoft engineers remotely log into your box and browse your filesystem. They say they only use it for diagnostic purposes, but I don’t see how that could be true unless they’re in violation of US law, which compels them to give the same access to law enforcement.
I’m not sure if you can opt out of that (or whether the opt out would survive a warrant).
I switched away from windows over this sort of thing. There were dozens of other objectionable things they were caught doing, and efforts to build windows 10 “decrappifiers” made it clear they were adding new telemetry every month, and laundering the data through sock puppet domains.
It looks like you can disable it, but “Full” telemetry (in Microsoft’s words) includes:
> Full: All data necessary to identify and help to fix problems, plus data from the Security, Basic, and Enhanced levels.
In the Windows 8 days, they claimed that engineers couldn’t silently pull individual files from machines without managerial approval. I can’t find the source. It was some old news article with an interview with a Microsoft manager.
Anyway, “All data necessary to identify and help to fix problems” pretty clearly implies they can pull whatever they want as they debug. I don’t see how they could implement that without exposing customers to warrant requests.
This page outlines everything additional they recieve on the Full setting.
> In the Windows 8 days, they claimed that engineers couldn’t silently pull individual files from machines without managerial approval. I can’t find the source. It was some old news article with an interview with a Microsoft manager.
I recall reading something similar, but for Windows 10. AFAIK it said that engineers diagnosing a difficult problem can select a group of machines to receive raw telemetry from, after getting permission from managers + microsoft's privacy team. I have a feeling it was for insider builds only though.
That article is talking about Remote Assistance, which lets you explicitly grant temporary permission to someone you trust (not just a Microsoft engineer, but anyone you choose) - and you can see what they are doing because you're sharing your screen.
The GP comment seemed to imply that Microsoft engineers could log in remotely without your knowledge or consent.
>"By default, windows 10 lets Microsoft engineers remotely log into your box and browse your filesystem."
This is correct, they AFAIK need a password/acceptance from the user, that's the proviso, but the original comment didn't say "without anyone knowing" (and as it's closed source none of us knows for sure). Their quoted claim is true it's just of very limited value.
This whole thread is going nowhere.
The first question should've been "yes, but can they do it without a password or user-acceptance". The answer is "we don't know" AFAIAA.
Discussing problems with software is fine for Hacker News, but claiming that there is no point because there is a cabal that hides answers requires evidence which should probably be directed at the site moderators instead.
In my experience most companies are not investing enough in QA. A lot of companies are moving away from dedicated QA roles and falling back on tests that devs write and crash reports. Both are good to have but not a substitute for proper QA.
And I would say most software today is incredibly buggy. Almost every major piece of software I use now from large, well known companies is just rife with bugs.
spyware is software that spies on other processes.
Usage data is a better name for what gets returned in most telemetry.
I would much rather genuine telemetry is supported so it can be reported in a way that doesn't allow a "you agree to this" hook to be used for both innocent telemetry and problematic telemetry.
Throwing your toys out the pram at any usage data going back is harmful not helpful. It will mean that the bad actors will win because they'll be the only ones who have the data to improve their products.
Or everything will go saas so you'll get desktop "software" that's nothing more than a shell making HTTP calls back to a backend so all the usage gets tracked there and it'll be slow as shit for the priviledge.
> spyware is software that spies on other processes.
I don't know where you heard that but that is absolutely untrue. There have been anti=spyware apps dating back to the late 90s and before, and they essentially AV scanners with a slightly different focus
> spyware is software that spies on other processes.
The New Oxford American Dictionary defines it as "software that enables a user to obtain covert information about another's computer activities by transmitting data covertly from their hard drive"
Without consent / explicit opt-in, this describes most telemetry perfectly.
> It is impossible to get proper usage feedback from your programs without being swayed by the vocal minority community.
There are tons of other products in this world that don’t rely on surveilling on their users’ every single move to solve these problems and improve the product.
Besides, there is a huge assumption that the data gathered will be used only to improve the value of the product for user. Considering the fact that the two sides of the market, the buyer and the seller, is in adverserial dynamics when deciding the price point, it is irrational for the seller to not use this information to actually increase their profitability. They might as well, and indeed do, use telemetry to cut their costs without moving their price, for example in the form of cutting support from existing but under-demanded features, allocating resources that could bug fix an existing feature to features of the next product, to engage in extractive behavior such as upselling new products etc. All of this shortchanges the end user.
People give feedback on Windows. The web is full of Q&A forum posts with people struggling with all sorts of issues, including in forums run by Microsoft.
Some problems get reported for years without any fix in sight.
Like, stop the telemetry nonsense and read the darned forums, starting with your own?
I'm a bit confused about what it is you think has been abhorrently bastardized.
There are
1. well-behaved programs that do what I tell them to do,
2. programs that surreptitiously send my data to an external party without being told to.
I thought we were calling the second category of behaviour "telemetry" (or "spying" when we're not being polite). If not, what is the correct terminology?
One alternative option: there are businesses devoted to gathering feedback from everyday users who represent wide demographics.
Those users participate after providing consent, and are paid for the feedback they offer.
Your comment could be seen to imply that software developers are entitled to behaviour data from users and organizations without their consent (or even awareness).
The best way to understand and improve a product if you lack the ability to gather telemetry or user feedback is to use it yourself and identify areas for improvement.
I really think the world would mostly stop talking about it if all software started respecting a universal environment variable OPT_OUT_ALL_SILENT_TELEMETRY=1. It's not the same as Do-Not-Track because the respecting or disrespecting of the variable can be verified independently, by the user, on their own machine. We'd only hear about the disrespecting software.
If on reflection you come to the reasonable conclusion that the entirety of planet earth if fully informed and made to understand would opt to set OPT_OUT_ALL_SILENT_TELEMETRY to 1 why not skip a step and all act as if it were set to 1 now.
Have you found a correlation between software quality and the amount of data sent via telemetry? Do you have examples of great software that was achieved because of this?
This is how it was done before telemetry was possible. Good software was made back then. All our software today is built with or on top of software that was developed without any telemetry.
Today it's much more common to see companies / product managers using telemetry, instead of their brains, and making bad decisions as a result. There are always confounding factors, and they usually dominate. Collecting numbers is easy, collecting the right numbers is hard, product teams don't have time for that. Telemetry ends up mostly being used for excuses for bad decisions.
I agree, most of us have built applications that capture plenty of telemetry data about how the application is being used to improve the product using OTS tools or SaaS solutions. Of course Microsoft is going to do the same thing, they've even built their own tooling for it so it isn't going to some other provider.
Is there a particular company size when you suddenly can't collect telemetry because of the privacy implications? Why aren't they allowed to compete using the same tools as everyone else?
It's pretty obvious that as a company gets larger and more powerful, their potential for abuse grows. This comment is funny considering it's about Microsoft of all companies.
Microsoft's telemetry practices SHOULD DEFINITELY be more heavily scrutinized than other companies.
I don't know, I find the double standards in this space concerning sometimes.
Your website uses Fathom which focuses on privacy but doesn't have any third party auditing of that claim (searched their site for "audit", no results). Why do you trust them to do the right thing with my browsing data when I visit your website? What are you using the data for, to improve the site and create content more people are interested in? Why can't Microsoft do that do to improve their applications?
I don't say all of this to downplay the importance of knowing that things are collecting telemetry and shipping it off somewhere but we can't just have a blanket statement of "you can't do it once you're big enough to abuse it" be our guiding policy either.
Because they've been getting a pass for years and it's painful to give up something that you've been doing. Oh, wait, you're telling me that's not a valid reason?
Video games over the past decade have made big strides in metrics and feedback. There is near infinite stats on playtime, player progression, play-testing with people, bug tracking, ect. People will complain pretty frequently when stuff doesn't work. Communities spring up helping each other with technical difficulties.
I'm curious how many people here decrying telemetry use Google Analytics or similar on their websites or their employer's websites and think nothing of it. Or have ads on the websites. Or literally just about any 3rd-party Javascript. Or literally just about any 3rd-party resource of any kind.
1) They do give feedback, on forums and tech support etc.
2) you do testing and extensive QA. You don't outsource your testing onto users, but instead hire people to do it.
And yet I remember a time before always-on internet connections where quality software was still being written. I'm not sure telemetry is the answer to these problems.
It is not my job to provide usage feedback, and it is none of their business what I choose to do with my own hardware.
If they want to understand how people use their products they can perform usability research. If they want me to participate in such research, they can offer to pay me for my time. Snooping on me with embedded spyware is not acceptable.
It can't be done, IP address alone is a problem. And even if there's trust today, that could change at any time. I'd rather not have the lifetime chore of making sure I still trust something I never wanted to have to trust in the first place.
There's no truly anonymous telemetry, there's only anonymous until combined with other data sets. You'd have to only store and process aggregate records (like "# of users using feature of X") and never store individual records, but if your application phones home without my knowledge, my trust is already violated and I won't believe you if you say you only store/process aggregates.
1. Let's collect information on every URL, including every distinct pornhub URL, a user visits.
2. Let's collect information on how many times a user browses pornhub.
3. Let's collect information about how and when a pornhub site crashes our browser.
4. Let's collect information about how and when our browser crashes - without submitting the website that crashed it, just the internal info such as call stacks.
Only 3 and 4 are are about fixing bugs, and while 3 would maybe make it easier to reproduce issues by knowing the exact website that triggers faulty behavior, you end up with a lot of information about your individual users that could be abused.
1 and 2 are about potentially making your product better by being able to tell what websites (or features, etc) users are actually using and focusing on improving those features, or figuring out why they do not use the other great features (e.g. bad UX?). However, this comes at the great expense for user privacy.
What really makes me mad about microsoft is how little they are telling you what kind of telemetry they actually collect at what configurable level. I do not know if they do 1 or 2 or 3 or 4 or any combination thereof by reading their privacy notice. The privacy notice says (or used to say?) that they can even transmit your files around for telemetry/bug fixing purposes if they feel like it. And they are very unclear about how the data is processed and retained and how long such a data retention is.
I'd be happy to contribute some telemetry depending on what it is. But they are refusing me configure let alone tell what they collect, why they collect data and if they got proper processes in place before they add certain types of data collection, so in this scenario I'd like to opt out completely. But I cannot because not an option. And I genuinely hope they get slapped with enormous GDPR fines for it.
As a counterpoint, Firefox - while not perfect
- There is an opt-out/opt-in for telemetry/crash reporting
- They show me what telemetry exactly they collect: about:telemetry
(While it's certainly hard to figure what all that data means, it being there in the first place means that third party experts can easily access it and evaluate it)
It's fine to collect telemetry. It's even fine for it to be active by default, so users have to opt-out, although not my preference. That isn't what we're discussing here.
This telemetry is mandatory. Users are not permitted to opt-out.
Actually, any telemetry that, combined with other databases, can identify a single natural person located inside EU borders must be submitted with informed consent.
This means that the average Joe must understand what information is being sent and the submission must be opt-out rather than opt-in.
It's possible to do telemetry in a way that does not violate this law, but that means you're not allowed to do more than basic aggregates that can just as easily be collected on the backend. A collection of installed software can easily act as a globally unique identifier because every PC installation is different, so even training a "recommended for you" system that just finds other software that people commonly install along with a certain package must already be opt-in.
Because of the requirement that combining data with other databases (server logs, for example) must not allow for unique identification of a single person in a data set, you really should only be using opt-in with informed consent. "We may use data as described in our privacy policy" is not informed consent, "we will send a list of all software you install, remove, change or update, when you do so and in what way you did it" comes close. Microsoft is severely lacking in this aspect.
The developers add insult to injury by also lying about the theoretical ability to opt out.
While I love WSL2, my biggest beef with the install process is I had to turn on almost every crap privacy-busting feature back on. I had 'decrapified' my Windows 10 install previously and the WSL2 reversed almost all that.
That's probably a function of WSL2 currently only being available in Insider, no? If you want to use pre-release builds, I think it's fair to expect that you need to enable telemetry.
When it comes to things like this, it feels like old "Linux is a cancer" Microsoft is battling the new multi-platform open-source Microsoft for whether the company should be evil or not.
There is no new Microsoft - this is the "embrace, extend, extinguish" Microsoft of always, betting that the forkable nature of open source will not matter in practice, so far correctly.
Since it’s open source, I wonder if anyone will be willing to maintain a branch that has all the telemetry removed, but is otherwise basically unchanged and so can connect to the normal repositories.
Tin foil hat here. The person posting this is the lead maintainer of Chocolatey, a long standing 3rd party Windows package manager.
Although the point is reasonable (why NOT just provide an opt out, like .NET SDK?), it seems to me that there is a potential ulterior motive in dragging down winget.
Probably because the .NET SDK is a separate program, and winget is being slipped into a default component app of Windows 10.
I am irritated about Windows 10's telemetry policy, but it makes sense for component applications to obey the system telemetry configuration rather than each one having their own settings.
Why do we use MS's terminology for it (telemetry) when it is essentially theft of private and sensitive data?
From what I have gathered, they are snarfing up my browsing data and what applications I have installed. I don't have the time or energy to jump through the hoops to figure out how to stop this blatant invasion of my privacy. On top of that, I've read that win10 shows ads in the actual OS. That is so beyond unacceptable that I am stuck on an unsupported OS (7)
As a result, I now have only two windows7 boxes - one for gaming and one that I use as a front end for some specialized audio hardware. Each box is dedicated to that purpose and is relegated to a subnet. 20 years ago, 90% of my computing was on windows - now it is <10%. Soon, I'll get another mac to do all my audio work on and, eventually, I'll relegate my gaming to Linux available titles.
Half-serious startup idea: Telemetry Escrow, where I send data to Escrow, can inspect what it is, then Microsoft pays per datum. Escrow holds all historicals / some samples, depending on tier.
The assumption that windows package manager does not permit opting out of telemetry seems to be wrong.
The whole issue is not stating how to opt out and someone assumed that it means it is not possible. However, based on the updated readme https://github.com/microsoft/winget-cli#datatelemetry there was a way to opt-out from the beggining, just not documented.
Yes, it should have been clarified from the start, but I think it's positive the option was always there and they managed to clarify how to opt-out in mere 7 hours.
According to a recent post, we spend 34 years of our lives staring at screens. If Microsoft wants 34 years of data on each of us, let's collectively bargain on a value for it.
Anybody else really turned off by the unnecessary amount of internet access in these new CLI tools? It's not just telemetry. Some tools will just access the internet even when they don't need to (my favourite is checking for a new version with every command). I like the CLI because it is so minimal and transparent (compared to GUI). But that seems to be changing these days.
I wouldn't mind the Windows telemetry, if it ever seemed like they were using it to actually fix things that suck. However, it's been almost ten years since the Start Menu search has worked at all, and wonky multi-monitor and settings/control panel settings dichotomies persist.
Some have posited that this official package manager will be a death blow to Chocolatey, but I doubt it.
I don't use Windows, and the only Microsoft product I have installed is VS Code (with telemetry explicitly disabled), but in spite of that about 10% of all the blocked DNS requests according to my pi-hole are to Microsoft's telemetry servers at watson.telemetry.microsoft.com.
I thought everyone knew it was the pseudo official Microsoft policy to extract as much data as possible. I mean if you use windows, you must be anyways used to something like this.
I would have expected Microsoft to stalk their users like always and go for an opt-out instead of an opt-in (which, by the way, is legally required according to the GDPR but national privacy offices seem unwilling to actually take Microsoft to court).
What's sad is that Microsoft actively lied in its description and privacy statement by stating it's possible to opt out of tracking. This is not possible for Windows 10 Home and Pro users. Advertising that opting out is possible when it's not is a blatant lie.