Hacker News new | past | comments | ask | show | jobs | submit login
FTC explores rules cracking down on commercial surveillance and lax security (ftc.gov)
162 points by JYellensFuckboy on Aug 11, 2022 | hide | past | favorite | 77 comments



The other day, in our _private_ parking lot, a car drove through with license platter scanner attached. Up and down each row. It wasn't marked, and the scanners were bolted on. Civilian plate, early 20s driver. We called the police, but they wouldn't do anything as the car was gone, and we didn't get the exact plate number.

Like _what in the actual @#$%? What jackass Silicon Valley startup is funding this operation? It's this kind of stuff, which is illegal, but because it's hard to enforce, people will get away with. These people have obviously trespassed hundreds of times.


Along the same line, when I did work for a police department in a small town we started getting cops from LAPD that had a very different mind-set. They would drive around the theater parking lot which is a private lot and would ticket expired plates. Most of the people there were teenagers and the cops knew they were not likely going to fight it. The original small town cops would not do petty nickle-and-diming tactics like that.


'Quasi-public property'


If a car drove through with a human driver who was either looking for a specific plate/vehicle they thought might be parked in your lot, or a human who was simply following bad directions, no one would bat an eye or call it "trespassing".

The problem is that politicians and the law fail to recognize that there's importance in the quantitative difference between one human doing something, like reading a license plate, and a computer doing the same thing a billion times.


> or a human who was simply following bad directions, no one would bat an eye or call it "trespassing".

Humans venturing onto private property where they don't belong are regularly greeted by a property owner politely wielding a shotgun and offering to give them directions back to where they belong.

If anything, I think it's the addition of some computer contraption that creates the free pass, like putting on a hardhat-- if you've got gear people are more likely to assume that you're supposed to be there.


Brandishing is actually a crime in most places in the US. Laws vary, but you generally do not have a right to threaten someone's life with a gun--and the gun itself is a threat even if not pointed unless it's not in your hands--if they're not actually threatening you. Doesn't matter if it's your property.


Just a nitpick: If you're on private property it's not brandishing. The other person would be trespass in fact provided it was posted or you told them to leave as an agent of the property. If you are threatening people off your property from your property that's a whole different ball of wax and in any scenario there are a ton of if/buts.

Brandishing laws come into affect on public areas and are subject to a lot of ifs/buts.


Indeed. Brandishing also generally requires using it to threaten, not merely have it in your obvious possession with the implication that you could use it to defend yourself if the threat escalated.


Sounds like the GP’s instance happened in Silicon Valley based on their rant against SV startups. Silicon Valley isn’t the type of place where the property owner wields a shotgun and asks you to leave because they’ve made guns illegal, made kicking trespassers off almost illegal, or because their armed security guards never let you in in the first place.


More of a property of suburbia than silicon valley.


There are private companies that roll around looking for cars scheduled for repossesion, and will call in their tow trucks if one is found. You'll also see cops do this in some of the "more affordable" motel parking lots, as these tend to attract the types with open warrants.


In NYC they do this with NYC Marshals riding shotgun.

The vehicles are plain white and say scofflaw patrol.

Here's an article about it https://www.cbsnews.com/newyork/news/seen-at-11-on-the-hunt-...


The art in that article really is unimpressive. No pics of the cameras, no details of the screen used to ID, no pics of anything at all that is interesting and enhances the article. Just shows that even 8 years ago web journalism sucked


CBS is a television station, so I would assume this is a transcript of a TV segment (how most TV stations post online content)


i'd be shocked if the video segment was any better. however, where's the video segment?


The article is 8 years old.

I feel like multimedia players come and go (unlike the stability of say the <img> element), so it’s entirely probable that the site looked different 8 years ago, and a migration or few later the video content was lost.


They've probably changed corporate owners in that time as well reducing their independence, switched to a new CMS, etc.


NYPD does this with their marked cars too (standard patrol cars with ALPR rig attached fore and aft). Ostensibly it's the meter maids expediting detection of parking violations but the data obviously has other uses.


ah the venerable license plate.

welcome to the 21st century. parking garages, theme parks and malls routinely do this as well. entering a private parking lot is surely criminal trespass, but most cops dont care so long as your trespasser is gone and no damage was reported.

most highways, onramps, and many intersections include hidden or overt plate readers as well. airports use them extensively as well. Ive even seen them at a gas station car wash of all places.

the DMV in many states also sells your personal information for profit. through massive defunding and regulatory capture, the entire license process for a vehicle has largely become a thinly veiled clearing house to shore up random state departmental revenue.



Could it have been someone working with a car repossession group?


It's this kind of stuff, which is illegal

It is? Is it a California specific law?

It is private property, but there is public access. You could ask them to leave, but I wasn't aware a crime had been committed.


How do you know that's what it was? Why would anyone need anything more than a simple camera for that application? Is it possible it was using LIDAR or some other tech for another application?


Why does it matter? Why is it ok to drive through a private property and just scan shit, regardless of the specific technology used?


There are a number of valid reasons. Immediately jumping to the most conspiratorial may not be the healthiest approach.


I'd want to know what was done with the data after the scanning. Perhaps a specific car is being sought and the data is discarded afterwards, which would be more ethical than keeping the data and attempting to profit off of it, regardless of whether it's "legal" and there's some attempt at "disrupting an industry."


Okay — name a few specific reasons.


Yeah, if I wanted to do a license-plate scanning startup -- I would just use an off-the-shelf dash cam and a laptop. No need at all for special optical equipment.


Under ideal conditions you'll be able to read a limited amount right in front of the camera. Want to read plates at/after dusk or to the sides (e.g. parked cars) and you'll need special hardware.

I have a fixed APNR camera on my property and even with good positioning it can't always read the plates (nor can I manually) because sometimes sunlight glare blows out enough of the image to hide a digit or two.

I promise you that the people building these systems didn't just fail to think of using an off the shelf camera before they went out and deployed the fancy stuff.


And you'd miss a lot of the cars. By hanging them off the sides of the car, they can scan both sides of the aisle in one pass. They also need to be weather proofed, hence the housings.


Depends what the company’s focus is, right? If you want to do something with the data, why not use the already made, high quality devices that are relatively cheap. Seems like the important piece is likely using the data. (Unless you are trying to create the devices themselves, of course)


Property rights you're not prepared to enforce are meaningless. To start with: Put up a gate, it should be entirely effective against this kind of intrusion.


Some quick questions for your super outrage.

1) Did they break down a gate to get onto your private property?

2) Do ANY other folks use "your" private property? Are you falsely claiming ownership rights to something that has a form of common ownership?

3) When you claim it's your property and entry is forbidden - has ANYONE else given permission (explicit / implicit) for non property owners to access? Think of things like family members of an owner, guests, UPS drivers, delivery drivers , maintenance folks etc. Is it clear that anyone who is not an owner can't access? This is normally managed by an attended gate system or similar.

4) Are you one of those folks who like to yell at minority or others in your neighborhood who don't "belong"? We've seen videos of delivery drivers dealing with folks like you who (often falsely) claim ownership of property.

5) If you don't PERSONALLY own the property, have you been explicitly authorized to act as an agent of the property? You'd be surprised at the number of folks like you claiming personal ownership who actually have ZERO rights to trespass a delivery driver etc.

6) You've made a strong claim it's illegal for someone to come onto the property, CA law protects folks unless they have been trespassed. Has that occurred?

Note: Many property management companies partner with third parties to help manage lots. If you have a shortage of parking or someone actively managing lot (not you) they may contract for things like expired tags / stolen vehicles and other tow services. They may walk the lot. They can also pay for someone to come and see which cars may be parking regularly without paying required parking permit fees etc. Many homeowners PREFER that someone sweep their lots.


Arguments great apart from: "dealing with folks like you who (often falsely)....".

Let the person answer the questions, and then start the crucifiction [sic]?


Great. I look forward to the answers!

Having been around the block a few times with folks who are

a) outraged

b) make ridiculous claims of criminal activity

I've gotten probably a bit too quick to assume they fall into a certain type of person. I look forward to hearing more about this parking lot that is so strictly controlled that merely driving through it is a crime.


Maybe people don't want to live in a world where every minutiae of their life, including where they parked their car, is collected and analyzed by mysterious strangers.


What does parent commenter stand to gain from falsifying their comment? What do you expect to accomplish with your itemized incredulity?


The chance of this person individually owning a parking lot seems very very remote.

The chance that there was an illegal activity on the parking lot, if it regularly was in use by non-owners, seems even more of a stretch.

You can see plenty of videos online of folks "policing" spaces they don't ACTUALLY have the rights to (usually because they don't like skin color of someone).

I'm just saying, let's take the claims of criminal activity because someone was simply on a parking lot the OP claims to own with a grain of salt. A big grain :)


Official announcement on @FTC on Twitter: https://twitter.com/FTC/status/1557733026947543042

FTC Chair Lina Khan [further breaks it down](https://twitter.com/linakhanFTC/status/1557738539202531334):

>1. Firms track & collect data on Americans at a stunning scale—on our location, our health, what we read online, who we know, what we buy.

>@FTC is seeking comment on whether to issue rules aimed at commercial surveillance & lax data security practices.

Senator Markey (MA) has also [responded positively to it](https://twitter.com/SenMarkey/status/1557738870397353984):

>We have reached a crisis point for children and teens’ well-being online. I'm glad to see @linakhanFTC and @FTC working to give users the privacy protection they deserve. Congress must also pass my legislation to further protect young people online. We must act now.


Sign up to comment during the public forum here: https://www.ftc.gov/form/anpr-registration

Event details here: https://www.ftc.gov/news-events/events/2022/09/commercial-su...

Sign up now because there is a limit to the amount of comments they'll allow and it's first come first served.

At the bottom of the event page, they say a comment can be submitted at this link https://www.regulations.gov/ but I can't find the corresponding document for this topic. Am I missing something?


The FTC is going to be limited in what it can do unless Congress passes legislation specifically allowing it to regulate this. Due to a recent SCOTUS decision, federal regulators are allowed much less leeway in how they can interpret their powers as granted by congress.

This is in principle a good thing, appointed bureaucrats shouldn't be allowed to create law by fiat. However it means Congress actually has to do their damn job and stop delegating their powers through lazy and broad legislation. Good luck with that.


I don't think it is a good thing in principle. Much of the point of creating agencies like the FTC is to prevent Congress from having to deal with the minutiae of every single thing that ought to be regulated.


Of course its a good thing. We in theory have a balance of powers between the branches for well founded reasons. Give too much power to one branch and its easy for them to rule by themselves and create tyranny. Non elected officials shouldn't be able to create law. That power should remain exclusively with the legislature. Give it to people who are not accountable to the voters and they can and will start working against the will of the people. And it will happen quite easily.


These problems should be addressed at the OS and device level first. Policing app makers is not enough, and there are far too many app makers involved in a revolving door of deniability at the individual application level.

Device and OS makers should transparently and clearly define what features can be accessed by apps, and allow them to easily be administered and disabled easily by device users... Period. Then penalties for abuse and misuse can be addressed for app makers with severe fines.

Any device a consumer buys should never be used to undermine them financially nor ever in terms of their personal privacy beyond basic analytics. That's a well known principle that should never be redefined.

With all of the other advancements in technology to monitor and track individuals that are out there, we should not be personally paying for devices that monitor us and report on us to private companies or anyone else for that matter.


> Device and OS makers should ...

It sounds like you expect them to do this without intervention.

Device and OS makers are not doing this now, and have had plenty of opportunity to do so. If anything the device/OS situation is getting worse - just look at windows requiring accounts, recommending apps and sending telemetry or ChromeOS embedding Google into your device.

So we do need an external force like the FTC to make change happen.


It's meant as a target for what the focus and outcome of what the FTC's work should reflect.

We've already been through several attempts of analysis of policy for app makers, a lot of time has been wasted. Companies already have huge volumes of data on consumers even if gathering was to be shut down now.

Regulation should also address deletion of personally identifiable information that they've already gathered with a well defined policy for moving forward. This issue is far beyond the point where regulatory action should have been taken.


> Companies already have huge volumes of data on consumers even if gathering was to be shut down now.

This is a thing that I keep telling people that no one seems to realize. If by some miracle all of the surveillance stopped now, the data already collected would still be effective for some future bad actor to use 10-20 years from now. I don't think that most companies would keep the data for that long, or many for even very long after some theoretical regulation, but they would certainly sell it to someone who would, and the NSA will be interested in copies of what it doesn't have.

If, 20 years from now, in a new town with new friends and a new job, living under a new government, it becomes important that people don't know you are gay, you're out of luck.


FYI: your common sense idea conflicts directly with other HN members' comparably sensible idea that devices are the property of their owners and shouldn't have any vendor or regulatory constraints on how they're used and what runs on them. And of course the middle ground people will say that users can just have the option to unlock their devices, and contrarians will respond to say that any unlock feature becomes a vulnerability to plain-as-day social engineering and therefore defeats the purpose. etc etc

You never know which comments get attention, but you might just see your upvote count bounce all over the place today!


A rooted OS can still be on your side, and it's still the burden of the OS to structure how it protects and informs you about what the programs installed on it are doing. This still requires regulation, because the people who produce hardware don't share with the public (and with the people with the skills and interest to create non-user-hostile software) the information necessary to run that hardware, artificially restricting the market to their business partners.

The problem is that the government doesn't want you to have control over your phone either.


> because the people who produce hardware don't share with the public (and with the people with the skills and interest to create non-user-hostile software) the information necessary to run that hardware, artificially restricting the market to their business partners.

That's a claim that needs some supporting evidence.

To my knowledge the vast majority of hardware documentation is publicly available.


I'd like a copy of the following :

  -Broadcom processors/GPU's documentation/specifications
  -John Deere's ECM/BCM datasheets/specs
  -Nvidia's firmware documentation
  -Full, unredacted documentation of 'all' opcodes in x86_64 from Intel
I assure you. Your knowledge is dead wrong.


Do you think that people aren't porting Linux to phones the second they come out is because they don't have the time, skills or desire? It's because they have to reverse engineer everything.


Understandable, Allowing phones to be rooted is an essential freedom that should be protected as well to protect the art of individual innovation and in driving constructive (positive) market competition.

I'm okay with imaginary online point fluctuations, it's a small price to pay. Thank goodness it's not a reflection of anything real like my personal savings... hah.


> Device and OS makers should transparently and clearly define what features can be accessed by apps, and allow them to easily be administered and disabled easily by device users... Period. Then penalties for abuse and misuse can be addressed for app makers with severe fines.

The problem is that the device and os makers are also app makers, and often can circumvent the protections forced down the chain with private apis and hidden features. Yes, app makers can do evil things, but so to can those vendors below them in the stack.

Also, trying to blame and fine will eventually eliminate open source and free solutions because it will force members of open source projects to accept liability.


Regulation can be defined to exempt certain aspects of app making. It's not just an on/off switch.

That being said, even an open source developer can potentially conduct info gathering and/or do serious damage to any consumer that installs their app, so in many cases, there would be no offense cited if there isn't malintent or negligence involved.

We're far past the point where regulation should have been in place. A serious example should be set to create a proper message about this type of activity by private companies and individuals. It is really not necessary for private companies to gather this personal info on individuals for any app. It should be in everyone's best interest to end this espionage-for-profit activity, even if it devastates the opportunistic industry that activity created.


> These problems should be addressed at the OS and device level first. Policing app makers is not enough,

I read the FTC press release and it talks about companies in general and data in general and doesn't even mention apps, devices, websites or advertisers.

Not sure why you think they are only going to be policing app makers?


> Any device a consumer buys should never be used to undermine them financially nor ever in terms of their personal privacy beyond basic analytics. That's a well known principle that should never be redefined.

What does it mean to "undermine them financially"? Offering just enough of a discount that they'll purchase a product that they probably shouldn't?

What are "basic analytics"? What other kind of analytics are they contrasted against, and what would make them no longer basic?


Narrow scope scenarios don't properly serve the discussion.

If Dunkin Donuts (a coffee shop) runs an app and gathers data about your purchases linked to you name and ID, they can any time later sell that data to life and health insurance companies, which in turn can use that data to justify charging you higher rates when you sign up to a life insurance or health care plan. That may not be happening now, but it could easily be rampant in the future in thousands of ways, and it's just a minor example of how people can be undermined financially by personal data overreach within private companies.

Social media companies have a lot more data than that if they consistently track users by location (under the false guise of targeted marketing), and there's no real public awareness nor understanding of these issues to this day.


What kinds of things do you expect to be gated behind permissions? How do you plan to educate users on how to grant these in an informed way?


It's not just about gating things behind permissions, it's also about exposing them in a reasonable way.

Looking through the list of apps that want to "Control my computer using Accessibility Features", the attack surface is just too damn high: https://imgur.com/a/1CBpSWQ


Well now... That's a whole term paper that someone would normally be paid to write...

Methods like encrypting files, creating proper storage segmentation/isolation for each individual app, ending the process of adding "bloatware" to devices, allowing for concrete disabling of cameras and microphone feature access to all apps (and also ensuring that app makers don't break app functionality when those features are disabled), eliminating in-app purchasing, ensuring that app stores clearly define app pricing and app maker credibility.

Those things are just some of the first steps that need to be taken. Educating users is not involved in those steps. The way apps are installed and operated these days is far more confusing than making changes in order to protect their privacy as a default setting.

TikTok and Facebook don't need access to cameras and phones when the apps aren't actively triggered by a user to record something... Somehow the apps require the permissions to be enabled entirely while the apps is being used.

Every company should be required to comprehensively report on what data they track and be bound to responsibility to uphold that to a government watchdog with extreme punishment for mis-use.

We also have to understand that when we speak about devices, we're not just talking about phones, we're increasingly talking about cars, thermostats, home security systems, TVs and many other consumer bought devices that give companies an infinite measure of methods on which they can wire tap consumers, and then sell that data, or even later use towards more harmful purposes like corporate espionage and extortion.

Microphones and cameras were found recently to be hidden in Televisions when consumers had no idea they were, as an example of how far info overreach has gone.

I'd recommend looking deeply into the integration and use of LIDAR on phones... Most people don't even know that/if it is a feature on their phone and on certain cars... It can be leveraged in many deeply invasive ways on individual device users if it is accessed by social media companies, or even worse if a data breach occurs.

First, They (The FTC) should hire proper consultants to properly present the issues involved (Both cynics and optimists), and not just lean on basic understanding, there are a wide range of devices and features, combined with tons of different apps and use cases for them. Resolution is not a simple issue that can be summarized within a few posts online.


Wait, I think I've seen this before. It starts with a good idea, but watch as the outcome of all this is a new bill congress passes that requires everyone to walk on the left side of the street in states that start with a vowel and includes $90 billion in funding for desk manufacturers.

Jokes aside, I'm really happy to see this. Privacy is really important and companies need to avoid collecting data they don't need as most of them end up 'hacked' at some point and every company shares their data with all the other SaaS they are built on. Also, I've needed a new desk for a while now.


I’ll throw out a practice that I’m really weary of - many bars, clubs, casinos, etc now scan your ID upon entry.

If they have a problem with you (and they define what constitutes a “problem”) you can end up added to what is essentially a secretive blacklist that’s shared among businesses, and you have no visibility into or recourse against it.

I understand that I’m entering private property and I have the right to not go there. I don’t think that’s a valid argument against this being a terrible idea.


Which is the terrible part? Scanning your ID, or the secretive blacklist?


I mainly have a problem with the collection and storage of my personal information in a way that could impact me in the future and that I have no insight or control over.

Scanning an ID to get back a confirmation of its validity and not retaining any data in the process isn’t something I have an issue with.


What are some potential solutions? Regulating and standardizing scanner apps?


Since we’re talking about what the FTC could do here, prohibiting these businesses from retaining or at least sharing data on their patrons could be a start.


Whichever one you're more willing to argue for, they're both justifiable yet dystopian horseshit.


I'm excited at the prospect of getting more control of my privacy. I get calls, texts, emails to my personal account about jobs. No clue how they know I'm a physician, and God knows what else they know about me. I'm forced to assume the state medical board sold my info to some third party recruiters.

IDK, maybe the FTC should suggest that China may own some of these tracking companies/apps, that would get immediate bipartisan support.


They do!, TikTok is just the beginning. Look at the games, etc.


I'm sure this is aimed at this data being used for marketing, but hopefully will address providing this surveillance data to government as well.


Oh, maybe Chase will have to finally figure out why they send notifications about my account to an email that isn't anywhere I can edit, see or change. And why only intermittent notifications there, I actually get some.


This must be too good to be true, right? What are the odds any meaningful action comes from this?


>The Commission voted 3-2 to publish the notice in the Federal Register. Chair Khan, Commissioner Rebecca Kelly Slaughter and Commissioner Alvaro Bedoya issued separate statements. Commissioners Noah Joshua Phillips and Christine S. Wilson voted no and issued dissenting statements.

That is a very interesting spread.

Statements from each (notable excerpts)

Chair Linda M Kahn. https://www.ftc.gov/system/files/ftc_gov/pdf/Statement%20of%...

>The data practices of today’s surveillance economy can create and exacerbate deep asymmetries of information—exacerbating, in turn, imbalances of power. And the expanding contexts in which users’ personal data is used—from health care and housing to employment and education—mean that what’s at stake with unlawful collection, use, retention, or disclosure is not just one’s subjective preference for privacy, but one’s access to opportunities in our economy and society, as well as core civil liberties and civil rights. The fact that current data practices can have such consequential effects heightens both the importance of wielding the full set of tools that Congress has given us, as well as the responsibility we have to do so. In particular, Section 18 of the FTC Act grants us clear authority to issue rules that identify specific business practices that are unlawful by virtue of being “unfair” or “deceptive.”10 Doing so could provide firms with greater clarity about the scope of their legal obligations. It could also strengthen our ability to deter lawbreaking, given that firsttime violators of duly promulgated trade regulation rules—unlike most first-time violators of the FTC Act11—are subject to civil penalties. This would also help dispense with competitive advantages enjoyed by firms that break the law: all companies would be on the hook for civil penalties for law violations, not just those that are repeat offenders. Today’s action marks the beginning of the rulemaking proceeding. In issuing an Advance Notice of Proposed Rulemaking (ANPR), the Commission is seeking comments from the public on the extent and effects of various commercial surveillance and data security practices, as well as on various approaches to crafting rules to govern these practices and the attendant tradeoffs. Our goal at this stage is to begin building a rich public record to inform whether rulemaking is worthwhile and the form that potential proposed rules should take. Robust public engagement will be critical—particularly for documenting specific harmful business practices and their prevalence, the magnitude and extent of the resulting consumer harm, the efficacy or shortcomings of rules pursued in other jurisdictions, and how to assess which areas are or are not fruitful for FTC rulemaking. ... At minimum, the record we will build through issuing this ANPR and seeking public comment can serve as a resource to policymakers across the board as legislative efforts continue. ... [categories include (Procedural protections, Administrability, Business models and incentives, Discrimination based on protected categories, Workplace surveillance)]

Rebecca Kelly Slaughter https://www.ftc.gov/system/files/ftc_gov/pdf/RKS%20ANPR%20St...

>Conclusion The path the Commission is heading down by opening this rulemaking process is not an easy one. But it is a necessary one. The worst outcome, as I said three years ago, is not that we get started and then Congress passes a law; it is that we never get started and Congress never passes a law. People have made it clear that they find this status quo unacceptable.46 Consumers and businesses alike deserve to know, with real clarity, how our Section 5 authority applies in the data economy. Using the tools we have available benefits the whole of the Commission’s mission; well-supported rules could facilitate competition, improve respect for and compliance with the law, and relieve our enforcement burdens. I have an open mind about this process and no certainty about where our inquiry will lead or what rules the record will support, as I believe is my obligation. But I do know that it is past time for us to begin asking these questions and to follow the facts and evidence where they lead us. I expect that the Commission will take this opportunity to think deeply about people’s experiences in this market and about how to ensure that the benefits of progress are not built on an exploitative foundation. Clear rules have the potential for making the data economy more fair and more equitable for consumers, workers, businesses, and potential competitors alike. I am grateful to the Commission staff for their extensive work leading up to the issuance

Alvaro Bedoya https://www.ftc.gov/system/files/ftc_gov/pdf/Bedoya%20ANPR%2...

>Our nation is the world’s unquestioned leader on technology. We are the world’s unquestioned leader in the data economy. And yet we are almost alone in our lack of meaningful protections for this infrastructure. We lack a modern data security law. We lack a baseline consumer privacy rule. We lack civil rights protections suitable for the digital age. This is a landscape ripe for abuse. Now it is time to act. Today, we are beginning the hard work of considering new rules to protect people from unfair or deceptive commercial surveillance and data security practices. My friend Commissioner Phillips argues that this Advance Notice of Proposed Rulemaking (“ANPR”) “recast[s] the Commission as a legislature,” and “reaches outside the jurisdiction of the FTC.”1 I respectfully disagree. Today, we’re just asking questions, exactly as Congress has directed us to do.2 At this most preliminary step, breadth is a feature, not a bug. We need a diverse range of public comments to help us discern whether and how to proceed with Notices of Proposed Rulemaking. There is much more process to come.

Noah Joshua Phillips. https://www.ftc.gov/system/files/ftc_gov/pdf/Commissioner%20...

>Legislating comprehensive national rules for consumer data privacy and security is a complicated undertaking. Any law our nation adopts will have vast economic significance. It will impact many thousands of companies, millions of citizens, and billions upon billions of dollars in commerce. It will involve real trade-offs between, for example, innovation, jobs, and economic growth on the one hand and protection from privacy harms on the other. (It will also require some level of social consensus about which harms the law can and should address.) Like most regulations, comprehensive rules for data privacy and security will likely displace some amount of competition. Reducing the ability of companies to use data about consumers, which today facilitates the provision of free services, may result in higher prices—an effect that policymakers would be remiss not to consider in our current inflationary environment.1

Christine S. Wilson https://www.ftc.gov/system/files/ftc_gov/pdf/Commissioner%20...

>Throughout my tenure as an FTC Commissioner, I have encouraged Congress to pass comprehensive privacy legislation.1 While I have great faith in markets to produce the best results for consumers, Econ 101 teaches that the prerequisites of healthy competition are sometimes absent. Markets do not operate efficiently, for example, when consumers do not have complete and accurate information about the characteristics of the products and services they are evaluating. 2 Neither do markets operate efficiently when the costs and benefits of a product are not fully borne by its producer and consumers – in other words, when a product creates what economists call externalities.3 Both of these shortcomings are on display in the areas of privacy and data security. In the language of economists, both information asymmetries and the presence of externalities lead to inefficient outcomes with respect to privacy and data security.


Surprised that HN keeps falling for this.

Does no one remember the NSA?


The NSA can get all the information it needs without all of this commercial surveillance.

The NSA can also get all the information it wants from companies while they're still banned from selling it.


But it can't when they're banned from collecting it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: