Commenting only on the speed of response (or the glacial interpretation of it in Panera's case):
For companies operating in European Union, the General Data Protection Regulation (GDPR) (1) mandates that such breaches need to be disclosed under 72 hours. The implementation deadline for GDPR is by end of May 2018 (~7 weeks to go).
Underarmor, a US-based sports apparel manufacturer, who operates in EU as well, recently had a breach that affected 150-million users, and went public within 3 days of discovering the breach (2).
I believe UnderArmor's case is the norm we can expect going forward.
I found his initial interaction with their head of IT Security (very first initial response) laughably appalling:
Dylan Houlihan <dylan@breakingbits.com>
to Mike, Geri Haight -
Hello Mike et al,
Thank you for making yourselves available. There is a security vulnerability on the delivery.panerabread.com website that
exposes sensitive information belonging to every customer who has signed up for an account to order Panera Bread online.
This shows the customer's full name, email address, phone number and the last four digits of their saved credit card number.
Moreover, the customers are easily enumerable which means an attacker could crawl through all the records.
I can provide the specific details of the vulnerability over email once you respond, but if you prefer (for more security),
I can also encrypt the information with a PGP key you provide me. Alternatively we can hop on a phone call.
Best regards,
Dylan Houlihan
And their response:
Mike Gustavison <Mike.Gustavison@panerabread.com>
to dylan
Dylan,
My team received your emails however it was very suspicious and appeared scam in nature therefore was ignored. If this is
a sales tactic I would highly recommend a better approach as demanding a PGP key would not be a good way to start off.
As a security professional you should be aware that any organization that has a security practice would never respond to
a request like the one you sent. I am willing to discuss whatever vulnerabilities you believe you have found but I will
not be duped, demanded for restitution/bounty or listen to a sales pitch.
Regards,
Mike
This kind of incompetence directly endangers the privacy and security of anyone who does business with Panera. And it's reminiscent of the kind of incompetence that characterized the Equifax breach and other recent high-profile hacks.
Maybe it's time that a subset of IT workers become professionally licensed and liable, like engineers.
I made this recommendation a couple of years ago when a careless sysadmin left a MySQL dump on a public web share. The response I received is still relevant:
>Requiring a license would wind up making such qualified people more expensive to hire, and companies would ignore it and hire those without licenses to save money.
It would be just about impossible to enforce, naturally, and would be like firing the Senior Developers and hiring fresh graduates.
He has a CISSP (or so says his LinkedIn) though, and on Wikipedia it says DoD, ANSI and NSA value or approve of it and its holders have a higher salary on average.
And we're talking about the director of security with 17 years of security experience here, (he also spoke at Akamai Edge 2015), not a common programmer or admin, I'd assume he already isn't too cheap to hire with those credentials? And that a company that size doesn't skimp on it's directors?
Then again they'll probably lose nothing over this leak and their response.
Being passive aggressive is even somewhat justifiable if they really get that much scamming but taking offense at someone asking for a PGP key isn't nor is ignoring Dylan's emails repeatedly for 6 days when he asked if his encrypted information came through.
Plus the whole "we are working on it" and then not doing anything for 8 months. Did he throw what Dylan sent him away? And then the fix that required you to login (with an ordinary customer account) to get all customers' data instead of exposing it to the internet. They also told fox only 10 000 customers were affected, treated Krebs like an idiot to the point that he went on a Twitter rant against them and he and others were posting links to other holes, web accessible admin login panels of various things, etc. and saying their website should be taken down (which it now is).
On his LinkedIn page it says he has CISSP[0] and has had four security jobs (Panerabread being his fourth) so far between 2000 and now.
He also might have spoke at Akamai Edge 2015 as a security expert (some internal page comes up if you Google his name called 'speaker details' and in the URL the ID of the event leads to Akamai Edge 2015).
[0] - I've no idea if it's good for anything but according to Wikipedia DoD, NSA and ANSI approve of it and it makes the salaries of its holders higher.
I suppose there are many ways to choose the subset. Maybe software engineers in specific verticals: medical technology, avionics, etc.? Given the number of security breaches lately, CSO seems like a no-brainer, too.
Solution Architecture is where I would go if I wanted to credential IT folks like engineers or CPAs. Require the license if you are accountable for implementing a solution that hits various thresholds.
Infosec is an area where there is already a problem with credential collectors, and in many places it is just a dressed up audit/compliance function. It’s not a standalone vertical imo.
I've developed a joke law that says that the amount of genuine information in a statement is inversely proportional to how polite and wordy it is. I.e. the "we take security seriously" PR fluffs.
Perhaps Mike knew that law and that's why he took Dylan's email as not genuine. Perhaps "yo fucka, I pwn'd your shit, tomorrow it's on the dark web if u no patch this link" would be the proper way to inform them of a leak.
Then again some people say it's good they didn't try to get Dylan arrested for "hacking".
Arguably that wasn't a leak, they're intentionally putting that data into an analytics store. Whether that's okay is a different issue, but it wasn't leaked to the public.
That wasn't a breach, a possible MitM != breach nor does putting data into an analytics company. I'm not saying they are blameless but it irks me when thinks like this are mislabeled and dilutes real breaches.
A debt collector would apply since they have been contracted by a HIPAA-covered entity and the data they have likely came from that source. Grindr is completely unrelated to another HIPAA-covered entity and any health data you give them is solely your responsibility... so don't.
And he joined Equifax after jumping ship from A. G. Edwards in 2008, presumably because the company was accused of fraud in that same year.
His first security gig was Senior IT Security Analyst at A. G. Edwards and Sons. His only work experience before that was Supervisor of Branch Installations.
This seems unbelievable, but that senior security position was his first IT experience.
I’m sure the only reason that only partial credit card numbers were stolen is that PCI makes it very hard for Panera to store complete credit card numbers (with expiration dates and the security code on the back).
> PCI makes it very hard for Panera to store complete credit card numbers (with expiration dates and the security code on the back)
How about impossible. Storing the CVV number is 100% not allowed. Even storing complete cards numbers is only allowed under very specific conditions. Any deviation opens them up to liability for related fraud.
Even storing complete cards numbers is only allowed under very specific conditions.
We encrypt these at the app, even before putting them into the DB, yada yada. The PCI auditor actually made us restore the DB from backup onto another server and show them the data, to prove that some magical process in the backup program didn't cause them to come un-encrypted. They also wanted us to change all corporate email addresses to random characters, ostensibly to prevent spearfishing (we declined to take this suggestion). My point is that they go to crazy lengths to ensure you're doing this stuff right.
Thank you. I didn’t remember the details and went with “hard” because whenever I say “impossible” somebody will come up with some exception to the rule.
My (potentially limited) understanding: Certification by an ASV doesn't free you from any responsibilities. You could still violate the DSS even if an ASV clears you.
B- How can a company have such a bad response? I think just about every big company has put a huge emphasis on data security. But hey, companies are big and technology is complex, so maybe data leaks still happen. But when they do, how can you treat them with such a lack of care? And how can the director of Security be alerted about this and not fix it? Seems potentially criminally negligent?
"Per my last tweet, Panera issued a statement to Fox News saying the breach only impacted 10,000 customer accounts. Interesting that they had no numbers for me, and yet had this 10k number all ready to go on the same day this was "discovered," eight months after it was reported."
"At the risk of making my job harder (or possibly, easier?) it's clear I'm going to have to write an entire series of blog posts about how not to handle a data breach from a PR perspective. I'm sputtering over here. Gave @panerabread every courtesy and they treat me like an idiot"
"Hey @panerabread : before making half-baked statements to the press to downplay the size of a breach, perhaps you should make sure the problem doesn't extend to all other parts of your business, like http://catering.panerabread.com , etc. Only proper response is to deep six entire site"
Krebs doesn't have to write his own blog series on how to handle breaches (although I might be interested in his version as well) Troy wrote a nice post about it
Most the IT Managers / Directors I've worked with were never from developer backgrounds. They were either an "IT Guy" that stuck it out or the "network guy" who's extent of knowledge is seemingly plugging in a network cable.
Between that and the fact most established businesses I've been in still treat IT like it's a necessary evil and waste of money, I'm not remotely surprised when stuff like this happens. My current company had a data breach, the IT Director swept it under the rug. I contacted my attorney for what I'm required to do to (to cover my ass). I emailed my managers and moved on down the road.
I love that the maintenance page has a button labeled "Order Online" (https://delivery.panerabread.com), which is the page/domain broken in the first place!
I'd revise that from "if a breach happens" to "if a breach happens and the CSO demonstrated criminal negligence." The attack surface for security is too large, and it's not fair to hold a CSO of a cafe chain to such a standard when zero-days are also possible. Punish for being negligent, not for being attacked by a zero-day, or something else really obscure.
What if the CSO informed engineering teams, got stonewalled, and, a few weeks later, escalated through the company's risk process (Panera is public, or was before it was bought by a public company, and will have a risk process). What do people here think a CSO does? If your mental model is: "decree that something is safe to deploy publicly, or else forbid its deployment", your model is broken. Most CSOs have an advisory role in the organization, and the real institutional power comes either from engineering or from the CIO.
This security director handled Dylan's bug report badly and deserves the reputation hit he's getting. But if we're going to suggest liability (let alone criminal liability) for security flaws, we should at least have some idea of what it is we're regulating.
If we were running under the liability model the CSO's final option would be to resign which sucks. But he is basically in the same situation that any employee is who is being forced to do something that is clearly illegal. But, I guess that is a good argument for why liability might not work because you end up not having a security team or you put good people into legal dilemmas that they shouldn't have to deal with.
And a house is different than a school. MIT has an open campus. MIT has a long history of celebrating students who transgress boundaries and go where it is unexpected[1]. I don't have a history of celebrating people who enter my house uninvited.
> Swartz had connections to [MIT]: "He was a regular visitor to the MIT campus and interacted with MIT people and groups both on campus and off. … He was a member of MIT's Free Culture Group, a regular visitor at MIT's Student Information Processing Board (SIPB), and an active participant in the annual MIT International Puzzle Mystery Hunt Competition. Aaron Swartz's father, Robert Swartz, was (and is) a consultant at the MIT Media Lab. Aaron frequently visited his father there, and his two younger brothers had been Media Lab interns." [2]
If a good friend of mine sees my house has the door ajar, and the walls are "covered in graffiti" it would be perfectly reasonable for him to check inside.
>MIT has a long history of celebrating students who transgress boundaries and go where it is unexpected[1]
Only when it's conservative enough and doesn't break the law too much. And not officially. In fact the very wikipedia link says:
"Although the practice is unsanctioned by the university, and students have sometimes been arraigned on trespassing charges for hacking, hacks have substantial significance to MIT's history and student culture".
>If a good friend of mine sees my house has the door ajar, and the walls are "covered in graffiti" it would be perfectly reasonable for him to check inside.
Not really. Especially if they know they're not welcomed if found inside, and they have no business there.
MIT's official campus guide brags about the hacks as a way to try to attract students[1]. They are advertising the police car on the dome as a positive thing.
I don't see any reason to think they would be upset about him going in an unlocked closet. The previous quote mentions he was part of a puzzle hunt. If he was creating a part of that hunt and used that closet as a part of a puzzle I would think they would have been ok with it. The walls were covered with graffiti. How many years of prison were the students who drew the graffiti threatened with?
Why is a software developer an engineer when it fluffs their ego, but not an engineer when regulation and consequences for failures are necessary?
Yes, if the security failure is grossly negligent, you should face criminal proceedings. As a C level executive, you are responsible for your chain of command.
Is there any evidence that software engineers are protected in some way from criminal negligence cases?
The reality is that it is vanishingly rare for any engineer to face criminal charges for their professional actions. It doesn’t seem to me that software is held to much lower a standard.
Not protected, simply not pursued, although it’s usually outright fraud that is the target of most prosecutions.
Watching the SEC closely to see how many ICOs they prosecute. Also was helpful to see someone involved with their breech response who attempted to profit from non public material information prosecuted (although that’s tangential to the breach itself).
Someone relatively important is going to have to get burned before more software professionals are pursued for grossly negligent security failings.
You misunderstand my point. Are there examples of other sorts of engineers being brought up on charges?
It only happens in the most egregious of negligence cases as it is and even then convictions are rare.
I'm saying your impression that software engineering is protected is wrong, because no engineers (to any normal approximate) are brought up on criminal charges.
Lawsuits are commonplace in civil/geotechnical engineering because faulty work has life and death consequences for the general public. To be a certified professional engineer and sign-off on design plans in California you need to pass an exam, after which could result in issues of liability. This law practice defends professionals that may be in a dispute [0]. Here's a breakdown of why engineers might get sued [1]. Here's a case where a company was held liable for damages associated with a construction project [2].
The title 'software engineer' without any notion of liability is an exercise in stroking ones ego.
He said “criminal” charges. That is a very high bar.
Software engineers can be held liable in civil suits, as can other engineers even if there is no professional accreditation body for their industry.
It is less common in software than civil engineering for a few reasons, one of which is that customers literally have no problem signing away their liability. No one would sign a contract from a bridge designer that said “this might fall over in a stiff breeze” but that happens all the time with software.
By that extension if a McDonald's drive thru employee accidentally spills hot coffee on a customer, the CEO is responsible and should be charged with assault?
If they create a work situation where by cutting corners on container safety, protocols, and employee attentiveness I think they are guilty.
And in the modern security context we're pushing deadlines just to race to the latest features with almost no regard for security in the process.
Something has to change. If this kind of negligence were causing similar problems in physical realms there would be regulations.
The tech companies behind these mistakes won't have that free roam forever. Every major screw-up is a step closer to regulations and everyone will cry about it when it happens... But so many companies today don't seem like they're ready to behave responsibly.
Is that grossly negligent? No. Is keeping the coffee excessively hot for cost reasons, thereby causing the customer to receive third degree burns on their genitals and winning in court? Yes.
While I fully understand that without universal insurance in the US, it may be most expedient to go after someone like McDonald's with deep pockets, I am tired of hearing how shocking and unconscionable it is that coffee could be served at a near boiling temperature.
I make coffee nearly every morning by boiling water in a tea kettle and pouring it over coffee grounds in a Melitta filter. If I poured or spilled it on my genitals, that would be bad. Doesn't make an approximately 200F temperature incorrect though.[1]
I'm familiar with the case, that's why I mentioned it. My point was that although they lost the civil suit, there weren't any criminal proceedings against C-levels. I understand the argument of negligence being as guilty as malicious intent but it creates a sweeping blanket that's hardly fair or enforceable.
I agree with your principles in theory but it's just impractical.
The Department of Justice was able to dismantle Arther Anderson after their fraudulent audits of Enron. Lots of things that are impractical are possible with sufficient effort. And the government has unlimited resources for those efforts.
You must hold systemic negligence and corruption accountable, or it perpetuates the cycle.
A) The DOJ had been looking at Anderson for years prior to Enron due to irregularities with other major firms like Waste Management Inc. Enron was not an isolated incident.
B) They were prosecuted for the very specific crime of obstruction of justice after they were caught destroying evidence. It wasn't some backlash against a nebulous problem.
C) Their conviction was overturned!
I'm not sure you could have picked a worse example for arguing your point.
They keep the coffee that hot because customers like hot coffee. That's the main reason I get coffee at McDonalds, not because it's great coffee (though it's not bad) but because it's HOT. Half the time I get coffee at Starbuck's it's only a litte better than piss-warm.
I don't think forbidding hot coffee at drive-thrus is unambiguously in favor of safety, since not-so-hot coffee encourages people to drink while driving, which could cause an accident. Some people want to drink on their way to the office or home, and others want coffee that is still hot when they get there. The consequence of the litigation seems to be that the former group of customers is privileged, but I'm not certain that is an overall social good even if you prioritize safety - and some would of course be happy to trade off others safety for their own hot coffee.
There seems to be an unlimited supply of people always popping up to "debunk" the "myths" about the Liebeck case who seem to deflect from the fact that it is normal for coffee to be brewed at near boiling temperatures[1] that cause the sort of damage that was at issue. I could burn myself severely while draining pasta too, if I pour hot water all over my pants and don't remove them; it doesn't mean boiling water is too hot for cooking nor that say, a manufacturer of a non-defective pot is to blame.
it's unfortunate but leaks and breaches happen in programs (which a website is). it's coding, it isn't perfection and no one should go to jail or be ridiculed because they unintentionally introduced a bug that caused whatever problem arise (WE HAVE ALL DONE IT). This is why it is ideally best to have some sort of peer review and/or buddies reviewing our code for things we don't see before they are pushed into production, however unfortunately, this doesn't happen in all cases.
the only crime was not fixing the problem and keeping it a secret AFTER IT HAD BEEN DISCOVERED. in this case, it wasn't the mistake that was the crime, it was the cover up.
Engineers in other disciplines are held liable for their mistakes. Imagine a civil engineer signing off on a building and then having it collapse. If it was found that the engineer was negligent then you can bet your ass there will be reprucussions. As an engineer, you are the top of your field and with that comes a professional responsibility that is important to fully realize. Mistakes are mistakes sure, but if those mistakes end up being responsible for criminal activity then you’re fully responsible. It’s why the chain of command exists.
> Engineers in other disciplines are held liable for their mistakes.
To be fair, they have several hundred (if not thousands of) years of trial and error, documentation, etc. behind them to (try and) help people avoid the mistakes.
Computer Science has barely 70 years of half-arsed fumbling about.
True but worth mentioning different forces were at stake there and here (although both very dark).
In Swartz case, prosecutor was trying to make example of him because his public University made/is making tons of money for providing information that should be free (or already is)
In this case, I would imagine they want peoples info to be leaked and exposed as much as possible, just to have a good reason to fine those for-profit private companies.
Edit: in other words - show me a priest who doesn't want you to sin, or a cop who doesn't want you to break the law, or a doctor who is not fine with people getting sick. Otherwise they would all be out of job.
This is why you should lie as much as you can when dealing with for-profit corporations, especially online. Any information you give them will eventually be available to everyone, because they have no reason to care.
Wow, this story is amazing. Companiy got notified last August of a 0 day (no authentication) to download all customer records, but no action taken for half a year. Then a very bad PR stunt leading to even more exposure - one can't make this stuff up... its April 3rd already, right?? Wondering why they couldn't just really fix the problem? Would be interesting to learn more on how they do engineering? Eg. was it all outsourced and someone else tries to fix it now? This year is going to be good!
It's exactly a 0 day. They were notified last August of a 0 day in their website and 6 months later 6*31 days (31 for simplicity) later it was is still was not fixed.
My natural gas provider can't get my bill to print with me emailing them for over a year.
So their old 1990s site, worked fine. Upgrade to new whizbang bullshit and a steady stream of emails still can't get it to simply use a CSS print routine. Outsourcing is glorious!
Cases like this are why I think the general public vastly overestimate the capabilities of government surveillance. These same people work at NSA, CIA, etc.
Not to insult the intelligence of these fine agency folk; my point is security is only as strong as its weakest link. And whether public or private, people can make some very weak choices.
Maybe in the future, but there are plenty of companies that ar around Panera's size that had to get into the online-ordering space before SaaS was as big as it is today. Thankfully, much smaller eateries now can use Yelp or Seamless to deal with account management instead of rolling their own bespoke systems
So here's a fun note - as it turns out, the Panera Bread Director of Information Security mentioned in that email exchange worked at Equifax from 2009 to 2013. There's a comment mentioning it on that page, but you can find it just by looking at his LinkedIn: https://www.linkedin.com/in/mike-gustavison-b020426/
Time is a flat circle. Everything that has happened before will happen again. Every time it happens, we will hear "Security is our top priority" or "We take security very seriously."
Correct me if I'm wrong - its also NOT illegal to do so, even if you are lying it is immoral but not illegal.
So I was once told by cop when i told them defendant is lying not showing up that he has good reasons. Unless you are under oath by very few LE organizations, its not illegal to lie.
Of course I'm not saying its a good thing; just pointing out they can say whatever they want to - there is no liability.
Absolutely agreed. It feels like corps are developing thier own infosec version of the four dogs defense.
4 DOG DEFENSE
My Dog Does Not Bite.
My Dog Bites, But It Didn't Bite You.
My Dog Bit You. But It Didn't Hurt You.
My Dog Bit You And Hurt You, But It Wasn't My Fault
Probably won't happen until some Senator gets personally burned. Equifax hasn't suffered much, for example, and they released almost all of their info for every adult in the US that ever used a credit card or had a mortgage.
I'm almost wishing some activist hacker would buy the data for the House and Senate reps and go to town...just to get their attention. Purchase pornhub accounts , shady drug site stuff, escorts, etc, and start sharing it publicly.
> My guess is that senators that have been burned have been done so secretly and are being blackmailed.
The whole bunch has been blackmailed for decades. Just not "ordinary" blackmailing, but threatening by big funders to cut said funding unless, for example, the politician keeps supporting NRA/BigAg/BigFinance-favorable policies...
I know HIBP's Troy HUnt has very carefully detailed his ethical and moral tradeoffs in what he does, and I appreciate that as a benchmark.
But I so want to lose my mind, start getting these breach db's and start emailing Congresscritters with "This email was hacked, you're screwed, we're screwed, and here's legit links to help fix our lives back up... (eff.org) (hibp) etc"
And now I'm on the watch list for when someone crazier than me actually does this. Sigh.
I feel like there could be an xkcd-style greasemonkey script that adds a winkey face to the end of any of those phrases to make them a little more accurate.
Because that would cause Panera a lot of money and the alternative (this) will likely cost them very little money. The choice then becomes clear: it's cheaper to tweet out an "oops" apology than to actually prevent it.
They don't care that your information got leaked, that doesn't enter into the calculation (unless it costs them money, which it doesn't).
Years ago, I was assigned to clean up an office building that had recently been vacated by a government cybersecurity contractor. While throwing away all the trash that had been left behind I discovered a binder that had at least a hundred pages of print outs from mapquest with the location of the panera bread on each circled in pen.
Comments like this tend to weirdly circulate, I've noticed, with people forgetting that originally, it was completely anecdotal and unsourced.
Op I don't mean to disparage you, but this is the internet, and there just aren't enough grains of salt in the world to allow me to swallow a tale like that without speaking up.
At least around here, they were one of the only places to go with free public WiFi for a while. It could be something related to that, just a place to find open internet hotspots.
For companies operating in European Union, the General Data Protection Regulation (GDPR) (1) mandates that such breaches need to be disclosed under 72 hours. The implementation deadline for GDPR is by end of May 2018 (~7 weeks to go).
Underarmor, a US-based sports apparel manufacturer, who operates in EU as well, recently had a breach that affected 150-million users, and went public within 3 days of discovering the breach (2).
I believe UnderArmor's case is the norm we can expect going forward.
(1)https://en.wikipedia.org/wiki/General_Data_Protection_Regula... (2) http://www.bbc.com/news/technology-43592470