Hacker News new | past | comments | ask | show | jobs | submit login
Login Forms Over HTTPS, Please (hacks.mozilla.org)
342 points by _jomo on Jan 29, 2016 | hide | past | favorite | 113 comments



> If you’re submitting your login form over HTTPS, that’s good, but it’s not enough. You have to deliver* the form over HTTPS too.*

I'm glad they mentioned it. Too many people think their sites are secure if logged in sessions use https and everything else is http.

Their example is that an attacker could insert JavaScript to steal the password, however they could just as well change the form target from https to http which is even less noticeable.

There's also a more detailed post [0] with reasons and explanations. You can enable this setting in Firefox 44+ by setting `security.insecure_password.ui.enabled` to true.

0: https://blog.mozilla.org/tanvi/2016/01/28/no-more-passwords-...


As I understand it, if not everything is HTTPS, the attacker could just inject JavaScript that changes the page to the login page when the user clicks "Log in". That way, they can still record the password.


Yes, it's disturbing how many sites still do this. Most of the big shopping sites like amazon.com, ebay, Target, WalMart are http until you log in. Was also until recently very common on credit card sites, though all of mine are all https now.


Amazon cares much less about your security than about the fact that tiny increases in latency decrease conversion dramatically.


This is because most people are browsing and not buying, so the caching advantages of unsecured catalogue pages is likely quite large.

At least for most of those the login is a separate protected page rather than a https iframe.


Uhh, what caching advantage are you talking about? Everything that can be served cached over HTTP can also be served cached over HTTPS. Unless you are talking about a caching proxy outside of the website control. In that case, they aren't needed, CDNs solved that problem in a much better way.


> Unless you are talking about a caching proxy outside of the website control. In that case, they aren't needed, CDNs solved that problem in a much better way.

There are still parts of the world where caching proxies are used to conserve limited transit bandwidth; CDNs don't solve that.


"Everything that can be served cached over HTTP can also be served cached over HTTPS. "

Only if you're taking about a browser cache. Any other cache couldn't intercept the data and cache it without alerting the browser (unless you mess around with installing SSL certs).


I wonder if even simply the overhead of setting up and encrypting the secure connections even just by itself has a measurable effect on people's buying behavior.


Yes but the "login" link, the "add to cart" link, etc. is all served over http so could be replaced by a MITM.


You don't have to wait for the user to click anything. Just read each key press and send a message out.

Hard to believe this is still a common vulnerability. Of course, even in 2007 a significant number of bank and credit card web sites operated like this.



If the attacker has ability to inject code, can't they simply redirect you to non-https version of the website on their servers anyway?


That's one of the reasons why HSTS [1] was invented. Basically your website itself informs the browser to never contact it though http. Then a man-in-the-middle, which can inject whatever in any http pages you visit, simply cannot inject (nor replace) your website ever, because they (supposedly) cannot fake a valid certificate covering your domain.

You can even submit your domain [2] so that current browsers automatically apply the https-only preference to your domain, even when it hasn't ever been contacted before. Cool, isn't it?

[1] https://en.wikipedia.org/wiki/HTTP_Strict_Transport_Security [2] https://hstspreload.appspot.com/


They could. But how would they show the users their desired contents after login?


Doesn't matter, they've already gotten your password at that point. They can just say "login failed" and redirect you to the real login page


For too long popular a lot of popular gambling sites such as betfair and skybet have done this. I think betfair now is all https, but I'm not sure of skybet (and I can't be checking while at work).


Betfair is finally using all https. Their security was terrible because once you logged in, the site went back to http again. This meant that anyone on the same network could grab your cookies and take over your account...


That's not the half of it - if you knew what that cookie contained…


Like... login and cleartext password, to lookup your profile on each request?


$sql = "select * from user where username='".$_COOKIE['username']."'";

Came across this last year... have seen variations of it over the last 15+ years, but saw it in 'new' code last summer. :/


Sad reality of rising kids with the notion that security is about forms and passwords and not about hacking stuff.


Isn't betfair the one where the password reset email was sent to the email in a hidden input field which you could change yourself....?


Is there a caching story to tell, here? In other words, is it possible that most web infrastructure is set up in such a way that caching is used to mitigate much of the load on the servers, and moving too many things to HTTPS too quickly would result in a MUCH greater content generation load?


That kind of caching is actually extremely rare in my experience. It's difficult to set up for all but the most simple static sites, and since it isn't available by default the vast majority of sites don't bother.

The times I've implemented full-page caching (generally using Varnish) I've set it up so nginx runs in front of Varnish handling SSL termination, which means that I still have SSL support even though I'm serving through a cache.


So how do you avoid this?


Force TLS on all pages and subdomains, send the Strict Transport Security header, and preload your site (https://hstspreload.appspot.com/).


I wish they outlined a plan to push this icon out to Stable. Even better the plan should call for the browser to eventually refuse to submit forms with password fields unless HTTPS was used for both loading the form and submitting it.

I think that developers who are still using HTTP with passwords either don't understand the implications (and a tiny icon won't help), don't care, or don't have "management buy-in" to spend the time to fix it. Having the browser force best security practices will benefit them and everyone using their websites.


Browser should display a scary warning popup when submitting form to http (either always, or maybe at least when there is input type=password in a form). This would be annoying enough to get management buy-in to implement https, if someone still maintains the app - better than a tiny icon.

Breaking stuff is a last resort, nuclear option. There are many forgotten, old web apps that would totally stop working and people would switch to another, less secure browser as a result.


They used to -- see http://www.kentlaw.edu/faculty/rwarner/classes/legalaspects/... (§2.4, about half-way down) and http://labs.ft.com/2014/05/do-we-really-need-to-hide-the-url...

But it was removed in later versions of Netscape and Internet Explorer, because everyone turned it off as soon as they made their first search engine query.


I remember that, though honestly internet was a bit different 15 years ago - it was in (almost)-pre-HTTPS, pre-public-WiFi, pre-Snowden times. It's time to progress now that the realities and technical capabilities changed.

Today there should not be "do not display this anymore" checkbox.


FWIW, 1Password will do this: https://15254b2dcaab7f5478ab-24461f391e20b7336331d5789078af5... (not my image)

1Password will also refuse to autofill passwords when it can't verify the application's signature (for example, if Chrome hasn't been updated in a while).


That is not particularly scary though. That's the kind of thing a user will automatically press "next" on. This is the kind of warning that looks scary: http://i.stack.imgur.com/2kaXO.png


I have recently deployed Content Security Policy (CSP) on a website. When I first looked at violation reports, my jaw dropped. The amount of malware (rouge extensions, toolbars, viruses, ...) that is blocked is staggering.

If you really want to help your (clueless) users, never ever serve a login, registration or credit card form without CSP. It really helps - at least until the malware catches on (I already see "Kaspersky Labs" is injecting its domain into the CSP itself).


Note that per the CSP spec, browser extensions should NOT be affected by CSP. The fact that they are in browsers is technically a bug, caused by the fact that once you've injected stuff into a page browsers don't so much track where it came from...

This does mean that currently CSP can stop various malware-ish extensions, but also that it stops legitimate ones (e.g. say an extension wants to apply a certain font that the user finds more readable to the entire page). It's a tough tradeoff.


Yes, I was about to say this - if you have a secure enough content security policy (and the browser in question supports it properly) it will be impossible for an attacker to execute their inserted Javascript (which to be able to do this anyway is also a security vulnerability).

But yes, the best plan is to have HTTPS everywhere, something that looks a lot closer than it once did! Thanks NSA!


> if you have a secure enough content security policy (and the browser in question supports it properly) it will be impossible for an attacker to execute their inserted Javascript

I don't follow your reasoning. Why wouldn't an MITM attacker modifying an HTTP response body to insert rogue Javascript also be able to modify the response headers to strip or alter the Content Security Policy?


Good point about MITM attacks; I assumed that we were talking about cross site scripting (XSS), but I suppose you are right.

I still am willing to bet that SSL is not impossible to MITM. Someone will manage to find a flaw in such a complex system.


Is there a pre-built tool to capture and manage CSP violation reports? I believe its mainly just a POST request with some JSON right?


https://getsentry.com/ added support for CSP a few months ago, but I haven't tried it out yet. https://github.com/getsentry/sentry/pull/2154


You can use https://report-uri.io/ to capture violation reports.


>rouge extensions

Man, World Of Warcraft flashbacks can be intense sometimes...


LEVEL 70 ROUGE TWINK GUIDE CLICK HERE


I don't understand the distinction between the login form and every other page of the site. If someone is logged in and then back to normal http, someone can just grab the cookie and pretend to be that person already-logged-in.

I suppose if one uses the same password for every account they have then knowing their password is more harmful than just having access to 1 site... but other than that it seems like a distinction without a difference.


> If someone is logged in and then back to normal http, someone can just grab the cookie and pretend to be that person already-logged-in.

If the cookie is set through HTTPS, the browser won't send it when loading HTTP resources. So the cookie won't be exposed that way.

We should still be using HTTPS for all traffic in 2016.


> If the cookie is set through HTTPS, the browser won't send it when loading HTTP resources.

If the cookie is set through HTTPS and does not have the Secure flag set, the browser will happily send it along when loading HTTP resources.


If a website only uses HTTPS for login, then it has to set a cookie for HTTP as well, otherwise how will the user navigate the site after login? From top of my head, you can implement this by associating the randomly generated session ID that you assign to all visitors, with the login ID.

Regardless, what jordanlev said still applies. The session can be hijacked.


Exactly! What is the purpose of being "logged in" if when you then go to browse the rest of the site you are no longer actually "logged in" (because the secure HTTPS cookie isn't being sent on those insecure http: pages).


The cookie would tend to be for one session only. The login credentials can be used to create subsequent sessions.


You answered your own question. If a site works this way it is risking the privacy of user data on itself, but at least it isn't endangering the credentials which may be useable to attack other sites.


Previously when crypto was expensive, there were (and still are, PG&E is one and Mariott another) people thought just doing login over ssl was sufficient.


The best solution is just to make everything use HTTPS. Using HTTPS for all traffic also helps to obfuscate which connections are the important ones.

Certain unnamed three letter organizations and nation states have the computing power to crack HTTPS encryption if they really want to, but using that power is expensive. Making sure all your traffic is encrypted makes it a lot harder for potential snoops to decide which traffic is worth spending the time and money decrypting and which isn't.

By using strong encryption for everything, even just reading wikipedia pages, you're doing everyone a favor by helping to make user privacy that much harder to violate.


  Certain unnamed three letter organizations and nation 
  states have the computing power to crack HTTPS encryption 
  if they really want to
Some ciphers and key lengths are vulnerable, but I do not believe it to be true to say that the NSA can "crack HTTPS", outside of a suborned-CA MITM attack, which isn't at all deniable or subtle.


I don't think many people pay much attention to the address bar, let alone a little lock icon with a red line through it.

If they want to get serious about it, they should just disable http support and require https for all actions. Or throw up a big in-your-face warning on http pages, not just make a small change to an obscure icon that nobody really understands.


> If they want to get serious about it, they should just disable http support and require https for all actions. Or throw up a big in-your-face warning on http pages, not just make a small change to an obscure icon that nobody really understands.

Even though I agree with you, I think that it's still a little bit too early for that. My prediction is that the browsers are going to start experimenting with this near the end of 2016 and that it'll become a part of stable versions of browsers somewhere in 2017.


Another alternative is to disable forms on http pages.


There are too many security holes and corner cases to serve any authenticated page over http. It's better to have your login page be served over https, but if you use authentication cookies, any MITM can easily pick them up from the following http pages and do what they want (for as long as the cookie is valid).

And that's beside a whole rake of issues from having http and https on the same domain. It's just easier to have https everywhere.


> It's just easier to have https everywhere.

For what it's worth, this was not true ten (or maybe even five) years ago.

I'd say it's true today, for most websites, but it's worth keeping in mind that it's a relatively recent phenomenon that HTTPS is now pretty easy to set up, use, and maintain. That wasn't always the case!


Five years ago was 2010. Other than Let's Encrypt, AFAIK, there hasn't been exactly leaps and bounds in terms of making https easier, at least for self-managed servers. It's still buy a certificate and set it up in nginx/apache/your favourite load balancer/etc.


I have found Caddy (https://caddyserver.com) really helpful in this regard. By default it grabs a Let's Encrypt cert and serves your site over HTTPS (oh and it handles renewals too.) Caddy + LetsEncrypt = easy https everywhere :D


> It's still buy a certificate and set it up in nginx/apache/your favourite load balancer/etc.

To a first order approximation, yes, and for small and medium sites, that's pretty accurate. But for very large sites, there can be additional complications and expenses, and that situation has improved over the last several years. Don't underestimate the effect that the downward price of bandwidth over the years has had - at very large scale, those pennies can really add up.

There's a reason why sites like Google, Facebook, Reddit, Tumblr, etc. took so long to add SSL to everything Or why some sites like Comcast still don't provide it. It's not (always) that they simply don't care or don't have knowledgeable engineers; it's that the logistics of managing[0] SSL at that scale are non-trivial. Arguably worth it, yes, but it's not so straightforward.

[0] heck, the logistics of paying for - even with a CDN, the marginal cost of adding SSL is not cheap.


SNI and IPv6 are two pretty big ones since the limited number of IPv4 addresses limited the practicality of having one IP for each SSL site.


Prominent? A crossed over lock icon in the address bar? Try again.

A prominent warning would be something ridiculous, like a full page cover saying "THIS PLACE IS NOT SECURE – HERE BE DRAGONS!" or something. Browser vendors should do more of this for egregious errors on the publisher's side. Unless users complain loudly that stuff is uncomfortable and broken and scary and what not, you can write articles like this every day of the week and publishers still won't do anything about it, and the only users to care will be the geeks who understand what the damn icon means in the first place.


Use to many prominent warnings and people learn that they happen all the time and ignore them. It's a fine balance, and browser vendors invest a lot of effort in it (I know the chrome team had papers/talks on quite a few conferences about the topic)


Would be an interesting thing to do as a network operator - doing MITM to alert people about the dangers of HTTP.


they're already doing it - for different reasons though

there have been reports of replaced adds on unrelated pages with banners from the ISP and similar things


What I don't understand about authentication over HTTPS is, though, why not making login a part of the protocol? Wouldn't it be much better to authenticate a user with a public key of the user like in SSH, instead of password authentication over the public key of the server? It'd be more resistant to attacks such as MITM or stealing the private key of the server. If a user can register a password on a website, why does it have to be a password rather than a public key? The only hindrance is the fact that the protocol doesn't support it.

I have no idea why this easy change hasn't been made in the protocol.


Check out the abandoned Mozilla Persona, and the SRP protocol.

https://developer.mozilla.org/en-US/Persona http://srp.stanford.edu/whatisit.html


There are sites that can use keys to authenticate. They're usability is miserable.

Key based authentication is difficult for a layman to manage and understand. May mother can memorize a password and use it across computers. Asking her to do the same with a key will be difficult.


There was, fairly recently, a half-hearted attempt to do that in the way of Persona. Unfortunately, neither Mozilla nor any of the other browser vendors implemented it, and the fallback mechanism was very poor UX.


The most secure approach is to serve everything over HTTPS AND redirect HTTP to HTTPS WITH "Strict-Transport-Security" header [1]. Even this is not perfect, but it's the best approach available AFAIK.

1.https://en.wikipedia.org/wiki/HTTP_Strict_Transport_Security


You can also submit your site for inclusion on the HSTS preload lists:

https://hstspreload.appspot.com/

This fixes the remaining hole of the first visit.

However, one warning, if you like privacy, you should turn off HSTS or clear it on browser close at least. It's effectively a giant supercookie and it provides little security benefit over checking URLs for SSL yourself.


Thanks, that's good to know. :)


I'm puzzled. As a developer the sites I work on are (mostly) going to be hosted on my local machine. I usually don't bother with all the effort to set up SSL certificates for my development web server unless I've got an SSL-specific issue to investigate. Is this feature disabled for sites that are local? If not I'd expect I would just come to ignore it quite quickly. Then when I then look at the production version of a site I'm more likely to continue to ignore it as I've been conditioned into assuming it's a false indicator.

At the same time, for normal web users I can see how such a warning could be helpful. But it seems normal editions of Firefox won't have this enabled by default.

Or are my development practices unusual in some way?


The comments indicate that the icon will never be displayed on `localhost`.

I'm not sure what the situation is like for websites hosted on a LAN or using something like zero-conf, but I wouldn't be surprised if they do display the icon in those scenarios.


Hiding mixed content warnings (like Chrome did last year http://arstechnica.com/information-technology/2015/10/chrome...) would probably also really help some sites with SSL everywhere.

On sites that allow users to embed things like images (forums, comments, etc), in order to avoid mixed content warnings or interstitial confirmation dialogs, you either have to pipe everything through an SSL proxy, or severely limit the types of things users can embed.


I'm not writing login code personally but this sounds like something fundamentally broken in the web and asking devs to fix it case by case is a complete copout. If a browser can detect this shit, why can't it fucking proxy it or do something clever to mitigate it automatically instead of putting up stupid hyroglyphs and expecting to shame developers into duplicating a bunch of work case by case and thereby probably getting it wrong in many cases? .. like.. ff detects insecure login form.. Browser DON'T FUCKING SEND CLEARTEXT PASSWORD (instead sending some encrypted bs to firefox grand central station, where it is securely forwarded through a secure backchannel? Maybe it has to talk to the isp directly who has to inject the password back into your process, and they charge u extra tax for securing your shit.. and then.. gasp.. problem solved and not every aspiring developer needs to become a cryptographer.

Passwords are stupid. The internet is broken. My Spyware infested machine and browser knows all my passwords anyway.

Note to any techies reading this.. fix this problem. Here is your killer app. Disrupt this shit.


Facebook had this problem for years, even after they started hiring industry veterans who would know better (https://www.sslshopper.com/article-how-to-make-a-secure-logi...). Amateur hour lasted for so long that just thinking of their codebase makes me ill.


I don't know why sites don't just use HTTPS for everydamnedthing. It's 2016. SSL is not that computationally expensive and it's just easier to develop an entire site that way anyway (rather than making some pages secure and other non-secure). Just redirect everything to https and forget about it.


I work on a site with a long history. I recently created a benchmark and found that I could make around 500 synchronous HTTP requests to our infrastructure per second (essentially localhost calls). If I switch my benchmark to HTTPS I can only make about 5.

That's a big difference. I'm a developer, not a hardware guy, so I don't know what causes the slowdown for us. I assume it's the actual setup and teardown of the HTTPS connection. That is a fairly significant difference when I want to use API's via HTTP/HTTPS and I need to make a lot of calls quickly.

My point is that HTTPS still seems to be 100x more computationally expensive than HTTP.


I agree with you, however the bureaucracy at some companies is insane.

In order for me to get a TLS certificate where I work I have to create a CSR, fill in a word document, attach those to a Jira ticket, the operations team will then hand those over to a contractor who talks to our chosen CA. Up to 10 working days(!) later I get an OV certificate back.

How I wish we were allowed to use Lets Encrypt.


Also, SNI is available since IE 7, so the times where you need an extra IPv4 address for every SSL domain are almost over.

(Or, are there any websites still trying to support IE 6 or 5.5, apart from large online shops?)


Injecting JS into the page with the form isn't my main concern, even -- it's changing the form POST action to their own server rather than the one I think I'm logging into. That's much harder to detect and block without encryption.

Plaintext HTTP is scary stuff these days.


Login forms over HTTPS, everyone. This time we'll finally fix web security.

This time we'll be safe.


Wow, this is still even a thing.

The form, all its js assets and form api endpoint all have to be secured. And https for everything that contains code. Deploying SRI for web pages over https is also another layer of defense against js tampering.

Also, sending passwords across the wire in any reversible manner is really more dangerous than is necessary. Passwords/passphrases could be salted hashed by the browser in JavaScript using a PBKDF similar to scrypt or bcrypt, before being sent to the backend for constant-time comparison... it just takes a little more prudence and effort, but it's absolutely doable.


> Passwords/passphrases could be salted hashed by the browser in JavaScript using a PBKDF similar to scrypt or bcrypt, before being sent to the backend for constant-time comparison... it just takes a little more prudence and effort, but it's absolutely doable.

This is not safe! Now an attacker just needs to intercept the hashed password and replay that, and he gets to login without knowing what the password is.

Use https. And don't do client-side hashing, it's no improvement.


This doesn't make any sense to me. If attacker can intercept connections I'd rather he get the hash than the plaintext password. At least then actual password is still not compromised, which is good, given how many people share passwords accross services. If he can tamper with connections, there's nothing that can be done anyway.

It would be very nice if there was a feature in web browser that allowed user to opt in to something like this:

When there's a password input on a website: 1] automatically take website's domain concatenate it to plaintext password 2] generate a secure hash from 1] 3] Derive some reasonably portable text password (20-30 characters) from 2] 4] Make it so that original web page never has access to user's plaintext password 5] Submit result of 3]

Downsides:

1] Stupid websites enforcing password rules other than length (can be worked around mostly transparently). 2] Extra stupid websites limiting password length. (requires user interaction and configurable exceptions for such websites) 3] Can't login with browsers that don't implement this.

This would be very nice for people who share passwords between services. Also would make web service data leaks less valuable to attackers. User would not depend on service password handling quality and/or operator's morality.

I use different passwords for each and every service. But many people can't be bothered or don't get the risks of password sharing, so it would be at least some help to them.


Exactly. That's the risk to mitigate. Here's scrypt for js using emscripten on actual scrypt C source... the site has to provide and adjust/migrate the three factors over time to ensure a sane amount of cpu/memory is used: https://github.com/tonyg/js-scrypt

For real-world, practicality's sake, the client-side of an app would contain the PBKDF or some AA code, as opposed to the traditional, slightly-riskier "let the server handle all of it"-approach... the server still has final say on authentication and authorization, just move some of it into client-side js. There's really not much impact on FE development considering most FEs and BEs are codeveloped these days anyhow.

Btw, for web devs Two-factor auth is cheap and easy to do, Google Authenticator OTP has open source libs and requires no calls to Google to work.


You're wrong because it's a strawman since if an attacker can intercept the hash they could as easily intercept the plaintext in your traditional server-side architecture. The attacker cannot replay a TLS unless there is a problem with it, there have been many issues in TLS stacks, but it's the most widely deployed.

Furthermore, using plaintext any further from the owner or exposed longer than is necessary is inherently less secure because your breach of https would also compromise users' passwords.


> Use https. And don't do client-side hashing, it's no improvement.

Well, it's an improvement if your users are reusing their passwords. Then multiple different sites will have different hashes sent.

But that's all besides the point, we should've been using zero knowledge proofs for authentication since the beginning.


With HTTPS, multiple different sites will have different "hashes" sent anyway, so no, it's not an improvement over just good ol' HTTPS.


Well, sure. But then you have to worry about how well the website is storing your passwords. Meh, password managers are the way to go to be honest.


Most websites with any sense dont store actual passwords, they usually store salted PBKDF hashed that are compared in constant time.


> Passwords/passphrases could be salted hashed by the browser in JavaScript

Not the worst idea from the developer's point of view, but it doesn't protect the user: the site could still serve bad JavaScript which didn't properly hash the password (this, for example, is why Mozilla accounts are completely, totally and utterly insecure, and why Firefox password storing should never be used by anyone who doesn't wish to share his passwords with Mozilla, any Mozilla employee and any state which can compel any Mozilla employee). The browser needs to natively support doing this.

There's also the issue of replays. This topic has been well-studied; I believe that Secure Remote Password (SRP) is currently the gold standard for this.


The worst offender I interact with frequently is Hulu.com. This is a high-profile site with millions of users. It's one of those tell-tale signs as a consumer where I really can't say I trust them with my data. I really wonder, once they have a high-profile "hack" (I assume this will happen), if the state can charge them with criminal negligence?


Still happens. I pay rent through RentPayment.com, but their homepage (which contains a login form) is served over HTTP.


I love this, but would prefer they didn't re-use the "invalid certificate" icon.

As a devs, we use self signed certificates for building our products and we have trained QA to ignore the HTTPS with a slash through it as an error that is acceptable in dev.

(And ourselves for that matter)

That makes this one easy to miss.

My preference would be a browser-level warning bar to roll out over the page. Like the one used in 'this plugin is not installed'

This is a huge error and its much harder to miss that way.


A little red padlock? It's a start... but why not refuse to load the page if it has a form tag but no https. That's how you solve this.


This is how the Tunisian government scraped the Facebook passwords of Tunisian internet users in 2011:

http://www.theatlantic.com/technology/archive/2011/01/the-in...


I've used this Chrome Extension for years.

https://chrome.google.com/webstore/detail/unsecure-login-not...

It seems we are finally getting it built in, as it should be!?


How about "Ads over HTTPS, please"? Can't upgrade our sites if our ad revenue drops through the floor.


Hmm. When technology basically took away musicians's ability to sell digital music files, the world pretty much said, "figure out a new way to sell music."

So, "figure out a new way to advertise." Welcome to disruption.


Maybe browsers should block ads not served over HTTPS? The ad networks will upgrade to HTTPS real quick. :)


I think you're the first person in this thread to explain why this problem is still allowed to exist.


Now if only I could convince firefox (and chrome) that this asterisk-input text field is not a login form.


So, if I understand it correctly, HTTPS costs developers money (annual rent for renting an SSL Cert).

Google too is about to start shaming non-HTTPS connections (according to a recent article).

I've heard about the free one-year Cert. Is there any way to do HTTPS all in-house (permanently), without resorting to an external agency?



Thanks.


Now this is the use of HTTPS that matters. This should get a major browser warning. It's far more important than "HTTPS Everywhere", which is mostly security theater.


The icon is nice addition, but a much better option is an animated icon. A heartbeat / pulse kind of alert icon for insecure web page.


This is not a good thing. The expense of SSL certificates is not negligible, and it is not justified. The protocol is easily compromised.

Much like email, https needs to go away in favor of a solution that incorporates a modern mindset.


Legalize insider trading; It'll fix/solve HFT;


Verb in Sentence, Please.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: