I don't understand all the negativity around this. People complained when Firefox added Pocket, in part because they took a browser extension and made it a feature that was ostensibly unrelated to web browsing. Now they're taking an old feature that's definitely not related to web browsing and removing it, and people are still complaining?
Firefox can't be everything. It should focus on being a great browser and not a great browser and also great FTP client, or a great browser and also a great feed reader, or a great browser and also a great mail client. People using FTP can use a dedicated client, of which there are plenty on every platform, and people who don't use FTP (i.e. the vast, vast majority of web browser users) won't even notice.
A modern web browser is probably some of the most complex software humanity has invented yet, besides a full-scale OS. Taking a maintenance burden that's unrelated to the core browser product of a struggling NFP should be welcomed with a sigh of relief.
FTP has been related to web browsing since the beginning of the web 31 years ago. For maybe a decade (the first third of the web so far), most of the WWW was on FTP servers—not just most software downloads, but also most HTML pages. Web browsers are a much better interface to FTP servers than dedicated FTP clients, because you can click a link on an HTML page (either a statically generated directory, possibly on the FTP server itself, or a dynamically generated search result page) and get the file from the FTP server.
There are a few people commenting with nonsense like this:
> You can configure Firefox to open "ftp://" links with the client of your choice. This is a non-issue.
That's absolutely useless if the client of your choice can't render HTML and the ftp:// link is to an HTML file.
The fundamental idea of the WWW project was that it provided a universal, uniform interface to all the information on the internet, regardless of protocol. This move amounts to Firefox abandoning that vision. Abandoning Gopher was maybe reasonable—there just aren't that many Gopher servers out there—but FTP is still a widely used protocol.
More broadly, this is a tradeoff between the traditional vision of the WWW as a vast library, in which human knowledge accumulates over time and becomes accessible to all, and the strip-mall vision of the WWW as a means to sell people things they don't need. This move amounts to burning down a wing of the library (or, at least, its card catalog) because it wasn't profitable enough. Or because people keep getting mugged there, I guess.
This kind of intentional functionality regression is precisely the kind of thing I use free software to avoid.
I'd just like to add to this that ftp is a relatively simple protocol, not particularly complicated to implement. It's not a constantly fluctuating standard that requires a team of 50 developers to keep pace with. Supporting FTP isn't some big technical challenge. The code has been there in the firefox codebase for nearly 20 years now, running just fine. All you need to do to continue to support ftp is nothing at all. If having the small amount of code to do FTP support in there is making your job really hard, that would seem to indicate to me that you're not very good at your job.
I also find the incredibly vague and nonspecific "but security!" scaremongering language to be quite hyperbolic, a repeat of the borderline lies mozilla peddled when they decided to dump xul for webextensions.
It seems to me that the removal of features like this amounts to "I don't use/understand it, therefore I'm going to assume it's not useful to anybody".
Of course, Mozilla being out of touch with its user base is hardly news, so this comes as no surprise at all to me.
(While I'm talking about browsers and particularly mozilla, I'd just like to take a moment to congratulate them on finally getting their market share down below that of edge. They've been working hard at driving firefox into the ground for a long time now, and I'm sure they must be feeling very proud to have finally achieved this important milestone in their seemingly unending quest to achieve that holy grail of 0 users. So I'd just like to say: Nice work, Mozilla!)
> ftp is a relatively simple protocol, not particularly complicated to implement
Parts of it are, sure, but parts of it are an absolutely horror show (The client opens a port, and then the server connects back to it!?), text conversion and binary modes that's based on ASCII, different list formats, etc. It's not great. Worse, it doesn't support the good stuff like implicit TLS extensions
> Supporting FTP isn't some big technical challenge. The code has been there in the firefox codebase for nearly 20 years now, running just fine. All you need to do to continue to support ftp is nothing at all.
Not at all. Continuing to support FTP means continuing to defend attack surface that's implemented as 20 year-old code, to deliver a feature that in 2021 the majority of people do not use.
It's a cost-benefit analysis. I want Mozilla to do more. If they believe removing FTP support enables them to do more, I'll all for it.
If it were, they'd have never added a pdf viewer in the first place. Everyone already had one. Near-zero benefit, huge cost.
>I want Mozilla to do more.
At this point I'm afraid we're just going to have to agree to disagree. For the last 5+ years, Every time Mozilla "do more", I throw up in my mouth a little.
> If it were, they'd have never added a pdf viewer in the first place. Everyone already had one. Near-zero benefit, huge cost.
Every time I clicked a PDF online, I dreaded the external viewer opening (especially Adobe Reader). I very much liked when they integrated a PDF viewer. Chrome also has one.
I'm willing to bet the internal PDF viewer has at least 100x (more likely 1000x) the users the FTP client had.
Sounds like you had a shitty pdf viewer installed. Which is not my problem.
Why was I forced to install a substandard, feature-poor, ridiculously slow pdf viewer that I never wanted and which took over as the default pdf viewer against my wishes and without asking me, just because you couldn't be bothered to install a decent one?
>I'm willing to bet the internal PDF viewer has at least 100x (more likely 1000x) the users the FTP client had.
Yeah, that tends to happen when you change the defaults without asking.
All PDF viewers are crap in my experience. If you don't agree, you haven't used PDF enough. PDF has Javascript as a programming language, it has extensions for 3D models and a ton of other stuff (I think even sound). As a format it's almost as complex as the entire web stack.
Multiple web browsers have embedded PDF viewers because the PDF viewing experience for most people sucked so much. That's not a product decision taken lightly.
Your personal experience does not reflect that of many other browser users, it seems. Even on mobile I dread clicking a PDF link.
Yeah, PDF can play embedded sound files. It is an overly complex format. Its character encodings are defined by embedded CMap files, which are a restricted subset of PostScript. It includes by reference the CCITT G3 fax standard, PNG, JPEG, JPEG-2000, JS, Deflate, TrueType, the PostScript Type 1 font format, a separate richtext format that isn't PDF for annotations (ISO 32000-1:2008 §12.7.3.4), and several other things. In different contexts it represents line endings as \r, \n, or your choice of either of those plus \r\n. And the 3-D models thing isn't an extension; it's in the base spec (§13.6), although the actual 3-D format is ECMA-363, which is included by reference, like Deflate and JPEG.
But it's nowhere close to the complexity of the entire web stack, not within an order of magnitude. It's much, much simpler.
I'd respond to the rest of your vague and incorrect opinions, but why would I do that if you're not even going to bother to address my points despite me asking you to repeatedly?
> Why was I forced to install a substandard, feature-poor, ridiculously slow pdf viewer that I never wanted and which took over as the default pdf viewer against my wishes and without asking me, just because you couldn't be bothered to install a decent one?
Because you are part of a minority of Chrome and Firefox users and I'm probably part of the majority of users, in this regard. I don't know this for sure, but if a Google program manager decided to back building and embedding an entire PDF reader into a browser, I'm 99% convinced this is true.
That's.......... actually not a bad answer to my question, goddamnit ;)
BUT: you do have to concede that it's not a technical reason. It boils down to "Adobe are goddamn useless, let's just do their job for them because it will make our life a bit easier". There's no requirement for pdf viewing in the HTML spec or anything like that. It's not required for web browsing and it doesn't really have any place in the browser in a technical/engineering sense - from an "engineering purity" perspective these two things should be separate, even if you do like your pdf viewers written in javascript. The "correct/pure" engineering solution would have been for them to bundle their awful javascript pdf viewer in an electron(ish) app and release it as the "google pdf viewer", with all the requisite spam in gmail and the technical press, and to have chrome default to it if it's installed. Hell, you could make it a separate application and still bundle it with chrome.
There must have been a LOT of support enquiries about "the adobe doesn't work" for it to survive a cost/benefit analysis for implementing an entire pdf viewer. That's not a trivial piece of work that somebody churned out in an afternoon (despite the performance of the thing feeling like it ;) )
If I was that product manager, I would not make that decision lightly. First I'd have built a special page into my support area that redirects the user to adobe.com/support if they type "pdf" or "adobe" into the "search support" bar. (You know what I mean - you've seen it in action on help desk sites where they try to fob you off rather than just giving you a "contact us" box straight away).
This is all a whole lot easier to implement than a pdf viewer. In fact if you're using something like zendesk you get it for free. For me to decide to build a pdf viewer into my browser, I'd have to be getting a high volume of support emails about it after I'd implemented the change above.
I don't think that's what happened.
In the post i largely ignored, you said:
>Multiple web browsers have embedded PDF viewers because the PDF viewing experience for most people sucked so much
Allow me to give you an alternate explanation / interpretation, which I admit might come across as "a little bit cynical" or paranoid, but which I believe to be more fully borne out by the facts than the "cost/benefit analysis" theory you advocate. I don't think "multiple browsers" did that at all. Here's my theory:
1. Google decided to build a pdf viewer into chrome because they were busy trying to turn chrome into an operating system (see also: chrome os), and they knew that would mean including a pdf viewer, since a pdf viewer is a fairly essential tool for an operating system if it wants any kind of mainstream adoption. This had the added side-effect that for many users their pdf viewing experience would improve, because adobe are basically incapable of doing software. So it was really a "two-fer" for google.
2. Mozilla, seeing google's announcement for a pdf viewer built into their browser, did what Mozilla has been doing for the past decade: copy Chrome without giving any thought at all to any of the considerations you and I are discussing today. Or indeed any consideration other than "what is google doing with chrome this week?".
Now, I do try to keep my paranoia and cynicism in check, but to me this seems like a far more likely explanation. Perhaps I'm wrong. It would be kind of nice if I was. If you have some data to support your theory I'd love to see it.
I don't have much data to back up my theory. I can't be bothered looking into what chrome's support site was like pre-pdf-support. But I do have one data point: I can confirm that chrome added pdf support first, with Mozilla aping them almost immediately (they were pretty much at parity within a year or two).
Maybe Mozilla had coincidentally already done a full cost/benefit analysis of adding a pdf viewer before google went live with theirs. It's not impossible. But it seems a little unlikely to me.
It's not a well-designed protocol, but it's still a relatively simple protocol.
Your reasoning, that "Continuing to support $x means continuing to defend attack surface that's implemented as $y year-old code, to deliver a feature that in $z the majority of people do not use," would ultimately consign every feature on the web to the book-burners' flames, except for the worthless minority of features that the majority of people do use. Chinese HTML text support, I suppose, and biometric authentication, and whatever the latest video codec is.
After all, it's 02045. Who watches videos encoded in VP9 now anyway? It's been obsolete since 02028! Or reads English text? Much less Hebrew? The majority of people do not use those features. Why should we continue to defend that attack surface?
Straw men are not helpful. I never suggested removing a language used by millions of people. Let me be more specific: Pruning support for old formats and protocols is a feature, not a bug.
Let's pick something else from this golden age of "browsing HTML pages via the FTP protocol" that people on this thread keep professing: XBM images, the very first image format ever supported by browsers, and used when Marc Andreessen literally invented the IMG tag [0].
XBM is a crazy simple image format [1]. Yet, because it is essentially C code that defines a buffer size and then gives you a char array, caused security bugs over and over [2] in all major browsers. Since virtually no one was using XBM images, the browsers all removed support for it [3].
The browsers removed code that was already written, and abandoned functionality that had already existed, because it reduced attack surface supporting something that virtually no website was using. Exactly like FTP support.
> I never suggested removing a language used by millions of people. Let me be more specific: Pruning support for old formats and protocols is a feature, not a bug.
You're contradicting yourself here; Hebrew is a language used by millions of people (about ten million, so 99.9% of the world's population does not use it) and also an old format (about 3000 years old). "Pruning" support for old formats makes history inaccessible; "pruning" support for old protocols requires constant effort to keep your servers compatible with whatever is fashionable with today's cascade of attention-deficit teenagers [CADT].
And if we follow your originally stated reasoning, "Continuing to support $x means continuing to defend attack surface that's implemented as $y year-old code, to deliver a feature that in $z the majority of people do not use," we ineluctably arrive at the conclusion that we should remove support for languages used by millions of people.
Taken literally, we should remove support for all languages, since no language is used by more than 50% of the world population, but in keeping with the principle of charity, I interpreted your "majority" as "vast majority". I'm not sure where exactly the vast-majority cutoff lies: a feature that 90% of people do not use? That would include all natural languages except English and Chinese. 95%? All languages except those, Hindi, and Spanish. (In particular, it leaves out all those RTL languages that cause so much complication in text rendering, like Arabic.) 99%? That leaves 20 languages, but not, for example, Persian, Swahili, Italian, or Thai. Even a cutoff of 99.9% might leave out Hebrew, Uighur, and Greek.
What percentage of users do you think use View Source? The web inspector? Printing?
That's not a strawman argument; it's a slippery-slope argument. And, I think, it's a valid slippery-slope argument. If we are going to avoid removing support for these things, we need a better basis on which to make the decision than, "Continuing to support $x means continuing to defend attack surface that's implemented as $y year-old code, to deliver a feature that in $z the [vast] majority of people do not use."
I do agree that there needs to be some kind of cutoff. Gopher is probably below it; WAIS and XBM certainly are. But FTP?
XBM was never very widely used in web pages because it didn't support color, grayscale, or compression, although for a little while it was the only image format supported by browsers that supported transparency. I did put it on a few of my web pages, but as soon as Netscape added support for transparent pixels in GIFs, I switched over and never looked back. This would have been about 01994.
We are at an impasse that can only be solved by data: how many unique users downloaded a document via an FTP URI in a web browser in a month? If I were a betting man, I'd wager it's less than the unique number of users who visited a webpage rendered using Hebrew, but we don't know.
Hopefully the browser makers will be transparent with this telemetry.
Precisely what I'm arguing is that we should stop destroying access to all information except the most popular—I'm astounded to find that you disagree! That is not an impasse that can be resolved by data.
Popularity is not a valid measure of value. The July 02021 issue of People magazine sold 3 million copies, in a single month, and is almost completely devoid of value. (Maybe in 02068 it will provide valuable insights into vapid 02020s US popular culture.) Amazon tells me Claude Shannon's Mathematical Theory of Communication is outsold by 185,210 other books at present, so perhaps it has sold 100 copies this month, but it is the foundation of data compression, error correction, and significant amounts of artificial intelligence work. One Hundred Years of Solitude has sold about 50 million copies—over the past 54 years, so perhaps it sells 80,000 copies a month, 40 times less than the July 02021 issue of People. (But probably less; it probably doesn't sell as much as it did 30 years ago.)
40:1 is more than the ratio between the number of HTTPS servers and the number of anonymous FTP servers.
> (While I'm talking about browsers and particularly mozilla, I'd just like to take a moment to congratulate them on finally getting their market share down below that of edge. They've been working hard at driving firefox into the ground for a long time now, and I'm sure they must be feeling very proud to have finally achieved this important milestone in their seemingly unending quest to achieve that holy grail of 0 users. So I'd just like to say: Nice work, Mozilla!)
Well said. And the more they fail, the more they double down.
But their decisions go beyond incompetence. They might as well be controlled opposition, actively undermining open web technologies.
They deprecated RSS support with very flimsy justifications, but they make supporting the latest DRM standards a top priority, because they are terrified of losing the blessing of Netflix.
> They might as well be controlled opposition, actively undermining open web technologies.
Yeah, this pretty much sums it up. A conspiracy minded person might accuse them of all kinds of nasty shenanigans, especially given who their primary finding source is these days.
>They deprecated RSS support with very flimsy justifications, but they make supporting the latest DRM standards a top priority, because they are terrified of losing the blessing of Netflix.
Oh now you're just being cynical. RSS support was like 150 lines of code! A huge attack surface! And it was invented before the year 2000! That makes it useless!
As for DRM, I'm sure that encryption algorithms are much simpler to implement and have less attack surface than an RSS reader, and that Mozilla added this tech for my own good, and that I just don't understand how it's useful to me because I'm a dumb.
> I also find the incredibly vague and nonspecific "but security!" scaremongering language to be quite hyperbolic
Well, there is nothing vague here: FTP is a cleartext protocol, and we're migrating towards protocols that provide integrity and encryption.
Sometimes I think it's a generational thing. I find it hard to accept this, growing up with testing all protocols with Telnet and so on. But unfortunately the Internet has changed a lot, and especially bad and unscrupulous people learned how to find all possible and sometimes very creative ways to abuse whatever had been created in the past. So I understand that cleartext POP3, IMAP, SMTP and FTP authentication should go away.
Anonymous FTP is a slightly different beast though. Security-wise, it has the same weaknesses as regular HTTP. But nobody is removing HTTP support from web browsers (yet). So I'm a bit sorry to see it removed from FireFox. There are many better things to copy from Chrome.
HTTP is a cleartext protocol. Why does your browser quietly navigate to any HTTP site you throw at it? Anonymous FTP isn't any less secure than HTTP.
Why does your browser scream at you for connecting to an encrypted but unverified site, such as a self signed certificate on a closed network, but have no warnings at all for an unverified and unencrypted HTTP connection?
How do you know the context that I am using a plaintext protocol in? How do you know I'm not connecting over a patch cable to the computer next to me? How do you know I'm not connecting over an SSH tunnel?
The user should easily be able to override these safety measures.
The only argument I have heard this is that the user could be tricked into disabling security mechanisms. But that is true of anything in computing. The user could be tricked into typing in rm -rf.
When there is inconstancy like this, it usually implies there is something else going on that we aren't seeing. I have a feeling that companies like Google and Apple have an agenda to move people away from having too much outside of their influence.
> When there is inconstancy like this, it usually implies there is something else going on that we aren't seeing. I have a feeling that companies like Google and Apple have an agenda to move people away from having too much outside of their influence.
I think it's both. For one thing, I think they genuinely do care about security, because any high-profile incident involving their products is a cause of embarrassment for them. At the same time, I have a feeling they would prefer people not inspect the traffic moving in and out of their apps, for example.
> HTTP is a cleartext protocol. Why does your browser quietly navigate to any HTTP site you throw at it? Anonymous FTP isn't any less secure than HTTP.
This is true, but it ignores the fact that the web has been moving towards depreciation of HTTP in favor of HTTPS.
While FTP is an established standard, FTPS is kind of a nightmare with different and incompatible variants.
The parent explicitly mentions that there are no warnings for a cleartext http connection, but warnings for an encrypted connection to a self-signed certificate.
How many in-the-wild attacks did you see using XUL from extensions that were in the curated addons ecosystem, i.e not downloaded and manually installed via extra steps a novice is unlikely to go through?
XUL allows too many modification for browser. It increases attack surface, and it makes hard Firefox developer to modify Firefox's internals to improve things.
Firefox didn't support directly opening and rendering HTML files via ftp:// URLs anyways, it'd just download them like a normal FTP client. I can't even remember the last browser I used that supported that. You can't kill what has been dead for decades already.
Actually, yes, it did, until Firefox 61 when FTP subresources were disallowed from HTTPS, and Firefox 70 when HTML pages over FTP were converted to downloads as well.
It has driven me mad that the tech world no longer cares about backward compatibility. Sometimes breaking backward compatibility may be justified by a steep cost on maintenance, but this FTP removal? It hurts a sizeable user base with little benefit on their end.
I cannot for the life of me remember the last time I landed on a page with FTP or had to use FTP in any way. Even lists of file downloads are http pages where I just click on the file.
I use FTP in the browser once in a blue moon. Out of curiosity, I checked my browser history to search for instances of opening FTP URLs in the recent past. I found two:
* ftp://ftp.oreilly.com/pub/ – only available as FTP
For the O’Reilly URI, it’s convenient to be able to click on directory links in the browser and then open HTML and PDF files without requiring another program.
So, it’s nice to have native FTP handling in Firefox for the odd time I’d use it but I can understand why Mozilla decided to remove it.
ftp://ftp.oreilly.com/pub/freebooks/mh/index.htm is an example of an HTML web page served over FTP with relative links. It's worked fine for 25 years (though I bet 25 years ago the URL was only ftp://ftp.ora.com/pub/freebooks/mh/index.htm, which still works) with very little effort on the part of the redoubtable crew at O'Reilly and Associates. Now it's being broken. Intentionally. For no good reason.
> So your argument is that because you haven't used it lately, nobody does?
I mean, it's opposed by the argument that it used to be popular. It's a pretty low bar.
FTP is a protocol that dates to when NCP was the protocol suite that ran the Internet. It was retrofitted to TCP/IP. That's why there's a command session and a data session. The protocol is so old that it dates to a time when IP+port was the unique identifier for a half-duplex connection. Nobody even uses active FTP anymore because everyone has firewalls now.
It's not like web browsers have added SFTP support. They haven't even added FTPS support (either flavor) as far as I'm aware. I just don't see many use cases for FTP anymore. Why would you choose FTP at this point over HTTP(S), SFTP, BitTorrent, etc.?
The last system I used that required FTP actually used implicit FTPS. Worse, when the vendor implemented SFTP to replace FTPS like their customers had been demanding, they actually implemented Simple File Transfer Protocol (i.e., RFC 913) and not SSH File Transfer Protocol. I wish I were joking.
> Yes, and it's not particularly unique or well suited to that task over any other protocol.
When the files are on an FTP server it is. And given that there are more than a million anonymous FTP servers on the internet today, there are a lot of files that are on FTP servers.
One advantage of ftp is that it is much easier to have access control on different resources than dealing with .htaccess files so it was popular with customer service.
The consequence will be that many ftp servers still hosting software/firmware for legacy products will be taken down and their content lost.
Ok, that is a legit use case. It's been a long time since I've needed firmware that was FTP only, but I can see that possibility. But I still disagree with this:
> The consequence will be that many ftp servers still hosting software/firmware for legacy products will be taken down and their content lost.
Even if all the browsers dropped support for FTP, Finder and Explorer both have native FTP support, and there are plenty of clients otherwise. I doubt those resources will be taken offline.
Sure, but what is the use case of browsing that with a web browser? Both Finder and Explorer can natively support ftp, and there are plenty of other clients.
You asked for a scenario in which you need ftp in 2021. You handled my example by switching to another program, such as finder or explorer. That's fine. Do you agree that your user experience decreased by having to switch programs?
You really like to take the worst possible interpretation of whatever someone says, don't you?
Of course I can tell the difference between a browser and the finder. My point was the transition was so seamless that I thought Safari was embedding the finder in the Safari window until I looked up at the app bar and realized it had actually switched apps.
But feel free to go on dismissing me, since it's clear from your comments that you think no one in the world is right but you.
What "worst possible interpretation"? You were the one who claimed you didn't realise you had switched from safari to the finder. Which is exactly equivalent to saying "I have no clue what I'm looking at and pay zero attention to detail". In a conversation where you're discussing those very details.
What's the "good" interpretation? That you suffer from an attention deficit disorder, so it's not your fault that you have no clue what you're looking at and the things you're saying are patently ridiculous?
> I thought Safari was embedding the finder in the Safari window
So, to summarise, your argument is that you have no idea about how your software works or what it does. It felt "seamless" to you because you had no idea what to expect and think your browser magically embeds other applications now.
Just in case I wasn't clear last time: If you think that switching to an entirely different different application is a reasonable thing to miss, you have zero credibility where UX is concerned.
>it's clear from your comments that you think no one in the world is right but you.
Lots of people who aren't me are right. It's just that statistically speaking those people tend to agree with me on most things.
Kragen, for example, has been saying things that are right all day (In particular I found this treatise on security stuff to be damn near sexy: https://news.ycombinator.com/item?id=27900935).
Even some people who disagree with me on a bunch of points have said things that are right. It's happened today. And you'll see me acknowledge them when they happen if you look.
The reason you haven't seen that is because you haven't said anything that was right.
But if you really believed this assertion you could easily prove it accurate: All you'd need to do is say "OK, yes, I had no clue what I was looking at. I don't know why I thought the finder was in a web browser. I guess I was just confused. That was dumb and inattentive of me". If your theory is accurate I won't be able to agree with that.
> But feel free to go on dismissing me, since it's clear from your comments that you think no one in the world is right but you.
You know, you're really not in a good position to criticize other people in this conversation with exaggerated complaints about their intellectual arrogance.
>I’m a pretty technical user and use the web and the internet a lot.
Oh Wow! cool! I'm so impressed!
So what you're saying is that because you "use the internet a lot" that you know everything about the requirements, hardware, software, and most importantly limitations of every single internet user on (and off!) the planet.
I guess I should at least give you some credit for not having any problems with self-doubt.
That was mentioned elsewhere and I agree that is a legit use case. But why would you use a web browser to get there, and not the built in client in your OS or one of many other client choices?
Because the firmware link came in an e-mail advisory, and I clicked on it, and it launched by web browser, which then saved it in my "Downloads" directory.
This argument would be okish if FireFox removing FTP was like burning it down, but it's not. A browser removing FTP protocol isn't the end of FTP. There's a myriad of FTP clients you can still use.
Okay, yes, it's more like burning the card catalog, not the actual books. The books are still there, but the links that tell you where to find the book you want don't work.
I think the concept of the WWW as a vast library of accumulating knowledge was a nice vision but it was never a reality, and even less so today. Every time someone stops paying a server bill, a part of that vast library disappears forever. URLs change all the time without redirects, closing doors in the library, usually for good. Protocols go out of fashion (Gopher anybody? FTP?). That's why we have important services like the Wayback Machine and archive.org, which are probably the real vast libraries of accumulating knowledge.
It's true that there are many existing causes for the decay of the web, and we should work to fix those. But that becomes more and more difficult when we add more and more causes to the decay.
That's not two important services you've named, but only one, and at some point it will be destroyed; it's only temporary. Hopefully it will last a few decades.
Brewster Kahle, the founder of archive.org, has a saying:
Wayback is probably not a good example here, though, because it backs up neither (to use your examples) Gopher nor FTP. There are some images of such sites on the Internet Archive, but these are site dumps people have taken steps to manually upload. I submit that won't be the majority.
Quite the contrary. Books from 01997 or 01977 are not worth any less than books from 02017. Neither are web pages from 01997 (for example, http://canonical.org/~kragen/x-pretty.html). In fact, those web pages tell us things that are especially difficult to find out: they tell us what 01997 was like.
If you want to know what 02017 was like, you can ask pretty much anybody; most of them will remember pretty well. You may remember yourself. But if you want to know what 01997 was like, well, most people have pretty much forgotten. Did you know they didn't check your ID at the airport in 01997? People would resell non-refundable airline tickets in newspaper classified ads. Airlines wanted to stop this practice, but competition prevented them from instituting mandatory ID checks. Until 02001, when the US entered a permanent state of war on abstract concepts.
It's true that there are a lot of things that are true today that weren't true in 01997. But most of those things are not worth knowing, because they won't be true in 02045 either. In fact, a lot of them won't be true in 02022.
So, losing access to a web page from 02020 is bad, but losing access to one of the few remaining web pages from 01997 is much worse.
Why are there people who think otherwise? Because they never learned to think of the World Wide Web as the greatest library in human history, probably because they don't value libraries or learning; instead they think of it as a way to dunk on their political opponents and consume up-to-date memes from Instagram or Netflix.
You're saying it was normal (and is even today) to show HTML files over ftp links? I've never seen a browser do HTML over ftp, and while I am a youngster compared to HTTP (I stared to do web stuff in the early 2000's) the sites where I've seen that had both HTTP and FTP modes the ftp mode only downloaded the file, it did not show it.
> More broadly, this is a tradeoff between the traditional vision of the WWW as a vast library, in which human knowledge accumulates over time and becomes accessible to all, and the strip-mall vision of the WWW as a means to sell people things they don't need.
I don't get this at all.
* Are you saying FTP is a fundamentally better protocol to download files than HTTP?
* Are you saying that it would be easier to run a FTP server than a HTTP server?
* Do you think that FTP-only-sites generally depend on HTML-over-FTP for browsing? Because that's something I've never seen AFAIK, either they use HTTP-only or HTML-over-HTTP for browsing and FTP for download.
I get the "strip-mall vision of the WWW as a means to sell people things they don't need" complaint, but that does not seem related to the protocol discussion of HTTP and FTP at all.
> I get the "strip-mall vision of the WWW as a means to sell people things they don't need" complaint, but that does not seem related to the protocol discussion of HTTP and FTP at all.
Web pages that are profit-making ventures, with employees dedicated to working around the latest browser featurectomies, will have no trouble with this sort of constant change. Web pages that are just HTML files that someone uploaded to a server in 02003 will disappear into the memory hole. They're already hard to find, but now they'll be totally inaccessible.
FTP is a bad protocol. HTTP servers are generally easier to run than FTP servers, and especially to run in a secure fashion. None of that is relevant to whether we should break functionality that has been core to the WWW project for 31 years.
I don't get how any of your comment is about HTTP vs FTP, even the security part seems to be about HTTPS, right? If we are talking about FTP and HTTP (not FTPS or SFTP and HTTPS) is there any meaningful difference?
> even the security part seems to be about HTTPS, right?
No, I mean that a lot of FTP servers by default want you to do password authentication (yes, in cleartext); by default they grant access to your whole filesystem; and a minimal FTP server is significantly more complicated than a minimal HTTP server, and so it's more likely to contain vulnerabilities. Also, by default, most FTP servers support writing to files, and HTTP servers don't. I'm not talking about eavesdropping on the protocol itself or packet spoofing, which I agree are equally easy with FTP and unencrypted HTTP.
This HTTP server I wrote is, I think, 324 instructions of machine code, and it doesn't use any libraries: http://canonical.org/~kragen/sw/dev3/server.s. I think it's plausible that it contains no security vulnerabilities, other than a DoS by flooding it with connections. I have a fair bit of confidence that it doesn't have an exploitable RCE vulnerability. I'm not sure if there's ever been an FTP server we could say that about.
But none of that is relevant to whether we should break functionality that has been core to the WWW project for 31 years.
> by default they grant access to your whole filesystem
Do they? Do FTP servers usually just open up everything to any host? That's not the way I've used them. Don't they usually default to sharing one directory and nothing else?
> Also, by default, most FTP servers support writing to files
That's kinda what file system permissions are there for, and it is usually pretty configurable in the server, right?
> This HTTP server I wrote
Love it, but I have no experience writing assembly, and almost no experience writing C, so I cannot say anything more (I just don't have the experience or knowledge). I still have to ask though, is this relevant to something that asks if FTP is needed over HTTP on the protocol level?
> But none of that is relevant to whether we should break functionality that has been core to the WWW project for 31 years.
I think the question here is if it needs to be in the browser though? There are plenty of protocols in wide use on WWW that are not browser supported and considering that none of the "secure" variants of FTP are supported in browsers it does not seem out of the question to remove this.
If this was a priority then I'm guessing that over the last decade or so getting ftps or sftp working in the browser would have been worked on.
> Do FTP servers usually just open up everything to any host?
Typically they do not allow anonymous access by default, but do not discriminate by host.
> That's not the way I've used them. Don't they usually default to sharing one directory and nothing else?
That's a good default, but historically speaking you had to chroot them to get that behavior. Nowadays you could use Docker.
> [Writing to files] is kinda what file system permissions are there for, and it is usually pretty configurable in the server, right?
Running a server that includes code to write to files is unnecessary for serving up web pages, and it's more likely to accidentally result in the server writing to files than running a server that doesn't. You're more likely to misconfigure a server that's pretty configurable than one that isn't. Filesystem permissions are generally far looser than you want for anonymous access over the internet; I don't want random strangers to read my /etc/passwd or see which versions of what Python modules I have installed, much less create files in /tmp. Filesystem permissions are only usable in the first place (for uploading files) if the FTP server has the authority to set its user ID to the user ID of the authenticated FTP user, which means it needs to run as root until after they've authenticated. Also it means I need to add my FTP users to my real /etc/passwd and /etc/shadow.
> is this relevant to something that asks if FTP is needed over HTTP on the protocol level?
My point with the two-kilobyte secure (?) HTTP server is that FTP is a bad protocol. The reason browsers should continue to support FTP is not that FTP possesses some kind of unparalleled technical excellence, the way NNTP and IRC could be argued to; it's that FTP, however janky it may be, is still useful, and providing better access to existing FTP repositories is one of the main reasons the WWW was created in the first place.
> If this was a priority then I'm guessing that over the last decade or so getting ftps or sftp working in the browser would have been worked on.
There's relatively little advantage to ftps or sftp over unencrypted FTP for anonymous access—you aren't sending the FTP server any files or credentials, just the names of files you want—and no advantage for backward compatibility, since the existing FTP servers you want backward compatibility with aren't running ftps/sftp.
> You're saying it was normal (and is even today) to show HTML files over ftp links? I've never seen a browser do HTML over ftp, and while I am a youngster compared to HTTP (I stared to do web stuff in the early 2000's) the sites where I've seen that had both HTTP and FTP modes the ftp mode only downloaded the file, it did not show it.
Of course they do, it's quite normal. Here's Netscape Navigator:
> You're saying it was normal (and is even today) to show HTML files over ftp links? I've never seen a browser do HTML over ftp
Firefox supported this completely until Fx61, when it disabled FTP subresources on HTTPS. Even then, you could still view HTML pages served over FTP until Fx70.
Well, I'm not saying running ftp to serve html is better. All I'm saying is that it can sometimes be the only supported option, or viable option. And there are major differences. Webserver access permissions vs. ftp server permissions for example?
> Webserver access permissions vs. ftp server permissions for example?
What I'm saying is: is there a meaningful difference? Both need to bind to a port that is privileged and read files from the served directory. There is no difference between a ftpserver and a webserver from a permissions/privilege perspective.
Well, that's right. There is no meaningful difference in the sense that they both have to serve files from a storage backend, over a port to a frontend. I was comparing ftp folder permissions with webserver htaccess permissions. What I really ment was access control.
Gotcha, but htaccess is specific to one server (apache). You can leave that out for basic things for basic use cases (just set read/write as you would for a normal unix user) and for more granular things you need to configure FTP just like you need HTTP.
No, they're breaking with accessing a part of the internet that web browsers used to support, an no one does anymore. They are most certainly not break the web itself. These are different things: if you need FTP (and let's be real: you don't), then as a user you still have TONS of options available to you, and as a host even legacy systems using ancient versions of Apache can already serve file listings as standard web pages. Just turn that on.
Your web browser doesn't need to support FTP. It just needs to support the web. Everything else is a bonus, unless it's a security liability. Then it has no business being in there.
Your browser doesn't NEED to natively support .pdf but it does and you've probably used that feature. I mean you could have it launch an external .pdf viewer, which is a FAR more complex piece of software than the little FTP protocol code they're disabling here.
Your browser doesn't NEED to support TABs. Your OS is already equipped with the ability to run multiple instances.
Your browser doesn't NEED to sandbox anything, you could just run each instance in it's own VM.
Your browser doesn't NEED to play video, it could launch an external viewer.
Touche. But what Firefox needs is popularity too, and being the last browser to drop a feature used by some niche isnt helping them. I'd argue not to add ftp if it wasn't already there, but why drop it?
It's probably the largest in-use uri scheme they have dropped. At the very least, there should be better error handling. For me on Chrome/Windows, ftp:// uris just don't do anything[1]. No error, they are just inoperable links. Yes, I could fix that, but for average users, that part of whatever you want to call it could just disappear silently.
[1] Because the default handler for ftp:// was originally Chrome on my system, which is pretty common. Chrome dropping it didn't change that mapping.
I certainly was not viewing Gopher pages at the time they dropped gopher support. I was rather perplexed after updating when FTP didn't work. Turns out some downloads are still on FTP.
There are lots of URI schemes in use. Firefox removing support for FTP doesn't mean that you can no longer open FTP links from Firefox, it just means that they'll need to open in another client. I don't see how this breaks the web.
I think it effectively means that for many. Unless there's a really easy-to-follow error page.
I tried chrome on an ftp:// uri, and it just does nothing. I suspect because it was the Windows default app for ftp uri's, then they dropped support, but that didn't change the mapping in Windows.
Agreed. There are old drivers from established vendors that only send drivers over FTP links.
It's an edge case these days though as more are moving to https:// links so I can understand the browser vendors wanting to make the code base smaller. They have enough to do. Especially for Mozilla given what they charge for us to use their product.
Maybe nice if they (Mozilla and/or Google) would make a site somewhat like https://webftp.dreamhost.com/ that would accept url parameters to sort of proxy ftp requests.
Wouldn't help for internal network ftp servers, but would ease the publicly accessible part.
(Note that the dreamhost site has a little link icon in the lower left that will generate a link/landing page with all the important bits filled out.)
Wouldn't make that security worse? Then any FTP connection would have to be proxied through a third-party, which has all opportunity to monitor or MITM the files. This in particular as FTP is often used to transfer non-public files.
Alternatively, Firefox could allow standalone clients to register for the ftp:// uri scheme (I think that's already possible) and, if no client is registered, redirect to some info page that explains the situation and offers links to standalone clients.
Perhaps if it only supported anonymous ftp. That's most of the links that will get broken. For that, there is no real security consideration. What exists is plain text all around with deliberately exposed passwords.
> The biggest security risk is that FTP transfers data in cleartext, allowing attackers to steal, spoof and even modify the data transmitted.
I don't know how realistic that type of attack is and compromised authentication is likely worse, but both are cases that Mozilla cannot fix since they are inherent to the protocol itself.
For the majority of people, the lack of FTP support would go unnoticed so it is easy to depreciate and remove. I wouldn't be surprised if the same happens to HTTP (in general) once a certain threshold has been reached for HTTPS. I also wouldn't be surprised if they start adding warnings about insecure transfer methods for particular file types at a much earlier date.
>I can understand the browser vendors wanting to make the code base smaller.
That's an excellent point...
...So they'll be removing their builtin pdf viewer first then, right? That's a much much bigger chunk of code than an ftp client. Or are they both scheduled to be removed at the same time? I suppose that would be reasonable.
Mozilla did great decision for security: develop PDF viewer as pdf.js. It works same as other webapps so no longer considered to attack surface, not like integrated PDF viewer, IIUC.
This right here is something I want to say every time the "people said x, now people say y" paradigm appears. The word "people" in both cases are not the same persons.
It is a shorthand but it always seems like it's designed as a "gotcha". I haven't thought enough about it to figure out what fallacy it entails, just enough to ignore it anytime I see an argument that uses it.
Browsers opening an ftp:// link can simply prompt to open it in a different program, much like they did PDFs in the past. (I also don't think a browser should have to be a great PDF renderer, but here we are.)
> Browsers opening an ftp:// link can simply prompt to open it in a different program, much like they did PDFs in the past. (I also don't think a browser should have to be a great PDF renderer, but here we are.)
Except then you have to find, install, and configure that program, which not everybody may be able or willing to do. We have trust browsers, for better or worse, and many people may not be comfortable identifying another trustworthy program to handle FTP downloads (due to malware/adware concerns), especially when they're trying to do something else. Having something in the browser saves users a trust decision.
Also, this feature has been so established that lots of stuff was designed expecting it to work that will now be broken. I also wouldn't be surprised if some FTP sites (say with old drivers) just get taken offline without being migrated, due to this.
And such programs can and will be downloaded from disreputable sites packed with malware and other junk, because the user needs that file. The great thing of having this stuff concentrated in a single program is that at least you have only one vendor relationship to consider.
Not everyone has a FTP client installed in the system, or a FTP client that can be launched with the server to open (Windows explorer can open FTP servers but I don't think you could with a link. But I'm not sure).
A lot of sites, especially old ones, are build with the assumption that every browser can access FTP links as you would with HTTP. And so for example a download section is a link to a FTP server.
To me removing it is stupid. Is it a security concern? Not really. Also not having it in the browser will not make security better, a person that needs it will use it with another client. Will make the browser faster or smaller? Not really, a FTP client is something really simpler, and browser have them since ever.
Me. I don't want to install a FTP client for the 2 times at year that I go to a site to download something (typically drivers for some obsolete device) and the download is from an FTP server.
And yes, if I have the URL I can use curl to download the file from FTP, even if downloading it from the browser would be easier. But most of the times there is a link to the server directory, with multiple files to select. Yes I can use Windows explorer and connect to the FTP server and browse it, but to be fair, or just open Internet Explorer and past the FTP link.
Pocket is worse than useless. It's ad-filled, privacy invading, garbage. For all the potential security problems with using FTP, pocket is a lot riskier since you can be certain your data will be sent to 3rd parties and you'll have ads pushed at you by using it.
Understand this: ftp has been with the browser for 25 years. Ftp has been with the web. Ftp is part of the web. No one cares that ftp is not really what browsers are made for. Browsers is how people access ftp. That’s it. And they have been used in of web pages too.
Yeah, really gonna need some numbers here, though, because literally no one I know younger than 40 even knows what FTP is, let alone having used it.
Even Apache 1 can expose dirs over http instead of ftp, there is literally no reason for FTP unless you want uploads. In which case: no you don't, you want sftp at the very least, because you care about the fact that you want data that gets uploaded to be your data, not the data that a MITM trivially changed it to. Which FTP fully allows.
Sure, they also use C6H12O6 every day, and interbank networks, but if we listed everything that wasn't similar to FTP we'd be here a while. FTP is "a thing" you intentionally use. You connect to FTP servers, and upload or download files, in the same way that HTTP is a thing you intentionally use to consumer web pages. TLS it just an aspect of how connections are negotiated and is on a completely different rung of the ladder of abstraction.
> People using FTP can use a dedicated client, of which there are plenty on every platform
FTP is used often in my field. The removal of FTP from both Chrome and FireFox has been very inconvenient. I tried a few free FTP clients with GUI. They are huge and clumsy in comparison to browsers. For example, cyberduck zip is as large as firefox and I couldn't copy-paste a ftp:// URL in FileZilla. I wonder why these FTP clients don't adopt a browser-like interface. It would be more friendly. Now I mostly use command-line lftp, which is better than the GUI clients I have tried but still not as convenient as browsers.
You should be able to paste a full ftp://server.tld/path into the host field and upon connection it'll drop you right into that folder.
As for why the GUIs aren't that great I think it's precisely because FTP was made with CLI in mind and by the time good GUIs came around there were better protocols to plug into them.
I, for one, would be quite happy to trade FTP for no-Pocket. I sometimes looked at an ftp index through firefox, but my file managers do a perfectly good job too. Anyway I still don't want to give it up until Pocket is erased.
A decision that maintaining an FTP client is no longer worth the effort and resources is fine, but instead there's this ... imperious attitude, that has come to characterize Mozilla even as their market share drops, diminishing any ability to actually be imperious
I agree. Creating some friction to encourage the move away from FTP is the right thing to do, for a variety of technical reasons
There are times when HN seems to become very negative to a particular topic. In the past I’ve seen it with Kubernetes, systemd or GCP/AWS. I feel it’s that way with Mozilla/Firefox. More often than not, comments on Mozilla/Firefox are very negative then create a feedback loop of negativity. Obviously subjective, but just what I see
If I'm hard on Mozilla/Firefox it's only because I love it, think it's valuable/important, and want it to be the best it can be.
When Google or MS does something shitty with their browsers I pretty much expect it from them and I'm partially insulated from their bad behaviors since I avoid using those browsers. When Mozilla acts badly though I'm often personally impacted.
I'm actually okay with them getting rid of FTP support (although I think leaving it there, but disabled by default was a better way to go - FTP links are pretty common out there) but I'm not at all surprised by the backlash.
For network protocol perspective, FTP is weird old protocol from current view. Forget Active mode since it's no luck under NAPT. Even in Passive mode, it requires NAT device to rewrite address in command, won't work with firewall closing unknown external port without dynamic filtering by packet inspection, and server configuration isn't simple depends on network configuration. Modern network environment like NAT64/DNS64 need special support for FTP. There are no reason to continue using ancient protocol forever.
How is NAT the browser's problem? You're about 10,000 words short of "excruciating", and about half a dozen reasons short of "a variety"
>There are no reason to continue using ancient protocol forever.
You undermine yourself with statements like this. I explicitly asked for excruciating detail. This response essentially sums up to "because I said so".
...and it's empirically incorrect, too: You can find a ton of reasons where people need FTP today if you look through this discussion. So many that I'm not going to bother repeating them here.
Mozilla is struggling and is trying to focus its resources in a way that will let them build sustainable revenue and browser share. I hope they succeed
It's still a really good browser, you just have spend a lot of time and effort beating it into submission, removing/disabling anti-features, and installing 3rd party add-ons to get it there. Once you do however, it's the best bet you have at a secure, privacy respecting, modern web browser.
Pocket is ad-tech that nobody wanted. FTP is, for some, a useful feature. The negativity is because Mozilla maintains a façade of selflessly serving the little guy but then it removes an existing, working feature that makes it a little easier not to use surveillance-focused file-sharing services like Google Drive. The animosity Mozilla receives from its users has been well earned.
> A modern web browser is probably some of the most complex software humanity has invented yet, besides a full-scale OS.
What exactly about FTP has changed that invalidates the FTP browser code that's been in Firefox for 20+ years?
People bitched when Firefox added a stupid non-standard thing, yes. Now, the few who still use Firefox, will bitch because they have arbitrarily removed a standard thing.
To note that Pocket was related to web browsing and offered a convenient bookmarking system. The reason people complain is it being a third-party extension and service (requiring registration even) rather built-in features in the web browser.
That said some functionality has been included in bookmarks such as clicking favorite button saves directly in unsorted, clicking twice opens a menu where tags can be added, and can see all bookmarks through menu. A secondary bookmarks tree can be added with extra features being read status, and simple status change and deletion from menu without requiring right click. Kinda like Chrome did it.
Yes, good point. Gopher is internet browsing too but is Firefox the best way to experience FTP and Gopher? How many different protocols should Firefox support? Firefox does not have the same resources as Google so focusing on what Firefox is known to do best such as HTTP and its supporting protocols, like WebSocket, really well looks like a good strategy.
I just checked my Windows 10 machine and it still has the FTP command-line program built in. What proportion of machines have Firefox but no FTP command-line program (or a "finder"/"explorer" that doesn't support FTP)? I cannot imagine this number is large.
The healthcare sector should be using secure technologies like TLS/HTTPS. If anything that's an argument for this change, since misuse of FTP remains rampant.
FTP is something completely different to Pocket. FTP is a standardized protocol.
Browsers never had decent ftp support, true. They just allow you to list directories and download stuff. But on the other hand, the FTP support doesn't cost anything. Don't know much about Pocket to be honest, but this form of integration is much worse than to support a protocol.
Aside from that, maybe using http for downloads is the better alternative today.
I think what really hurts, for me, is that this has become the general tone from Mozilla - whether the changes are ones I'd agree with or not.
I remember back when the Spread Firefox campaign was still around - at the time, Firefox and Mozilla in general felt grassroots, fun, and human. Like a club anyone could join and that anyone would want their friends, family, coworkers, and even strangers or people they didn't like to get in on: an all-in-this-together effort for a better internet.
Anymore, Mozilla feels more and more corporate, more like a company - even as Google Chrome (and the many browsers built from Chromium) eats away more and of their market share and they move toward being "the little guy" again - and less and less like a group of people.
I think what I really miss is having a browser that made me care about it beyond just wanting alternatives.
Especially because "The FTP protocol itself has been disabled by default since version 88", so any user that hadn't opted in to FTP protocol already had the "benefit" of the "security enhancement", no?
But, hey, if you are not yet a FF user, here's where you can download it, in case you're looking for a browser that... lacks FTP support. Something many users are likely to be seeking out.
To be fair, most non-technical users have no clue what FTP is. Twenty years ago, some non-technical people knew, because a lot of downloading was going on involving FTP servers, but that was... well, twenty years ago.
I do not see the benefit of removing FTP. For security concerns, a big warning as with expired TLS certificates would be an acceptable compromise, IMHO.
But as someone who knows what FTP is and still uses it on occasion, I don't think FF dropping FTP support is going to impact me very much.
Mozilla have been working hard on that for years now. Haven't you been watching their market share? One day they'll get to their holy grail of 0 users and no security problems.
I am fine if Mozilla explains that FTP usage is now very low, based on telemetry and that there are FTP clients available for users.
However, there is no need to characterizes FTP being dangerous by jumping from FTP is old and is in plaintext, to FTP servers are being exploited and used to distribute malware, to FUD-type statement implying that there are [unspecified] exploits now available to attack Firefox if FTP was enabled.
This is just plain disgusting and it leaves a bad taste in my mouth.
I thought we should be familiar with this kind of corp speak from Firefox team. This is what they say every time they break or remove useful features in the pursuit of chromification.
This on the same line like " Our new CP and terrorism law is for your protection". I already feel much safer knowing that nobody can interfere with my downloads which will be supplied securely by Microsoft, Google and Cloudfare.
Maintenance-free code exists. I'm sorry you haven't experienced work at an org that has 10+ year old code which hasn't changed at all, easily verified by source control. (Has Mozilla published something anywhere about exactly how much work it is to keep FTP support around?) And while not technically 100% free, something closely related that's basically free is lots of 10+ year old code which does need to change sometimes but only as part of a batch of the same change in many places, and that's basically free because it's done automatically by tooling.
Small code segments may indeed be maintenance free.
If a large codebase that interacts with any other system/code (either through usage of shared libraries, network connection or a common file format) has not changed, that means there's likely technical debt that hans't been addressed (or needed to address).
You can continue to use most command-line FTP programs without problem. Since the terminal is a fairly stable interface, programs written long ago often have no problem compiling and running on modern systems, or only require small changes.
I wouldn’t actually say that, but it may be more of a problem of ecosystem maturity and priorities than it is one of interface stability: I work under Linux and love it, but getting a 20- or 30-year-old program to compile or run under it is surprisingly more painful than doing the same for a Win32 program of similar age, despite the ostensible stability and undeniably better documentation, including that of historical issues, in the former case. (To be fair, getting an old DOS program to compile or run anywhere except a full VM is an absolute nightmare, even if it can sometimes be offloaded to DOSBox maintainers.)
I would say it—getting a 20-year-old CLI program to compile and run is often not much trouble at all. Not always, but often.
Even if the program was never written from Linux, you can often get it to compile and run by fixing up headers and making some other minor changes to the source code. I’ve had plenty of success compiling and running programs that were only ever designed to run on IRIX or Solaris or other Unix systems.
SFTP is mainly a *NIX thing. And it's a terrible hack built on top of a protocol meant to be something really different.
Also, while in theory SFTP can be as secure as FTPS, in practice it's not. How many people really check that the server public key signature it's the correct one? You know that annoying message that appears the first time you connect to a server and you have to say yes and if you don't it will not let you continue?
Not checking that give you the same security as having a HTTPS/FTPS server with a self signed certificate. You trust blindly the identity of the server, but there could be someone doing a man in the middle and stealing all your data. In that situation, FTPS is more secure, mainly because you need a valid TLS certificate that will give you some guarantee about the identity of the server.
At least in the pre-Let'sEncrypt era, lots of the shared hosting providers that gave you SFTP as an upgrade over plaintext FTP also used self-signed certs for their HTTPS admin panels :D
Server identity keys can be checked using SSHFP DNS records signed with DNSSEC, but that is not really mainstream unfortunately.
Obtaining a public certificate from a well-known registrar requires Internet connectivity for the ACME protocol, and that's at odds with the other best-security-practice of isolating internal systems like NAS devices well away from general Internet connectivity.
The problem is even worse for home routers. They need Internet connectivity to have a chance of obtaining a certificate, but since they provide that connectivity to a network they can't obtain the certificate until they're set up. But setup generally happens via a web browser and captive portal, so we're right in the middle of a bootstrapping problem.
https/TLS everywhere on the public Internet is a great thing, but it's not a reasonable expectation for private networks with private devices.
Virtually none of the DNS hierarchy signed with DNSSEC, and it's unlikely it ever will be. It's a strange best practice to appeal to, since it's been categorically rejected by almost every security team in the industry.
There are lots of perfectly good (well, suited to their purpose, anyway) appliances (as in: pieces of hardware that cost money) out there that have web interfaces that don't speak 100% valid https. In part, this is because achieving fully-valid https on a machine on a local network, without a static domain name, with a machine & network config that you (the vendor) don't fully control, potentially without an Internet connection, and without installing root certs on client devices is, to put it mildly, inconvenient, and used to be even worse.
Getting rid of http or insecure-https support completely would render either them, or your browser, useless, and require that one or the other be replaced.
Lots of places like US federal institutions or universities use FTP to this day to distribute their open datasetes.
Show me an example of actual FTP MITM hack in the wild.
Sure loading FTP resources from HTTP(S) context is not a good idea (as would be downloading executables over FTP), but did they actually make any effort to inform the public and owners of FTP servers? I do not think so, I haven't seen it.
Mozilla these days has very weird priorities. Their decisions should not feel so unilateral or "because Chrome does it". There should be more emphasis on widely understood infrastructure even at the cost of "soft" projects/campaigns [1] - these could be served by the EFF after all. I can't understand why shedding MDN was a good idea in their heads.
Even though I love Firefox and I think it's vital for the web, I can't really in good conscience support Mozilla when it keeps shooting itself in the foot times and times again.
> Mozilla these days has very weird priorities. Their decisions should not feel so unilateral or "because Chrome does it"
I agree. The attempts to be more and more like Chrome are especially confusing to me. Maybe they just want to copy what's popular but the thing they seem to miss is that if people wanted a browser that was just like chrome they'd probably just use chrome. The removal of choice, customization, and control over Firefox is what's going to drive people away. Those are the features that attracted most of us to Firefox in the first place.
> Lots of places like US federal institutions or universities use FTP to this day to distribute their open datasetes.
Then they can use an FTP client which will perform better anyways. This is Mozilla removing it from their web browser, not L3 black holing port 21 traffic.
Reminds me of Apple removing the headphone jack and being so proud of themselves for "being brave".
Just yesterday I found a link to FTP while researching something. Was pretty annoying to go get another FTP client up and running to get it.
Anyway, the movement away from unencrypted protocols to TLS-only is moving us closer to a fully censored internet. Sure, an unencrypted internet did not have any integrity guarantees, and thus was easy to censor (and worse) by totalitarian nation states.
However, a TLS-only internet is very easily censorable by our new global central planners (FAANG). This way, they'll have much more control than was available to the common MITMing nation state.
> While Mozilla removes it to delete a Malware vector and/or save programming resources.
Malware vector, really? When was the last time FTP was a major malware distribution channel as opposed to, you know, plain http? And I don't buy the "save programming resources" argument either. FTP is an old, simple and stable protocol, it's not like there's much need to touch that code.
Let's look at this from the other end of the horse.
FTP is a horrible kludge that needs to be depreciated. SFTP is better.
The number of ports needed, holes punched in firewalls, everything sent in plain-text, inability to traverse NAT without more kludge and hacky work-arounds. We only tolerate it because it was the only thing that worked.
There are better/newer methods that should be embraced.
We don't bemoan the death of Gopher, or Finger do we? Hell no. FTP does have it's uses, but I'd dare wager that every-single-instance could be upgraded to SFTP and the world would move on.
Legacy, ancient apps that haven't been touched in 40 years; will break. Let them.
There's definitely a group of people that bemoans the death of Gopher: they've tried to resurrect it and even are reinventing it as Gemini. They do use TLS in there at least :)
I was sad to see firefox drop support for Gopher too. I'm not very active in the gophersphere these days, but I guess there's a certain nostalgia in playing around with these old/depreciated/near forgotten protocols and a sense of comfort knowing that they're still out there and being used in little corners of the net even as most people never stray from HTTP-land.
It's unfortunate that this had to happen, but the landscape around FTP is increasingly unexamined and part of a dead era. The effort required to build in robust SFTP support as well as convert existing FTP users to it is non-trivial, and not within the charter of a modern web browser.
Yes, you can build an FTP client in JavaScript. However the browser won't let you escape the sandbox and open an arbitrary port ... but on Node.js or FirefoxOS (haha) there shouldn't be a problem, except weirdness in standard interpretation :)
I don’t program web sites or web browsers, but with wasm or web sockets can’t you make a web page that opens a connection and sends arbitrary binary data to another server without any http?
No. WebSockets are a specific protocol tunneled over HTTP, not raw sockets, and WASM doesn't expose any functionality which wasn't already available to JS.
You could make something similar to FTP with that, but it wouldn't be interoperable. Web browsers just don't expose a generic socket api where you can specify ports, bind(), listen(), accept(), connect(), and so on. So you can't make generic socket clients or servers. XHR, websockets, webrtc, and so on might be able to connect to a specific port, but they send things that would cause errors, and can't send/receive arbitrarily crafted sequences of data.
Last I checked you'd still be speaking HTTP (even if it's "WebSockets"), not raw TCP. Certainly there's no UDP support. You'd need a translating proxy of some kind.
> but with wasm or web sockets can’t you make a web page that opens a connection and sends arbitrary binary data to another server without any http?
I can't speak to WASM, but websockets are literally just a layer over HTTP in some regards -- a websocket connection is initiated by sending an HTTP request with the Upgrade header.
>> To date, many malware distribution campaigns launch their attacks by compromising FTP servers
Yes.. well, they can do the same by compromising servers that offer the payload via HTTP(S).
At least when the payload is ftp, it stands out and you can catch it in your gateway/firewall devices.
With https you now need https inspection at the border in order to be able to do that. These MITM devices do tend to cause a lot of trouble.
This argument reminds me of how Google refuses to show the server-provided login prompt when doing HTTP basic auth in Chrome (and as a consequence, browsers deriving from it). Some developer argued that someone could MITM the connection and make the basic auth login prompt say "please enter your YouTube credentials" or something like that. Yeah right, if you can MITM the connection, you could just as well inject a much more realistic Google login page. Basic auth really wasn't the actual problem at hand.
This is frustrating. FTP is really handy for distributing some files and there are lots of servers in place that now Firefox users can’t access.
One can argue that servers should upgrade, and that’s valid. But they don’t and they likely won’t do this just harms Firefox’s user base and is one more reason I no longer recommend Firefox. They just don’t seem user friendly as they once were.
I would expect Mozilla to advocate for more FTP as a cheap way of distributing files.
Define "lots". Chrome dropped FTP support in late 2020 and basically nobody noticed. The vast majority of the remaining public FTP servers are also accessible over HTTP.
> I would expect Mozilla to advocate for more FTP as a cheap way of distributing files.
In what sense is FTP "cheap"? What makes it any different from HTTP in that regard?
Even accounting for that (ignoring that the data served over FTP and HTTP aren't always perfectly equivalent), it still leaves a lot of stuff that is only served over ftp.
I used to agree with this, but with how easy it is to run MITM attacks these days I think there is a pretty compelling argument for removing plaintext FTP. I don't think FTPS should go, but tbh I don't think I have ever encountered an FTPS server in-the-wild.
It's one thing to have your password stolen, but another thing entirely to have your download and its shasum/md5sum/whatever sidecar file replaced in-flight
I'm a Firefox user, and Firefox doesn't support torrents. Yet, I can access torrents. Because of course you can have dedicated protocol handlers that do the job better than the download manager baked into the browser.
Sure, there might be a user that doesn't know how to get a good FTP tool. But how many FTP servers are they accessing? Probably not enough of those to justify the maintenance effort.
FTA: "To date, many malware distribution campaigns launch their attacks by compromising FTP servers and downloading malware on an end user’s device using the FTP protocol."
It removes a malware vector going through Firefox.
But seriously, who's serving FTP but doesn't serve HTTPS?
People made exactly the same arguments, almost to the letter, about universalizing TLS. The arguments were made in good faith, often by very clueful people, but it's hard to dispute the positive results of the change. FTP needs to go.
You saw the same arguments when the Python Cryptography library started adopting Rust to replace memory-unsafe C code in their C library. People running, like, DEC Alphas in their basements for sport were furious. It was on the front page of HN for several days. It blew over, because nobody really cares about those people in a durable, meaningful way.
Same situation with FTP. It's dead. Stick a fork in it.
Absolutely not the same situation. There are numerous mentions in this very thread about industries / sectors that still heavily rely on FTP. There's hundreds of millions of dollars worth of contracts that rely on data being transferred using that protocol. This isn't people enjoying retro computing. This is day-to-day operations.
And they'll make do without Firefox and Chrome, just like the DEC Alpha in the basement didn't explode when the Cryptography team moved away from C.
For whatever it's worth: those industries should not be relying on FTP. FTP is bad. But I'm not advocating to ban it from the Internet; rather, I'm just saying, nobody should make software security compromises of any sort to continue supporting it.
While we are at it, let's remove the domain name and replace it with something readable and maybe just default to Google without the ability to change your home or maybe Facebook. /s
“Not on the internet IoT” is basically the domain of either large industrial/commercial entities who already pay engineers to design and operate their gear (and for whom there are a number of viable internal-PKI platforms) or hobbyist tech people who want to do fancy segmentation of their IoT gear (and for whom there are a host of open source PKI helpers).
The general human in 2021 who buys IoT gear puts it on their Wifi and goes back to other things.
The general human in 2021 who buys IoT gear puts it on their Wifi and goes back to other things.
Fallacy of the "general human" aside...how do you configure it? How do you configure your Wifi in the first place?
The app- and service-centric world that people have been forced into by the laziness of developers, the desire for surveillance data, and the deprecation of browser features, is the worst of all possible worlds.
Devices have full network connectivity, so that security camera you bought becomes part of a botnet, hacks your laptop, and installs ransomware or steals your financial info/bitcoin wallet. Companies control your house and your data, so when they disappear, or when Nest pushes out a bad firmware update, your devices (and your thermostat!) stop working.
Technology is realizing only a small fraction of its promise, and rather than empowering people, is acting as just another set of shackles that binds people to the whims of the powerful. The best instruments for changing that, namely web browsers and truly independent, user-owned devices, are being destroyed one small step at a time.
So, in 2021, the question stands, and you've made no attempt to answer it -- how is any device supposed to break out of this sinkhole and restore the power of technology, if browsers block local devices' UIs, and users can't even configure the device without the blessing of Google or Apple?
I think we took a left turn midway between my comment and your reply. You asked how people should handle universal TLS for internet-disconnected systems.
The answer I gave addressed that directly: there are commercial and open source tools for doing so (MSCA, Vault, EJBCA, smallstep, FreeIPA, to name a few). But the overwhelming majority of actual individual users do not desire to segment their IoT devices off the internet. That’s not a “fallacy”, that’s just a fact.
It’s clear that you have objections to the current state of general purpose computing, and desire that technology existed differently. But that’s a pretty far step away from the topic here.
I have been using Firefox since Phoenix/Firebird, so over 20 years, and have never once used it's FTP feature.
I've used FTP, fairly heavily back at an old job that required it, but I have an FTP client. They are a dime a dozen for every platform. But I haven't used FTP at all in at least a decade.
Mozilla should focus their efforts in their web browser on web browsing. If you need to FTP, Gopher, or torrent over the internet, you can grab a client that does those things.
For one thing, you can't manage files over the same protocol you use to serve them, with HTTP, without extra application-layer add-ons (WebDAV or custom web file management junk) FTP can both serve and manage, all over one protocol.
"But why wouldn't you use some other method to manage your files? Why combine the two?" I dunno, but WordPress is basically that (managing your blog's/site's appearance, content, and server-side plugins, over the same interface/protocol that serves the blog/site) but for blogs & websites, and it's damn near the most successful Web project ever, so there must be something to it unless that's not a big reason for its success (and I'm pretty sure it is).
I can certainly see the appeal if your main focus is serving files, or providing file-serving hosting to others (say, other departments, or to paying clients, or whatever). One daemon to configure for the whole task.
This makes a little sad, but only a little. I honestly do not remember when I last accessed an FTP server from a browser.
I do use FTP every now and then, but I do so from the command line or file manager like mc (or far manager when I am on Windows). Even there, it has been declining steadily, though, because ssh/sftp works pretty well as a drop-in replacement, unless one of the endpoints is so low-end the encryption becomes a throughput bottleneck. But it's been many years since I've had that problem.
I think this is a bad idea - many URLs are encoded as ftp still, e.g. when downloading tarballs, zips and such - does this mean clicking on an FTP URL now will require launching an external ftp app instead of downloading the file? I'm ok if FTP browsing is suspended through the browser, but abandoning FTP altogether in the browser is a bad idea. It's an old protocol and still very useful, albeit not in its unencrypted state maybe, but it does do a good job of what it's supposed to - namely, transfer files.
I’m not sure I buy that there are that many links encoded as FTP that you’d hit during normal browsing. Chrome dropped support a year or so ago, so any website depended on FTP already has required a separate FTP client for 85% of users.
It also doesn't really do a good job of transferring files - the protocol is slow and is incompatible with lots of firewall setups.
Mozilla is the last organization keeping us from the complete takeover of the web by Google/Webkit and they seem to be held to an impossibly high standard by users. There's no FTP support in Chrome, there's terrible support for Adblock, there's telemetry, etc, but somehow Mozilla has to be 100% perfect.
This is why we can't have nice things and why the internet is going to become Chrome-first.
Perfect? No. Living up to the expectations and principles [0] that they themselves established? Yes.
[0] Principle 6: The effectiveness of the internet as a public resource depends upon interoperability (protocols, data formats, content), innovation and decentralized participation worldwide. https://www.mozilla.org/en-US/about/manifesto/
Why is FTP all of a sudden an important interop protocol when we have HTTP, HTTPS, rsync, scp, etc? Why is the browser responsible for maintaining this legacy FTP support when there are command-line utilities and dedicated GUI clients for it?
The issue isn't that FTP is all of a sudden an important protocol. It has been chugging away in the background without much fanfare for decades. The issue with Firefox removing it largely comes down to convenience: it's been included in browsers for so long that it's now the primary means (in some industries) for non-technical users to access files.
This whole ordeal kind of reminds me of IE8. Whether good or bad, companies stuck to what they knew and what tools they used to carry out their day-to-day. I can easily see updates being avoided to keep FTP functionality at the expense of newer security issues being patched.
Is it really the responsibility of Firefox to maintain FTP support when everyone else has dropped it, just so that a tiny fraction people don't have to learn a new tool and/or modernize infrastructure that's out of date?
I wouldn't say they're responsible for maintaining it. But I think it's fair to question their decisions. As I mentioned above, one of their principles specifically calls out that the internet depends on interoperability and various protocols.
I don't really understand the argument that if everyone else drops support that Firefox should too. Firefox could champion themselves as continuing to support various protocols and live the principles they set out for themselves.
Yeah ftp support should be backed in the operating system. In KDE, it's simple to open an ftp link in Dolphin as if it was a local folder and I think it's the same for GNOME, but I have no idea about if it's possible in macOS and windows.
The World Wide Web (WWW), commonly known as the Web, is an information system where documents and other web resources are identified by Uniform Resource Locators (URLs, such as https://example.com/), which may be interlinked by hyperlinks, and are accessible over the Internet.
The web is HTML and HTTP. URLs to HTTP resources are links to things on the web, URLs to non-web resources are links to things on the Internet, maybe... maybe not, maybe links to other computers in your home, or files on disk.
Check out NCBI, terabytes of public data are being shared on FTP servers. There are other protocols/APIs, but FTP has been a much better experience in many use cases. Those files are also linked all over the place in NCBI pages and elsewhere.
Does FTP really have better experience in 2021? I'm pretty sure the last time I saw better performance from FTP was in the previous century and, unlike FTP, having built-in error detection and recovery is pretty nice.
In my experience with NCBI it's better mostly because other tools or schemes provided by them kinda sucked, but FTP servers were there since forever and grew to be well organized and stable. Outside of this niche, no.
FTP might be dying in other sectors, but at least in biomed research, I would use FTP anyday when the alternative protocol needs a client app and a bunch of configurations. It's also handy when sharing data.
Browser support is important here because those files are often not explored from command line etc, but rather the FTP links are placed on individual pages as a quick download. At least for me, it's much more convinient to click and wget, than reading a page then switch window to query from API/client...
FTP is still heavily used in the energy industry too. We upload an enormous amount of minute-by-minute meter data at work via FTP to various energy providers.
I'm not sure if you're trying to be antagonistic with this reply or not. I also didn't feel it was important for me to outline the entirety of the data pipeline -- I merely wanted to point out that FTP is still in heavy use in some industries.
To address your concerns though: meter data is not ingested via FTP. That's done using other protocols. It's transferred b2b via FTP over private tunnels. And you're correct Firefox is not used for uploading data. If it were possible it wouldn't even be a good choice given the volume of data we deal with. It is, however, used heavily for accessing the uploaded data by Operations and other teams.
The former two (rsync, scp) are a lot more complex than ftp(s), since they both rely on at least some subset of ssh, plus some other stuff. They're not really suited for safely linking to unauthenticated downloads, either.
https doesn't let you also manage your files with the same protocol/daemon without other stuff on top of, or alongside, it.
> When is the "complexity" of SSH ever a problem? If you need a tiny implementation for an embedded device, use dropbear instead of OpenSSH.
We're in a thread concerning the removal of FTP from Firefox because having extra code around you don't strictly need is, the argument goes, expensive and dangerous. Given the context, I don't think any justification for extra complexity being, per se, something worth worrying about, is needed.
Maybe I should've put it this way: ff (sometimes?) won't force ftp to use https, had to manually correct it if the url happens to start with ftp://ftp.*. And outside of NCBI, when occasionally sharing >10GB data, ftp (because of inertia) has been handy than the more proper ways and could have no https support.
I am a head of team of 20+ brilliant software engineers, who doing great things on a daily basis. When I being asked "What kind of code do the best of your team are writing?", here's a canned response: "The best engineers are deleting code, not writing it".
For software project with size and age of Firefox, deleting obsolete or redundant code is universally good. It is hard but necessary task. I am okay with completely stop using FTP for that cause. Or eventually fire up Chrome FWIW.
I remember way back in the day with ftp.cdrom.com [1] this was an important browser feature. Nowadays this is totally pointless and mind-boggling that anyone still uses FTP at all (SFTP should be used now). And more-so http is resilient for resumable file downloads with range header requests. Also people can develop an extension.
Didnt Chrome remove ftp first. Is Firefox just following suit. Need to check.
Mozilla's explanation/justification here for removing ftp is quite flimsy. It presumes there could never, ever be any possible situation in which a user wants to use a browser for ftp. Whether now or in the future. It just does not add up. There are no specific references to ftp-based exploits, or other examples of how ftp is harmful. Who uses ftp for transfers of unencrypted files containing sensitive data over the open internet. ftp can be useful for stuff that is not sensitive and for transfers over the local network between devices (no internet connection required).
It makes sense to remove ftp if the web is just for advertising and sales. Why would any "consumer" need ftp.
Fortunately the text-only browser I use is probably not going to remove ftp. But any decline in ftp use that results from the decisions of these advertising-dependent organisations is concerning.
I do suspect a likely reason is because while secure FTP protocols exist, they are often not well-supported by both servers and web browsers alike. So while FTP can be secure, the implementation in practice often is not, and Firefox and Chrome are both pushing a "literally everything over HTTPS" mentality. (I have significant reservations about this concept, but in a world where they're shoving your DNS requests through HTTPS, it isn't surprising they are also want file transfers to be through HTTPS.)
You can see here where the GUI didn't support FTP over SSL, and then eventually got marked WONTFIX because they decided to deprecate FTP entirely instead: https://bugzilla.mozilla.org/show_bug.cgi?id=85464
Firefox 90 has terrible usability with the new "Photon" interface. Has no one read The Design of Everyday Things? I just don't understand what they're trying to do - they need to keep and increase their userbase, not push away the few users they have left.
Firefox can't be everything. It should focus on being a great browser and not a great browser and also great FTP client, or a great browser and also a great feed reader, or a great browser and also a great mail client. People using FTP can use a dedicated client, of which there are plenty on every platform, and people who don't use FTP (i.e. the vast, vast majority of web browser users) won't even notice.
A modern web browser is probably some of the most complex software humanity has invented yet, besides a full-scale OS. Taking a maintenance burden that's unrelated to the core browser product of a struggling NFP should be welcomed with a sigh of relief.