32 years ago I had just finished my "erikoistyö" (a pregrad exercise) in CS at the Helsinki Uni about combining object-oriented programming with relational databases and uploaded it to nic.funet.fi for all to see and enjoy - I was that proud of it. Even promised to send a 1.4MB diskette for those who couldn't download it for whatever reason.
Only curiosity value is left probably, but back then it felt like magic to be able to publish something like this on my own. Half a dozen people even asked for the diskette, which I sent to them.
OOP was a mistake. Functional programming has some interesting ideas, but it's not useful in practical applications. Procedural programming is the simplest and fastest way to make the computer do things.
OOP is how large-scale procedural programs are organized. It allows programmers to try to reason about medium-scale parts of the system. (Mesoscale programming? Is that a thing yet?)
FP is a set of guarantees (maybe even enforced by the compiler) about program behavior which enable optimizations, for the compiler, and reasoning about small parts of the system in isolation, for the programmers.
Enough FP can allow you to abstract away the procedural parts, but even a little FP in a procedural codebase can be a good thing. Ditto a little OO, which is why C programmers reinvent bits and pieces of Smalltalk every so often.
I mean, it drives all of Whatsapp, Discord, and once upon a time, Facebook Messenger, but it's certainly not "useful"!!
Your claim is as ludicrous as ludicrous can be. About the only thing it's not (yet) good for is 3D game dev, but Rust (which is not purely functional but which has lots of opt-in functional concepts) is making good headway
Considering that literally everything is easier in it (EXCEPT for SOME of its conceptual bases), I'm surprised it's taking this long. Modularity is easier, testing is easier, there are no mutation bugs, no mutex locks, you understand program flow better (which means less bugs), you just produce fewer bugs per LOC... just scratching the surface here
Because 7-bit ASCII didn't include accented characters used in many European langauges, there were national changes. The Finnish variant replaced {|}[\] with äöåÄÖÅ.
Which, btw, is why those symbols are acceptable as IRC nicknames (and why { is lowercase [, i.e., {some|one} and [some\one] are two equivalent nicknames). IRC was invented in Finland.
One might guess the people who decided on this replacement were not Unix programmers...
The lack of brackets, caret, tilde and other ASCII special characters on various localized keyboards was a real problem in the 1980s. The C language standard solved it by introducing trigraphs:
Of course you could still use {} etc, they just might show up as localized characters in your source code. There was no character set conversion involved at the source code level, your terminal font just might have had the glyph for ä in the place of {.
The people who designed the Finnish keyboard layout were definitely not programmers, though: https://kbdlayout.info/KBDFI/
Ditto with the Spanish (es) layout, they layout looks more apt for journalist and writters than programmers. I just switch to the us keymap with the compose key bound to right menu/right win key, so I can type áéíóú with compose key + ' + vowel (not pressed at the same time). Ñ is more cumbersome (compose key + ~ + n) but I can adapt XCompose under BSD/Linux for that.
Wouldn't claim so - perhaps the ideas were floating in the air. What I know for sure is that my work wasn't used for much.
What's more alarming is that it seems those 32 years old files at ftp.funet.fi are mostly unreadable by now. Back then I thought PostScript would last but alas! that is not the case. Ghostcript can show just about the cover page and that's all.
Libreoffice does a little bit better with the DOC-file but it's still not quite right.
So if there is anything to learn it's about persistent document formats. I wish I had known about LaTeX back then.
I'd say that assuming 'he' might be a US thing - 42 years ago when I was in Australian university math | comp sci classes a third of the students were female as were staff.
Even then I routinely used 'they' when writing about people in general, authors I had not met, etc. as there was a good chance they weren't male.
It used to be taught that the singular "they" was ungrammatical. (Ironically, the singular usage predates the plural.) The rule faded in other parts of the Anglosphere a bit earlier than in the US.
While being no expert on the historical development of the english personal pronouns (I do read some old english and maintain some fluency in modern ditto, not my first language), the linked Wikipedia page clearly states the opposite: singular they came into use after the plural use.
This is a minor nitpick, as I suspect that third person personal pronouns where in a state of flux during the middle english period, replacing some inherited pronouns with pronouns borrowed from old norse. More so, language isn't defined by it's history but from how it is used presently!
I myself wouldn't use singular they, it goes against my “language intuition
” (probably formed by my native language which wouldn't allow that construction), others feel free!
Wikipedia reinforces my understanding .. it's been in common use for centuries and only relatively recently have a few dipshits declared it to be "wrong"
Singular they has been criticised since the mid-18th century by prescriptive commentators who consider it an error.
Who d'fuck gives a toss about prescriptive gammons tellin udders de write ways to use da Engrish, 'hey?
FWiW the Oxford English Dictionary is descriptive and not prescriptive.
I was 6-7 at this point in time but I went from DOS to Win 3.1. I don't remember ever hearing about Win 3.0 and a quick google search makes it look like Win 3.0 and 3.1 were drastically different for some reason that I'm not really tracking down. I wonder why my dad held off until 3.1.
Downvoted for literally being curious to a Win API developer as to why 3.0 and 3.1 were so drastically different. I have a feeling the people who downvoted me didn't even go through this code base and realize how specific to Win 3.0 it was. Good on ya, mates. Keep reading only headlines.
It is so wonderful that we have repositories / archives like FUNET. So much history can be found in one place, along with everything we need to (re)experience what things were like back in the day.
As someone who runs an Aminet mirror (us3.aminet.net, which happens to be hosted on a real Amiga), I'm always grateful and appreciative we have resources like these that aren't based on popularity or on the OS du jour.
Some of the appreciation should go to the academic roots, culture and tradition of the early internet. The internet originated at DARPA but many of the earliest participants and adopters were academic institutions.
FUNET is the Finnish University and Research Network. They provide backbone connectivity and networking facilities to universities in Finland and have done so for decades. They've also run the public FTP archive (actually HTTPS by now) since 1990.
It seems to me that at the time, providing a server that distributed freely distributable and open source software was part of an academic culture of sharing and of providing a public good. (The free software movement also has its roots in the academia. Of course "open source" as a term didn't exist back then, but some of the culture did, without the commercial connotations of open source necessarily.)
In today's rather commercialized world, I appreciate it that a public institution still runs such an archive in a similar spirit with no direct commercial interest. (FUNET is run by a state-owned enterprise.)
Totally agree and thank you for being so perceptive! It was swell to hear someone say aloud "academic culture of sharing and providing for public good". I think that's what humankind would be wise to aim and seek for: equality of all and caring for the welfare of the weakest.
University education does not need to be expensive. On the contrary, it can be free.
Not FUNET but from IBIBLIO I've got the sun multimedia sounds for notifications in my machine along herbe. I use beep_casio.au for some calm and unobstrusive sounds for instance with an IRC client on messages or for SPT (simple pomodoro tecnique) to stop/continue working.
They used to make BD-Rs using this technology, but switched to a cheaper (supposedly not as long lasting) method without any change in branding. It was a minor scandal among datahoarders
The SUNET Archive began its life in 1990 as an ftp archive created by Lars Gunnar Olsson of the IT-Department at the Swedish University of Agricultural Sciences, or SLU, in Ultuna, a few kilometers outside of Uppsala.
The archive became a SUNET facility in 1993 and was assigned the name ftp.sunet.se. In the SUNET newsletter SUNETTEN from 1993 it is noted that the archive already contains 4 GB of data and there is room for another 4 GB.
By 1994 the SUNET Archive was ranked among the largest and most visited archives in the world. Its total storage capacity was then 28 GB.
I faintly remember when around the turn of the century the Irish equivalent, HEANET put in an Intel Itanic server for a similar purpose (or was it two?). I hope someone will correct me if I remember wrong but it had an absurd amount of memory, like 32GB.
FTP is such a horribly dated protocol; it's actually older than TCP!
The NAT issues are well known, but resolved in a standard way. However, the intersection of {Features defined in RFCs} and {Features implemented in FTP Servers} is much smaller than the sizes of either set. Many useful things are implemented outside of the spec, and most of the spec is not implemented in servers.
There were several application protocols in use at the time of the TCP cutover (1981?), including ftp and telnet. We didn't just throw them away, we ported them over from TCP's predecessor NCP. The one we did throw away was email, which was not a separate protocol, but was implemented as part of ftp. We got rid of that and replaced it with smtp. But the new ftp/tcp servers still supported email for several years as a transition.
The "FTP" archive is also served over HTTPS nowadays, and has been for a long time. The hostname may still be ftp.funet.fi but there's an HTTP(S) server listening.
I'm actually a bit surprised that they do also seem to still run an actual FTP server there as well.
If you like working with old machines, you quickly learn to appreciate the availability of FTP servers. Encrypted connections are quite CPU intensive, particularly when negotiating the connection.
If you're working with something old enough for encrypted connections to be "CPU intensive", you're probably not talking to anything remotely current any way
TLS 1.3 is out there
TLS 1.2 is ubiquitious
I cannot recall the last time I saw anything older than TLS 1.1 anywhere - TLS 1.1 on down into the various SSL specs are all deprecated/unsupported[0]
TLS 1.1 and 1.0 have been deprecated for 2 years, SSL3 since 2015, SSL2 since 2011
If you're running something new enough to support TLS 1.1 (at least, 1.2 or 1.3 are much better), then the encryption overhead is very tiny
The list of TCP and UDP port numbers [1] is a treasure trove of historical artefacts. It's amazingly hard to find information on e.g. "compressnet", which is wasting port 2 and port 3 for eternity.
I was skeptical at first but found evidence to corroborate it's the same thing, the key fact being the company named Process Software.
The reference in RFC1340 for it says "Bernie Volz" of process.com. Searching on that let me to a Web-0.9 style page with the text "compressnet/tcp is a protocol that is used to compress TCP connections for WANS. It used two ports to compress tcp conections, one to send the compressed TCP connection data through, and one used for the management of CompressNet. This is a commercial product only."
It is not bad design. At least in world without firewalls or NATs. It potentially allows things like one machine setting up file transfer between two servers. Not sure if this was done, but could be done.
Lot of stuff was rather interesting design, which we really have gotten away with how our current networks are build. Like multi and broadcasting for actual content.
"It potentially allows things like one machine setting up file transfer between two servers. Not sure if this was done, but could be done."
Well I certainly recall doing it back in the 90s.
I vaguely recall it was sending PASV to one server, taking its response values, and providing them in a PORT command to the other server. Then sending the RETR and STOR commands respectively to the two servers.
It is much more complicated than using two ports. There are two modes of ftp, active and passive.
In active mode the client connects to the server on port 21. It then issues a PORT command which tells the server which port on the client to connect to for data. The server connects to destination port on the client using port 20 as the source port.
In passive mode the client connects to the server on port 21 and issues the PASV command. The server chooses a random port for data and responds back to the client with a PORT command indicating this port. The client then connects to this destination port on the server using a random high port as the source port.
early network engineers were often rude and insulting. It was common to be berated about "vi or emacs" or other dense topic. Communication between engineers was often verbal commands or mild insults like this one.
> early network engineers were often rude and insulting. It was common to be berated about "vi or emacs" or other dense topic. Communication between engineers was often verbal commands or mild insults like this one.
Indeed; try suggesting any circuit-switched solution with Vint Cerf in the room and see what happens!
I have a spiel about "the old days" (pre-2000) internet, when the Web wasn't the only protocol. How we had FTP, Archie, Veronica, WAIS, Gopher, Telnet, Finger, etc...
I'd bring up all the great FTP sites I remembered:
spies.com (and "wiretap")
funet.fi
sunet.se
monash.edu
ac.oak.oakland.edu
I feel like there were a few other great repos out there, but those are the great ones that stuck in my memory.
Archie, X.500, ftp, gopher, telnet, irc, mailing lists, USENET, WAIS, WWW (remember to use WWW clients), a caching WWW server for Funet members, Alex ("Global filesystem for all anonymous ftp sites with caching (experimental)")
You could telnet into the system with username info and use command-line clients for some of these services if you didn't have access at your home university.
I vaguely remember using an email-based service to access web pages: you sent the URL to an address and got a reply with the page contents rendered as text. That probably wasn't Funet but something else.
tsx11.mit.edu, where lots of unix and early linux binaries could be found.
Unfortunately also university servers retire, and with them we lose the digital history they contain. Shout out to funet.fi and archive.org for taking good care of old files!
I should really get around moving to iki.fi as email address. It is forwarder designed to last forever. Which with companies acting like Google makes lot of sense.
Better to be beholden to non-profit that has driven the same mission for long enough time.
>It runs on a Linux server with dual 20 core processors, 786GB of memory and 80+TB of NetApp NFS storage. It has a 2 x 25Gbit/s connection to the Funet backbone.
Imagine an IT expert waking up from a coma they had been in since 1991 and seeing this.
I fondly remember using sites like this to download stuff (ok, apogee games mostly!) in the 1990s. I could FTP to the university VAX which would exceed my 'soft quota' of disk space every time, then I'd have to get the files onto a floppy and delete them before my account locked up.
Does anyone know if the SunSites are still mirrored anywhere? There used to be a dir called 'programming' full of weird and whacky languages which I'd love to see again. For example there was one called ALLOY by Thanasis Mitsolides. His PhD is still available on the internet but the code in usable format is not, as far as I can see.
As someone just getting access to the net in '94/'95 and also learning programming via Turbo Pascal around that time, ftp://x2ftp.oulu.fi was a lot more important! Now sadly gone.
I lived in Oulu and would go to a local library to download game programming tutorials for Turbo Pascal from x2ftp, not quite realizing that the library was pretty much half-way on the direct line connecting my home and the location where the x2ftp server was located. The name, as I heard afterwards, came from it being hosted and managed by the association of Computer Science and Engineering students of the University of Oulu (https://otit.fi/en/kilta/) in a bomb shelter under the main campus building, stairs down from the entrance designated as X2.
One of my favorites as well! I had a poke around and it seems there's a downloadable archive of it, though at "only" 500MB I'm not sure how complete it is: https://archive.org/details/X2ftpArchive
There's a mirror available at ftp://foto.lu.lv/pub/mirror/x2ftp/msdos/programming. Only via FTP, as far as I can tell. It works well with e.g. Filezilla. Log in as anonymous...
I believe it might be just because it is really easy to incorporate an association in Finland and an association that is a non-profit organization founded for public benefit will be considered tax exempt by default. Many of these services have been run by the incorporated associations as opposed to just a loose group of individuals or a single person.
The FUNET though is the Finnish university research network and thus I believe the Funet FTP server has always resided on a server funded by the government but still mostly maintained by volunteers.
I think... both for Finland and Sweden: We wanted to stand out internationally. And there were lots of clever people in the right places early on. And there was money to spend on that exciting new stuff, after the first few early press rounds.
But since this on top of HN I hope someone who actually knows will show up soon. :)
Still to this day my default address I use when pinging just to see if a device can talk to the internet. If my packets can make it to Finland they can go anywhere.
This was probably the first website I regularly used when I got internet (late 90s) - because it had an awesome Commodore 8-bit archive. That archive is now hosted on another domain ( http://www.zimmers.net/anonftp/pub/cbm/ ) even now it's still an awesome resource for Commodore fans.
That might be true for the individual RFCs but the index.html for that folder sure takes it time to load, which isn't surprising taken the roughly 10k RFC, which each are present in 5 different file formats. I remember having worse scaling issues in Windows Explorer back in the day when trying to navigate into the folder of Ralf Brown's interrupt list, which also contained more than 10k files. But that took several minutes to open not seconds.
anon.funet.fi was an anonymous remailer popular with the phreak/hack scene that was shut down around 1996 after Mitnick was arrested and the attitude towards prosecuting hacking became far more militant.
It's a common acronym across several different networks, especially in the early Internet. I am not exactly sure but I believe it stands for Network Information Center. FUNET is the Finnish UNIversity NETwork, i.e. the early research network that first connected Finland to internet when it was still mostly a tool for researchers to connect and share data.
NIC.FUNET.FI would've been the central directory server where you found other services on offer across the larger FUNET, the hub. Mind you search engines would not really exist for several years in 1990.
FTP is not a very good protocol in my opinion. HTTP, Gopher, and Gemini are better designed (although better is with optional TLS instead of mandatory TLS).
Nevertheless, for use with computers that have FTP but not the other protocols, it can be helpful.
Cannot help feeling good of seeing it's still there. https://www.funet.fi/pub/sci/computer/oop/
Only curiosity value is left probably, but back then it felt like magic to be able to publish something like this on my own. Half a dozen people even asked for the diskette, which I sent to them.