Hacker News new | past | comments | ask | show | jobs | submit login
EMule 0.60a (emule-project.net)
277 points by ciccionamente on Sept 3, 2020 | hide | past | favorite | 183 comments



That brings back nostalgia. I remember vividly being active on various eMule mod forums back in ~2005. There was a very active scene back then with some eMule Mods that really pushed things further, MorphXT, Xtreme, StulleMule, eFmod, Sivka, ionix, NeoMule and ScarAngel come to mind.

So much drama with leecher mods (leecher was negatively connotated in regards to the ed2k-protocol) that tried to download more than uploading or upload to clients only that had proven to pay back with higher speeds. But also really innovative ideas like downloading from clients that used a thin HTTP server within the client, leveraging your ISPs proxies (Webcache/Peercache feature) for higher download speeds. Also so many discussions where the network should be heading with the main Client devs famously mute in regards to those discussions and drama.

The main benefits of eMule/ed2K, compared to torrenting, were file discoverability (serverless thanks to Kad search) and longetivity of files. It was actually relatively hard to find a file that was dead, because people kept sharing as long as they downloaded, often much longer. Download speeds were much slower than Torrents, even back then, and that was an advantage because that kept so many files alive. Great times that basically ended because of „p2p sheriff“ companies, Torrents and the dawn of One Click Hosters like Rapidshare that promised privacy and better download speeds.

To this day I am wondering if eMule would still have an active userbase if it had some sort of mechanism that (via opt-in) allowed clients to download from trustworthy „friends“ only and pipelined those downloads from the „public“ ed2k network through these friends and their friends' networks. Yeah, may be a bad idea but to my knowledge never really got explored.


I loved, with eMule and/or Kazaa (don't remember anymore which one), the feature of being able to get the list of all what the remote host was sharing => my idea was that if that person was sharing something that I liked, then that person might have other interesting stuff which I might have liked => from time to time that was the case.

1 or 2 years ago I installed it, connected to the network (nodes? meganodes? forgot the terminology), and activated somewhere the parameter to be verbose about the incoming search queries => A LOT of nasty porn stuff being searched (I therefore quickly shut down and uninstalled everything, brr) => it's a pity, but thinking about that later maybe they were just automated queries or similar submitted by police & Co. to search for illegal videos, who knows.


> p2p sheriff

My trick back in those days was to make an inverted filter. Instead of trying to blacklist the sheriffs, I white-listed peers.

Back then I was with an ISP that had an unusually large number of "peering" connections that were unmetered -- they wouldn't count towards the monthly download quota.

I tracked down the IP subnets for each peering company and then wrote a little program that would block everything except peered network ranges. This worked surprisingly well! Despite filtering out 95% of the Internet, my connections were short-range and high bandwidth, so I actually had slightly better download speeds -- and no more scary emails.

I did feel bad about one thing though: I couldn't figure out how to invert the subnet list efficiently, so I made a 4-billion entry bit vector (just 512MB!), flipped the desired ranges, and then exported from that.

I can feel my CompSci University professors shaking heir heads in shame.


Yes on eMule you could find cool and rare stuff just by searching because users basically shared folders.

I miss that (well I guess this post says I can still use it).


>To this day I am wondering if eMule would still have an active userbase if it had some sort of mechanism that (via opt-in) allowed clients to download from trustworthy „friends“ only and pipelined those downloads from the „public“ ed2k network through these friends and their friends' networks. Yeah, may be a bad idea but to my knowledge never really got explored.

This is what RetroShare does.


Please be aware, that the most popular emule server software available many years ago was backdoored. That's why it was distributed as a binary, without code.


Source? I remember trying it last year for old time's sake and I'd like to know in what way it could've compromised the OS.


My own reverse engineering. There was a hidden packet in protocol implementation that opened a shell for you


That would make a great article for HN.


Indeed it would! grzaks, please email hn@ycombinator.com if you might like to write about this.


I would first have to find out if I still have that research data


Please do. I think it'd be fascinating so bookmarking this thread in hopes of an article.


I haven't used EMule in a long long time, is it still useful? does anyone use it regularly?


Not regularly, but if you're looking for a super obscure documentary that is not on youtube and you cannot buy anywhere - emule/amule may be the place to find it if you're willing to wait a few days or even weeks. Did that a few times.


DC++ used to have the most obscure stuff. I remember finding one site(?server?hub?) that had every single F1 race broadcast (practice, qualifying, and race) for many years. Basically someone had been recording every race on VHS tape and eventually digitized them all and uploaded them. This is just one example of the kind of specializing you could find. I wonder how much of it is still there.


One of the reasons why I like IPFS is that it gives you an immutable links to files like these. If you find the file once, you can get the link and access it again later (assuming people are still seeding) and be sure you're accessing the exact same file you think you are.


I mean, magnet links are the same thing—with the exception that they aren't widely used between users, to my knowledge, i.e. people aren't too familiar with them and go to search sites instead. (Not to say that the same won't happen in IPFS if it ever gets popular.)


They're not the same thing which is why they're not as useful.

Magnet links apply to an entire collection of files, as snapshotted by the creator. There is no way to pick a single file and refer to it, immutably. It's all or nothing. Additionally, metadata besides the file contents will change the hash. This includes filenames, directory structure and piece length (chosen at snapshot time, by the creator).

All of these limitations are enough in practice to make magnets useless for widespread per-file distribution.


Do magnet links work across torrents? Do two different torrents of the same file have the same magnet URI?


The infohash (which identifies the torrent, and thus the magnet link) is a hash of the following structure:

http://bittorrent.org/beps/bep_0003.html#info-dictionary

There are some things, beyond the content of the file, that will change the hash:

- Path of the file (nature_documentary.avi vs root/nature_documentary.avi)

- Name of the file (nature_documentary.avi vs nature_doc.avi)

- The pieces length


Ah, if it's the entire dictionary and not filtered by the clients to just those fields, then presumably additional data may be shoved in—specifically the trackers to contact.


No, look at the content of the dictionary: trackers are not part of the "info" dict, so you can add all the trackers you want, it won't change the identity of the torrent. Identity is defined by the infohash.

Trackers are added in the magnet link by adding parameters to the URI: see http://bittorrent.org/beps/bep_0009.html


Not sure if any metadata other than the file name affects the hash—e.g. modification times. Chunk sizes likely do. Otherwise, I'd expect the hash to address actual content, though frankly I haven't checked.

Edit: in fact, my hasty, belated and superficial skimming suggests that IPFS is an implementation of just a DHT (presumably with some metadata and chunking built in)—i.e. it keeps data itself in about the same way that Bittorrent stores torrent descriptions in the Mainline DHT.


magnets are just hashes of the contents


For me it was WinMX. I remember around 2001 I found all GameDev magazines there. Something that was just not available in any other p2p network.


So I tried a few years ago. Was looking for an old obscure TV show and couldn't find that anywhere. Installed aMule on a whim, got it connected and used the search function. Actually found the show, started the download. A minute in it found a couple peers and started downloading, very slowly. Then I remembered that you can see the file name that all the other peers stored the file as locally. It was handy back in the day (around 2001) to spot fakes. For some reason, people back then liked to rename random movies or porn to current blockbusters, so when you look for a new movie you get the wrong thing. That's why eventually websites that manually indexed links took over.

So I opened up the window that shows all the file names of the tv show I just added to the download queue, and most of them very explicitly hinted at child porn. I freaked the hell out and nuked the whole installation, deleted everything, overwrote empty parts of the disk with zeros. It just downloaded a few kilobytes of a couple hundred MB, but still... I just thought what would have happened if I hadn't remembered that file name thing and would have found out when trying to play the damn file after downloading it. So probably not gonna use it again anytime soon.


> I freaked the hell out and nuked the whole installation

when something so bad and rare is used as scapegoat for so many unrelated things as part of a fascist power grab, that everyone rather run away than to report when the actual crime happens to the authorities?

makes you wonder how much all the 'think of the children' knee jerk legislation actually helps anyone, besides the politicians


I appreciate your view on the matter. In this case, however, as disgusting as it sounds, it's the "authorities" themselves serving that kind of content. Actual honeypots. No point in reporting, they perfectly know. Better to just run away and pretend nothing was seen.


You don’t know that.



That does not mean this instance was a honeypot, or, by extension, that reporting would be pointless.


Yes, and yes.

In fact, it is arguably _more_ useful than BitTorrent on the majority of files people might share. You see, with BitTorrent, you typically share few files, and mostly the ones you just downloaded. There is little incentive to actively seed thousands or tens of thousands of files you don't even know are interesting to people.

With the ED2K protocol, the premise is that you share a lot: complete directory structures, and you search your many hashes when others make a request. That is more or less _the_ way to share collections of items/files/media/etc (other than, perhaps, running your own tracker which shares every file in your sharing directories separately).


Last used it 8 or 9 years ago to download z/OS, so I could have the actual DB2 for z/OS running locally via hercules, to help with app development.

The download took about 2 weeks to finish. Couldn't find it anywhere else, only on eMule.


I occasionally use aMule on ubuntu (sudo apt-get install amule). Mainly to download classic Italian movies. There are a lot of titles to choose from Commedia all'italiana and neorrealismo, it's fast and I don't have to visit any shady webpage to get them, I just use aMule's search bar.


I haven't used EMule in a long time either, but I still use Soulseek occasionally to find stuff that just isn't available anywhere else (legal or otherwise). I have had wishlist searches in Soulseek that were active for months before someone logged in who had that particular thing.


TIL soulseek has a wishlist.


I haven't used EMule but Soulseek is very active.


Wow, that brings me back! I had to check and Nicotine Plus was last updated... wait for it... yesterday.

I might have to install it for old time's sake.


Yes and yes (although technically I use aMule). There's lots of stuff in the EDonkey & Kad networks!

aMule itselfs gets updated even less, but hey it works.


Well this is a blast from the past for sure.

I haven't been looking at torrents for ages, mostly because local LEO partnered together with extortion lawyers made it uncomfortable over where I live. Literally extortion in this case, as the legal firms in this business sent blackmail letters to people who's IP addresses had been seen as part of a torrenting activity to either pay a ridiculous settlement fee or go to court.


Rent a seedbox in eastern europe (somewhere like Belarus or Romania) and torrent your stuff from there. There's no way they can cost-efficiently get you that way.


Tbh I dont really need illegal torrents nowadays, but thats an interesting concept for sure.


It is not necessarily about illegal torrents. There are some niche files such old movies that you wouldn't find anywhere to let you watch it where you live. I used to torrent some, just to keep it alive and prevent such works from disappearing from the Internet. However, I'd still prefer to torrent from a location that reduces the chance of receiving such letters from opportunity seekers to suck some cash out of me.


You do have a good point - although the extortion lawyers were just contracted by some of the big studios. Ergo, the likelihood of being hunted by the heirs of a ww2-era Polish filmmaker or similar entity are considerably slimmer. :-)

Youre right though in that its generally a good idea to stay anon when torrenting.


> However, I'd still prefer to torrent from a location that reduces the chance of receiving such letters from opportunity seekers to suck some cash out of me.

Then you should consider torrents over I2P: https://geti2p.net/en


The law firms aren't going after everyone who is torrenting, they are going after people downloading copyrighted material. You're not going to get a letter if you're downloading a torrent of the latest Ubuntu image.


Not guaranteed, actually, due to the carpet-bombing strategy of targeting the ‘culprits’. Can't remember if Linux specifically was involved, but there certainly were some misfires.

(Ironically, when I tried downloading fresh Ubuntu with desktop Transmission, it froze for a rather long while, presumably due to the near-two-thousand peers.)


It's not going to connect to most of them. The protocol doesn't scale linearly with the number of peers, so most clients won't connect to more than like 50 at a time. And will download from even less.

With an infinite number of peers, 0% of the traffic would be actual file chunks. 100% would be bookkeeping.

I guess that's a long winded way or saying that the issue probably wasn't with the number of peers.


How does one contact such renters?


Would be kinda cool to, err, "join" some majority of the Wi-Fi networks in your area and download some of the most boring stuff possible (including of course Linux distros)...


If you list that as the primary reason, why not just use a VPN?


Why do you just trust a VPN provider, any VPN provider?


Because for this use case of avoiding the extortion lawyers this is good enough. Nobody is going to bother with trying to go after some VPN provider when there's millions of people on normal ISPs where they just have to give them a call.

As usual, it's all about knowing your threat model.


You don’t think providers would be willing to sell that information for cheap? They’d just saw off country sized chunks and sell bit by bit


The business model of these lawyers is to join a few swarms of big popular titles, then send automated letters to the ISP, then send automated letters to the address they got from the ISP and collect money (At least here in Germany).

They are only going after the low hanging fruits, it's not about catching everyone or doing deals with VPN providers located in some hard to reach country.


That’s an extremely definitive point of view. What if the lawyers agree to give the VPN provider 10% of the proceeds?


That VPN provider would likely be out of business extremely fast. If you offer a service that's supposed to increase privacy, and it doesn't... who's gonna pay for that service? Of course exceptions apply with law enforcement, but then again your threat model is very different there (and the VPN provider in question will not be able to say no).

That being said, there are VPN providers who guarantee that no logs are kept (I believe ProtonVPN does this) and are extremely privacy-oriented. I am inclined to believe these companies, if all I'm doing is pirating a few movies.


VPN\seedbox providers probably don’t make the wisest business choices. They may not care bout the longevity of their business even if they do.

Hosting providers that sell bandwidth to VPN services could easily log traffic.

There have been numerous cases where ‘No logs’ VPN providers have been caught logging details. See the recently exposed logstash breach.


Why couldn't lawyers almost as easily send bulk IP requests to VPN providers as they send notifications to ISPs?


The VPN can credibly claim not to hold on to that information.


What does the VPN provider claim when they receive a court summons for copyright infringement?


In the US at least, they are neither liable for the actions of their users nor under any sort of obligation to retain IP address logs. So, as far as I can tell, they're free to respond with "sorry, can't help you." IANAL, though.

The EU on the other hand has mandatory data retention, but it seems that this has been challenged be several countries, according to https://www.eff.org/issues/mandatory-data-retention


PIA (before acquisition) said in lawsuit that they can't provide IP address because they don't save.


They don't receive a summons, because the courts in their jurisdiction don't issue summons for that.


Why wouldn't lawyers run a vpn service in order to collect IPs?


Just a tangent, but you can send Mullvad an envelope with cash and your account number as a form of payment. That's not a proof of trust but it's a statement IMO


It's not so much about trust as it is about defense in depth. Many states won't cooperate with american litigation.


Because trust is what they sell, unlike an ISP.


What is the alternative?


I was a very passive torrenter anyway, just found it easier to source stuff in alternative ways


Does that still happen? I thought that the ability to spoof IP addresses in torrent networks ended that. The accuser not only has to prove that the IP address hosted the file, but also that it was not spoofed; an impossible ask


Wow. I’m almost shocked to see a recent update for eMule of all applications. Throwback thursday for sure! I think I may have used eMule at some point in my youth, like in ~2003 or so.


On a related note, it is remarkable how a program that does several things and deals with more than one network protocol and presents a complex, information-rich, configurable interface can still clock in at about 3 megabytes. I can't help thinking an Electron version of this would be about 250Mb and have 1/4th functionality and consume 1 Gb of memory just idling.


When someone makes a platform as easy to use as Electron that makes tiny executables with zero friction and lets me easily style and debug a live running app I'll switch. Maybe that already exists?

My previous experience with native dev is that every platform requires 1000s of lines of platform specific setup and then special code all over the place and further I need deep familiarity with all the different platforms' APIs

On Electron, if I start on Mac (you can start anywhere), it's

    npm init -y
    npm install --save-dev electron electron-builder
    git add .
    git commit -m "initial commit"
    ./node_modules/.bin/electron-builder
Then I ssh to a linux box, clone the repo and

    npm i
    ./node_modules/.bin/electron-builder
Then I ssh to a windows box, clone the repo and do exactly the same thing.

I just created all 3 apps. On each platform the file is setting in the 'dist' folder as a .dmg in Mac, an .exe on Windows, an .AppImage on linux

No other installs needed (just node), no installing 57 standard or custom libraries and runtimes. no sudo apt install for dependencies. No setting environment variables telling various compilers where they can find stuff, don't generally even have to run my code on the other platforms even, an app pops out and I'm done. And, if I've written tests they'll generally work similarly, "npm test", zero friction, zero extra things to worry about.

If someone want's to beat Electron those are the features they need to provide.


> with zero friction and lets me easily

You made a whole case for things that make your life easier. Any remnants of interest for what would be in the user's interest? Let's not kid ourselves, you saving some effort is not for the user more than say charging double for your software is.

But perhaps that should be left in the hands of software engineers.


As a counterpoint I have come across lots of software that is only developed for one platform and unfortunately I haven't been able to use it. Sometimes the difference between a 3mb or 300mb install file is worth it if it means that the software is available to more users. In that sense it can be in the user's interest.


> Maybe that already exists?

The closest thing that exists is probably this: https://platform.uno

I don’t have hands-on experience though, only read their web site and watched a couple of videos.

And I’m not sure they have live visual debugger, it might work on Windows when using Visual Studio 2019, but I would be very surprised if they ported that to the rest of them, it‘s relatively hard to accomplish.


There's tons of easy ways to develop gui apps, many easier than Electron. tk and Racket come to mind.


I have nothing against TK or Racket. I haven’t used these two in particular, but I used equivalents like Windows Forms or wxWidgets, and I was happy with both. However, they only work when you don’t have high bar for visual quality, are happy with stock OS-provided controls with minimal customizations, and don’t have too much GUI in general.

GP mentioned styling and visual debugger.

Styling is needed for projects where you have a professionally-made GUI design on input (i.e. Photoshop or Illustrator or XD, or much less often non-Adobe equivalents), and want to make GUI close to that design. Such projects often include custom UX in addition to custom GUI, like animated transitions or decent touch screen integration for custom controls.

Similarly, visual debuggers are only needed when you have very complex GUI, and/or non-trivial UX code.

Other typical requirements for high-quality GUI are internationalization (Unicode all the way down including these surrogate pairs and ideally colored fonts, support for right-to-left languages), and good DPI scaling (this requires vector graphics for assets instead of bitmaps).

I don’t like Electron nor JavaScript, but I have to admit they are often the best tool for the job.

QT is close, but requires C++ with all the unsafe shenanigans, and depending on the project the license can be expensive.


Qt can be developed with Python or JavaScript too. And there are other third party bindings available. [1]

[1] https://wiki.qt.io/Language_Bindings


Reminds me of MLDonkey which has a comparable feature set. It too is still maintained and it also clocks in at around 3 MiB.

All of this while being written mostly in a strongly typed, functional, ML-like high level language.


> ML-like high level language

Calling OCaml an ML-like language is like calling Common Lisp Lisp-like; it is literally the main ML these days with SML a distant second.


Wonder if rust ranks in there, it has a very ML-feel. Considering mindshare/Jobs/Developers.


I would call Rust ML-inspired. I would call OCaml a ML-descendent.


While I completely I agree, I believe the appeal of electron comes from being easily cross platform with a single and consistent UI and most of the code being shared between platforms. I don't know of many stable frameworks that can achieve that while being able to easily create decent looking UIs. The only other one I saw was JavaFx. But even then, you will still need separate codebases for web and native.


Interestingly enough, there's aMule (http://www.amule.org/) which is almost the same as eMule but using wxWidgets, making it cross platform. Just launched it and downloaded a bunch of things, and it sits on 64mb mem and around 2% CPU.


Ain't it ironic? From all the excellent desktop oriented frameworks we have, developers and users seem to prefer cross-platform compatibility that web techologies enable. Consistent UX seems to be winning over all the technological superiority, performance, or personal preferences that developers have. And I don't like the CPU hogging Electron apps too.


> developers and users seem to prefer cross-platform compatibility that web techologies enable

Users don't care about the technology under the hood, and the compatibility can be achieved in any number of ways not just one bloated framework. Developers use this because it makes their life and job easier. But it's still the users who foot the bill one way or another.

It's as compromised as basing every road vehicle on an 18-wheeler platform because this provides the compatibility between all transportation types, a truck can be used for commuting but a subcompact can't be used for freight. Then you ask the drivers to foot the bill for the extra resource consumption or adjusting the infrastructure to accommodate the new and improved "cross-platform" vehicle.


Users aren’t being asked what they prefer, companies are forcing Electron web tech on users without giving them a choice. Example: 1Password just announced a beta Linux client, and it’s going to be Electron. Were users polled about this?


Do users really care about cross platform (as in Windows, OS X, Linux, not Desktop/mobile) consistency?

I believe a lot of the appeal of electron is that e.g. Linux users gets a version at all, which they wouldn't for plenty of products if it would have to be custom built. Most users don't constantly switch the OS platform, consistency across OSes is of little importance to them.


Most users don't have to switch between platforms but those who are obligated to do like I was, consistency cross-platform is removing a level of friction. Most of the time $work provides me Windows and I run my personal laptops on Linux (mostly Fedora) for the last 10 years. Some of my colleagues were running OS X. It was messy to switch context all the time to explain, learn or replicate stuffs when you are not in a IT/SE environment or even tech-savvy places. One lab we had people on Windows/OS X/various Linux distributions for office desktops and calculation servers on Linux distributions. When software are not consistent across platforms, it results on a lot of friction and lost time in meaningless conversations about "How to do X/Y".

The only time I am ok with inconsistent cross-platform is between desktop/mobile because the UX and means of interaction are fundamentally different.

Electron is a middle ground (that if I can avoid it is better) but sometimes ease to use prevail on ressource-hungry applications. The other alternatives (imgui, vulkan, etc.) seems complicated at best and I am not familiar in of with GTK/Qt port on Windows/OS X to know if they are a pain in the ass to develop with.


That's a good point, developers/sysadmins/IT and adjacent jobs in general may be a sizable exception.

Most "normal" users are usually running very homogeneous systems (the average office worker doesn't get to choose whether they want a Mac Book, a Windows PC or a Linux workstation), where the difference would only be between work & private computer use - but I think, most people (again, developers and maybe designers excluded) have little overlap in the tools they use at work and at home.

Electron is the native-ish sequel to the everything-is-a-web-app movement, I suppose. That has similar motivations (also makes everything no-install, super portable), and for plenty of tasks it's good enough, as computers these days are massively over-powered for lots of general tasks.


You get a lot of mileage out of platform-native UI elements. Also, people have been making megabyte-sized icons these days.


I raised a ticket with Slack when I tried to upload a team-icon. They wanted a _minimum_ of 512px^2. For an icon that will never be larger than 64px^2. They told that was intentional but didn't explain why (I guess the ticket handler just didn't know why) Wtf.


That's for "retina" displays. I don't think there are any 8x displays right now, but probably they want to be future-proof for a while


> That's for "retina" displays. I don't think there are any 8x displays right now, but probably they want to be future-proof for a while

It's kinda unreasonable to put restrictions on the users in order to achieve that. I mean, your slack team might not even be around by the time such displays are introduced. Why force the users to jump through hoops for such a silly reason? 90% of the time they're just going to resize whatever image they have in the quickest, dirtiest way to meet the system's arbitrary requirements.


> They wanted a _minimum_ of 512px^2.

It seems like there's a pretty simple malicious-compliance solution for your problem...

Bonus points for using http://jpegify.me/


As others pointed out, this is for future-proof high-dpi monitors. A smarter approach would be to ask for it in a vector format with hints for specific resolutions.


To have a hi-res version available for (future) hidpi screens and to apply their own (optimized?) scaling algorithms to.


Presumably because the icon will be shown on high-dpi displays (retina screens etc), no?


I think it's remarkable that you think it's remarkable ;-)

eMule comes from an era when all applications were native and 1GB of RAM was considered extravagant.


Am era when you could download whatever software you wanted to your device without going through a draconian app store.

An era without walled gardens.

People had websites and blogs.

The technology was bolder. The algorithms used to accomplish heavy lifting were cooler.

It was the wild west and it was free and exciting.

Today we live in a plastic, enterprise, software as a service monoculture. The wild and free part died.

Maybe I'm just getting old, but I hate it.


Your comment reads like the first few lines of a manifesto I'd gladly sign.


My comment was a bit rhetorical but I do remember that period well, as a regular user of DC++, uTorrent, Kazaa and others. It's just that the contrast hits you fresh in the face after several years of getting used to bloated web, .NET, UWP, flatpak apps, it is refreshing to install a full-fledged application consisting of a single binary less than 3 Mb, and whose UI doesn't treat me like I'm a mobile-totting child.

I wish we had more programs like this. They could breathe new life into older or less powerful systems, and get along better with multi-tasking (using several simultaneous Electron apps on a small or mid-tier system brings it to its knees while these computers would still be considered insanely powerful just a decade and a half back).


Hey, don’t hate on .NET!

Windows Vista and up shipped with .NET 3 and .NET 2.0 was a one-time install for Windows XP that enabled devs to ship complete GUI applications with network, file system support, OS integration, COM integration, and more in a single 100KB binary.

UWP is, of course, a different story.


uTorrent comes to mind, when it first came out people where mostly running Azureus and 90% of the network switched over within weeks. Can't beat a single file distribution of under 1MB.


Apparently the benefit of having larger size apps is the higher development flexibility and evolutivity.

There will always be somebody complaining that they prefer JS to C/C++.

Meanwhile if API and language tools were better designed from the start, and we could agree to DUMP HTML and its ecosystem, maybe things would improve.

I guess android is a huge victim of app cruft. I love android but I have to admit software bloat is a real threat to the environment.


HTML is not really the problem. It is overkill if you don't use DOM at least in basic way. To render static non interactive blocks it's better to use straight opengl, and probably easier no matter if you know opengl beforehand.

Electron ships and keeps in memory things almost everybody already has either installed themselves beforehand or laying in main browser cache (JavaScript libraries). A better, but slightly less user friendly alternative would be to ship applications that simply serve the same HTML on a local port and access it through main browser (like syncthing for one example) and perform computing on the local backend server. And here's the main issue: average Joe is scared by :8080 suffix in the browser. The average user is just a step away from the solution but that's... still not there.


Can someone explain to me why don't browsers offer a "local app" mode where they couple with a locally run server and display a more stripped down UI (no favbar, tabs, extensions, etc) for these cases? Is it about security?


Because even HTTP and the server side are growing in complexity. User expectations that browsers are the outside world and the aversion to full page loads are big factors.


But a stripped down UI serves to make a non-tech savvy user not associate it with their usual browsing.

I know server software is getting complex as well, but to my (limited) understanding the interface was the biggest offender when it comes to unnecessary use of resources, which is the problem my proposition tries to solve.


Chrome and Firefox did have this in Debian repos at one point


It also helps that the code is built on top of MFC and targets only a single operating system (Windows).


amule is not that much heavier. [https://github.com/amule-project/amule]

They’re just native


Funny how a software that old, widespread and "battle tested" hasn't had a version 1.0 yet


Eighteen years, same age as what Wine was in ‘alpha’ while being used by everybody including as a backbone to commercial apps.


Way to go, old friend!

P2P technology is important to resist censorship. It does not have an opinion about what content goes through the technology. Like Tor.

The important thing is that it's decentralised, which makes it truly "of the people, by the people, for the people".

Those saying 'Why use P2P apps if you have nothing illegal to share' are the same ones saying, 'Why need privacy if you have nothing to hide'. They're wrong, they're privileged, they're selfish, and in many cases they're lying hypocrites (and therefore dangerous and abusive people).

Long live P2P. The more protocols in active development the better.


> P2P technology is important to resist censorship

It's important to retain culture too, people who legally own movies, music, etc can't be trusted to actually archive or distribute them properly.

Look how many artistically important movies are not really available anymore and the only old dvd copies floating around are either badly cropped, badly graded, cut or have poor subtitling/sound. The only people who can be trusted to look after these artworks are people doing it out of passion rather than profits.

In 10 years the teenagers growing up today are going to have a cultural blackhole as a lot of the more unusual music they enjoyed no longer exists anywhere online because it was all on spotify, YouTube or soundcloud and they might have gone out of business, not retained it or the creators themselves lost it or moved on.


Music can also be lost to fire, as it was in 2008 [0] (with Universal studios originally downplaying the loss until a 2019 New York Times article revealed its full extent).

[0] https://en.wikipedia.org/wiki/2008_Universal_Studios_fire


The point is that if everything is in a single giant "library", and it burns down, then everything is losf forever.

But if there are copies here and there, there is much more chance that at least some will survive the centuries.

Eg, single point of failure vs redundancy


>The point is that if everything is in a single giant "library", and it burns down, then everything is losf forever.

What.cd suddenly comes to mind...


That was so tragic.


Now if P2P could only go back in time and get the original Beach Boy's Smile or MBV's Loveless followup finished and released.


P2P is actually mostly preserving movies. I find it to not be an abundant resource for old music recordings/lives.

I've downloaded quite a number of really old live performance of classical music from YouTube, that are practically nowhere else to be found. No pirate version can be found, private bay or the likes.

At least for now, YouTube is a gain in preserving and distributing old and unusual content, not a loss, in the world of classical music. I don't know what happens it one day it ceases to exist, but I imagine enough people would have saved copies of those videos to re-upload to whatever video site replaces YouTube.


Your "in 10 years the teenagers..." part applies even today already. A German rapper I know had released a mixtape 3-4 years ago, but as he had no rights to the beats it was only a mediafire zip upload...

Not even he has the zip or any of the songs, as the files got deleted, managers changed and so on. And this was no mediocre stuff, I loved it!


> Look how many artistically important movies are not really available anymore

Yeah like for example the moon landing videos that NASA deleted by accident. Had they been freely available in hires digitized copies we would still have them. But alas we only have low res copies.


> Yeah like for example the moon landing videos that NASA deleted by accident. Had they been freely available in hires digitized copies we would still have them.

The moon landing videos were made in 1969, and are believed to have been erased in the early 1980s. There was no technology at the time to digitize those videos, nor any compact data formats to store them in, nor users with spare storage for that data.


While true, it doesn't invalidate the grandparent's point. The tapes were easy to lose because they were not replicated.


And they were not replicated at the time because this data wasn't considered valuable, and on a larger scale, because there was not a culture of data preservation at NASA.

(It's not clear that they had the ability to replicate this data, anyway -- the reason it was lost was that they needed to reuse the magnetic tapes for data from another project!)


That's why allowing the data to be replicated is so important if you care about preservation. It only takes a few happy mutants who think some otherwise boring data might be interesting and keeping a copy for themselves.


Again, you're assuming that ordinary people had some way to easily "keep a copy for themselves". This was emphatically not the case. This data was not digital. It only existed as a set of magnetic tapes in an unusual format: 1" magnetic tapes containing application-specific analog data. The number of institutions that would have been capable of copying and storing this data would have been small; it's improbable that any individual, "happy mutant" or otherwise, could have done so at the time.

The notion that any sort of data is easy to copy and store is a modern concept. It was usually not the case in the past.


> Way to go, old friend!

And I thought the Emule network was dead, good to hear they're doing well.

> Those saying 'Why use P2P apps if you have nothing illegal to share' are the same ones saying, 'Why need privacy if you have nothing to hide'.

Couldn't agree more. Everyone of us, including myself, has something to hide. Some of the more extreme motivations behind the attempts to push for more control on communications simply don't stand: "it could be used for child porn" is no different to "it could be used for drugs trading" applied to normal cellphone communications. Just let communications protocols alone, then divert more funding to intelligence technologies and training rather than giving police military weapons.

> Long live P2P. The more protocols in active development the better.

Agreed, though it would be nice if nodes could interoperate with different protocols. In the old days the wonderful Napster OSS compatible client Lopster was rewritten to be compatible with pretty much all known p2p protocols back then, but before it was ready BitTorrent came and became the standard, so it's currently stagnating/dead. http://lopster.sourceforge.net/


MuWire is ahead of you there https://muwire.com/

Like LimeWire and Emule but runs on I2P for anonymity. TOR shouldn't be used for P2P. It's not conceived for that.


Hi, I'm the creator of MuWire. Indeed, TOR explicitly discourages P2P use or sharing of large files, while I2P welcomes it.

I'm very happy to see eMule spring back into life. I believe it's time for a move away from the "torrent site" model and back to more casual approach file sharing.


Slightly disagree, Tor hidden services are fantastic to implement decentral signal/rendevous services with no single point of failure. I'm using this in my app https://cryonet.io

Love the fact that it works in many places where plain UDP solutions fail.

Once two nodes are connected via a Tor hidden service, other p2p channels can be established and are more appropriate to transfere large files.


> Once two nodes are connected via a Tor hidden service, other p2p channels can be established and are more appropriate to transfere large files.

While maybe not in your concrete case, in general this sounds like really dangerous advice - a quick way to open up for deanomyizing attacks?

Hey node behind Tor, send me your IP by requesting this https resource outside of Tor... Etc ?


Yes that is true, I should have mentioned that, this is only a privacy preserving / encrypted mechanism for signalling without a central server, peers are expected to know and trust each other, like family and friends scenario.


The Cryo tool really should be open source. Closed source binaries should not interact with Tor. You are hinting that it leaks the IP over the Tor channel -- the behavior needs to be easily verifiable to the user, and the source should be available so folks can fix privacy or security bugs which can put your users at risk.

None of this precludes the sale of your $20 lifetime license; but you are probably better off selling some kind of support license or other intangible thing and liberate the code.


I'm not planning to open source the complete tool, after all I have a hard time to find similar open source projects which earn enough money (would love to be proven wrong here).

But I fully agree that the security and network stuff needs to be open sourced to be verifiable. My plan is to release the related sources together with the protocol specifications, need to refactor a few things first but this will happen.


There is already a P2P program called OnionShare that uses the TOR network. Never used it, but journalists and activists supposedly use it.


> Those saying 'Why use P2P apps if you have nothing illegal to share' are the same ones saying, 'Why need privacy if you have nothing to hide'.

Agreed. Saying content illegally is certainly a problem, but let's not dismiss the importance of decentralized systems. It certainly helps distributing large files without a central server and point of failure. I seed my share of Linux distros and open source softwares to help with the distribution with my own bandwidth and storage. And decentralization is s big equalizer in power, which allows the masses to share content even if there is someone in power against it.

We need decentralization and P2P protocols more than ever.


I'm gonna frame your comment, it's just great.


> The same people saying 'Why use P2P apps if you have nothing illegal to share' are the ones saying, 'Why need privacy if you have nothing to hide'. They're wrong, they're privileged, they're selfish, and they're often complete lying hypocrites (snd therefore dangerous and abusive people).

Is there a significant populace who vehemenlty oppose p2p?


The fact that some ISPs, and even VPN providers, block p2p connections means that there is a lot of interest in doing so. Not necessarily by the "populace", but pushed by whoever feels their interests are being affected.


"Significant" yes, "numerous" probably not: Richer people with personal interest in upholding IP rights, or who are generally approving of the legal status quo.

Although, on second thought - a lot of people who are not tech-savvy might be convinced by the mainstream media view on file sharing as involving a lot of "piracy" and theft, so maybe that demographic has more opponents of P2P.


Anyone know what the state of the network is these days ? I haven't used it for at least 8+ years, I remember servers drying up but with kademlia that's not really a problem.


I have to think a lot of people left it behind because of all of the malware that was masquerading as something else.


Well, video and audio should be fine either way, right ?

Downloading and running executable files from file sharing networks seems like asking for trouble no matter what the network is.


> Well, video and audio should be fine either way, right?

From what I recall of the time when these networks were popular, that wasn't the case; there was some way to make Windows Media Player download malicious executable files when playing a video file.


Yes, perhaps those people are typically bad, but you know you should focus your criticism on the argument itself.


But I'm choosing to discuss some of the social context to the technology and how it interacts with society and people. People's character and motives directly affect freedoms to use such technology, and therefore the relevance it has to people.


Agreed. I still have my copy of the Wikileaks file thanks to P2P. Still curious what's in there.


I played with aMule, similar project, about 6 months ago. Used to use it like 10 years ago.

Almost all the server nodes were themselves just SPAM, their names advertising prescription drugs for cheap, as well as dating sites. I found myself very disappointed with the state of the ecosystem.


Oh my, those were the days... This amd DC++ while everyone else was using Kazaa and Limewire


Ha... for some reason I thought Limewire was awful and barely used it. I was however a very big fan of WinMX for many, many years. Happy days! I remember building a massive music collection with it - and sometimes waiting days for a track to download (although usually it was more like hours).


And sometimes by the time the track or album finished downloading, you knew some of the lyrics by heart!

Ah, the memories...


I was in germany at the time. University network dc++ had literally everything.


DC++ and Soulseek for me.

I still use Soulseek (via Nicotine) to this day and it's great!


EEEhhhh, I used Emule around 2003-2008, mostly for downloading music and books, but since then I've been getting everything I need from either torrent or http ddl. What kind of things do you people seek in emule nowadays?


What's the likelihood of getting hassled by your ISP for running this? I always assume that the main target is torrent traffic these days and all the rest is mostly being ignored, due to the volume of torrent enforcement, but would be interested if anyone had any experience otherwise.

Asking for a friend.


Your ISP doesn’t give a crap what you download, they just care about the GB downloaded and a specially uploaded. Users are harassed when the ISP gets a legal notice from a copyright holder that has “proof” of infringement by your IP.


Wow, what a blast from the past! You won't remember me (certainly not under this name), but I briefly hosted the download for the installer in 2003 which was just hidden in a subfolder of my works website. Fun times, thrilled it's still going :)


It's good that development is on-going, but it's a bit saddening that the official client is closed-source and single-platform (if I am not mistaken! Haven't seen an indication to the contrary on the site).

I think the platform would fare better, and would decline less, if the software were developed more openly.

Note that there is a FOSS client for Linux (and maybe other operating systems): aMule.

https://github.com/amule-project/amule/

And while it seems some development (last commit was 8 days ago), it has not seen a proper release for 4 years.


eMule is GPL, any closed source fork is likely illegal, even if it's the official client.

Apparently, the source code of this release can be found at https://github.com/irwir/eMule/tree/v0.60a

aMule was created as a cross-platform port of eMule as far as I can recall.


Umm, the link you gave is for something released in 2016; and not worked on since.


> the official client is closed-source and single-platform

It's not clear what you're talking about... eDonkey2000 ? I don't know if such software is still used nowadays.

Emule is Windows only, but it's open source, you can find the links on their official page and on sourceforge.


eDonkey2000 has been dead for a decade and a half now - https://www.latimes.com/archives/la-xpm-2006-sep-13-fi-donke...


What a blast from the past. The project, the forum software, the community around it. What a delicious injection of nostalgia.

Thank you!


Where would you find stuff to download?

Anyone has good sites / servers?


I use eMule because it had a rare doco I found on this site.

https://docuwiki.net/ (Obviously check it's Creative Commons etc before DLing anything)

But you can just search in app. But I can't work out if it's worth adding servers or they are all linked? Or where to get servers from. Or what it all means. I guess I should read a manual.

(Some search material can be illegal (Not from copyright law). Anyone used to these networks I guess knows this, but if new be careful, you will probably autoshare if you accidentally get it)


If it's the same now as it was back when I last used it (more than 15 years ago), it should have a search box. You search within the app itself, it shows the results and you select the file you want to download. You don't need to use an external service.


Holy shit this takes me back!!


Me, I was a mldonkey man.


Masochism you say, I'm impressed.


Will these changes come to aMule for Linux as well?


Wow, I was sure all donkeys and mules died a decade ago. Or were full of malware. How is this still usable today? I mean you can find literally everything online for download or stream, without pirating over p2p.


Depends on what you're looking for, but "literally everything" is not available to download on clearnet, some things can only be found over p2p via groups you'll get invited to/know the right people. HTTP/HTTPS/WS/WSS is also still blocked in many places today, or otherwise tampered with, while p2p presence remains strong in those locations.


I haven't used eMule in maybe 15 years but I still torrent a lot. I find that it's massively more convenient and reliable than shady streaming or direct download sites. You just have to find a couple of reliable trackers to get your torrents from. I have a dedicated seedbox though, I suppose that if you download from your home network it could be more annoying.


no, you can't.

I've been trying to source some movies from the '90s and they are not available on any streaming service I have access to.


Just as a very silly example: try to find Pantera's glam rock albums and you see how much the mainstream channels feed you only what they want you to see and consume.


I can find all Pantera's albums on Spotify, at least those citied in Wikipedia.

But, I agree that sometimes there's content which is hard to find on digital stores or streaming services.


Maybe they added now or it can be a regional issue, but they didn't have it in Germany at least until 2018 when I looked for them because I wanted to show a co-worker.


https://www.youtube.com/watch?v=_p4_6ho9UyM (7.4 minutes sketch about idea that "you can find literally everything online")




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: