Hacker News new | past | comments | ask | show | jobs | submit login

This is what it was like when I first got on the Internet (around 1992); there was no web then, instead you FTPed everything and used resources like Gopher, Archie, and Veronica to find documents.

I find PigShell interesting from a historic point of view but think it also points to a flaw in how the web currently works, with far too much glue and eye candy, which is a direct result of the ad business model.




I completely agree. I actually deleted a bunch of my comment before posting it, because I started getting carried away pondering about what the right level of abstraction is for dealing with web resources.

At the risk of sounding like one of the semantic-web xml-standards people, it really does make me wonder if all this horrible, incompatible, site-specific frontend crap is just a fad that will give way to tools (like psty) that ignore all that and just deal with the raw semantic content (and HTTP verbs) at a higher level of abstraction.

Woah, man. We're like...surfing the Matrix, man.


Things like this really make me miss the old Firefox experiment "Ubiquity". It was truly a marvel for the time it was up, and I often wonder why they did discontinue it.

You could search websites, send emails, look at maps and much more (you could build your own commands) without ever actually opening a webpage. It had 200.000 users when they shut it down.


Wouldn't you just be able to fork it and continue as a hobby project of your own? Who knows, maybe you find other enthusiasts and it becomes a real project again. Mozilla licenses are (I believe and could be wrong), quite lenient about these things.


This is what I found by looking at its Wikipedia article:

https://bitbucket.org/satyr/ubiquity/downloads




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: