Hacker News new | past | comments | ask | show | jobs | submit login

Sad the site is now a Javascript powered site. Its not even old SEO friendly (I know most of the engines now run JS but it slows down indexing).

I thought for sure they did a PJAX [1] like design when I saw the spinner to keep asset loading minimal (why c2 would need that is beyond me).

But it is even worse. A full page reload on each click and then an XMLHTTPRequest call is made for some JSON.

I wonder who is running and why they picked such an awful design. Do people just not know how to write old web 1.0 apps?

For now I will play devils advocate and just assume perhaps they are offering JSON like API so that someone else can maybe write a better skin. If some one does hopefully they will use push state.

EDIT

I clicked around to see what Ward has been up to and now I can sort of see why c2 is the way it is. Basically Ward is working on Federated Wiki. I think the idea is sort of cool as it is yet another attempt to continue to decentralize the web.

I believe it is going to be based on some JSON protocol but it would be interesting and perhaps have greater consumption if it was in Google AMP format or supported that format or converted or whatever (not c2 but the federated wiki stuff).

[1]: https://github.com/defunkt/jquery-pjax




I think that is because Ward Cunningham wanted to transition to a static site instead of running it old style CGI way.

See also: https://github.com/WardCunningham/remodeling/issues/2 http://wardcunningham.github.io/


It appears the root of his problem is an old database in a weird format. If he just gave access (to the DB) I'm sure a fellow githuber / HNer could probably get the database converted correctly for him to some other DB or format.

Then again I probably don't fully understand all of the problems.


It's not that the database is in a weird format. It's that individual entries have multiple character encodings. The old perl CGI could handle that just fine, but trying to do anything with those files is a pain in the butt, which is why the site was offline in the first place. Ward was attempting to work with a static target instead of a moving target.


see my comment floating somewhere in this thread.

it took some digging but I managed to grab a dump of every page in JSON. he hid it well but I scraped everything and put it up for people to see (it's in its original markup form, and I'm working on converting it to HTML again.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: