Hacker News new | past | comments | ask | show | jobs | submit login
Tell HN: Thank you for not redesigning Hacker News
1831 points by ramphastidae on Sept 1, 2019 | hide | past | favorite | 390 comments
I’m currently in a country with low speed internet and the entire ‘modern’ web is basically unusable except HN, which still loads instantly. Reddit, Twitter, news and banking sites are all painfully slow or simply time out altogether.

To PG, the mods and whoever else is responsible: thank you for not trying to ‘fix’ what isn’t broken.




Right now I'm traveling in a train somewhere in Smolensk region in Russia and HN is the only site I could use normally with sporadic Edge/3G. Had exactly the same thought as in the title. Thanks HN!

+1 for a dark theme though, reading from a bright screen in midnight darkness hurts eyes.

Another frequent thought I have during such trips is why so few sites have any basic offline support? E.g. while writing this I moved to offline zone and if I press 'add comment' now then most likely I will lose it. So I have to copy it and save temporarily in a GMail draft. But it would be nice if all drafts are stored in LocalStorage/IndexedDB as I type until posting is acknowledged.

Actually it is much easier to lose written text on mobile. I've just opened 5 apps that took all Android memory, which kicked Chrome from RAM and forced page reload when I returned to Chrome. The page reloaded from cache instantly, but the comment draft was lost (continuing this from GMail draft, waiting for a stable 4G at the next train station near a mid-size town).

GitHub issues comment form keeps the content if I accidentally click on a link (e.g. Pull Requests) and then return to the issue page (not just by going back, but jumping over several GH pages, e.g. Home->Repo->Issues->#issue) and such behavior is much better than an alert on tab close about unsaved content. Probably in GH case this is an accidental side effect from SPA state storage, but at least once it saved me from losing a large complex comment with links and markdown, that is how I noticed the behavior. GH still loses content on page close, but keeps it after page reload.


I turn HN into dark mode instantly by applying the negative color filter built in on Android (I assume iOS has something similar as well). On Android the option can be found under accessibility and can also be assigned to a combination of hardware keys for quick switching.


Just looked into setting in my phone and doesn't have Dark Mode but I tried the Color Inversion. It's enough!


Privet :)


There should be an extension that automatically redirects common websites to their lite versions.

reddit.com -> old.reddit.com

facebook.com -> mbasic.facebook.com

cnn.com -> lite.cnn.com

npr.org -> text.npr.org

twitter.com -> twitter.com (with JS disabled)

gmail.com -> gmail.com (HTML version)


Twitter is barely usable with JS disabled. Any time you follow a link to it you'll get a worthless interstitial "would you like to proceed to legacy twitter??" with only one button: "Yes." As of recently, if you have cookies disabled for their domain as well you'll end up in a redirect loop and never be able to see the 140 characters of text.

*As for reddit though, if you log in with an account you can set it to always use old reddit in preferences.


The current workaround to those issues for Twitter (while it still lasts) is to set the user agent to Opera 12. To handle that for Vivaldi/Chromium I use a nice little user agent extension [1] that allows auto UA switching based on domain.

[1] https://github.com/eminentspoon/chrome-extension-useragent


Might work with internet explorer too. On ie11 mobile I always instantly get the static twitter page. Loads way faster than the modern Js version on my note 7 which has four times the ram and twice as many CPU cores. It's both hilarious and sad.


No, you can set "Use new Reddit as my default experience ". At the bottom of the page, "beta options".

https://old.reddit.com/prefs/

There is no "always use old reddit"


Hmm, it always shows me old reddit when I log in. I just tried logging out, got new reddit, then old reddit again as soon as I logged back in.

Are you saying you have that checkbox you pointed out unchecked, and still get new reddit by default?


I navigate directly to old Reddit. However if links within Reddit (in comments, sidebars, wikis) specify the full domain, I end up at new reddit.

I basically never ever want to be there. I've specifically not enabled JS for 'www.reddit.com' so that the site won't load, but I still have to fix URLs manually when directed there.

What I'd like is for any 'www.reddit.com' I encounter on the site to be rewritten to 'old.reddit.com'.


I log in to www.reddit.com, and have the "use new reddit" box unchecked. Anytime I click a link to reddit (including np.reddit links) it honours my preference of avoiding new reddit (checked just now by a quick reddit search on site:reddit.com)

So maybe this is a misfeature on old.reddit.com, or you don't have the box unchecked?


As an example: visiting the Ask Historian's wiki in Old, all the links redirect to New.

https://old.reddit.com/r/AskHistorians/wiki/index


Ok, well if I visit that url and click links to reddit posts, like [1] under April Fools, they go to www.reddit.com but my preference setting of not seeing new reddit is honoured there. (i.e. looks the same as if I did s/www/old/ on the url.)

Not sure if you're saying that setting's not working for you or you're just annoyed that it doesn't explicitly link to https://old.reddit.com../blah/..

1: https://www.reddit.com/r/AskHistorians/comments/b7rums/why_d...


I think they meant if someone posts a link that's not an np.reddit link, but a full link like `https://www.reddit.com/r/europe/`


If you uncheck that box though, you will always be taken to old Reddit unless you specifically go to new.reddit.com


It's _meant_ to do that, but from personal experience (and one that from what I've seen others share), keeping old Reddit enabled is extremely finnicky (apparently it's an issue with the way their server handles the old/new check) and it'll randomly switch to new Reddit.

I personally wound up using an extension[1] to keep old Reddit active, which so far has taken care of the weird behavior.

[1]: https://addons.mozilla.org/nl/firefox/addon/old-reddit-redir...


twitter.com -> m.twitter.com appears to be the mobile, non-JS version. (I don't use it to tweet, but I get linked to it to read some things sometimes, and that functionality works fine.)


Hm, so maybe all I need is an extension to rewrite twitter links to the mobile version. Will look into, thanks!


>gmail.com -> gmail.com (HTML version)

gmail used to have an option to set HTML as default, but it seems like it's gone.

Only way to get HTML is to click on a link that's available during loading, nothing in the settings to make it permanent.


Just use this as your bookmark: https://mail.google.com/mail/u/0/h/


For anyone interested, I just put together a userscript with some rudimentary key assignments for the Basic HTML version of Gmail:

https://pastebin.com/raw/kHyZXcTY

    gi    Go to inbox
    gk    Go to first label inbox
    t     Toggle checkbox of current mail
    mr    Mark checked mails as read
    mu    Mark checked mails as unread
    k     Navigate up in mail list (shift twice as fast)
    j     Navigate down in mail list
    h     Go to older mail(s)
    l     Go to newer mail(s)
    Enter Open mail
    r     Reload
    b     Back
    n     New mail


Thanks a lot! This was the only thing I missed from the JS version, but it wasn't worth the slowdown/heaviness that that version brought with it. Now this gives me the best of both worlds.


Thanks. I just noticed a bug though: You can enter the search field by moving all the way to the top pressing arrow up or k. And then you can defocus the search field by pressing escape. This works now and I posted it on GreaseFork.

https://greasyfork.org/en/scripts/389680-google-mail-gmail-b...


Thanks. This actually works.


You can definitely make it permanent, it's permanent on my account. Can't remember what I did now...


Generate enough outcry on social media?


After you've clicked on it, there's a bar at the top with an option to "Set basic HTML as default view"


s/there's/there was/

It's gone. That's why I'm posting about it.


It still displays "Load basic HTML (for slow connections)" at the right bottom section of the loading page. :/


Yes it does. Like I said in my first comment, I have to click it every time I load gmail, because "Set basic HTML as default view" is gone.


On Firefox I use Redirector: http://einaregilsson.com/redirector/

I have it set up to handle Reddit and Wikipedia. News sites and Twitter don't get to run JS thanks to NoScript, so I usually don't have to do anything extra to get a lightweight experience there.


+1 for Redirector. I eventually got sufficiently sick of www.reddit.com that I searched for solutions and ultimately chose Redirector to replace it with old.reddit.com. It lets you use either a "wildcard" or a "regex"-type match; I chose the former, and this is what I've been running:

  Redirect:
  *//www.reddit.com/*
  to:
  $1//old.reddit.com/$2


There’s a couple for reddit at least. For some of the others you can get by with a bookmark, so typing facebook would take you to your mobile site bookmark.


Switcheroo Redirector for Chrome has worked for me to redirect amazon.com links to smile.amazon.com


Funny thing is that I'm pretty sure CNN Lite is a server-rendered React app, or was at some point.


Well volunteered! You can add:

* amp link -> non-amp version of same link

while you're at it, please!


reddit.com -> old.reddit.com -> i.reddit.com


I love hearing stories about people choosing to forego large frontend frameworks, or even not using JS altogether.

Is there a community or name for this? If not, I'm going to call it #neverscript. I find the web works far better, more often, for me when SPAs and JS aren't used. There's something about minimalistic sites that's very appealing to me. I know it doesn't work for all sites, but a lot of the web could be improved by reducing the frontend bloat.


When you're not using frameworks/libraries, it's typically referred to as vanilla js. Probably the name stuck thanks to this amazing satire of a js library. http://vanilla-js.com/. This was basically around the time browsers got good enough that jQuery was no longer needed. It essentially emphasized that all the showcase features of popular 'must-have' js libraries were now a standard part of modern browsers.


>Probably the name stuck thanks to this amazing satire of a js library. http://vanilla-js.com/

Using "vanilla" to describe a standard, unmodified and un-customized version of something is a very common idiom. This satirical website followed the convention, it didn't coin the expression. I don't think it's even jargon, it's just based on the meaning of the adjective "vanilla":

Lacking adornments or special features; basic or ordinary: "a delicious twist to a vanilla plot" (Ian O'Connor).


Of course. But within the JavaScript community, it has taken on specific meaning: no 3rd party libraries at runtime.

For "vanilla js," I can still have compilers, type checking, advanced ECMAScript 2019 features, local storage, webgl, etc. Without that understanding, you could qualify something as "vanilla js" in lots of different ways.

If someone said "vanilla c++," they might mean c++ without the recent language features, or they could mean the latest language spec but no standard library. Maybe it's still okay to link OpenGL but not okay to use a desktop GUI framework? Maybe OS frameworks are okay but nothing more? There's no communal understanding there that I'm aware of.


What if you don't want to use JS at all? I want to cater to people who want to turn off JS altogether.


Would just be called non-JS. However you don't need to avoid javascript completely to cater to non-js users. The term "progressive enhancement" is used for a site that works without javascript but gets enhanced when it loads. Hacker News is like that. The links/buttons would trigger full page loads except the javascript cancels those actions and handles them without. The vote button won't refresh a page, for example.

I personally don't think there's a market for non-js users. Most of the web is pretty broken without. I've never encountered someone in the real world who lives this way, even in the tech world. At most, someone will install a no-script browser plugin. That makes whitelisting and enabling javascript dead easy for the user. The only legitimate non-js use cases, IMHO, are for extreme political cases where you'd also be using TOR.


> The web is a bloated, over-engineered mess. The Lean Web is a set of principles for a simpler, faster world-wide web.

https://leanweb.dev/


I have a hobby webcomic, and it made me realize how little code you need for any website. Works great on any device and it is 2mb. That 2mb is the actual comic page. It has every navigation and contact button you could want. And I believe it is about 80 lines of html total.

Boggles my mind how bloated and down right creepy the modern web is today.

A webcomic is a far cry from some main stream site, but it served as a good window into how others abuse their viewers with bloat and surveillance.


doesn't have a way to interact with the server (e.g. chat/feeds etc.) - which most sites need to have


I use barba.js to make my site load faster, but at the initial expense of a JavaScript payload to enable the XHR page delivery. It would be cool if browsers offered an experimental flag, allowing the page to be fetched with XHR by default when possible. Then static sites could be even faster, no Javascript download required, because of a little browser magic


I browse (and can browse) HN without javascript.

Only useful feature to me that javascript enables is comment hiding.


I wish they would update the line-height to something more reasonable. ~1.5x font size would be go a long way in paragraph readability. Other than that, it's fast and stable. About all I require from a forum.

Edit: All my lovely responders, I appreciate the feedback, but I don't want to install addons/mods/hacks to fix line-height.


Another awesome thing about simple sites like this is you can hack the CSS really easily with something like Stylus.

https://addons.mozilla.org/en-GB/firefox/addon/styl-us/


Yes, it is good. (I tend to only use it for sites that already use CSS; for those that don't, the default is almost always good enough.) Even on Hacker News, I use that to do two things: to make all comment text black, and to display dotted lines for the comment indentation to make it more easily to be seen the indentation level (it is implemented as simply a tiling background picture). (For more complicated web pages, I often need to add a lot more stuff to make it work reasonably.)


You can literally do something like what parent comment wanted to do in three (3) minutes, including installing the addon. Stylus is great.


HN is trivially editable with userstyles. I personally use one for a dark theme; I think you can install a userstyle of your choosing on all browsers from which you access HN.


Edit: All my lovely responders, I appreciate the feedback, but I don't want to install addons/mods/hacks to fix line-height.

The line-height looks perfectly fine to me. AFAIK the HN CSS doesn't force any specific height but uses the defaults. In fact I'm sick of the "modern" absurdly-low-density design trend that just makes me need to scroll more to see less.

My browser has a zoom function, and so should yours. I don't understand the resistance against customising a site --- it is a user agent working for you, and the CSS spec even mandates a user-stylesheet (which I know some browser(s) blatantly disregard). Different users will obviously have different preferences.


my default zoom is 150%. this breaks a lot of websites. but maybe they don't care about accessibility anyway. like OP said, i have not had any issue with HN for years.

i recently started keeping dev tools's console open when i am using chrome. i have some JS to collapse comments here on HN.

i have seen some interesting things in the console, i just keep it opened now:

- some sites have st tons of JS errors and yet function as normal

- some companies put job ads in there for JS people (medium is doing it right now)

- some libraries show off their awesome ASCII logos in the console

- debug statements are left there unintentionally (i wonder if there is no tool that could strip these out)


Does your JS do something differently than HN's native comment-collapsing? You can just click the "[-]" at the end of each comment's "header".


What sites does zoom break? I've been on a 2x DPI screen for years and never noticed anything. The web is pretty good about this, even "pixels" are actually just length units not physical pixels.


Very few work correctly with zoom. One of them is http://motherfuckingwebsite.com/

And text has nothing to do with pixels, it's not raster.


Can you give an example of one that works poorly with zoom? Like I said literally all of the browsing I've done in the last couple of years on my work PC has been with 200% zoom and I've never noticed a site be broken/different to my 100% setup.

I suppose if you have e.g. a 720p screen and do 200% zoom you might run into some problems but that's got little to do with zoom and more to do with the lack of space on your screen to put content anymore.


YC, reddit, wikipedia, github, SO (very horrible), all google sites.

Lack of space is not really a thing for html, rarely a page fits on screen without scroll and wrap.


For PC Chromium based browsers - use the extension "Zoom Text only." Allows you to - well, the name is self explanatory.

For Firefox - no extension needed. Press alt to show the main menu. Go to View -> Zoom - then select Zoom Text only.

Now, zoom will only increase font size, instead of the whole page.


All my lovely responders, I appreciate the feedback, but I don't want to install addons/mods/hacks to fix line-height

I have, but it's a mark of frustration for me. It definitely falls under the heading of, "The user shouldn't have to do this."


In Safari: View → Zoom In

It's smart enough to remember the zoom level the next time you open HN too.


That's not increasing line-height. I don't have a problem with my eyesight, I have a problem with poor typography.


And I have a problem with acres of whitespace. Here's one vote for HN to leave the line height right where it is.


How do you live with a centered fixed-width table that is wasting 15% of the browser window with nothing but hectares of white space then? Bumping the line-height to 1.5ish the font size has minimal impact to space efficiency, and add tons to readability.


I get by


> And I have a problem with acres of whitespace.

We're talking about just 1.5x line height. Which is a perfectly reasonable default for nice reading. The browsers have a lower default line height because of historical reasons, doesn't mean it's the right one. People don't use the Times font very often either.


Isn't that something that can be fixed with user-side CSS rules? EG, this for Firefox: https://ffeathers.wordpress.com/2013/03/10/how-to-override-c...


I had to use 64kbps internet on vacation for last couple of years, it's rough but it's possible. RSS everything and then use Inoreader mobile and read it there. Unfortunately more and more websites use javascript for content and that breaks the mobile version loader (it can parse text from mobile version and show it to you). But stupid webdevs make even even blogs and galleries via javascript so it often says that it cannot render mobile version. I hope google will eventaully return to non-js content and pagerank of those sites will fall down significantly.


Besides the excellent minimal UI changes [1] [2], the HN leads have focused on the difficult core challenge of moderating the forum [3] [4]. Thank you.

[1]: https://news.ycombinator.com/item?id=10489499 "mobile markup"

[2]: https://news.ycombinator.com/item?id=12073675 "collapse comments"

[3]: https://news.ycombinator.com/threads?id=dang

[4]: https://news.ycombinator.com/threads?id=sctb


Amen. It’s the most pleasant site I use every day. Even Google search is beginning to slow down to a crawl in some hours (and Google used to be blazing fast).


You can redesign the website using modern technology and still make it insanely fast, probably even faster than its current state.


Theoretically.

You can but no one manages to do that!


The key is to start without JavaScript and then progressively enhance. There's very few reasons you need gigabytes of JS downloaded just to read a website.


you mean like reddit?


here are some ideas, hope you could find some of them useful:

https://2019.jsconf.eu/news/how-we-built-the-fastest-confere...


This, and you can get a much better mobile experience to boot, not to mention fixing dozens of errors that crop up with prolonged use of this site.

Literally to post this comment, I got a, "We cannot serve requests that quickly please try again in a moment" message. Completely unnecessary in 2019, given how simple this site (as a concept) is.

"I hate change" <- basically the summary of this submission, and kind of sad.


Have you ever seen a SPA redesign that actually improved the usability of the site for you?


No one said anything about SPAs. You can keep the site modern whilst still serving content in separate pages with a little JavaScript here and there. Being ideologically against any and every form of JavaScript is being pretentious and holier-than-thou.

As for concrete examples of the above, see lobste.rs for an excellent interface that's still minimal and responsive. Typing this comment on HN took me to a separate HTML page for absolutely no reason; Lobsters doesn't do that and allows you to write inline, preserving the context above.

Old Reddit (the only acceptable way to use Reddit) is another example of good JavaScript. Fuck the new redesign, but there are countless subtle and useful features in the old design that are only possible because of a tasteful use of JavaScript.


> No one said anything about SPAs. You can keep the site modern whilst still serving content in separate pages with a little JavaScript here and there.

But that's not modern web development, that's jQuery-style coding. Modern web development is about heavyweight SPA frameworks like React or Angular.


Heavy weight SPA frameworks might be popular, but there are also tiny SPA frameworks like choo, svelte, hyper.


I like Svelte, but it's not exactly tiny. And it still has the cost of generating everything with JS.


Literally to post this comment, I got a, "We cannot serve requests that quickly please try again in a moment" message. Completely unnecessary in 2019, given how simple this site (as a concept) is.

That's a ratelimiting function to stop spammers and such. No one is questioning the ability of the servers to handle much higher request rates.

"I hate change" <- basically the summary of this submission, and kind of sad.

It's "I hate useless change". What's sad about that?


The myopathy required to think change that doesn't benefit you personally is therefore useless both astounds and bores me.


I generally agree, but what a few lines of CSS would make it much more pleasant to read IMO:

- limit container width so lines don't get too long

- larger text and more line height

And not really CSS but:

- Making it easier to collapse comments on mobile

I know some of those changes would potentially destroy the look & feel that some seem to enjoy around here. However on mobile currently it's only possible to enjoy HN via a third party app though.


I actually put together a userCSS to fix the two first points. I made it last year, haven't updated it recently but it's still up in my repo: https://github.com/gonzalocesar/sightsafe-hn


It is this that I cringe hard whenever a developer mentions 'the modern web' or 'the free and open web'.

I wish websites were as simple as HN.

+ Lightweight.

+ No ads.

+ Works without (or little to no) JS.

+ Does not make me download MB's of data.

It seems 98% of the websites I visit cannot pass the first two.


I loathe the modern web. Speed is a secondary to "hey look what we're doing with our frameworks".


Modern front-end toolsets are finally starting to address this by using automated critical css, SSR + "hydrating" on load for interactive elements, webpack treeshaking + chunking of assets into tiny js/css files and only loading code based on what the page/routes asks for, PurgeCSS to get rid of unused css, cssnano/uglify/htmlmin/imagemin, etc. Webpack and these various popular tools has done a lot to make the web better and are default in many starter kits. But we're just beginning to see it go mainstream.

The basic shift back to pre-rendering + delivering only what's needed on the actual page is a big deal which hasn't got enough attention yet IMO. We went completely in the other direction for over a decade because we thought networks and browsers could handle it, but it was too easy to abuse and mobile took over.

There's no reason even very advanced websites can't load quickly, without having to fully compromise back to the 90s non-interactive websites.

News sites and marketing teams still ruin it regardless with all the crap they add on top. But it's far far better than loading a giant blob of 20 jquery libraries plus a big old angular.js or backbonejs app (sometimes both at the same time) encompassing a whole site in one file, even though you just wanted to visit the contact page.

I also blame Themeforest type developers because they're still stuck in the "let's add a hundred JS files so I can parallax and animate two divs" mindset.


> news sites

It's really a sad state of affairs when an industry with such a strong historical connection with publishing is now reasonably cited as an industry worse at publishing than most others.


Newspapers have always been full of ads fwiw


I'm talking beyond the matter of ads. On their websites I can use an adblocker, so in that respect the situation has actually improved.

They regularly fail on basic UX. Their ability to turn a kilobyte of text into 10MB of trash is astounding, as is their ability to cripple modern computer hardware, or simply readability, with their latest 'innovations' in totally unnecessary JS/CSS fuckery.

uMatrix blocking all JS, including first party, helps. But you've still got bullshit like floating headers that follow you when you scroll down the article to contend with. I find myself creating cosmetic filters to remove pointless floating clutter on news websites more than any other sort of site. The industry seems utterly incapable of presenting a clean and readable article online.


I would advise using your browser's developer tools to find the irritating web page js, css, or domain/sub-domain responsible, and then using uMatrix/privoxy/hostfile to block that garbage. Many sites are usable by preventing js entirely.


Reader mode on browsers is the easiest way to make news websites usable


I can often get the text of the article in Emacs, so I generally do that. My TTS integration is better in Emacs than Firefox anyway, so I actually prefer it for longer reading.

But regardless of our personal work-arounds or workflows, those websites shouldn't be like that. The general public is still subjected to the default experience and that bothers me.


Newspaper ads weren’t as noxious or ineffective as online advertising.

I don’t understand how online ads are as big a business as they are, other than the troll stuff on Instagram. The rest is pretty unremarkable and ineffective.

I can probably count on my hands they number of times I intentionally engaged with an ad in the last 25 years.


> I don’t understand how online ads are as big a business as they are

For one, they're on-line, = cheap to make and pretty much free to serve.

Two, with the amount of tracking done, on-line ads are attractive to advertisers because they can evaluate their effectiveness real-time, and often tell what ads are responsible for what real sales.

Three, with all the accrued complexity, it's a wonderland for third parties who help the advertisers navigate through on-line advertising landscape, and divine attribution. Now the trick is, if your customer (the advertiser) doesn't have even basic competence in statistics (they most likely don't), you can bullshit them with data to your heart's content. Which is what you're probably doing if your team doesn't have much statistical competence either. I've seen this happen.


Yeah, but those fixes and workarounds are only almost enough to sometimes compensate for the code bloat that the frameworks themselves have caused.


People seem to be confusing web 2.0/bootstrap style design (see: Reddit's bad redesign) with the frameworks here but the component based approach combined with webpack/compilation steps, which are very much mainstreaming, are fundamentally different than top heavy JS frameworks I spent most of the last decade using.

I dont think people have really seen what server-side rendering (ala Next.js/Nuxt.js), decoupled components, modern treeshaking, minification, and chunking can do for performance. We've yet to have a full web framework designed entirely to encompass this from day one but they're coming and getting better at it.

Outside of the endless 3rd party crap the 'business' guys add, if anything, it's no longer the JS framework size and library cruft I'm worried about it's how many object watchers I've got at runtime via Vue.js and reactive style programming (which automatically adds Object.observe style watchers to all data) + component initializers (which can add overhead to simple HTML templates). Fortunately with the new Vue 3 function API there is a clear distinction between observed values vs static/immutable/config data. The amount of observers per page dropped dramatically. Otherwise looking at a Vue project in the performance tab shows it's highly async and efficient.

The new Vue functional API (allegedly) is also going to be far easier to treeshake so you really only ever get the library features you're using. Even for something like Lodash or Ramda where you might use only 10% of the functions it's a big deal.

I'm sure React is making similar progress in this direction and it was already smaller than Vue.

CSS Frameworks have some ways to go and PurgeCSS isn't perfect but using a functional CSS framework like Tailwind or Tachyons also helps in this regard, so all of the stuff isn't intertwined and can be stripped out.

This type of tooling really does massively improve the state of things automatically, even with poor coding practices or large applications spanning multiple pages. SSR, treeshaking, chunking, stripping unused CSS, etc is a 'zero cost' type optimization that I've seen turn 700-1000kb dev assets into loading 100kb base + a few 5-10kb js/CSS files dependent on the page.


I think you and the parent commenter are coming at this from fundamentally different perspectives.

The parent commenter remembers a time when websites were server-rendered by default. The server would send down HTML, along with a small amount of CSS and JavaScript to style it and add interactivity. You didn't need an elaborate chain of compilers and bundlers and frameworks and chunkers to make sure you didn't ship several megs of dead code to the client; you just didn't ship several megs of dead code to the client. That's their baseline for web performance.

Your comment, on the other hand, seems to use the current giant-JavaScript-framework sites as the baseline for performance. And yeah, you can get a lot of cheap wins if you start from there. (For starters, you can break up your JavaScript execution into tiny slices so that, even if you're using 80% of a CPU core and draining the user's battery for no real reason, you're at least not blocking the browser UI thread.)


This is more than just wrapping stuff in cards and making the font 2x, it's about the abilities of what websites can achieve. The design is a part of that but it's also influenced by what you can do differently than a pure HTML + a bit of jquery site.

I basically rebuild past era bootstrap style Rails HTML views with some jquery into modern Vue components for a living on a B2B app.

I'm very familiar with the old way and how much better a proper modern replace could be. I can pack a ton of information into smaller flexible places and integrate action-relevant help boxes and only showing the parts of forms as needed.

Page render times to first action are about 10x faster on average (no joke) and our users use 1000's of objects on a single HTML form. I'd love to see someone try rendering that pure SSR framework with partials in a performant way and usable way.

This type of interaction extends well outside of my business/B3B domain as well. I'm quite excited for the future of web development after a decade of fearing more JS everywhere (which I was contributing too).

Another great HN style site is Basketball Reference, I personally love these information dense designs but it's not very scalable with tons of information, elements, and what it could really achieve if it was a desktop application or fully flexibly UI wise.

https://www.basketball-reference.com/teams/TOR/2019.html

This site could massively improve with real time filtering, multi-select forms, interactive data exports tools, multi-team/page comparisons, information highlights on hover, inline math tools, etc. On the site currently it looks great on the surface but once you engage with any JS-y stuff or imagine what's possible if it was a desktop app, and the previous tools would be extremely limited in scope.

You can do this very powerful stuff now without clogging up performance and load times. That's a big deal. We've been dealing with slow clunky JS components forever now and it can be way better.


> The design is a part of that but it's also influenced by what you can do differently than a pure HTML + a bit of jquery site.

> I can pack a ton of information into smaller flexible places and integrate action-relevant help boxes and only showing the parts of forms as needed.

> This site could massively improve with real time filtering, multi-select forms, interactive data exports tools, multi-team/page comparisons, information highlights on hover, inline math tools, etc.

I guarantee that it is possible to write any of these features in traditional HTML, CSS, and JavaScript without relying on today's frameworks or build infrastructure. I know this because even the most advanced build pipeline in the world must eventually compile down to traditional HTML, CSS, and JavaScript; that's all browsers run, no matter how many layers you add to try to abstract yourself away from it. I also know this because websites existed that had these features prior to React becoming popular.

> We've been dealing with slow clunky JS components forever now

That's the thing -- there was a time before slow, clunky JS components. (They used to be slow, clunky Flash components :P)


Or you can just use Elm from the start and not concern yourself with code-splitting, code-shaking and still get smaller assets. See for yourself: https://elm-lang.org/news/small-assets-without-the-headache .


Seeing the parent post you're replying to reminds me a lot about how mainstream compiler optimisation works: create a lot of crap "unoptimised" output, then add many additional layers of complexity in an attempt to get rid of the unnecessary crap, instead of the quaint idea of simply not creating so much crap in the first place.

"All problems in computer science can be solved by another level of indirection, except for the problem of too many layers of indirection."


A modern css framework would increase the font size change the font type make the reply link a fat button. It would put a card around each parent reply thread.

Getting rid of unused css helps but the additional markup would make the page 4 or 5 times larger.


A "framework" doesn't do that. A designer does. Or whoever decides to implement it.


I see a trend away from that in 'trendy' CSS frameworks like http://tachyons.io/ and https://basscss.com

The Bootstrap stuff you seem bothered by is better for splashy big-headline marketing sites not high-density content like HN.

A lof of web designers cut their teeth designing marketing websites, not app UIs, so it's not surprising when I see it mindlessly used on sites like Reddit's new redesign.


Iterating without a destination in mind. Assigning people to work on something vs finish it. This is in stark contrast to games which get released, get stabilized, and stop development. This model would suit a lot of software aside from security or dependencies necessitating updates but if we want we can just iterate forever.


I’m always amused when the frameworks being used advertise things like “We make coding fun!”

I tend to think “but what about making the web enjoyable for the end user?”


>I’m always amused when the frameworks being used advertise things like “We make coding fun!”

I understand where they're coming from having dealt with a few legacy frameworkless spaghetti code bases that have no documentation. Give me bad code to maintain in a known framework over bad code without a framework anyday. Adding features and debugging in those conditions are no fun at all.


The end user is paid to deal with whatever we provide them.

But more importantly, the fact that the page loads 2 seconds longer is a drop in the bucket compared to the insane processes our business people dream up.


End users don't care. When they see a slow website they think "Awesome, now I have a reason to buy that new iPhone Jessica has because this one is getting slow"


It’s interesting to note that iPhones render JavaScript faster than laptops/desktops.

https://mobile.twitter.com/codinghorror/status/1049082262854...


I understand where this is coming from, however whenever I interact with a app that re-loads the whole window on every major interaction, I start getting frustrated because of the jarring, slow transitions, most modern web apps are faster after the initial page load, albiet on my 15" 2018 MacBook Pro.

Stop blaming the modern web. There are plenty of slow PHP and rails sites and plenty of slow React based sites. Of course speed is second priority, I'd much rather have users using my app with a mediocre experience than nobody using my app. However, as the web supports more tooling and frameworks continue to evolve towards web assembly, I believe will start to have web apps approaching native speed.

The reality is that requiring a server to render every page limits what you can do with your app. Even if modern tooling is overused, developers will always move towards what allows you to do more.

I'd much rather focus on making modern tooling faster than complaining about the modern web.


You can have a very tiny website with custom onclicks to load / change elements via callbacks. The bloat of JS frameworks to do everything including generate the actual HTML client side is totally separate from having dynamic page content.

I always strive to keep the client side stupid. I should be sending as much of the final document as possible from the server and any interactivity needed should also produce as little work for the client as possible.


To a certain extent, it's developers serving themselves, not the user. Your job is not to show off the latest frameworkify.js and get internet points when you blog about it. It's also not to make your life easier (like by including a 1MB js library that saves an afternoon of your time). It's to make the user's life better. Just do your damn job! \end{rant}


Unfortunately I suspect a lot of the decisions are driven by resume-padding, which is itself a phenomenon started by management who has no idea of the implications of using $latest_web_tech beyond the fact that it's "new and modern" asking for use of the dozen latest frameworks for everything they do.

If the companies hiring web devs changed their mindset and listing a dozen different frameworks on your resume was perceived as the negative trendchasing architecture-astronauts that such developers often turn out to be (at least in my experience), and valued simple JS skills and experience instead, I suspect websites would end up being a lot different than they are today.


If HR departments wouldn’t set hard limits on percentage raises that are allowed and then hire new developers at market rates causing salary compression and inversion, developers wouldn’t have to be constantly on the look out for their next job and always stay focused on Resume Driven Development.


Saving an afternoon of your time means spending an afternoon working on more things the user wants. With that logic we should be writing web apps in pure assembly to shave off as much loading time as possible despite the fact that in the time you could have built the whole application, you are still working on the login page.


I love the power of modern web apps built using modern frameworks. I hate that ordinary content websites decide they need to be built using such frameworks.


Secondary? If only. It's more like not even considered at all.


Honestly, with modern things like Webpack, minify/uglify, CDNs, web speed should be much better on low bandwidth sites compared to the past.


It depends on how you use it. Size matters. On a site I manage, we send about 64KB of JavaScript and we are working to make that smaller... and the main site functionality works when JavaScript is disabled.

In contrast, many sites seem to think that 1MB of JavaScript, or even 5MB, are acceptable, and they can't even show simple static content when client-side JavaScript is disabled. On a slow link, 1MB of JavaScript means that the user has probably given up. And it's not just the size. 200KB of HTML is far far faster than 200KB of JavaScript; web browsers are highly optimized for handling HTML, while processing JavaScript is necessarily much slower. JavaScript is great for some things, but like a sledgehammer it's not always the best solution. Too many people aren't considering the real end-user experience. "Pretty graphics, but it takes 5 minutes to download" is usually not a worthwhile tradeoff.

To paraphrase Goldblum:

Your web designers were so preoccupied with whether or not they could use JavaScript, they didn’t stop to think if they should.


My parcel with a fairly simple app spits out a 10MB release build. We still have some way to go.


How many libraries did you add? Last I checked the major ones were about 1mb


I think the AWS library is pretty insane.


I'm currently living in a country and city with high speed internet, and I would also like to thank all who are responsible, for not trying to 'fix' what isn't broken.


I like

https://text.npr.org/

for this as well.


https://duckduckgo.com/tty/

Well, OK, that's actually NOT lightweight.

But this is:

https://duckduckgo.com/lite


Sweet, now I can visit duckduckgo lite in firefox lite. Thanks!


https://lite.cnn.com/ as well - 107 KB


I think Gopher could be good for some text-only versions stuff like that.


100% agreed. A dark mode option would be cool, but it's very easy to get one with user themes and alternative UIs.

I bet HN will look almost identical 10 years from now, too. I wonder how many popular websites can boast such a consistent design from their beginning?


I hacked a quick dark mode userscript for HN that respects your OS choice of dark/light colour scheme (using the prefers-color-scheme media query)

https://gist.github.com/1player/85d146a4aa0c2afc78bab63582f5...

Some things like the login page are not styled correctly, but works good enough to browse and post comments.



I love how fast and simple HN is. I wish the touch targets were bigger on mobile. Shouldn't be too hard with just a little CSS


On the other hand voting is a pain on mobile, even with a large screen I often misclick.


Leave alone collapsing comment threads. I end up on the commenters profile 50% of the time.


I personally believe this is one of the couple reason why HN's demographic is significantly on the older side, in Tech terms.

I personally wish HN attracted a bit of a fresher, younger crowd as the conversation here has become more stale over the last few years.


I wish there were more preteens and teens on the site. By the time someone reaches 20 their ideas are a little stale shaped by the harsh realities of life.


I have found that many people online are not necessarily forthcoming with their age. Hacker News might have more people of that demographic than you think it does.


On the other hand, such demographic often comes with a lot of drama


I think that comment was sarcasm.


Think so? I didn't get that feel, but it can be tough to tell sometimes.


HN is lightweight in some ways, but also quite bad in others. For example, every single time you vote on anything, the site sends back a 302 redirect and then a copy of the entire page's HTML in response. This means that every vote results in a response that's usually around 10KB gzipped.

As of right now, for me, voting on any comment in this thread currently causes a 14KB response and takes about 1.3 seconds to finish.


    This means that every vote results in a
    response that's usually around 10KB gzipped
I have Facebook open in another window and opened the network inspector to compare it:

Just moving the mouse towards the like button created 824KB of data transfer. Not sure if it is because of tracking the mouse movement or if it preloads stuff it thinks my mouse is moving towards. Or maybe it loads stuff that it would display if I held the mouse over something longer. No idea. But it loaded 824KB without me interacting with anything. Just because I moved the mouse.

The actual click on the like button added 30KB on top of that.


Sure, of course it can be much worse, and almost all other sites are. I'm just saying that HN could do it better, because there's no need for it to send a redirect or that full-page response unless the user has JS disabled.

With JS, they could send a completely empty HTTP 200 response and it would work exactly the same (it does nothing with the response anyway). Right now HN is probably sending multiple gigabytes worth of responses to voting every day that are all just thrown away.


>almost all other sites are.

That's their point. You're not wrong, you can always be better, but HN is barebones compared to almost all (using the mathematical definition...alright, facetiously using it) other sites.


Comparing to Facebook is probably the worst possible comparison, though, and doesn't seem representative of most other sites.


Only a very few sites are not bloated Facebook clone abominations.


He is wrong. It is not “quite bad”.

I appreciate your generosity to him but I am sad to see that is the top comment.


How could it possibly be considered a good thing? I've seen dang say before that they've been running into performance issues, and I bet that they'd see a significant drop in resource usage solely by not generating entire comments pages just to throw them away every time anyone votes on a comment. The response is literally not used at all, they're doing that work and sending that data for no reason.

There are multiple downsides and not a single benefit to the current method.


It's implemented this way for balance of a simple implementation and how lightweight it is. Not to mention, there is at least one benefit: the voting mechanism doesn't depend on JavaScript.

And if you are not authenticated, you are instead taken to a login page. Logging in then gives you the redirect and the upvote is registered.


So just to clarify, this behavior seems to be by design. In that case it does seem the original comment is objectively wrong if the code is readable and maintainable and accomplishes the desired behavior.


As parent comments mention, sending back the whole-page response when you vote on a comment is a great idea if you have JavaScript disabled or aren't logged in.

But it'd be a quick performance win if it didn't do that when you were logged in with JavaScript enabled. The actual JavaScript part of this is actually already implemented -- that's why the page doesn't refresh when you upvote my comment -- so you'd just need one if-statement on the server side to send a blank response instead of a redirect.

Sure, 10 KB of download and the server CPU time needed to re-generate the listing page is probably pretty negligible, but multiply that by the number of votes casted on Hacker News every day and it starts to add up.


Exactly, and it even adds up quickly for a single user in common cases like voting on a bunch of comments while reading a thread.

If someone loads this thread right now, the initial page load is 61KB (gzipped). If they read down the thread and vote on 30 comments while they're doing it, they transfer an additional 480KB (16KB/vote [1]). There's no reason that they should be receiving 8x more data by voting on a relatively low number of comments (~10% of the thread) than they did when loading the entire page originally. It also has to re-generate the page an extra 30 times unnecessarily - a repeatedly voting user becomes like a heavy multiplier to your traffic.

[1]: Also, I just noticed that the responses back from voting actually aren't the full page, they're truncated. It seems like maybe the response size is capped at 16KB gzipped? That implies that there's already some kind of special handling for the voting responses.


From what i can see, the response only contains the comment you upvoted and its children. It's important for people like me who don't have Javascript, but the JS voting system could probably do with a simple HEAD request.


Seconding. Sending back the whole page on AJAX upvote seems unnecessarily wasteful, and sounds like something trivially fixable.

Taking data from 2.5 years ago[0], HN has ~300k daily uniques. Assuming 10% of that are users, each upvoting one story, that's 30k unnecessary page renders per day. That's ~2% of the total views (though the more expensive views - AFAIK when you're logged out, you get cached pages?), so on the one hand it's not much, but on the other hand the fix should be trivial and have minimal impact on code readability.

--

[0] - https://news.ycombinator.com/item?id=9220098


It is possible to do both. Such as, have the link like it is now, but add a JavaScript code which replaces the link with one that executes the JavaScript code to make a request that returns nothing (if you are logged in). If JavaScript is disabled or unimplemented, it will still work.


You can return NO_CONTENT on a successful request, and the browser will do nothing.


> How could it possibly be considered a good thing?

It will work with any possible browser under any reasonably foreseeable condition with no maintenance on the part of the HN crew.

> I bet that they'd see a significant drop in resource usage solely by not generating entire comments pages just to throw them away every time anyone votes on a comment.

If there is a serious performance problem they will presumably do some analysis and make changes related to the most serious bottleneck.

> There are multiple downsides

There are multiple irrelevant hypotheticals. If anyone cares about the heavy-weight HN, I hope they don't accidentally click the link to the article and discover the wider internet with images and whatnot.


True. And I can see that some webs gradually stop working if you are using JS blockers. For example Aliexpress used to work fine with uMatrix. Now, if I click on the Next page in the paginator, it behaves like it loaded the next page, jumps to top, but it didn't - it just changed URL location. So I have to CTRL+R to reload the whole page to force their server-side rendering. I know it is theoretically my problem but I have to use JS blocking to stop (for me useless) ads and malware scripts. If they just loaded content from their domain or sub-domains, it would work perfectly fine. And if it were old-school server-rendered website, it would probably work faster, as it was discussed here some time ago. I have a feeling that web devs are sometimes over-engineering it. Web browser is HTML viewer, not an operating system.


I have a gigabit Internet connection with top of the line laptops and Facebook still makes me wait for a second or two simply to show me 5 notifications.


You may have gigabit, but what’s your ping time?


To Facebook.com, about 25 ms.


They might be tracking your browsing habits. For instance if you hover over the like button and rest the cursor there for a moment but you subsequently don't like the post, it could mean that you thought of liking that post but you did not do it for whatever reason. Or they are just prefetching pages.


This is a diversion to the point being made.

Facebook is a very different site, and could be lazy loading anything, and do you know the mouse move was causal? was there a window.setTimeout initiating it?. Facebook is not a site i'd expect to use on a low bandwidth connection, or if I did I'd use a low bandwidth version if possible (like Gmail has a HTML only version).


In lower internet speeds Facebook loads a lite version which is super fast (loads better than HN for me) and has most of the basic functionalities.


For anyone curious, http://mbasic.facebook.com/ is one of the least data-intensive way to access facebook.


it also lets you access your messages from a phone


This exactly.

No need for an app or anything.

I’ve been using mbasic on my phone since I read about it here in HN (Thanks HN !), it’s perfect and less intrusive.


The actual like is pretty small, about half a KB. Just a light graphql query that responds with updated counts for different reactions (happy/sad/etc). They definitely batch it with some other stuff though. Batching is really important for browser performance.


Does this explain why my votes on mobile often don't go through on the first click?


You don't have to use the actual words "what about" in your whataboutism post.


Let's put it this way: if HN cleaned up their act, the next time "whatabout facebook?" came up, there would be a larger delta to whatabout.


When reading on an iPad, tapping on links is nearly impossible.

A better responsive CSS file wouldn't make the site any slower or degrade the experience for laptop users.


Hehe, I had never thought of this, but you're right. It's not just an iPad, either - it's the same on my Microsoft surface.

I got so used to pinch-to-zooming, for so many years, I guess I didn't even consider that usability flaw.


This. Please spend a little more time on the css. And a somewhat bigger up/downvote arrow would be great. I don’t have fat fingers but still accidentally click the wrong one every now and then.


I always need to zoom to vote or collapse threads on mobile


Yes, me too. I suggested adding more spacing between upvote / downvote links for usability on mobile.


I had the same issue on mobile. Annoyed me so much, I made a browser extension that increases the padding around those and the collapse buttons. Not useful for you, but for anyone using Firefox for Android, see here: https://addons.mozilla.org/en-GB/firefox/addon/sihn/

(It could use some tweaks to keep everything aligned, but the small usability tweak was most important to me.)


Thank you for this! Just installed and a quick smoke test provided a much nicer experience for me. :)


Good to hear :) I've not spent too much effort on it (nor am I planning to), but feel free to report an Issue if you have suggestions for small improvements.


On mobile, I use MiniHack (there are tons of other HN apps). On desktop, HN's interface works well enough that I don't try to mess with it.

I really wish I could do one feature on the desktop/web that mobile MiniHack.app does on mobile - to enforce all links opened launch in reader mode in Safari webview. 99% of the time that's exactly what I want when I click through.


I just use a bookmarklet to increase the font size. It requires an extra tap, but I'd rather do that than use an app.

The bookmarklet:

  javascript:(function()%7Bvar%20n=window.document.createElement(%22style%22);n.setAttribute(%22type%22,%22text/css%22);n.innerText=%22html%7Bfont-size:1.2em!important;line-height:1.3!important;%7Dbody,table,tr,td,.default,.comment,.comhead,.pagetop,span.pagetop%20b,.c00,.yclinks%7Bfont-size:inherit!important;line-height:inherit!important;%7D.pagetop,.title%7Bfont-size:1.2em!important;%7Dtable%23hnmain%7Bwidth:100%25!important;%7Dbody%7Bmargin:0;%7D%22;window.document.head.appendChild(n)%7D)()


Paying to browse a free website? Check premii


My time isn't free. I don't mind paying a small up-front fee for usability. Plus, upvotes with HN's site ui on mobile resulted in mis-votes - that's fixed with MiniHack or other HN apps.


Seems significantly slower to me on my 3G network compared to hn


What do you mean? Haven't had that issue. Am on desktop version, unsure if mobile exists


The tap targets are minuscule and close to each other.


I had to pinch-to-zoom to upvote this comment. :-)


That's a feature IMO -- HN could even make them even smaller.

It should take conscious effort to vote, and the small target size mentally reinforces this model; that it's not supposed to be the most common action.

EDIT: See, clearly the vote targets simply aren't small enough! </s>


I rarely wish to click on the username or time links (“3 hours ago”), but that’s what I click on the most due to size and placement issues (when trying to vote or collapse a thread). I agree one should not blindly vote, but at the same time if you have decided to vote, it shouldn’t be maddeningly difficult and error prone to execute.


Usability-wise, isn't it kind of weird that the click target you use the most (upvote arrow) is the smallest of the three? And the ones that you use rarely or never, are so easy to hit by accident. Why must world be like this?


I’ve also down voted comments by mistake...


They have added an undown link a while back.


Adding undo changed the feel of downvotes a little. Irreversible accidental downvotes were scattered randomly on comments and they made the occasional downvote feel less harsh.


But the undown/unvote links are hard to tap on mobile as well. On firefox on android it's always an exercise in patience.


Related question: what is the easiest way to increase the tap zone of a HTML element using CSS?

Aside: I have noticed Safari auto-inceases the touch zone for inputs and elements with ontouch/onclick events (I noticed because I had to fix an issue caused by this!)


a {padding: 1rem}

This increases area by padding the link. Just don’t want the rap areas of different elements to overlap with too much padding.

You can also put a border around the new area, or a box shadow, to indicate larger click area, making it a button sort of:

{border: thin solid grey}


Note that the <a> element is inline by default, which means top and bottom padding is not applied. Set display: inline-block; to have padding apply all around while retaining the element’s flow in the layout.


Thank you! I’m still learning CSS and will have to adjust my style sheets! :)


Probably just add some extra "padding: 5px". All the padding area registers "clicks" for the element.


For the thread collapsing functionality I think the long press/touch could mitigate the need for pinch zooming. Just hold your thumb on the comment long enough and bang, it's gone. For me this is the most commonly performed interaction with the site.


I find this behavior on Reddit annoying when I’m reading a long comment and the thread suddenly collapses because I left my finger on the screen. There’s pros and cons to the approach.


The trick is to keep your karma below 500. xD


Thanks! As someone who lurks here much, much more than commenting, I wasn't aware that the interface would change over 500 karma.

This does put the discussion in a different light (not to mention helps avoids me making comments that don't apply to more karmarific users).


That's weird - must be an iPad specific thing. It works just fine on Android Chrome.


I've noticed that an Android Chrome I have no issues hitting the links on this site, but if I switch over to Firefox I can barely click anything. It's not just this site. That annoyance alone has almost driven me back to Chrome.


I always accidentally log out when I try to click on my username in iOS.


Oh true, that happens to me a lot.


All elements are super tiny unless you force zoom on them. The CSS says that all fonts are 10pt or _smaller_. There are 7pt fonts defined in there! Everything else is specified in exact pixels. Vote arrows are 10x10 pixels. Do you realize how small 10x10 pixels is on a modern display?


> Do you realize how small 10x10 pixels is on a modern display?

CSS `px` units are not equivalent to a device pixel.

https://www.w3.org/TR/css3-values/#absolute-lengths


That doesn't make it better. 10px at 1/96 inch per px makes them only 2.6mm.


Okay quick thought exercise. HN doesn't use a react framework and is quite lightweight on clientside JavaScript. How many upvotes does it take using this system before the total data size transmitted (original page + modifications) is, in sum total, larger than a fresh load of, say, Twitter?

The reality is that for the bulk of users, votes are an uncommon interaction; most of us are here to read the articles and view the comments. The additional complexity needed to make votes small has an upfront cost, and in this case, I feel the simplicity of the base interface is the better thing to optimize for, and HN's designers appear to agree. 10 kB for what is effectively a server-side page reload (which probably gracefully degrades for folks running NoScript) seems fine.


It's nothing more than an extra header or query param to denote an AJAX request. Then the server can respond with an empty 200/204 status code instead of the entire HTML page. It's an hour of work and would actually save resources. None of this requires a front-end framework.


No need for custom header or query param, Ajax (xmlhttp) requests already have an extra header field set automatically, so all is needed is some if statement server side, assuming registering the upvote and rendering the page is not entangled in some super crazy way.


HN works perfectly with javascript dissabled. It’s a feature


It can still easily work with JavaScript disabled and implement the change.


We're talking about it working better when JS is enabled.


You'd need some frontend code to hide the upvote icon though.


That frontend code already exists and does that every time you vote (line 27 of https://news.ycombinator.com/hn.js). The response content isn't used for anything at all.


Nice! I hadn't actually looked at the code before writing my comment. That does seem relatively trivial, so I stand corrected. I didn't realize the clientside code was discarding the response; I sorta naively assumed it was doing a DOM replacement or something equally straightforward.

What a pleasantly tiny script file. More site code should be this easy to read.


If you have JavaScript disabled, the response content is used to hide the upvote sign.


We're talking here about something simple like:

  (if *ajax-request*
     (upvote)
     (upvote-and-generate-page))


But they're already using javascript, and the issue is the backend response, so this would only be a backend optimization then and doesn't require any fancy frameworks?


No, It’s not needed. I have been using it without javascript for years.


You aren’t understanding the person you’re replying to.

No one is contesting that HN can work without JS. But for those who have JS enabled, it uses JS. For these people the optimisation could be implemented.


I'm going to agree and say there are minor usability/functionality quirks like these

For example, adding a story to favourites shouldn't redirect it back to the main page

At the same time there were a couple of minor improvements "recently", like the undo feature for votes which were a bit of high-impact low hanging fruit.

Apart from that I'm just happy I can customize the level of orangeness on the top bar to a lighter shade


Also, worth mentioning the insufficient formatting features. Quoting is not supported, and is quite important for forum discussions' UX.


For me, the only formatting issue is the code blocks that people use for quoting from linked articles - it doesn't wrap properly on mobile.

Other than that I like the minimalism. I even think the lack of automatically quoting the parent comment is a good thing, as it forces the user to think about the part of the parent they want to quote instead of wholesale quoting it all as the lazy option. If auto-quoting existed the comments would be 40% quotes of each other, and much harder to read.


The code blocks sometimes also cause horizontal scrollbars on desktop. And all they have to do is add one CSS rule (white-space:pre-wrap).

But apparently there's people who dislike wrapping code so much, that it's worth messing up the layout of a whole discussion even if only one comment has code in it (or by accident, even). Personally, I dislike horizontal scrolling much stronger (and I actually mildly prefer wrapping code, but that's taste, the horizontal scrolling is a big annoyance).


>> Also, worth mentioning the insufficient formatting features. Quoting is not supported, and is quite important for forum discussions' UX

Exactly


I'm not sure if this is sarcastic or not, but the problem is when the quoted text is longer and/or multiline, and especially when it's both.

You typically see users working this around with spacing, and (sometimes) other users complaining (rightfully) that they can't see the full text on mobile without scrolling.

This is exacerbated by the fact that then HN rendering only has the "<p>" concept, not the "<br>". If one works this around using the greater than symbol, and there are several entries, it blends too much with user-typed text.

It's not a very big deal, but it's pretty much standard feature in forum engines, and the lack of it makes the text outlook unnecessarily worse.


>Exactly

But it doesn't turn green.


I assume you have JavaScript disabled, right? If so, can you give an example of a similar webpage that doesn’t reload the whole page to reflect your vote?


You're correct that when JS is disabled, any interaction requires a HTTP request from a link or form, and that does a full page reload.

However when JS and AJAX is used, it's expected that it also avoids an entire full-page HTML response that goes unused. It can easily be a 204 instead.


Can't frame tags reload independently?


It's still a full page refresh but with a page within a page. Making every comment a separate iframe would make the site very resource intensive on client and server.


I dont see why it would be resource intensive server side. In fact, it would be less resource intensive, since we avoid a full page reload and instead reload a single comment iframe.

As for the rendering engine, a iframe tag per comment is nothing compared to the pile of JS that's normally there on other sites.


Every frame is still a full HTML document that needs to be generated even if it just contains a single comment. This page has 370+ comments which would mean 370+ additional HTTP requests to load them all as standalone HTML pages. That's a massive increase in server load.

On the client, iframe tags are entirely separate browser windows and can use up considerable resources. It's like opening up hundreds of tabs and is far more intensive than running some javascript code which is quickly parsed, compiled and cached in modern JS engines.


Ah. Thanks. Understood. I often wonder if one could whitelist a very small subset of JS and avoid Turing Completeness but enable 90% of the use cases.


No this is about JS enabled. It's the same endpoint. Open dev console and have a look.


I can confirm. Just upvoted your comment with JavaScript enabled, still has 10 KiB transfer.


No, voting doesn't work at all with JS disabled.


Yes, it does. I think you have something weird with your browser: with JS on, it does a light AJAX request, without it, it does the whole redirect thing.


The request is the same. The buttons are <a> tags. Without JS, the link is clicked and does a full page reload. With JS, the click is caught and sent as a AJAX request where the response is ignored.

It's not any lighter, it's just not reloading the page.


https://i.imgur.com/9z2VAfa.png

These are my network requests when I vote with JS enabled.


Whoops, you're right, it does work either way. The response is the same with or without JS though, it just doesn't do anything with it when JS is enabled (because as a hacky way of doing "AJAX" it sets the voting url as the src value of an <img> tag that doesn't actually get displayed).


Voting does work with JS disabled, just not well.

The page reloads and re-renders with the UI changes (up-down buttons hidden and unvote link added). However, the reload takes noticeable time, and you lose your scroll location, which is disorienting.

With JS enabled, the UI changes happen instantly, and the scroll position doesn't change. The page still reloads, but the reload is directed into a new Image object which is immediately garbage, so it's off the UI timeline. But it's inexplicable bandwidth waste, nonetheless!


You can do this without javascript, you just create a form that sends an XHR PUT/POST request and returns back a 200 response.

You can't do much with the response without JS but a single upvote doesn't change anything on the page so there's no need for the reload.


> a single upvote doesn't change anything on the page

Not much, but it does hide the arrows and shows an undo link on the comment.


This is probably the reason upvotes still work with javascript dissabled. It’s a feature.


Sending the same reply regardless of whether the user is using Js or not is not a feature. It's lazy. You wouldn't even need to change a single thing about the front end to fix this on the back end, since the js request via Ajax already has an additional header field (x-sent-via: Xmlhttprequest or something along the lines). This might be as simple as wrapping output generation in a single if-statement.


> the js request via Ajax already has an additional header field

No, X-Requested-With is a non-standard header set by frameworks like jQuery. But it’s trivial to set something like that with either XHR or fetch.


Thanks for the heads up. The only time I came in contact with actual web dev was indeed jQuery, and the couple other times I looked at the dev tools might have been pages using it as well.


Seriously, this is not an issue.


+1, thank you.


Reloading the page is way easier then rebuilding the entire comment section, should something be renewed ( eg. a new comment).

It also (partially) removes the requirement for live-pages on an active post here.


This isn't about reloading the page on voting with JS disabled; it's about the XHR call when you have JS enabled. It goes to the same endpoint as a non-JS call, ultimately loading the new page, but completely discards it (as it is an XHR call not an actual pageload).


From my experience when something like that happens, i.e., HackerNews sends back 10K after an upvote, there is a reason for that. We might not know the reason, but the developer who did that had one and at the time it was probably a good and valid reason (and might still be).


>We might not know the reason, but the developer who did that had one and at the time it was probably a good and valid reason (and might still be).

People seem to assume every aspect of this forum has some brilliant, purposefully elegant rationale behind it, but the truth is PG designed the language (and the forum) to be experimented with and iterated upon, and likely didn't intend the forum to be considered "finished." This software isn't a masters' thesis, it's a proof of concept for a web application in Arc lisp.

Unfortunately, a cargo cult mentality has kind of grown up around this forum and every aspect of it is treated as sacred and not to be touched because "there must be a reason for it." The reason is probably that PG didn't care enough to iterate on the first working solution he came up with.


Very possible. Also possible there was a valid reason at the time, and even though it's no longer valid nobody bothered to look into it. All I'm saying is that "created" things exist/happen for a reason, which is not always obvious to a 3rd party. I've had this happen with my own code. I find some old code that looks really dumb, no idea why I wrote it like that. I then spend time to refactor it properly, only to hit some obscure edge case which ultimately leads me back to the 'dumb code'


[flagged]


> It should use 5-50 megabytes of data per page load.

Mandatory min of 5MB vs. current 10KB?

I'm pretty sure I'm missing the logic here. I sometimes rely on a satellite connection in my access, and I don't hold HN responsible for that fact.

I do agree that no anger is merited (nor sane).


The person you're replying to is most likely making a low-effort troll post on how large webapps tend to be now. Keep in mind that most of what's downloaded would be cached, and the response from an up / down vote would be considerably smaller than 10KB.


> would be cached

For heavy users of the site, sure. For someone who has stopped by for the first time, well, good luck with customer acquisition is all I can say.


Something tells me that YC isn't about acquiring users to have a unicorn exit of HackerNews.


Shitting on users is bad no matter what site you make.


you call it low-effort but in fact the person doesn't see it as obvious satire.

It really does seem this is exactly how modern web app redesigns are done.


Sarcasm


Not only are we supposed to assume good faith comments on HN, reddit-type snark is looked down upon for a reason.

Per HN guidelines, 'don't talk shit you wouldn't say in person.' There is so much ridiculous rhetoric on HN these days, satire is difficult to discern.

Edit: Re-reading the original comment in its entirety, rather than focusing on numbers, I guess it's obvious satire. I naively assumed that's not the convention on HN.


Since he's probably too modest to mention it himself, the user you're replying to is the founder of the news aggregator and forum site Tildes [1] which is pretty much the gold standard of modern forum site design imho. If anyone has the right to make this comment it's him.

[1] https://tildes.net


>which is pretty much the gold standard of modern forum site design

Wait, what?

>open-source link aggregator with no ads or tracking (currently in invite-only alpha)

HN is over 12 years old and much larger. Suggesting an alpha-level service is the "gold standard" is pure hyperbole.

I'm not familiar with the founder, so I am not making any judgement; and, I support the basic premise of their project.


I think they were just talking about the design. HN is certainly a lot more active and successful overall, but it's extremely minimal for both design and functionality (which is part of its appeal).

Tildes's design has a lot more to it, including better responsiveness for mobile use, a built-in theme system with multiple themes (including dark ones), full markdown (CommonMark/"GitHub Flavored Markdown") support, syntax highlighting for code, etc. There are lots of aspects of it that would be nice to have on HN.


Thanks! I'm honestly a pretty terrible designer overall and there's still a lot that's bad about Tildes's design, but it's slowly improving. There are some major changes coming soon, and hopefully those will help some more too.


It looks pretty good. I would add a bit of left margin (can be responsive, it looks good currently in a narrow/mobile window) but the rest is gold.


Any chance of an invite?


Definitely (open for anyone else too) - there's no messaging on HN though, so just send me an email to the address in the announcement post: https://blog.tildes.net/announcing-tildes


Truly awful.


Whenever I use up too much data on mobile and get throttled hacker news becomes the only site I can still use since every other website takes 5+ minutes to load a single page.

Please never do a redesign. Don't ruin a good thing.


+1

I’d also like to express my sincerest THANKS for keeping it simple. Visiting HN every day is pure joy, and reminds me of the old days of the web with less distractions.


A while ago I managed to open HN on a Nokia 3310 classic.

It crashed eventually (not surprising given the minuscule amount of RAM this device has), but the front page was visible long enough for me to read some of the titles.


Fun fact: you can use Hacker News on an Apple Watch, and it’s not even that horrible.

- Sent from my Apple Watch


HN is pretty snappy and lightweight, but there's still quite a bit of HTML bloat that could be removed. e.g. all the nested tables, divs, classnames and padding gifs are completely unnecessary, and could be replaced by a single nested list.


If I'm interpreting this whole discussion properly, it seems that the fact that "modern web development" is bloated as hell, means that the messy table-based layout on HN is a-okay, because you really can't expect people to learn from history more than a single step back, or something.

Yes indeed, the shit that web developers were trying to get rid of around 2005-ish, is actually better than we have today on many pages.

Who are all these programmers that like bloat so much??


This is a good post about how bad the modern web is getting for people with slow/unreliable connections: https://danluu.com/web-bloat/

That was written two and a half years ago now, and things have only continued to get worse since then.


Website bloat is why my laptop now runs slow when the browser is open. It's why I had to upgrade last time. Modern websites will just continue to get more bloated for every magpie developer who wants to thrust their version of "better" on everybody.


Agree with everything, but there is a small UI thing that could be improved - it is very easy to occasionally press "flag" instead of "hide" on mobile.

This improvement could be very beneficial for users with fat fingers. Well, HN is inclusive platform, right?


For twitter you can disable JS and it will fallback to a ""degraded"" (a.k.a. Much faster & usable) experience.


It seems to fall back to the old mobile website for me? This is after clicking the fallback prompt.


Yes, but you have to click past an annoying prompt[1] every time. Does that not happen for you?

[1]: https://i.postimg.cc/jjBSVmCX/Screenshot-at-2019-09-01-15-23...


Once the cookie's set, I tend to go straight through.

That annoyance is considerably less than all the others a full-JS Twitter imposes. Including the motherlovin' account-creation nag that shows up all the motherlovin' time.

One click through at entry rather than random annoyance in the midst of reading? A net win. And an incredibly sad commentary on the state of the Web.


One thing I found recently That would help with data usage, as well as preventing tracking (like facebook) is uMatrix:https://github.com/gorhill/uMatrix. It allows you to selectively block ads, cookies, frames, scripts, xhr etc; from each individual source sending data on a particular site. You can even block individual sources across the board & they will be prevented from sending data on any site, as long as uMatrix is enabled. The link is the official github & contains links to the chrome, firefox, & opera versions.


I'd really like to disagree but I have to agree. I can think of so many ways that HN could be improved but then to what purpose.


While I like the light-weight design, I would prefer some bigger hit-boxes for the links on mobile. Also, the vote-buttons are very tiny.

And I would prefer if "hiding" would be near the voting buttons.


You can zoom in to hit them. Personally I much prefer being able to see a tonne more content on the screen, then zooming in to interact with it. Then having a mobile site in which you see like 1 post per screen and the buttons are large.


I like the middle way.

See a few posts less but don't have to fiddle around with buttons.


What about this forum software I'm developing? I'm trying to keep down the amount of Javascript — 150 kb currently.

https://www.talkyard.io/forum/latest

Long term, I'd like to make this work on slow connections, Offline-First, in e.g. rural villages in Africa :- ) on a Raspberry Pi, local wifi, unaffected by a "global Internet outage".

Can I ask, which part of the world are you in? (which country?)


old.reddit.com is still usable if it helps!


I still can't use the redesigned Reddit, I not even sure that I would, if I could. It's still way to slow to be something I'd use on a regular basis. If they kill of the old.reddit.com, I don't think I'd continue to visit Reddit.

Imgur for instance isn't a site that I visit daily or even weekly any more. I really enjoyed the site, but after a redesign it just stopped being enjoyable. It's way to slow, to much Javascript. It's the only thing that will reliably force my MacBook to ramp up it's fans. How is it possible for a website to make a laptop heat up more than doing development work?


Other than poor performance, the thing I'm most upset about is how each subreddit has lost its personality, and the fact they've closed the source. Being open is what made Reddit special, considering who founded it, it should never have gone closed source.


> Other than poor performance, the thing I'm most upset about is how each subreddit has lost its personality

What makes you say this?


They axed custom subreddit CSS stylesheets. It was very much like unstyled Reddit was a base to build upon.


The most frustrating thing to me about imgur is they way over-compress images on the mobile site. I always have to request the desktop site to get the original quality image. It's bad to the point that text on an image is often unreadable on the mobile site.


Is PG even involved with HN anymore? He hasn't posted on his account in four years.


I've seen posts by PG in some recent threads out in the wild, they don't show up in his post history. I would imagine this is intentional to prevent old comments being dredged up and used against him, much like Zuckerberg's old FB messages were automatically deleted from the recipients inbox.

Fine by me, but I wish HN would offer the same privilege to its users and allow us to delete our old posts


I believe Dang has said once or twice that this is something that can be achieved if you email them.


I'd suppose he'd use a throwaway if he'd want to post anything. Otherwise it'd draw too much attention and get too many upvotes and replies, essentially killing the small amount of neutrality we get here.


>At the end of March 2014, Graham stepped away from his leadership role at Y Combinator, leaving Hacker News administration in the hands of other staff members

Though he's still a major shareholder of YC which owns it. He tweets quite a lot these days.


I recently created a Text-Only news site: https://textnews.pythonanywhere.com/english for Indian readers. The goal was to provide multilingual news in places with slow-internet connectivity. I would appreciate any feedback.


Progressive enhancement all the way, HN is a great example of this, focuses on its core functionality and slightly enhances things with JS. There's the argument for modern web not having to reload a page, but we were reloading pages on 56kb dial up.


I've thought about making my own extension to make HN more "modern" but when I sat down to make it better, turns out I was really happy with how it is. Agree with what you're saying!


Too many negative views against new web. Imo these apps are built for both faster and slower connection with lite version. Sites can support slower network areas but why should that go against new web stack. I absolutely love the experience of spa, progressive webs and heavy client aide apps which do not post entire html every time you click on something.

I do agree with op that HN is built for sharing news and links and it is great that they have not tried to update it with new style and heavy experience.


If you want to keep that performance but still want HN to be a bit more "appealing", try a browser extension. This way it looks nicer but the styles are stored locally.


Just for grins and giggles, I bought a 30 pin iPod chord to charge my first gen iPod Touch from 2007. HN and Daring Fireball are two of the only sites that actually still render.


I noticed that a CSS fix I gave to pg for Hacker News over 10 years ago is still in place today.

I’m proud to have a single brick in the foundation of a site that has lasted this long.


What was it?



One issue I find on mobile with HN is accidentaly hiding or flagging articles. It's very easy to do considering how close together flag, hide and comments are.


Mobile could really use some work. Otherwise, HN is awesome.


The number one request I would have is that they support previews in messengers.

When I share an hn link on iMessage or telegram it should at least show the hn title text


I use Stylus with "Hacker News Readable Dark" and it works fantastic.

It doesn't constantly break, because the site is simple and unchanging.

Thanks to HN for that.


You can use https://old.reddit.com/ till it lasts.


HN is great just the way it is. I have access to reasonably fast internet service, but the speed and reliability trumps whatever "advantages" that are proposed to be found in modernizing the site. Like Stallman's site, HN is a paradoxically enjoyable destination, that I look forward to browsing and learning from every day. THANK YOU.


I want to add particular thanks for:

- Keeping the ‘[-]’ comment collapse button minuscule on the mobile site, so that hitting it is a game of chance.

- Abstaining from adding semantic CSS class names to page elements—such that CSS selectors amount to “a table cell on the third level under this table cell.”

These two features steadily keep me from spending even more time on the site than I already do.


I don’t understand. The collapse comment button is plenty big enough to consistently hit it every time I try.

Even if I only touch in it’s general direction my phone compensates for my fat fingers and collapses anyway.

Does this not work on other devices?


AMEN! Please keep Hacker News as it is forever!


Yes. I agree. HN speed and simplicity is probably a big reason it's one of the sites I go to first when procrastinating.



It still loads tracking (of course), but it is a vast improvement over their mobile site, and doesn't need JS.


And you can use messenger without having to use the app or go to desktop


I use hn.algolia.com and it’s fantastic.

With HN as is, I think plenty can be done to make it light weight and make it even more performant.

E.g HN isn’t a progressive web app. So you’re loading the same things over and over again. With a service worker HN could teach even more people around the world with a spotty connection.

May be I should prototype and build it.


> E.g HN isn’t a progressive web app. So you’re loading the same things over and over again. With a service worker HN could teach even more people around the world with a spotty connection.

Hacker News uses standard HTTP headers to cache all its CSS, JavaScript, and images. The only thing that gets re-downloaded when you click a link is the actual text content on the page -- usually about ten kilobytes. And isn't that almost always what you want, to make sure you see the latest posts, comments, and vote counts?

Because it's standard HTML, Hacker News also works with existing features like Chrome for Android's "download page when back online" button.


Thanks. This is nice. Useful.

Could be a bit more flexible on some regions of discussion, but take that as a quibble, and recognition to many personas here. (You are good people, I do not imply otherwise)

Those personas combine in what I find to be lucid, and high value.

And the full thanks go to those who recognize that and work to nurture it all along.


Hearing the internet is not feasible without ads can't help but wonder what is the economics of running HN?


I had never heard of Y Combinator until Hacker News. As far as brand building goes, most investment companies could only dream of Hacker News.


The HN jobs section attracts candidates (visitors of HN mostly are devs) for startup funded by Y combinator, I guess this is a win-win


I think a lot of Y Combinator companies get preferential treatment for posts (especially job listings). So in that sense HN does have ads.


may be indirect benefits


My only request would be better comment formatting options, the code block thing is used wrong all the time.


Came in to find the obligatory blockquote markup request.


Agreed. Though a Reddit style red envelope would make me respond to people faster that interact with me


Some may call that a feature. For the rest, https://www.hnreplies.com/ is quite useful.


Thanks!


I'm curious, in practical term (not throttle emulator via browser's developer tool), does single page app 'works' in location of low speed internet?

By 'works', I mean not just content been served, but overall usages not making you wanting to abandon this site.


But this site has been redesigned. It used to be unusable on mobile and is now just poor.


It used to be better with mobile browsers that knew how to properly reflow text (Opera Mobile), so that comments didn't get progressively narrower down the reply chain. Dumber browsers have since taken over as the whole "responsive" thing became a responsibility of the website itself, instead of being taken care of by the user agent, with inconsistent and ultimately worse results.


Yes, it is good that you did not "fix" Hacker News wrong. Still, I think it should be "fixed" by adding a NNTP interface to it (so that offline mode and so on is possible; and all sort of usability features you get "for free").


Sure, but the mobile layout could do with some concrete improvements. Specifically, the up/down arrows are too small to hit, and the comment text area should fit a few more lines of text. I feel like Bambi on ice when I scroll around in it on my phone.


You still can improve HN performance a lot and get better results on low connections.


Trust me, it's not just developing countries having an issue. Even on a super fast internet connection these modern web crApps will make your phone chug while they load 100 different ad and analytics libraries.


Thank you also for not changing the back-end.

https://en.wikipedia.org/wiki/Arc_(programming_language)


I wish HN had a "search" button but it's OK to search through Google. I hope they will never change the design forever. It reminds me of the BBS which I still logs on through Cterm these days.


Down in the footer, there's an Algolia search box that works pretty well.


Ah thanks man, didn't know about this...


Footer.

Also DDG !hn <terms>

If DDG is your default browser, you can run that in the navbar. Which I do all the damned time.


Using that takes me to "This page will only work with JavaScript enabled"

https://hn.algolia.com/?q=thinkpad

So not nice!


True, though if you enable JS, you'll have a good HN search tool.

I'm a fan of basic functionality w/o JS in most cases myself. I make exceptions for useful and non-annoying sites. Algolia meets both criteria.


Gave it a spin, I see what you mean, that's pretty good actually, thanks.


Thanks man, I'm using Chrome but will try DDG. It's gaining popularity these days.


For reddit, try old.reddit.com, or you can set your account to always use old in preferences.

It's not as clean as hn, but way better than "new" reddit. (I use it even on my fast connection.)


https://i.reddit.com is also simple and AFAIU more lightweight.

Tends to work better than either www or old for console browsers (lynx, w3m, etc.)

Some features, notably search, are broken, however.


reddit search is broken no matter how you look at it, it's been an ongoing meme since the beginning. One always uses google to search reddit.


Reddit search is limited in several ways, most especially in that comments aren't indexed.

I actually find it useful though. It's among the features that kept me using Reddit as a blogging engine for as long as I did, though accumulating annoyances have since driven me from that.


It's not worth it given the censorship but if you encounter some answer on reddit via a search and want to access it the best way is through http://ceddit.com/r/whateverurlpath . It's even more javascript but it's the only interface left that both looks and acts like the original.


In terms of new websites/services, https://sourcehut.org/ is a nice exception to current trends.


We have an ecommerce website in straight php we haven't redesigned since 2011. Works fine, very fast, and capable of handling very large traffic spikes without missing a beat.


I quite like the blog https://qntm.org/ which has a minimalist design in this sense.


Seriously? The website could be lighter, more responsive, and better designed (minify comments on the left?) while being as or even faster to load...


HN really needs a dark mode. Anyone mentioned it yet?

Also: yall ever hear about that syndrome that originated in Stockholm? This site obviously needs a lot of work


Dark mode would be pretty nice, negative colors on my phone works I guess but it'd be nice to have real support.


The only thing I'd like to see is a native dark mode, but everything else is perfect. It even works well on my Kobo eReader.


One of the biggest edge emacs and vi have held over other editors over the years. Consistent user interface.

Learn once, use for a lifetime.


it is possible to redesign something without adding tons of useless guff - stuff does need updating eventually, even if it isn't technically broken e.g. inline styles and spacer gifs.

I'd argue that having simpler markup and some sane, well-designed css would enhance performance further.


There is lite.cnn.io as well.

I wish there were more.


In that case, if you're looking for a great search engine, try duckduckgo.com/lite


Yes please do not every redesign this. I am still on old.reddit.com, new reddit sucks.


If there's one change they should make, it's a dark mode.


i love how tables are now more responsive than responsive. HN skipped an entire generation of jumpy websites and is now comfortably readable in every screen


ditto. Hacker news also works great on mobile devices. While every other so-called mobile app/website is a pain in the butt.


Laughs in Romanian 5g mobile data unlimited plan


I agree with this, though less for its usability and more for its aesthetic.

I wouldn’t complain, though, if dark mode and a mobile-friendly font size were introduced.


It’s lightweight and feels snappy.

Fonts and upvote buttons could be a bit larger on mobile though imho (my hands are of average size).


You can always use a terminal based browser like lynx, it’s also a way for me to often bypass some stupidly implemented paywalls. The elegance and speed of those browsers cannot be overstated in the world of unwanted garbage (embedded videos, 9 billion auto-refreshed ads, etc) on sites like CNN or most news sites.


What is the country?


this x 1 mil


I'd also like to thank HN.

It still works on my Blackberry 8700 under Opera Mini, and is the only site that I commonly use that I can say that about.

So HN performs fine using GPRS/EDGE!


The only thing that needs fixing is the link font size on mobile. Bumping it by 20% would fix things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: