Hacker News new | past | comments | ask | show | jobs | submit | jll29's comments login

This study [Mangen, Walgermo and Brønnick (2012)] should be replicated with adult subjects and N=500 instead of just N=72 kids from a single class.

In Sweden, based on similar results sub-notebooks and tables that were only recently introduced were removed again. I respect the Swedish for reacting on the new evidence instead of being in denial that the purchase of so much hardware was a mistake.

I write, read, and review scientific papers throughout the year, and most of the time, I will print them out (sometimes over one hundred pages for a single conference - e.g. 10 papers a 10 pages). The clearest benefit is reading mathematical formulae on paper vs screen, from my subjective experience, but also the ability to scribble notes, turn back the page to re-read something to double-check without much effort.

As much as I like computers, paper is the most ingenious medium ever invented by humankind, and the second most durable w.r.t. long-term preservation of the written word (after parchment).


> paper is the most ingenious medium ever invented by humankind, and the second most durable w.r.t. long-term preservation of the written word (after parchment).

And clay tablets.


> the ability to scribble notes, turn back the page to re-read something to double-check without much effort.

I agree 100%. The simple act of turning back the page/pages is so fast and efficient. My brain seems to even remember where on the page the info I want is. So much more is visible in a single 'eye-shot'. I don't get that same mental experience with e-docs - not sure why. I did grow up with paper so there is a bias.


I had not thought about the paging-back issue until you mentioned it, but I recognize in myself what you have said about it. Our brains and sensorimotor systems are developed for direct interaction with the physical world, not through proxies (though we can get pretty good with decent proxies through practice.)

On the other hand, being able to search is invaluable, as is the ability to have the footnote pages open in a separate window, if the software does not provide a better option. Finally, I can't imagine a good interface that is not pencil-like for sketching diagrams.


I would say 1984 is rather more subtle than portrayed here.

For instance, the masses are kept at bay by scaring them with perpetual wars on the one hand and by keeping them distracted with machine-generation filthy "literature" (we only reached the level of technical sophistication to do that courtesy GPT/Llama last year). That part is more similar to what the post portrays as the "Huxley" view, perhaps.

The appendix of 1984 ("Newspeak") is a masterpiece on its own, redefining English words like "freedom" so that they can only mean "free from lice", with the notion of freedom forgotten.


If you, like me, would have had to watch people play Candy Crush on the Tube [= the London subway] on the way to work every morning for seven years, you would not ask.

I felt terribly sorry for that loss of GDP and decline in mean global IQ.


Why would playing Candy Crush on the subway have any impact on GDP or IQ? Sounds like a non sequitur.

I come to HM for technology matters, but I am genuinely interested to see that the many geeks here from hardware nerd over full-stack developer to investor or founder are all also human beings that have ordinary lives and ordinary problems.

It is valuable content to read on HN how fellow geeks see other spheres of life.


> but it would be impractical to make it book-sized

Not really:

History: https://arxiv.org/abs/2212.11279 (75 pp.)

Survey: https://arxiv.org/abs/1404.7828 (88 pp.)

Conveniently skim-read over the course of the four weekends on one month.


This appears to be more prone to abuse than Wikipedia, perhaps because the encyclopedia topic attracts the "right" kind of nerds.

I've been thinking on an off how to identify and label sub-areas of cities automatically in meaningful ways, and how to compute/extract from text useful sets of attributes to describe them.

If this Website was less prone to trolls, its data generated could be used to train and evaluate machine learning models for that task.

B2B: Where would you open your office in order to be close to your customers?

C: Where to find the nearest pizza?

C2C: Where to rent an affordable room close to a university?


If you think about it, "booting" is an anachronism anyhow. Your washing machine doesn't boot. Your radio or TV doesn't boot (ironically, more modern smart TVs might). Your car doesn't boot.

When are we finally designing computers like proper products?


When you turn on the tv/washing machine/car/set-top box/toaster, and the LEDs all light up and twirl around, what do you think the device is doing?

I mean, all of those things do "start" somehow. Capacitors fill up, motors spin up, etc. Lots of them have had microcontrollers that boot for a longer time than you are giving credit. I have CRT TVs that "boot", they are effectively computers.

There's a lot of philosophical ways you could see it as why a computer has to "boot" every time it is powered on, but the way I think about it, it all has to do with the volatility of memory. When power is removed, registers and DRAM all lose their state slowly. When a modern computer is powered on, a lot of things happen before the primary CPU even begins to boot. Once reset is pulled down on the CPU, it has to go through the process of actually booting. Each phase of bootstrap builds on the previous phase, on modern computers usually starting with some kind of ROM right on the CPU, to firmware in EEPROM which may boot in multiple phases, to a bootloader, to the actual operating system kernel. Embedded devices can be a lot simpler here but unilaterally there will be some sequence of code at the beginning whose job is to (re)initialize a bunch of state that holding reset won't. (From my spelunking on older hardware it seems like holding reset on most CPUs only ever does nearly the absolute bare minimum to make it possible for a bootrom to deal with the rest of initialization. Makes enough sense, really.)

This has to happen every time a computer is powered on because it's a clean slate every time it turns on. The registers won't just magically get into some state, there has to be some integrated logic that can reset the processor to a known state and start the process of bootstrapping again.

Let's say someone builds a computer with absolutely no volatile memory at all. If power is removed, registers keep all of their state. The RAM is durable. All of the caches are durable. Then what? Well, then you probably don't have to bootstrap every time the computer is turned on. At least in theory it ought to be possible to have a computer that works this way and simply resumes exactly where it left off when power is returned. You would probably design computers differently since you wouldn't necessarily need RAM and disk to be separate, it could possibly just be one thing, and what is now booting an OS would be somewhat more like installing it.

I don't really think we have the technology to make something like this practical, and it's hard to imagine it much since our hardware and software is designed tightly around the assumption of these processes and the existence of multiple tiers of memory, but that's my opinion for the best philosophical justification for why computers must bootstrap.

It would be interesting to see, though I don't really agree with the characterization that making a computer without a bootstrap process would make it any more "proper" of a product...



> I don't really think we have the technology to make something like this practical

I mean, Intel Optane is kind of this, although afaik, nobody ran systems with only Optane, and you would still need some amount of suspend/resume logic, because cpu registers need to be persisted including getting the cpu into the proper mode. Also, perhiperals would need to be reinitialized (and possibly redetected).


What you're describing would still need a bootstrap ROM, and would be rather challenging since it's not obvious how you'd handle unplanned power loss.

If you really want to avoid booting, the registers also need to be non-volatile, and all of the peripherals have to be built this way, too, e.g. the GPU.


You'd need something like a 'stop' state, distinct from reset. The system power supply would have to release the STOP line to allow the computer to continue processing, and the power supply would also have to assert STOP on event of the power sagging below specified levels. Whether the power dropping is due to a user intentionally hitting a power switch, or the grid dropping out would be immaterial. 'STOP' simply prevents the computer from attempting to process when low voltage would cause unreliable logic. You'd still want RESET because from time to time you want the ability to put the system into a known state.

For 5 1/4" disks, you could double capacity by using a punch to cut a whole into one border and then use the back side - et viola, from 140 kB to 2x 140 kB.

But you had better not punch into the actual magnetic floppy disk inside its plastic enclosure, or you may have killed the whole thing.


I fully second that, this is a must-see for any geek, much more so than e.g. War Games.

"Halt and Catch Fire" (HCF) often is jargon that refers to documented or undocumented opcodes or code sequences that leads the CPU to crash:

https://en.wikipedia.org/wiki/Halt_and_Catch_Fire_(computing...

With a hat tip to all Commodore C64/MOS650 enthusiasts, I shall end this post with my favourite HCF sequence: 4C 10 7E.

_Happy watching!_


I recall BeOS had is_computer_on_fire()

double is_computer_on_fire();

Returns the temperature of the motherboard if the computer is currently on fire. Smoldering doesn't count. If the computer isn't on fire, the function returns some other value.

( https://www.haiku-os.org/legacy-docs/bebook/TheKernelKit_Sys...)


int32 is_computer_on()

If the computer isn't on, the value returned by this function is undefined.


not 'crash', which for computers refers to a temporary halt to operations which can be resumed by rebooting. an hcf is more like a car crash: something that physically damages the processor. like, if you have a microcontroller with four pins wired together to give you more drive current, if you drive two of them low and two high, you are likely to have an hcf

one thing we did on the C64 in this vein (I was a bad kid) was to send opcodes to the CPU on the 1541 floppy drive to send the head to a nonexistent track. This would sometimes jam the head and require physical repair of the drive.

haha rawk

Thanks for covering that story - I lived at Clapham Common for seven happy years.

So much history: there is also a little church on the Common, whose past members played a role in the abolishion of slavery: https://en.wikipedia.org/wiki/Clapham_Sect


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: