Hacker News new | past | comments | ask | show | jobs | submit | arkj's favorites login

To avoid network congestion, the TCP stack implements a mechanism that waits for the data up to 0.2 seconds so it won’t send a packet that would be too small. This mechanism is ensured by Nagle’s algorithm, and 200ms is the value of the UNIX implementation.

Sigh. If you're doing bulk file transfers, you never hit that problem. If you're sending enough data to fill up outgoing buffers, there's no delay. If you send all the data and close the TCP connection, there's no delay after the last packet. If you do send, reply, send, reply, there's no delay. If you do bulk sends, there's no delay. If you do send, send, reply, there's a delay.

The real problem is ACK delays. The 200ms "ACK delay" timer is a bad idea that someone at Berkeley stuck into BSD around 1985 because they didn't really understand the problem. A delayed ACK is a bet that there will be a reply from the application level within 200ms. TCP continues to use delayed ACKs even if it's losing that bet every time.

If I'd still been working on networking at the time, that never would have happened. But I was off doing stuff for a startup called Autodesk.

John Nagle


Can people who are not inherently 'gifted', start with an immense interest in math at a mid-point in their life, and still go on to make meaningful and significant contributions to the field?

Probably not.

In other words, is precocity a prerequisite to doing good work in Math?

Maybe, depending on whether you by "precocity" you mean "demonstrated precocity".

Intelligence is something you either have or don't have -- and while there's still debate about how much of intelligence is genetic (25%? 50%? 75%?) it's clear that the nurture which is most important is that which takes place before age 5, when the brain is still at its most plastic. Consequently, I would say that anyone who is not gifted when they're 5 years old is unlikely to have a significant mathematical impact.

However, not everybody with intellectual gifts demonstrates them early. Mathematics is a field known for child prodigies, not because it's particularly suited to prodigies, but rather because mathematics prodigies tend to get noticed. Some fields, like mathematics or chess, have little knowledge required before intellect can be applied; others, such as chemistry or biology, require years of prerequisite study. Moreover, just like chess prodigies, the abilities of a mathematics prodigy are obvious and unarguable, while a remarkable writer is still likely to be rejected by his first 19 publishers -- and what 9 year old writer is going to send his manuscript to 20 publishers?

In short: You have to be smart to do make a significant contribution to mathematics, and I don't believe that people can "become smart" past a very early age. However, it's possible for someone to be smart despite not having been recognized as such, depending on which fields his intellect is applied against. Finally, it's definitely possible for someone who is smart but has never done well in mathematics to make a contribution to mathematics -- if he can develop the interest which is necessary for him to apply his intellect appropriately.

Is this a useful answer? I have some other ideas about intelligence and IQ and the Putnam and flashes of insight floating around, but I'm not entirely certain how to explain them -- and given that news.yc stories don't stay on the front page for long, I thought I should post what I could promptly rather than waiting for everything else to crystallize.


Zines by techies have a long history! Tom Jennings [1], who founded FidoNet [2] and The Little Garden ISP, published one of the first Queercore anarcho-punk zines called Homocore [3]. He's written a retrospective and published an archive of his Homocore memory [4] of San Francisco during 1988-1991.

[1] https://en.wikipedia.org/wiki/Tom_Jennings

[2] https://en.wikipedia.org/wiki/FidoNet

[3] https://en.wikipedia.org/wiki/Homocore_(zine)

[4] http://worldpowersystems.com/HOMOCORE/


An IR (intermediate representation) is "retargetable": meaning new processors can be supported with a new backend addon, not a full rewrite.

Compilation, specially of the type of language LLVM is most used for (impure, procedural, object oriented, with optional collection) lend themselves to very similar optimization techniques. Optimization happens in phases, one after the other, sometimes a phase might need to be repeated after the code undergoes further "simplification". Since the phases pretty much operate on the same data format, and since they're mostly portable across processor architectures in wide use today, it makes sense to delay actual object code generation until last.

The final assembly instructions generated are as good as "machine code". In fact, the whole reason it's called "assembly", as opposed to "compilation", or even "translation" is because there is a 1:1 correspondence between assembly instruction and machine instructions. Many assemblers use instruction hash-table to look up code by template :-)

Even when assembled, the program is not in runnable state as a lot of its external symbols and dependencies are not resolved. A file is its own compilation unit, and so functions referenced in other files in the project, or dependency libraries, or standard system libraries and system calls in use have to be either resolved, or registered somewhere handy for quick resolution later. Static linking does the first, dynamic linking does later. If the object file exports symbols for use by others it might need to get made into a shared-library.

As you can imagine, all this work is both very platform specific, and also tedious. Whence why one might want to avoid the final "weaving" of the binary, and leave it to someone who is intimately familiar with the target environment. Someone like the vendor assembler and linker, better yet, the high-quality binary tools from the good folks at GNU :-)


My Computer Architecture professor Daniel Jimenez worked on (invented?) something like this:

"Dynamic Branch Prediction with Perceptrons" (PDF)

http://hpca23.cse.tamu.edu/taco/pdfs/hpca7_dist.pdf


Your comments reminded me of this anecdote about Arthur Whitney:

"The k binary weighs in at about 50Kb. Someone asked about the interpreter source code. A frown flickered across the face of our visitor from Microsoft: what could be interesting about that? “The source is currently 264 lines of C,” said Arthur. I thought I heard a sotto voce “that’s not possible.” Arthur showed us how he had arranged his source code in five files so that he could edit any one of them without scrolling. “Hate scrolling,” he mumbled."

Source: http://archive.vector.org.uk/art10500700

I suspect his code looks a lot like the J incunabulum:

http://keiapl.org/rhui/remember.htm#incunabulum


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: