Hacker News new | past | comments | ask | show | jobs | submit login

Right, I'm always annoyed when people talk about how std::vector has terrible performance (especially for games), which it certainly does if you just start with an empty vector with no memory reserved and append a few thousand items.

But it is very frequently possible to find the actual maximum you will need, allocate that upfront, and everything's great.




Coincidentally that's how I started thinking about limits and allocating everything up front. I wanted to try out arena allocation, with one arena per http request (and one for the entire lifetime).

But when I realized that if I wouldn't allocate a large enough backing array (capacity) then memory would grow by n+(n+1)+(n+2)... every time I allocate on the arena during a request.

Now you're right that this is not necessarily a problem. But this is a side project that I just write in order to learn and explore stuff.

So I thought I would need to figure out a static capacity and was thinking about how to go about this. Then I realized I don't need a data structure that grows, but only a slice of a static memory region, which then means I don't need an arena (which can arbitrarily grow) in the first place.

Now I'm exploring all kinds of things that a web server does and where I would normally use a dynamically growable data structure. Like parsing JSON, form parameters, HTTP headers etc. And I can find natural or self imposed limits for all of those things.

For me, the most interesting thing about this isn't even performance. It's the mere fact that you _can_ figure out and design those limits. And the resulting code doesn't become complicated, but looks much simpler than I would have guessed. I really like the exactness of it all and how it changes the way I think about these things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: