Hacker News new | past | comments | ask | show | jobs | submit login

Low-level tools that just spam requests at your webserver will give you a pleasingly high number (hopefully), but are unlikely to be an accurate picture of how many 'users' you can concurrently sustain.

Try to come up with some automated test or script that exercises the full set of features you'd expect an actual user to be interacting with over the course of a session.

Then, build a harness that slightly randomises those activities, and fires off a large number of 'virtual users' against your service. Running from geographically diverse hosts will ensure you're closest to what an actual user would experience.

Actually determining what mix of operations constitutes an 'average session' is relatively difficult, but there are tools that can record sessions from actual users (look at the HAR dump in chrome, or Selenium[1] testing framework)

All of your test clients should be recording response latencies + errors, which you can later analyse to see what slows down at scale, or which critical features with unacceptable performance you need to target first.

Another thing to consider, especially during release, is that lots of users signing up and playing around is likely to be a completely different workload to regular users using it for day-to-day activities.

You need to make sure things like your transactional email notifications ('your account has been created, verify your email address, ...') can handle peak demand as well.

[1] http://seleniumhq.org/




This was very detailed and useful, thanks. I ll be looking more into selenium, I knew of its existance, but I never looked deeper into that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: