Hacker News new | past | comments | ask | show | jobs | submit login

So we shouldn't test using a snapshot of live data? Seems prone to finding errors only on production.



When developing and testing new features/bugfixes? Unless the bug is directly tied to production data I have no idea why you would have to use production data for that. I'm not saying don't do it on a staging server, but you don't develop on the staging server.

Right now I have a vagrant box where the VM images is on a DMG and all data is NFS mounted from the same DMG onto the VM, which is kind of the worst scenario I can think of. The testing database is around 2GB and the source+data files etc is ~200MB, just because I actually do need to fix a bug related to a portion in the production data. What's slow is the CPU, and that is still doing fine. It's not the disc even though I'm abusing it this way. That's on a 2011 macbook, 16GB, 400GB SSD.

HN is a small piece of software which should be easy to write tests for, 2GB db and 200MB source/data-files should be more than you'd ever need to work on something like HN. If you want to stress-test, test fs speeds etc you cannot do that on Mac OS X anyways since you're not running Mac OS X in production.

Changing the specs of a piece of software in order to make development more easy seems totally backwards. You're developing for production, not the other way around.

And at last, why not simply add a case-sensitive partition to your Mac if speed is such a big problem?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: