Hacker News new | past | comments | ask | show | jobs | submit login

> it sounds like he had good reasons

Meh.

> Subversion-to-git conversion of the Gnu Compiler Collection history, at over 280K commits (over 1.6M individual change actions), was the straw that broke the camel’s back. Using PyPy with every optimization on semi-custom hardware tuned for this job still yielded test conversion times of over 9 hours, which is death on the recipe-debugging cycle.

Just how often does one need to convert the Gnu Compiler Collection from Subversion to Git in under 9 hours?




> Just how often does one need to convert the Gnu Compiler Collection from Subversion to Git in under 9 hours?

For this particular repository, once the conversion is complete, it's complete. But there are plenty of other old projects out there with large repositories in old version control systems. Reposurgeon is a general tool.

Also, one repository conversion does not mean one run of reposurgeon and you're done. Reposurgeon is meant to be run multiple times since each run is likely to uncover issues that have to be addressed, and the way to address them is to do another run with updated parameters. Reposurgeon also has an interactive mode where you can explore a repository and test the results of various possible transformations.


I was wondering that, too. My guess is that he's using this as a (large) set of test data to test his "recipes" and has to do that a lot. Having to run many nine-hour trials serially is painful.

Not sure I'd have bothered, but it's a defensible choice. And produced this interesting example of comparable implementations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: