Hacker News new | past | comments | ask | show | jobs | submit login

To prevent overfitting. A common technique is to use cross-validation, where you train some percentage of the data (90%) and test on the remaining (10%), and cycle through which 10% you use as test (each train/test split is called a "fold"). Once you've identified a model that resists overfitting, you can train on all of the data.

edit: As noted in the other reply, having a truly blind validation set is still ideal.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: