Hacker News new | past | comments | ask | show | jobs | submit login

>No model is perfect

But some are definitely better than others.

It sounds like you're looking at things purely from the point of view of getting the correct average for a group. But whether or not you get the correct average doesn't tell you if you're using a good enough or the best available model completely apart from fairness or justice.

If you do some type of testing and you know 5% of the tests should come back positive, is there a difference between reporting 5% at random and actually doing the tests? Of course!




> It sounds like you're looking at things purely from the point of view of getting the correct average for a group

No, I'm looking at things from the point of view of making a model that fits well to the original dataset and then is verified in the actual accuracy it makes over time.

If adding race-- or inferring race-- makes the model substantially better in predicting outcomes, is it right to do so? Credit default risk is correlated to race, even controlling for other variables. Hence, using race would help you make more accurate predictions.


When you say "better", better than what? How can your model tell you that some other data would not work better? The fact that you are looking at a correlation tells you that it's one of many ways to infer what you want to determine, and that it's imperfect. This is logically certain if we agree that race is not causal. So the only question is how much better is another model with different inputs.


I'd argue that curve fitting isn't modeling. And when inserting a fitted curve into a feedback system you're very likely to just perpetuate the problem you're looking to eliminate.


> I'd argue that curve fitting isn't modeling.

This entire discussion is about ML models, which can often be fairly described as very fancy curve fitting.

> And when inserting a fitted curve into a feedback system you're very likely to just perpetuate the problem you're looking to eliminate.

This is not a problem for the individual credit issuer-- they're not looking to eliminate the problem. They've avoided some credit risk by taking race into account. They've improved their expected value, even if society is stuck with the cost of the problem getting worse.


For credit reporting this sort of thing creates bad enough externalizes that it needs to be outlawed.

And I repeat curve fitting isn't modeling. Because in that case it's not a model it's a prescribed outcome.


But what is your definition of perfect (or better). That it perfectly aligns with societies moral view, or that it makes the company the most money.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: