Neural nets are fantastic, but not all problems either require or are best solved by neural nets. Sometimes you want to understand more deeply how certain variables contribute to the prediction. That's not as trivial to do with neural nets, but in a simple regression you can just look at the coefficients and have some idea of what matters most (assuming you've normalized everything properly).
This is just one advantage of the more traditional models that Spark includes out of the box. Just because neural nets are awesome doesn't mean there aren't big advantages to the older stuff. Also, often it's not the algorithm that matters, it's how you prepare the data and how much data you have in hand. After all is said and done, in many cases the fancy model has similar performance to the boring model.
Very happy to see Spark developing yet more abilities to make ML on large datasets seamless and (relatively) painless.
This is just one advantage of the more traditional models that Spark includes out of the box. Just because neural nets are awesome doesn't mean there aren't big advantages to the older stuff. Also, often it's not the algorithm that matters, it's how you prepare the data and how much data you have in hand. After all is said and done, in many cases the fancy model has similar performance to the boring model.
Very happy to see Spark developing yet more abilities to make ML on large datasets seamless and (relatively) painless.