Hacker News new | past | comments | ask | show | jobs | submit login

Could you explain what advantages there are to having the software work better as opposed to just throwing more computational power at it?

I know this doesn't necessarily apply, but if the solution space for certain niche problems is so small that we can just drown the problem in compute, I couldn't care less that the algo was N^2 or whatever, or that the UI was less than ideal. Maybe I'm not thinking deeply enough about what you mean by "better software solutions".

If I run a batch job and it spits out 10,000 compounds that I can try to a certain affect, then it then becomes a filtering problem where I can apply humans and do more traditional science, and if it was feasible to just try everything in parallel that option is nice as well, feels like how you got to that 10,000 compounds doesn't matter much.

Looking forward to hearing just how wrong my simplistic view is.




Computer power grows ~N but molecule complexity grows something like ~N!


Ah yeah so I was guessing here that we're talking about a subset that is solvable with the current methods -- as in within reach enough to be solvable by relatively naive methods




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: