Hacker News new | past | comments | ask | show | jobs | submit login

They are not f*cked. They have a free tutor that they can ask any time, who will always do his best to help.

My son is currently studying engineering, and whenever he is stuck at anything (construction, math, programming, mechanics) he fires up ChatGPT and asks for help. In maybe 90% AI gives him a hint so he can continue, which an extremely short feedback cycle. In the remaining 10% he will have to ask his human tutor at university, who is usually available a few days later. And it is not blindly following the AI's advice, but rather picking ideas from it. It is actually pretty awesome to see what opportunities there are, if AI is not simply used for cheating.

One impressive example he showed me was feeding a screenshot of a finished (simple) construction drawing to the AI and asking for potential errors. The AI replied with a few useless, but also with one extremely helpful suggestion, which helped him eradicate the last mistake in his drawing. I am still not sure if those were generic suggestions or if the AI was able to interpret the drawing to a certain degree.






If they use it as a tutor or as a glorified Google search then that's okay. The problem is if they start using code generation tools which AI provides and directly use that into the code base.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: