Hacker News new | past | comments | ask | show | jobs | submit login

I personally have found that ChatGPT is great to bounce ideas and concepts off of. If I’m unfamiliar with something or I’m trying to learn a new skill or get a second opinion on some design or architecture details, it’s very helpful in that regard.

When it comes to pure code, ChatGPT will deceive you. I’m at the point now where I’m very hesitant to copy and paste ChatGPT code. I’ll have to completely vet it first, or it’ll have to literally just be a line or two. And even then I’ll likely have to refactor it. ChatGPT is just wrong too often when it comes to the actual code either due to lack of context about the problem or just downright hallucinations. It’s even gotten some boilerplate wrong.

When it comes to Copilot, I honestly haven’t been able to quantify how beneficial it’s been for me. I’ve often had to delete or modify its code. I think it could partially be because I’m not as skilled with it. It could also be that I’ve just had to write a lot of JS lately, and there’s probably a ton of bad JS that it’s been trained on. I remember being more impressed with it when I write Rust. But I’m also new with Rust. I need to try it with something like C#, which has more difficultly than JS but I’m also proficient with it.

All in all, AI tools for me have been just that, tools I can use to help me get the job done.




In all probability it's deceiving you the rest of the time as well. Based on what you've described here, you're trusting its expertise when you don't have much domain expertise of your own, but find it not to be particularly expert at all on matters where you yourself have suffient context and expertise to know better.

This is a common phenomenon with humans, FWIW. The same sort of thing happens with traditional information sources. For example, when a media outlet reports on things we don't know much about personally, we believe them. Then they cover something where we have direct domain expertise and find all manner of misunderstandings and errors trivially, but instead of suspecting that they're likely just as wrong about a lot of other things as well, we assume it's a special case where they just got our special knowledge domain wrong.

In any case, to answer the OP... I use these new AI tools to generate content where the details don't matter and the cost of being wrong is near zero. Such as, graphics for slides, market/product/pitch blurb pablum, etc.

I use them to compensate for my limited artistic/graphic design skills and to overcome my propensity to tediously labor over copy despite that copy being basically throwaway.


This is indeed well-known, in recent years I've seen it referred to as Gell-Mann Amnesia: https://www.epsilontheory.com/gell-mann-amnesia/


If I had to really try and pinpoint where my discomfort for this latest hypecycle is coming from, it's probably that I have this sense that the dynamic behind Gell-Mann Amnesia is being cynically exploited by the interests overselling these new products.

Kind of in the same vein as the strategies around gamification, where some identified frailty of the human psyche is being leaned on for cash.

It's not quite a con because the technology is useful and certainly worth something, but it's not non-exploitive either.


To me, the biggest value of Copilot is that you get unlimited access to a gpt-4 chat for half the price of ChatGPT Plus via Copilot Chat.

The context sensitive stuff like highlighting a line of code and asking questions about it and the completions are okay too but the chat is the most useful part imo.


Yeah, great for architecture and design especially in areas I feel impostor syndrome from knowing just enough to be dangerous. I'd say it extends the horizon of possibilities, especially in filling in grey areas.


This all makes sense, thank you. And with ChatGPT are you using the paid version? I have been experimenting with both and found the free version to be very bad. The paid version seems to make fewer mistakes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: