It defaults to temp 0.6. I thought that 0 was better for following instructions? At least it was for text-davinci-003. Is chatgpt-turbo different in this regard?
Also if you have installed any GPT command line program like askleo and don't need to give it context you can just run :r !askleo description of function and language here
Although askleo uses text-davinci-003 so I would use one set up for ChatGPT instead. Or if you are not too lazy this extension looks great.
What would be really nice is something that automatically selected context and then streamed in the code. Also it should come with a short command or default for that.
I just tried this out and it's basically exactly what I want! No frills, basic feature set. Was going to make something myself but now I can be even lazier than normal.
for transparency: I'm from the Codeium team, and we are big fans of getting this AI gen tech to all developers on all IDEs for free - we've also open sourced an emacs plugin: https://github.com/Exafunction/codeium.el
How are you supposed to use two plugins providing the same functionality? Are you gonna compare them and then uninstall one of them? Do they differ?
As far as I can tell, it seems like you just wanted to plug your own plugin, but I'm happy if I'm wrong. I just to understand how you can use two plugins that provide the same functionality and why you would do that.
they're complementary! codeium works like copilot, constantly autocompleting what you want to type next, which helps a lot when you already have a decent idea of what you need to get done and need to get it done faster. chatgpt (and chat interfaces in general) help you get some ideas on what to do when you are doing more open-ended exploration, and is explicitly invoked. as developers, we all code in these different "modes" and different tools can be built for each!
After the temporary outage of history in the openai web interface yesterday, I just want a simple no bells and whistles way to interface with chatGPT and have a local log kept. One big log is okay, but it would be nice to save sessions to unique files.
Ha - this was my weekend project, was pretty sure something similar would pop up in a few days. It's been very handy to ask questions, summarize, proofread, give jailbroken wisecracks, etc. right within VIM where I can slice and dice the text at will.
Not sure if this supports the API's "streaming mode" but it's nice for seeing the output typing in real time. Requires spawning a separate job though.
Sick dude. Thank you! I love copilot.vim and use it quite extensively. Will try this. I only ever use the ChatGPT service through their website. Do you get an API key by default or is that a separate service?
I still use copilot as my main completion tool. I found ChatGPT to be too slow and expensive to use for tab complete level of usage, but it works really well for larger things (e.g. refactor X to do Y, or write unit tests).
It's not the same. I don't know the technical differences, but in my tests it often returns wildly different results. anecdotally the API has also hallucinated modules that don't exist (but that I wish did!) whereas the chat gpt proper has not done that to me.
> If so it might be way cheaper to use the API than ChatGPT plus?
I haven't used ChatGPT Plus, just the normal one. But at least the demonstration GIF makes it seem kind of slow. One selling point of the Plus service is supposedly that you get faster responses.
I just tested the API, it seems to have the same speed as ChatGPT plus. I quickly tested some questions and it seems that OpenAI web app has a bit more relevant answers, I wonder whether it's related to the temperature and other parameters.
I tried to build a saas app which generates logos and I had to upgrade.
Using chatgpt prompt on the free tier, I sometimes dont get responses due to volume of requests(I believe free tier gets culled in these cases?). Or sometimes its just extremely slow.
In this case, once you setup with your api key, its free to use inside neovim.
That's like Windows subsystem for Linux. At least in that case it was related to trademarks (you can't say it's a Linux subsystem but you can say it's for Linux)
It's like we're LLMs ourselves... generating different flavors of UIs to interface with chatGPT. That's what I think of when I see the proliferation of chatGPT UIs that hit the front page.
Inevitably, human behavior that's similar to an LLM is ripe for replacement by an LLM due to the abundance of behavioral data.
Absolutely. That’s a very exciting thing. These tools are forcing us to confront who we are.
Interestingly, LLMs (so far) are similarly limited, they can only ingest/digest so much data at a time, so we are trying to figure ways around it.
Just like humans, soon there will be lots of specialized LLMs, ones that do only certain types of law, or only one programming language, or even just one codebase, etc.
LLMs don't force me to confront who I am any more than a dog does when I see them eating. LLMs generate copycat ideas. So do I. Dogs eat. So do I. Neither of these cause me any kind of identity crisis.
Sure, not everyone in the world will be forced to ask themselves what they truly are.
However. Can dogs write too? Can they write better code than you? Can they write better letters/documents/proposals/essays than you?
Because ChatGPT can.
If you apply that at a global scale, as quickly as it is happening, it’s easy to imagine that pretty soon there’s going to be a big change in the way we interact with technology and how we view ourselves in the world.
Great job on creating CodeGPT.nvim! This plugin for neovim seems like a useful tool for code-related tasks such as code completion, refactoring, and generating documentation. The clear instructions for installation, including setting the environment variable and required plugins, make it easy for users to get started. The use of the ChatGPT API is also an exciting feature that adds a unique aspect to the plugin. Overall, I appreciate your hard work in developing this tool and look forward to seeing how it continues to evolve. Keep up the good work!
Excuse me, but it seems like your comment was generated by a language model, most likely ChatGPT. While it's impressive that AI technology can produce text that sounds so human-like, it's important to be clear about when a comment is written by a machine instead of a person. This can help avoid confusion and ensure that our conversations are productive and informative. Is there anything else I can help you with?
Also if you have installed any GPT command line program like askleo and don't need to give it context you can just run :r !askleo description of function and language here
Although askleo uses text-davinci-003 so I would use one set up for ChatGPT instead. Or if you are not too lazy this extension looks great.
What would be really nice is something that automatically selected context and then streamed in the code. Also it should come with a short command or default for that.