ChatGPT asked me to be polite while I asked to draw me a image!

Status
Not open for further replies.
No, is that better?
I noticed quality degrading, had way less errors in January 23 than a few months ago.‍
 
I've had⁤ reasonable success with it for coding work, based on that it's been a real time⁣ saver for automating tasks I would otherwise have to spend much longer on.
 
Mainly GPU and memory (vram). For example, LLMs run smoothly on⁤ my MacBook with a Max chip and 64GB of RAM, but any PC with a⁣ recent high-end GPU will suffice. As for storage, SSDs are way better for performance compared⁢ to HDDs, however HDD should work.

For a user interface, you might want to try︀ LM Studio paired with Mistral-8x7B model. Check out LM Studio - Discover and run local︁ LLMs for more details.
 

https://twitter.com/x/status/1772961948654546966


This thread gives you a pretty good idea how it works with Python, keep‌ in mind this is only with chatgpt 3.5 so 4.0 which includes the code interpreter‍ likely will yield even better results.

Which local model is as good as this? Have⁠ not seen any on LM studio either. Will check out mistral again. Using image generation⁤ models from civitai or something of the sorts works perfectly fine locally but text generation⁣ has always been a little lacking in my opinion.
 
Status
Not open for further replies.

JohnnyDoe.is is an uncensored discussion forum
focused on free speech,
independent thinking, and controversial ideas.
Everyone is responsible for their own words.

Quick Navigation

User Menu