bia@lemmy.mltoLocalLLaMA@sh.itjust.works•What have you been up to recently with your local LLMs?English
0·
1 year agoI used it quite a lot at the start of the year, for software architecture and development. But the number of areas where it was useful were so small, and running it locally is quite slow. (which I do for privacy reasons)
I noticed that much of what was generated needed to be double checked, and were sometimes just wrong, so I’ve basically stopped using it.
Now I’m hopeful for better code generation models, and will spend the fall building a framework around a local model. See if the helps in guiding the models generation.
Do you have a degree in theoretical physics, or do you theoretical have a degree. ;)