Writing code in languages I’m not familiar with yet. Reading code is much easier than learning all the syntax details, so I can still verify that it’s correct. I’ve even written a poc in lang 1 and used LLM to rewrite it in lang 2.
Writing code in languages I’m not familiar with yet. Reading code is much easier than learning all the syntax details, so I can still verify that it’s correct. I’ve even written a poc in lang 1 and used LLM to rewrite it in lang 2.
Interesting, I had missed that there are “non-official” models that can be used with Ollama just like the official ones. e.g. https://ollama.com/huihui_ai/deephermes3-abliterated
And it gave a good explanation to my “lithmus test” code snippet
Ollama’s Python API works well and there’s a lot of examples. However, I’ve not gotten the Ollama REST API to work, the response doesn’t unpack into json for me.
Just design the rest of your system so that it doesn’t have to know anything about the implementation (only prompts and responses) and you should be able to easily replace the LLM part later.
Interesting, lots of “bang for the buck”. I’ll check it out
of course, I haven’t looked at models >9B for now. So I have to decide if I want to run larger models quickly or even larger models quickly-but-not-as-quick-as-on-a- Mac-Studio.
Or I could just spend the money on API credits :D
The whole point of US and European companies moving manufacturing to China.