Local LLM Development with Ollama
Large Language Models are hot stuff at present. Using them, and integrating them into workflows, is still an open-area of investigation for many people and organisations.
I'm currently using VSCode, the Continue 'open-source autopilot' extension, and Ollama.
19 December 2023