Workspace Facility using Local (Ollama) LLMs

I converted both Workspace’s Sage and Scholar to run locally with Ollama LLMs. It is doable, but not straight forward. May I suggest that you include a Workspace Agent example that runs without any external dependencies. And/or, create a cookbook documentation page that describes each of the steps. If someone is not that familiar with Docker they will find it a little difficult to know which steps to do when and why. :sweat_smile:

The current examples are running a single agent examples, where each agent needs to be defined in 2 related files. I’m starting down the road of how to incorporate more complicated multi-agent applications and workflows in this environment. At some point it would be nice if you also included a simple example that highlights some of the challenges. :face_with_monocle:

Thanks for a great product that is hard to keep up with all of your new features, facilities and enhancements.

Hey @bills
Thank you for your input. I have forwarded your suggestions to the team and we will surely think about implementing these.

Thanks a lot for sharing these with us