@peterquirk oh Lord who gave it file system access? That's a terrible plan!
@peterquirk I don't mess with the commercial stuff.
I do know how to use the local ones this way but I keep the decision making for myself. E g.;
Personal AI assistant in your terminal, with tools so it can:
Use the terminal, run code, edit files, browse the web, use vision, and much more
https://github.com/ErikBjare/gptme
@b4cks4w I haven't been tracking gptme - it looks interesting. I run ollama locally with various models.
@peterquirk I recall you mentioned that. I'm using llamafile, which is similar
@b4cks4w I see it has a web interface, whereas ollama has a terminal interface.
@peterquirk it has both by default, or either can be turned off with a cli flag. I use the cli
@peterquirk llamafile's real benefit is portability. The executable can be concatenated to a GUF model file, and run off a USB flash drive. Same executable will run on Windows Linux or MacOSX. Wild stuff.
@b4cks4w I'll check it out when I've solved the problem of binding controls correctly for MSFS20204.
@b4cks4w That's what agents do in the latest versions. They can execute shell commands, run apps, etc.