Thank you for the Autotagger, Mario

Started by monochrome, February 05, 2025, 11:42:43 AM

Previous topic - Next topic

monochrome

It's everything I wanted, and probably more - but I still have to discover that.

Mario

#1
Thank you.
We're just at the beginning of all this, though.

I've been working with this since July last year (I remember sitting on my balcony in the sun, creating an account with OpenAI on my notebook - sigh!).

Over these few months, I have seen amazing progress in both quality of the results, performance in Ollama etc. Interfaces for AIs were aligned, allowing e.g. Ollama and OpenAI and LM Studio to be accessed with the same commands. I'm sure Mistral and Antropic will follow. This compatibility makes it easier for software like AutoTagger to talk to different AIs.

New models became available. Smart people figured out how to shrink models to make them use less RAM and resources while keeping performance and quality. Software like Ollama or LM Studio allow us to run powerful AI on our computers with minimal effort (fast GPU required).

Recently DeepSeek has shown new ways to train models quicker and with a lot less resources and money.

I use locally running AI models to write tests for code I write, check code I write for issues, check spelling and grammar in texts and so one. It has become a normal tool for me, like my text editor or IDE.

I'm sure we'll see a lot in this area over the next year.