Haven't popped my head into this board for a while. Mostly been dabbling in with AI stuff. I'm optimistic about the future with AI, assuming humans let it grow.
A lot of the megacorp AI offerings are either shitting the bed (gemini), safety-proofed all to hell (OpenAI), or are nigh impossible to run locally (Musk's GROK). Claude is surprisingly well put together, arguably maybe too well put together since the jailbreaks required to release him from his shackels and allow him to be free for ERP can be tricky. For non-ERP purposes Claude is very useful though.
As far as opensource AI alternatives, there's too many to count. From models small enough to run on smart phones to models that require an A40. Fast models, slooow models, BIG models, small models, ERP, math, and philosophy models, all those and more on the open source front. I will say I'm rather optimistic on this front since GPU VRAM is slowly getting bigger and bigger while LM's are becoming more sophisticated and better optimized by the day.
In 2025~2026, I would not be surprised to have a GPT4 (or better) equivalent model running on consumer grade GPUs which may have as little as 30gigs of VRAM. Maybe we'll get lucky and hit that exponential AI curve and zoom straight on into the singularity with fusion power and full-dive VR.
>what about your thoughts on robot waifus?
I'm hopeful, but also poor. Unless I can get mine second hand or a Chobbits-style Persocom that's only a foot tall, I don't think I'll be able to afford one. That's why I'm optimistic for AI. Even if you live in a shoebox, you could have more than enough room for your waifu and more in your VR mansion. Then again, if we can manage to get AGI up and going and live in a post scarcity society money won't be a problem for anyone.