Open and uncensored tribe for talking about anything tech related. Being competent with technology has gone from giving you an edge in the recent past to being a necessity in many fields today. As a result there are a lot more people interested in the various tech-related hobbies, here we have a free space to discuss such things.
@SwarmShawarma Learning models don't exist yet - not like the human brain anyway.
They "learn" (or get trained) and then they're set in stone.
There's a lot of academic researching going into how the human brain can learn "on the spot" but no-one has replicated it yet.
I give it < 5 years.
@SwarmShawarma Training models needs insane amounts of computing power, we're talking datacentres and gigawatts of power/compute. Nobody like me or you is doing that.
But once the model is "made" and it's essentially just a text file matrix of weights, you can use it.
The latest models probably need about 24GB of GPU VRAM - so we're talking high end GPUS $1.5k+ For the good stuff, that is (agentic AI).
But some of the smaller models can run on lower hardware (some models can run on phones)
benefit of the Chinese state government
I can't argue that or that China could have spyware etc, but since AI-USA was ahead in the race and invested billions into it awaiting return, crashing USAI bubble, seems like a valid goal.
But when it comes to AI, the Chinese are leaving the US in the dust.
Skynet starts in China, then.
I can't find myself trusting a Chinese model on my laptop they are state run everything and everything China puts out is for the benefit of the Chinese state government
@Bozza it is I have underpowerd devices since if one doesn't game one doesn't really need anything extraordinary. From what I read elsewhere you say that proper models need beefy setup.
Is it crawling www and needs a lot of power or can it learn only on laptop (books, aw3, graphics, mobi, PDF, avi)and then could be used as a cataloguer for a PC?
@Bozza I gotta look up how to do this.
I have some laptops with good gaming cards (I think Nvidia 4090) is that good enough?
I haven't tried local hosting a model yet
@Vermillion-Rx You can (and should) run AI on your own PC, locally.
Own your own data, join the self host crew yada yada.
But yeah you can host Claude code equiv AI on your own computer (given its fast enough) with LMStudio
Needs a pretty beefy computer tho.
Chatbot a little easier, if you want the agent stuff you need a high end PC.
I'd recommend Qwen locally, it's a much more efficient (and better) model than Google's Gemma.
Can run it on lower hardware.
Qwen is an Open Source Chinese model, for disclosure.
But when it comes to AI, the Chinese are leaving the US in the dust.
They're just pumping state funds into these models, and then open sourcing them - so people all over the world improve them.
And then there's OpenAI doing the private stuff, lots of money but they can't keep up.

