Do they use much electricity/processing power when they are idle, or only really when they’re being queried?
Only when they’re being queried, when idle it consumes memory to keep them running to be ready to answer a query, but thats about it
The electricity needed to store data in memory is relatively miniscule next to doing any sort of processing on it
Memory probably, but not processing power.
you should just give it a thinking loop that runs 24/7 that just prompts it with “nothing is happening” over and over again. and give it memory of its responses along with a counter that counts how many times nothing has happened, so that it is fully aware that it is stuck in an endless loop of boredom.
Are we anywhere near being able to run this in a car? I want to hook it up to loads of stuff in my car and have a computer control it like Star Trek.
On a car without modification, likely not, at least not a decently good model. You could technically install a powerful enough computer in your car to run on it
Divert all the power from life support to the thrusters!
Car: turns off the AC while going uphill.
Engage engine cooling! Heat turns on full blast
But don’t blast me with hot air!
I’m sorry! As a large-language model, I have no capabilities to blast hot air at you. Would you like to talk about something else?
sure, you can run them on a phone, a laptop even a raspberry pi. depending on what size and speed you want of course
Played around with a tiny 1.5b deepseek model, it was thinking for a loooong while before finally answering my question.
By then it had completely forgotten what the original question was and had instead hallucinated a new question which it then gave me an answer forI give it a perfect 7/10
yeah I wouldn’t bother with anything below 4-7b for this exact reason.
boomer: “You can turn it into an ai girlfriend just by getting it to nag you to do stuff”
So how do you go about training a local ai?