reasonably speaking, there are no smartglasses today that are capable of doing that locally, and even if there would, that does not guarantee in any way that it keeps the data locally. youre a walking surveillance camera, and like it or not, if people don’t like that, that’s their decision.
I’m surprised teslas and other such surveillance trucks are not vandalized more.
Cool, run a local tool. No harm in that.
But ifyou snitch on my location yo Facebook at all times, I’m gonna break the glasses and whatever you put them on, no remorse.
You do not get to surveil and put people at risk like that, your disability can get fucked if that’s your accommodatin.
Reasonably speaking, you have no way of knowing if smart glasses are local or remote processing just by looking at them.
reasonably speaking, there are no smartglasses today that are capable of doing that locally, and even if there would, that does not guarantee in any way that it keeps the data locally. youre a walking surveillance camera, and like it or not, if people don’t like that, that’s their decision.
I’m surprised teslas and other such surveillance trucks are not vandalized more.
The first assault on a cyborg using assistive glasses, that processed locally, happened in 2012 https://www.nbcnews.com/tech/tech-news/countering-mcdonald-s-denial-cyborg-posts-new-photo-alleged-assault-flna895484
Meta glasses process in the cloud because that’s what meta wants, the technology is more than there for local processing.