• SacredExcrement [any, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    57
    ·
    edit-2
    13 hours ago

    This transformative shift hinges on users being willing to talk to their computer — as in, yes, actually uttering words out loud. The Copilot features will be activated by saying the phrase “Hey, Copilot!,” acting as a sort of “third input mechanism to use with your PC,” said Mehdi. (Historically, adding another input mechanism hasn’t done wonders for the company.)

    The logic behind the voice controls sounds pretty questionable, but it’s supposedly backed by data showing that users spend billions of minutes talking in Microsoft Team meetings, according to Mehdi

    Teams meetings, famously beloved by people. More proof this was just executives stuffing garbage in because it aligned with what they wanted, and justifying it later.

    And with another feature, Copilot Vision, the AI will be able to see everything that happens on your screen so it can give context-based recommendations and tips. This is supposed to pair with Copilot Actions, which allow the AI assistant to perform tasks on your local machine, like editing folders or looking stuff up.

    I’m surprised we haven’t gotten to the point where users are required by the OS to take screenshots of their own machines periodically to send to the NSA Microsoft.