EagleEye, an AI-powered mixed-reality (MR) system designed to be built into soldiers’ helmets.

The modular hardware is a “family of systems,” according to Anduril’s announcement, including a heads-up display, spatial audio, and radio frequency detection. It can display mission briefings and orders, overlay maps and other information during combat, and control drones and military robotics.

“We don’t want to give service members a new tool—we’re giving them a new teammate,” says Luckey. “The idea of an AI partner embedded in your display has been imagined for decades. EagleEye is the first time it’s real.”

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 days ago

    All you really need are:

    1. Real time 3d model of what is currently being seen, achieved by multiple cameras.
    2. Real time 3d model of the rifle being aimed. With the ability to recognise where that rifle’s barrel is pointing. This can be achieved with a laser on the rifle, or it can be achieved with simple image recognition that currently already exists to do things like recognise a hand pointing at something accurately used for mouse pointing and clicking inside VR. All of this will work fine as long as the rifle being aimed is in frame of the camera, which it will be on such a wide fov camera.