EagleEye, an AI-powered mixed-reality (MR) system designed to be built into soldiers’ helmets.

The modular hardware is a “family of systems,” according to Anduril’s announcement, including a heads-up display, spatial audio, and radio frequency detection. It can display mission briefings and orders, overlay maps and other information during combat, and control drones and military robotics.

“We don’t want to give service members a new tool—we’re giving them a new teammate,” says Luckey. “The idea of an AI partner embedded in your display has been imagined for decades. EagleEye is the first time it’s real.”

  • Seems like a lot of expense and excess weight to replace a laser sight, knowing how a gun works, and looking at the side of your mag. Idk what the benefit if making it like a video game is, other than making a bunch of money selling gadgets to the military.

    I’m not saying these things are useless entirely, but externalizing basic skills seems like a good way to make your soldiers lazy and ineffective. Especially since computers can easily be subverted by hacking, dirty sensors, or just being out of alignment. Obviously imperial soldiers are trending towards lazy and incompetent so it makes some sense to wrap them in gear that mimics competency.

    • Awoo [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 days ago

      Idk what the benefit if making it like a video game is, other than making a bunch of money selling gadgets to the military.

      It makes non-standard firing positions viable. Right now you must hold a gun in a certain way, to your should, eyesight trained down barrel in order to fire accurately. You can’t hipfire that thing and have it go anywhere with any accuracy, you can’t hold your gun around a corner and have it go anywhere accurately. You can’t hold it cack-handed through a peep hole and have it go anywhere accurately. All of these firing positions become viable with an accuracy line instead of a laser sight.

      The laser sight is only useful in close quarters. I’m not talking about that. The laser sight is detrimental to accuracy at 300m+ because it’s not going to be where the bullets are going to land, unless you’ve zeroed it for that range which nobody is going to do because that’s not the intended use of them.

      Sights have to be altered before going into the field with guesswork as to what the expected engagement range will be, if they want close quarters then they’re going to use sights for close quarters, if they want longer ranges then they’re going to want sights for longer ranges. A hud like this completely removes that element, the gun will be accurate at close range, it will be accurate at long range, it will be MUCH more accurate at longer ranges than usual because of the guide line too. Let’s not forget the entire hud could function as a toggleable zoom too. Let’s face it, not all troops are sharpshooters because not all troops are very good at estimating bullet drop at range, you give them a guide line though and their accuracy will greatly improve at those ranges.

      Is it externalising skills? Yeah sure. It is. But not all these troops have all these skills in the first place. The tech will significantly improve the bad soldiers more than it harms the skill of the good ones. You’re giving them wallhacks and aim assist.

          • I know about AR, I mean the video game aim assist stuff. It just doesn’t seem practical. It seems ~the same as using a sight with a bunch of extra sensors in the middle. Seems like any calibration issues with sights would still exist plus all the work of integrating the gun (identificación, sensors for precise positioning, data transmission, another fucking battery). I don’t doubt MIC wants something like this, but there are always tradeoffs and it seems like in this case you add a lot of weight and complexity and maintenance for limited benefit. Then again, maybe they have a really nice aim assist system and the only thing holding them back was that the helmet was too heavy.

            • Awoo [she/her]@hexbear.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 days ago

              sensors for precise positioning

              No you’re overthinking it. The “sensor” already exists on the headset in the form of multiple cameras, which are set apart at specific distances allowing them to use multiple images from different locations to generate a 3d image. You can see this clearly on the current version of the headset (3 cameras in middle of helmet).

              The older prototypes they were working with had more:

              This can actually be done with just 2 cameras: https://youtu.be/5LWVtC4ZbK4

              The technique for this is very simple depth measurement. I’m sure you understand that if you have a 3d image of everything in frame what you can do with that is pretty simple and is going to be accurate. You can probably assume that these are using wide fisheye lens like this so they have an extremely clear view of everything:

                • Awoo [she/her]@hexbear.net
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 days ago

                  All you really need are:

                  1. Real time 3d model of what is currently being seen, achieved by multiple cameras.
                  2. Real time 3d model of the rifle being aimed. With the ability to recognise where that rifle’s barrel is pointing. This can be achieved with a laser on the rifle, or it can be achieved with simple image recognition that currently already exists to do things like recognise a hand pointing at something accurately used for mouse pointing and clicking inside VR. All of this will work fine as long as the rifle being aimed is in frame of the camera, which it will be on such a wide fov camera.