Hi there, I want to share some thoughts and want to hear your opinions on it.

Recently, AI developments are booming also in the sense of game development. E.g. NVIDIA ACE which would bring the possibility of NPCs which run an AI model to communicate with players. Also, there are developments on an alternative to ray tracing where lighting, shadows and reflections are generated using AI which would need less performance and has similar visual aesthetics as ray tracing.

So it seems like raster performance is already at a pretty decent level. And graphic card manufacturers are already putting increasingly AI processors on the graphics card.

In my eyes, the next logical step would be to separate the work of the graphics card, which would be rasterisation and ray tracing, from AI. Resulting in maybe a new kind of PCIe card, an AI accelerator, which would feature a processor optimized for parallel processing and high data throughput.

This would allow developers to run more advanced AI models on the consumer’s pc. For compatibility, they could e.g. offer a cloud based subscription system.

So what are your thoughts on this?

  • @colournoun@beehaw.org
    link
    fedilink
    English
    51 year ago

    Unless the AI processing is much more specialized than graphics, I think manufacturers would put that effort into making more powerful GPUs that can also be used for AI tasks.

    • @TheTrueLinuxDev@beehaw.org
      link
      fedilink
      English
      11 year ago

      They would try to alleviate the cost on running GPU by making an AI accelerator chip like Tensor Core, but it’ll get bottleneck by limited VRAM when Neural Net models require steep amount of memory. it’s more productive to have something like NPU that runs either on RAM or by it’s own memory chips offering higher amount of capacity to run such neural net and avoid the roundtrip data copying between GPU and CPU.

    • GreyBeard
      link
      English
      11 year ago

      We saw this happen a long time ago with PPUs. Physics Processing Units. They came around for a couple of years, then the graphics cards manufacturers integrated the PPU into the GPU and destroyed any market for PPUs.

  • Qazwsxedcrfv000
    link
    fedilink
    English
    41 year ago

    Your GPU is an AI accelerator already. Running trained AI models is not as resource demanding as training one. Unless local training becomes universal, AI acclerators for consumers make very few sense.

  • @averyminya@beehaw.org
    link
    fedilink
    English
    21 year ago

    Look into what Mystic AI was doing. It’s effectively what you were talking about but based in reality :)

    • @Port8080@feddit.deOP
      link
      fedilink
      English
      11 year ago

      Two very interesting articles. Thank you for that!

      Especially the analog processor is a game changer with having the computation directly in memory. Generally, analog computers are a very interesting subject!

  • @dill
    link
    English
    11 year ago

    It was before my time but… If physX cards are any indication, then no.

    • @Port8080@feddit.deOP
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      The PhysX debate was also before my time. But I read into it, and it seems like they solved it partly software based. Please correct me if I’m wrong, I just skimmed over the PPU subject. But with AI we are talking about hardware limitations, especially memory.

      Currently, AI operations mean a lot of time-consuming copy tasks between CPU and GPU.

  • @maynarkh@feddit.nl
    link
    fedilink
    English
    11 year ago

    Good question, but I’d say that the same train of thought went through dedicated physics cards. I’d guess that an AI card should have a great value proposition to be worth buying.

    For compatibility, they could e.g. offer a cloud based subscription system.

    I’m not sure where you’re going with this, but it feels wrong. I’m not buying a hardware part that cannot function without a constant internet connection or regular payment.

  • @s_s
    link
    English
    1
    edit-2
    1 year ago

    Restate your question without any of the following buzzwords: A.I., artifical intelligence, machine learning.

    Now, clearly describe what you are talking about.

    IMO, it looks like you’re completely lost in the sauce.