L4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 11 months ago2023 was the year that GPUs stood stillarstechnica.comexternal-linkmessage-square101fedilinkarrow-up1316file-textcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.footechnology@lemmit.online
arrow-up1316external-link2023 was the year that GPUs stood stillarstechnica.comL4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 11 months agomessage-square101fedilinkfile-textcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.footechnology@lemmit.online
2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.
minus-squareCalcProgrammer1@lemmy.mllinkfedilinkEnglisharrow-up11·11 months agoI’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
minus-squarebarsoap@lemm.eelinkfedilinkEnglisharrow-up1·11 months agoHave you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
minus-squareCalcProgrammer1@lemmy.mllinkfedilinkEnglisharrow-up1·11 months agoNo, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.
I’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
Have you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
No, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.