Tux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 5 days agoSoftware: Then vs Nowlocklemmy.worldexternal-linkmessage-square5fedilinkarrow-up134cross-posted to: memes@lemmy.ml
arrow-up134external-linkSoftware: Then vs Nowlocklemmy.worldTux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 5 days agomessage-square5fedilinkcross-posted to: memes@lemmy.ml
minus-squareMako_Bunny@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up7·5 days agoWhat Python code runs on a graphics card?
minus-squareapfelwoiSchoppen@lemmy.worldlinkfedilinkEnglisharrow-up14·5 days agoPhyton, not Python. 🙃
minus-squareBougieBirdie@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up3·5 days agoPython has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI There’s also some cuda libraries which by definition do things directly on the card
minus-squareTarogar@feddit.orglinkfedilinkEnglisharrow-up2·5 days agoYes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.
What Python code runs on a graphics card?
Phyton, not Python. 🙃
Python has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI
There’s also some cuda libraries which by definition do things directly on the card
Yes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.