@MarcellusDrum@lemmy.ml to Privacy@lemmy.mlEnglish • 6 days agoThis is getting laughably ridiculouslemmy.mlimagemessage-square258fedilinkarrow-up11.37Karrow-down177
arrow-up11.3Karrow-down1imageThis is getting laughably ridiculouslemmy.ml@MarcellusDrum@lemmy.ml to Privacy@lemmy.mlEnglish • 6 days agomessage-square258fedilink
minus-square@kkj@lemmy.dbzer0.comlinkfedilinkEnglish2•6 days agoFair enough. I wish them well in the effort. It would be nice if Nvidia threw them a bone, though, what with all the AI money and their GPUs being used in so many Linux supercomputers and servers.
minus-square@FauxLiving@lemmy.worldlinkfedilink2•edit-26 days agoThere’s a project working on making CUDA work on all (read: AMD) graphics cards. It’s alpha-level, but the progress makes it look promising. https://www.tomshardware.com/software/a-project-to-bring-cuda-to-non-nvidia-gpus-is-making-major-progress-zluda-update-now-has-two-full-time-developers-working-on-32-bit-physx-support-and-llms-amongst-other-things e: Tom’s Hardware links are half the size of the article 😂
Fair enough. I wish them well in the effort. It would be nice if Nvidia threw them a bone, though, what with all the AI money and their GPUs being used in so many Linux supercomputers and servers.
There’s a project working on making CUDA work on all (read: AMD) graphics cards. It’s alpha-level, but the progress makes it look promising.
https://www.tomshardware.com/software/a-project-to-bring-cuda-to-non-nvidia-gpus-is-making-major-progress-zluda-update-now-has-two-full-time-developers-working-on-32-bit-physx-support-and-llms-amongst-other-things
e: Tom’s Hardware links are half the size of the article 😂