• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: December 14th, 2023

help-circle




  • Yes for gaming, but for LLMs I’ve heard that the bandwidth limitations of using system RAM as vram hurts performance worse than running on the CPU using system memory directly, since smaller models are more memory bandwidth limited.

    I’ve never tried to run AI on an igpu with system memory though so you could try it, assuming it will let you allocate like 32GB or more like 64GB. I think you’ll also need a special runner that supports igpus.










  • I once had someone open an issue in my side project repo who asked about a major release bump and whether it meant there were any breaking changes or major changes and I was just like idk I just thought I added enough and felt like bumping the major version ¯⁠\⁠_⁠(⁠ツ⁠)⁠_⁠/⁠¯