NVIDIA Tesla P40 24GB - GPU for AI / ML / LLM / Stable Diffusion - UK STOCK

This holds if you spread the model over two or more of these GPUs too (so if you use 40 GB VRAM over two P40 GPUs then you still get 347 GB/s and the benefit of being able to load models into the full VRAM).

eBay