12GB of VRAM is still an upgrade away for most people and a 4bit quantized 13B model is barely going to be a tech demo. When open source ai is proclaimed to be near/on par/better then gpt4 they are talking about nothing else than their biggest models in a prime environment.
12GB of VRAM is still an upgrade away for most people and a 4bit quantized 13B model is barely going to be a tech demo. When open source ai is proclaimed to be near/on par/better then gpt4 they are talking about nothing else than their biggest models in a prime environment.
Sure, but not for standard cloud instances that are very affordable for companies wanting to get away from OpenAI.