With no further algorithmic improvements, GPT-4 level AI should be achievable today (provided extensive compute / training on an order of magnitude more tokens than current models) in about the same file size as a typical YouTube video.
Assuming some algorithmic improvement in the nearish future, I see no reason GPT-5 level AI in a GB or two isn't right around the corner.
(For comparison, GPT-3.5-Turbo is likely a finetuned Curie 13B, which is just about 8GB in 4bit.)
Assuming some algorithmic improvement in the nearish future, I see no reason GPT-5 level AI in a GB or two isn't right around the corner.
(For comparison, GPT-3.5-Turbo is likely a finetuned Curie 13B, which is just about 8GB in 4bit.)