To run your own laptop and cheap we still need much fine tuned training set with much better algorithm .
Right now, most capable ones needs over 120gb of VRAM just inference (run).
To run your own laptop and cheap we still need much fine tuned training set with much better algorithm .
Right now, most capable ones needs over 120gb of VRAM just inference (run).