Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You the real MVP!

Though I'm getting this error on an Intel macbook (Monterey); it works fine on a Windows11 box:

   python3 convert-pth-to-ggml.py models/open_llama_7b_preview_200bt/open_llama_7b_preview_200bt_transformers_weights 1
   Loading model file models/open_llama_7b_preview_200bt/open_llama_7b_preview_200bt_transformers_weights/pytorch_model-00001-of-00002.bin
   Traceback (most recent call last):
    File "/l/llama.cpp/convert-pth-to-ggml.py", line 11, in <module>
      convert.main(['--outtype', 'f16' if args.ftype == 1 else 'f32', '--', args.dir_model])
    File "/l/llama.cpp/convert.py", line 1129, in main
       model_plus = load_some_model(args.model)
     File "/l/llama.cpp/convert.py", line 1055, in load_some_model
       models_plus.append(lazy_load_file(path))
     File "/l/llama.cpp/convert.py", line 857, in lazy_load_file
       raise ValueError(f"unknown format: {path}")
   ValueError: unknown format: models/open_llama_7b_preview_200bt/open_llama_7b_preview_200bt_transformers_weights/pytorch_model-00001-of-00002.bin


I had the same issue and then noticed that I need git lfs - otherwise just cloning the repo will not download the weights.


After getting the model with git lfs I get:

Loading model file models/open_llama_7b_preview_200bt/open_llama_7b_preview_200bt_transformers_weights/pytorch_model-00001-of-00002.bin

Traceback (most recent call last):

  File "convert-pth-to-ggml.py", line 11, in <module>
    convert.main(['--outtype', 'f16' if args.ftype == 1 else 'f32', '--', args.dir_model])
  File "/Volumes/mac/Dev/llama.cpp/convert.py", line 1145, in main
    model_plus = load_some_model(args.model)
  File "/Volumes/mac/Dev/llama.cpp/convert.py", line 1071, in load_some_model
    models_plus.append(lazy_load_file(path))
  File "/Volumes/mac/Dev/llama.cpp/convert.py", line 865, in lazy_load_file
    return lazy_load_torch_file(fp, path)
  File "/Volumes/mac/Dev/llama.cpp/convert.py", line 737, in lazy_load_torch_file
    model = unpickler.load()
TypeError: 'staticmethod' object is not callable


Thanks for the tip! After running `brew install git-lfs && git lfs install` on my Macbook, I was able to run the model.


I get the same error on an M series MacBook (Ventura). However from the repo README.md it looks like make should work instead of cmake, I’ll give that a try.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: