环境

env.yaml

模型下载

Modelscope(阿里达摩) website

Download internlm-chat-7b at /root/weights/internlm with version v1.0.3

from modelscope import snapshot_download
model_dir = snapshot_download('Shanghai_AI_Laboratory/internlm-chat-7b', cache_dir='/root/weights/internlm', revision='v1.0.3')

总结:

Huggingface_hub (HuggingFace) website

Download with command line by os package.

model_dir = "/root/weights/internlm"
os.system(f'huggingface-cli download --resume-download internlm/internlm-chat-20b --local-dir {model_dir}')

Download with function hf_hub_download

import os 
from huggingface_hub import hf_hub_download  # Load model directly 

hf_hub_download(repo_id="internlm/internlm-20b", filename="config.json", cache_dir='/root/weights/other')

总结:

  1. 国外网站下载不稳定

解决方法:使用国内镜像 mirror

# config HF_ENDPOINT env
export HF_ENDPOINT=https://hf-mirror.com
export HF_HUB_ENABLE_HF_TRANSFER=1
# need hf_transfer
# confige env for fast download
# pip install hf_transfer

# download
huggingface-cli download --resume-download --local-dir-use-symlinks False bigscience/bloom-560m --local-dir bloom-560m