Skip to content

Latest commit

 

History

History
139 lines (87 loc) · 2.66 KB

2.md

File metadata and controls

139 lines (87 loc) · 2.66 KB

大模型和InternLM

特点

  • 需要数据多
  • 参数多
  • 算力需求大

网络结构

  • Transformer
  • BERT
  • GPT

优势: 能够捕捉和理解数据中更为复杂、抽象的特征和关系; 劣势:同样有可解释性问题,需要花费成本高;

InternLM 模型

已开源:InternLM-7B 和 InternLM-20B

开发资源

  1. 网站: tmux

  2. conda:

# 进入 bash
bash

# 根据已有环境创建
conda create --name internlm-demo --clone=/root/share/conda_envs/internlm-base

#激活
conda activate internlm-demo

将conda环境一键添加到jupyterlab>
lab add internlm-demo
  1. pip 源
# 指定pip源下载
pip install -i https://mirrors.cernet.edu.cn/pypi/web/simple some-package

# 配置系统源
pip config set global.index-url https://mirrors.cernet.edu.cn/pypi/web/simple

#使用指定源升级pip
python -m pip install -i https://mirrors.cernet.edu.cn/pypi/web/simple --upgrade pip

  1. conda 源 .condarc
  • Linux: ${HOME}/.condarc
  • macOS: ${HOME}/.condarc
  • Windows: C:\Users<YourUserName>.condarc
cat <<'EOF' > ~/.condarc
channels:
  - defaults
show_channel_urls: true
default_channels:
  - https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
  - https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/r
  - https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/msys2
custom_channels:
  conda-forge: https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud
  pytorch: https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud
EOF
  1. 端口转发,文件上传
ssh -CNg -L 6006:127.0.0.1:6006 [email protected] -p <开发机端口>

scp -o StrictHostKeyChecking=no -r -P {端口} {本地目录} [email protected]:<开发机目录>
  1. 模型下载

huggingface

# pip install -U huggingface_hub

import os

# 下载模型
os.system('huggingface-cli download --resume-download internlm/internlm-chat-7b --local-dir your_path')

# 下载指定文件
import os 
from huggingface_hub import hf_hub_download  # Load model directly 

hf_hub_download(repo_id="internlm/internlm-7b", filename="config.json")

ModelScope

# pip install modelscope==1.9.5
# pip install transformers==4.35.2

import torch
from modelscope import snapshot_download, AutoModel, AutoTokenizer
import os
model_dir = snapshot_download('Shanghai_AI_Laboratory/internlm-chat-7b', cache_dir='your path', revision='master')

OpenXLab

# pip install -U openxlab

from openxlab.model import download
download(model_repo='OpenLMLab/InternLM-7b', model_name='InternLM-7b', output='your local path')