---
width: 100%
figclass: caption
alt: HanLP versions
name: hanlp-versions
---
Choose your HanLP version
.. margin:: **Beginners Attention**
.. Hint:: New to NLP? Just install RESTful packages and call :meth:`~hanlp_restful.HanLPClient.parse` without pain.
For beginners, the recommended RESTful packages are easier to start with. The only requirement is an auth key. We officially released the following language bindings:
pip install hanlp_restful
See Java instructions.
See Golang instructions.
The native package running locally can be installed via pip.
```{note}
See [developer guideline](https://hanlp.hankcs.com/docs/contributing.html#development).
```
pip install hanlp
HanLP requires Python 3.6 or later. GPU/TPU is suggested but not mandatory. Depending on your preference, HanLP offers the following flavors:
```{note}
Installation on Windows is **perfectly** supported. No need to install Microsoft Visual C++ Build Tools anymore.
```
```{note}
HanLP also perfectly supports accelerating on Apple Silicon M1 chips, see [tutorial](https://www.hankcs.com/nlp/hanlp-official-m1-support.html).
```
Flavor | Description |
---|---|
default | This installs the default version which delivers the most commonly used functionalities. However, some heavy dependencies like TensorFlow are not installed. |
full | For experts who seek to maximize the efficiency via TensorFlow and C++ extensions, pip install hanlp[full] installs every dependency HanLP will use in production. |
In short, you don't need to manually install any model. Instead, they are automatically downloaded to a directory called HANLP_HOME
when you call hanlp.load
.
Occasionally, some errors might occur the first time you load a model, in which case you can refer to the following tips.
If the auto-download fails, you can either:
- Retry as our file server might be busy serving users from all over the world.
- Follow the message on your terminal, which often guides you to manually download a
zip
file to a particular path. - Use a mirror site which could be faster and stabler in your region.
If your server has no Internet access at all, just debug your codes on your local PC and copy the following directories to your server via a USB disk or something.
~/.hanlp
: the home directory for HanLP models.~/.cache/huggingface
: the home directory for Hugging Face 🤗 Transformers.
Some TensorFlow/fastText models will ask you to install the missing TensorFlow/fastText modules, in which case you'll need to install the full version:
pip install hanlp[full]
NEVER install thirdparty packages (TensorFlow/fastText etc.) by yourself, as higher or lower versions of thirparty packages have not been tested and might not work properly.