You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(janusenv) P:\Janus>python demo/app_januspro.py
Python version is above 3.10, patching the collections module.
C:\Users\artre\anaconda3\envs\janusenv\Lib\site-packages\transformers\models\auto\image_processing_auto.py:590: FutureWarning: The image_processor_class argument is deprecated and will be removed in v4.42. Please use `slow_image_processor_class`, or `fast_image_processor_class` instead
warnings.warn(
Loading checkpoint shards: 100%|█████████████████████████████████████████████████| 2/2 [00:06<00:00, 3.03s/it]
Using a slow image processor as `use_fast` is unset and a slow processor was saved with this model. `use_fast=True` will be the default behavior in v4.48, even if the model was saved with a slow processor. This will result in minor differences in outputs. You'll still be able to use a slow processor with `use_fast=False`.
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama_fast.LlamaTokenizerFast'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 - if you loaded a llama tokenizer from a GGUF file you can ignore this message.
Some kwargs in processor config are unused and will not have any effect: mask_prompt, add_special_token, num_image_tokens, ignore_id, image_tag, sft_format.
* Running on local URL: http://127.0.0.1:7860
Could not create share link. Please check your internet connection or our status page: https://status.gradio.app.
Successfully uninstalled gradio-5.13.2
**ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
rich 13.9.4 requires pygments<3.0.0,>=2.13.0, but you have pygments 2.12.0 which is incompatible.**
Successfully installed Pygments-2.12.0 altair-5.5.0 attrs-25.1.0 colorama-0.4.5 contourpy-1.3.1 cycler-0.12.1 fonttools-4.55.8 gradio-3.48.0 gradio-client-0.6.1 importlib-resources-6.5.2 janus-1.0.0 jsonschema-4.23.0 jsonschema-specifications-2024.10.1 kiwisolver-1.4.8 latex2mathml-3.77.0 markdown-3.4.1 matplotlib-3.10.0 mdtex2html-1.3.0 narwhals-1.24.1 numpy-1.26.4 pillow-10.4.0 pyparsing-3.2.1 pypinyin-0.50.0 referencing-0.36.2 rpds-py-0.22.3 tiktoken-0.5.2 tqdm-4.64.0 websockets-11.0.3
Use Conda’s sentencepiece and prevent pip from re-installing it. Run:
conda install -c conda-forge sentencepiece (already done), then
pip install -e .[gradio] --no-deps (the --no-deps skips reinstalling dependencies like sentencepiece).
This fixes the conflict. Ignore the Gradio “share link” warning—it’s just a connectivity issue, not an error.
Janus-Pro-7B not working locally
pip install -e .[gradio] gives
...
![Image](https://private-user-images.githubusercontent.com/48383808/407963894-dfc98913-c0d7-4bc6-95e5-45e84f1f8b22.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkwODkzODMsIm5iZiI6MTczOTA4OTA4MywicGF0aCI6Ii80ODM4MzgwOC80MDc5NjM4OTQtZGZjOTg5MTMtYzBkNy00YmM2LTk1ZTUtNDVlODRmMWY4YjIyLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDklMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA5VDA4MTgwM1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTljNWYxYTdlZWFkZTI4YmVlOTZhOWRjNDM4Zjk4ODdjNGEwYWM0NWM0OGY0MTE3Y2Y5Y2U1ZDU3NTRkN2EwNmImWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.vlFve9HFDaL6x_Ul60MzQWgN3vVkBmu4LeVWkiNQIeo)
...
I did conda install -c conda-forge sentencepiece
then python demo/app_januspro.py
I get
I can see the page on http://127.0.0.1:7860 but it does not generate anything
Complete logs here
The text was updated successfully, but these errors were encountered: