Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support to translate using DeepLX #63

Open
LordBoos opened this issue Jan 1, 2025 · 9 comments
Open

Add support to translate using DeepLX #63

LordBoos opened this issue Jan 1, 2025 · 9 comments

Comments

@LordBoos
Copy link

LordBoos commented Jan 1, 2025

Hello, before I found out about MORT, I used Luma, which supports DeepLX translator.
DeepLX is a free locally hosted version of DeepL.
In my testing DeepLX is fairly superior to all other translators in translating to Czech language.

Please add support to use it. It is similar to current Custom API translator, but the API is different. It is documented here:
https://deeplx.owo.network/endpoints/free.html

@killkimno
Copy link
Owner

Thanks for the info
But it's an installer, not a web-based one.
So there are no plans to support it at the moment

@LordBoos
Copy link
Author

It is locally run application, but it uses standard REST calls like a web application. It's basically same API as online DeepL, but you can start it locally on your own PC to avoid rate limiting.
It should be fairly easy to implement because you are already supporting online DeepL. Only thing that is probably needed is allowing us to set out own DeepL URL, so instead of official https://api.deepl.com/v2/translate you will allow us to set URL to https://0.0.0.0:1188/v2/translate or https://localhost:1188/v2/translate

@killkimno
Copy link
Owner

How can I use it locally?
Is there no other way than using Docker?

The process of installing Docker for general users is very difficult.

On the other hand, a guide can be created for users who installed DeepLX using Docker.

@LordBoos
Copy link
Author

You can just download a binary executable from their GitHub releases and just run it. No need to install docker.
There are binaries for different systems like Windows and Linux: https://github.com/OwO-Network/DeepLX/releases

@killkimno
Copy link
Owner

When I looked more closely, there was a Windows version
deeplx_windows_amd64
deeplx_windows_386

Which one should I use among these two?

@LordBoos
Copy link
Author

That is based on your CPU. Probably amd64 version, because it has a little better performance. It will only run on 64-bit CPU and 64-bit Windows version, but almost everyone today should have that. In case it doesn't work for you/somebody, you can use 386 version, which will run on all CPUs, but has a little worse performance.

@Net2Fox
Copy link

Net2Fox commented Jan 30, 2025

Hello. If it's still relevant, I've essentially made DeepLX support for MORT. Really, at first I thought I would just make a fork of DeepLX so that it would return data in the right format for CustomAPI in MORT. But then the problem came up that DeepLX requires separators like regular DeepL, so I had to make a MORT fork as well 😅
So the bottom line is that you need to download my version of DeepLX and my version of MORT to use them correctly. I made the instructions in my fork.
When I tested it, it seemed to work correctly. If there are any bugs, let me know.

Also, I think I might try adding DeepLX support as a separate translation method so that I don't have to download a special version of DeepLX.

@killkimno
Copy link
Owner

killkimno commented Jan 30, 2025

Thank you!
I haven't been able to work recently due to work
I'll try to support MORT after testing.

There are two branches, which one should I check?
Custom API-DeepLX or Feature DeepLX Support?

@Net2Fox
Copy link

Net2Fox commented Jan 30, 2025

CustomAPI DeepLX is an old branch. I had some free time today, so I kind of managed to understand the code a bit and made an implementation of DeepLX as a translation type in the Feature DeepLX Support branch.

So check out the Feature DeepLX Support branch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants