Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.
Alternatives To Libretranslate
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
10 days ago4May 26, 201872gpl-2.0TypeScript
Foreign language reading and translation assistant based on copy and translate.
a day ago34mitKotlin
Translation plugin for IntelliJ based IDEs/Android Studio/HUAWEI DevEco Studio.
React I18next8,0641,3071,90414 days ago269September 09, 202218mitJavaScript
Internationalization for react done right. Using the i18next i18n ecosystem.
3 days ago11mitJavaScript
i18next: learn once - translate everywhere
Keep A Changelog5,591
11 hours ago17mitHaml
If you build software, keep a changelog.
4 days ago37agpl-3.0Python
Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.
Zotero Pdf Translate2,570
4 days ago24agpl-3.0TypeScript
PDF translation add-on for Zotero
Go I18n2,3365553118 days ago62March 12, 202221mitGo
Translate your Go program into multiple languages.
Traduzir Paginas Web2,275
a day ago215mpl-2.0JavaScript
Translate your page in real time using Google or Yandex
a year ago52Java
Android Studio Plugin,Translate English to Chinese. Android Studio 翻译插件,可以将英文翻译为中文.
Alternatives To Libretranslate
Select To Compare

Alternative Project Comparisons


Try it online! | API Docs | Community Forum

Python versions Run tests Build and Publish Docker Image Publish package Awesome Humane Tech

Free and Open Source Machine Translation API, entirely self-hosted. Unlike other APIs, it doesn't rely on proprietary providers such as Google or Azure to perform translations. Instead, its translation engine is powered by the open source Argos Translate library.


Try it online! | API Docs

API Examples



const res = await fetch("", {
  method: "POST",
  body: JSON.stringify({
    q: "Hello!",
    source: "en",
    target: "es"
  headers: { "Content-Type": "application/json" }

console.log(await res.json());


    "translatedText": "Hola!"

Auto Detect Language


const res = await fetch("", {
  method: "POST",
  body: JSON.stringify({
    q: "Ciao!",
    source: "auto",
    target: "en"
  headers: { "Content-Type": "application/json" }

console.log(await res.json());


    "detectedLanguage": {
        "confidence": 83,
        "language": "it"
    "translatedText": "Bye!"

HTML (beta)


const res = await fetch("", {
  method: "POST",
  body: JSON.stringify({
    q: '<p class="green">Hello!</p>',
    source: "en",
    target: "es",
    format: "html"
  headers: { "Content-Type": "application/json" }

console.log(await res.json());


    "translatedText": "<p class=\"green\">Hola!</p>"

Install and Run

You can run your own API server with just a few lines of setup!

Make sure you have Python installed (3.8 or higher is recommended), then simply run:

pip install libretranslate
libretranslate [args]

Then open a web browser to http://localhost:5000

On Ubuntu 20.04 you can also use the install script available at argosopentech/LibreTranslate-init

Build and Run

If you want to make changes to the code, you can build from source, and run the API:

git clone
cd LibreTranslate
pip install -e .
libretranslate [args]

# Or
python [args]

Then open a web browser to http://localhost:5000

Run with Docker

Linux/MacOS: ./ [args] Windows: run.bat [args]

Then open a web browser to http://localhost:5000

Build with Docker

docker build -f docker/Dockerfile [--build-arg with_models=true] -t libretranslate .

If you want to run the Docker image in a complete offline environment, you need to add the --build-arg with_models=true parameter. Then the language models are downloaded during the build process of the image. Otherwise these models get downloaded on the first run of the image/container.

Run the built image:

docker run -it -p 5000:5000 libretranslate [args]

Or build and run using docker-compose:

docker-compose up -d --build

Feel free to change the docker-compose.yml file to adapt it to your deployment needs, or use an extra file for your deployment configuration.

The models are stored inside the container under /home/libretranslate/.local/share and /home/libretranslate/.local/cache. Feel free to use volumes if you do not want to redownload the models when the container is destroyed. To update the models, use the --update-models argument.


You can use hardware acceleration to speed up translations on a GPU machine with CUDA 11.2 and nvidia-docker installed.

Run this version with:

docker-compose -f docker-compose.cuda.yml up -d --build


Argument Description Default Env. name
--host Set host to bind the server to LT_HOST
--port Set port to bind the server to 5000 LT_PORT
--char-limit Set character limit No limit LT_CHAR_LIMIT
--req-limit Set maximum number of requests per minute per client No limit LT_REQ_LIMIT
--req-limit-storage Storage URI to use for request limit data storage. See Flask Limiter memory:// LT_REQ_LIMIT_STORAGE
--batch-limit Set maximum number of texts to translate in a batch request No limit LT_BATCH_LIMIT
--ga-id Enable Google Analytics on the API client page by providing an ID No tracking LT_GA_ID
--debug Enable debug environment False LT_DEBUG
--ssl Whether to enable SSL False LT_SSL
--frontend-language-source Set frontend default language - source auto LT_FRONTEND_LANGUAGE_SOURCE
--frontend-language-target Set frontend default language - target locale (match site's locale) LT_FRONTEND_LANGUAGE_TARGET
--frontend-timeout Set frontend translation timeout 500 LT_FRONTEND_TIMEOUT
--api-keys Enable API keys database for per-user rate limits lookup Don't use API keys LT_API_KEYS
--api-keys-db-path Use a specific path inside the container for the local database. Can be absolute or relative db/api_keys.db LT_API_KEYS_DB_PATH
--api-keys-remote Use this remote endpoint to query for valid API keys instead of using the local database Use local API key database LT_API_KEYS_REMOTE
--get-api-key-link Show a link in the UI where to direct users to get an API key Don't show a link LT_GET_API_KEY_LINK
--require-api-key-origin Require use of an API key for programmatic access to the API, unless the request origin matches this domain No restrictions on domain origin LT_REQUIRE_API_KEY_ORIGIN
--require-api-key-secret Require use of an API key for programmatic access to the API, unless the client also sends a secret match No secrets required LT_REQUIRE_API_KEY_SECRET
--shared-storage Shared storage URI to use for multi-process data sharing (e.g. when using gunicorn) memory:// LT_SHARED_STORAGE
--load-only Set available languages all from argostranslate LT_LOAD_ONLY
--threads Set number of threads 4 LT_THREADS
--suggestions Allow user suggestions False LT_SUGGESTIONS
--disable-files-translation Disable files translation False LT_DISABLE_FILES_TRANSLATION
--disable-web-ui Disable web ui False LT_DISABLE_WEB_UI
--update-models Update language models at startup False LT_UPDATE_MODELS
--metrics Enable the /metrics endpoint for exporting Prometheus usage metrics Disabled LT_METRICS
--metrics-auth-token Protect the /metrics endpoint by allowing only clients that have a valid Authorization Bearer token No auth LT_METRICS_AUTH_TOKEN
--url-prefix Add prefix to URL: / LT_URL_PREFIX

Note that each argument has an equivalent environment variable that can be used instead. The env. variables overwrite the default values but have lower priority than the command arguments and are particularly useful if used with Docker. The environment variable names are the upper-snake-case of the equivalent command argument's name with a LT prefix.



If you installed with pip:

pip install -U libretranslate

If you're using docker:

docker pull libretranslate/libretranslate

Language Models

Start the program with the --update-models argument. For example: libretranslate --update-models or ./ --update-models.

Alternatively you can also run the scripts/ script.

Run with WSGI and Gunicorn

pip install gunicorn
gunicorn --bind 'wsgi:app'

You can pass application arguments directly to Gunicorn via:

gunicorn --bind 'wsgi:app(api_keys=True)'

Run with Kubernetes

See "LibreTranslate: your own translation service on Kubernetes" by JM Robles

Manage API Keys

LibreTranslate supports per-user limit quotas, e.g. you can issue API keys to users so that they can enjoy higher requests limits per minute (if you also set --req-limit). By default all users are rate-limited based on --req-limit, but passing an optional api_key parameter to the REST endpoints allows a user to enjoy higher request limits.

To use API keys simply start LibreTranslate with the --api-keys option. If you modified the API keys database path with the option --api-keys-db-path, you must specify the path with the same argument flag when using the ltmanage keys command.

Add New Keys

To issue a new API key with 120 requests per minute limits:

ltmanage keys add 120

If you changed the API keys database path:

ltmanage keys --api-keys-db-path path/to/db/dbName.db add 120

Remove Keys

ltmanage keys remove <api-key>

View Keys

ltmanage keys

Prometheus Metrics

LibreTranslate has Prometheus exporter capabilities when you pass the --metrics argument at startup (disabled by default). When metrics are enabled, a /metrics endpoint is mounted on the instance:


# HELP libretranslate_http_requests_in_flight Multiprocess metric
# TYPE libretranslate_http_requests_in_flight gauge
libretranslate_http_requests_in_flight{api_key="",endpoint="/translate",request_ip=""} 0.0
# HELP libretranslate_http_request_duration_seconds Multiprocess metric
# TYPE libretranslate_http_request_duration_seconds summary
libretranslate_http_request_duration_seconds_count{api_key="",endpoint="/translate",request_ip="",status="200"} 0.0
libretranslate_http_request_duration_seconds_sum{api_key="",endpoint="/translate",request_ip="",status="200"} 0.0

You can then configure prometheus.yml to read the metrics:

  - job_name: "libretranslate"

    # Needed only if you use --metrics-auth-token
      #credentials: "mytoken"

      - targets: ["localhost:5000"]

To secure the /metrics endpoint you can also use --metrics-auth-token mytoken.

If you use Gunicorn, make sure to create a directory for storing multiprocess data metrics and set PROMETHEUS_MULTIPROC_DIR:

mkdir -p /tmp/prometheus_data
rm /tmp/prometheus_data/*
export PROMETHEUS_MULTIPROC_DIR=/tmp/prometheus_data
gunicorn -c scripts/ --bind 'wsgi:app(metrics=True)'

Language Bindings

You can use the LibreTranslate API using the following bindings:

Discourse Plugin

You can use this discourse translator plugin to translate Discourse topics. To install it simply modify /var/discourse/containers/app.yml:

## Plugins go here
## see for details
    - exec:
        cd: $home/plugins
          - git clone
          - git clone

Then issue ./launcher rebuild app. From the Discourse's admin panel then select "LibreTranslate" as a translation provider and set the relevant endpoint configurations.

Mobile Apps

Web browser


This is a list of public LibreTranslate instances, some require an API key. If you want to add a new URL, please open a pull request.

URL API Key Required Links ✔️ Get API Key - -

TOR/i2p Mirrors


Adding New Language Models

To add new languages you first need to train an Argos Translate model. See this video for details.

First you need to collect data, for example from Opus, then you need to add the data to data-index.json in the Argos Train repo.


The LibreTranslate Web UI is available in all the languages for which LibreTranslate can translate to. It can also (roughly) translate itself! Some languages might not appear in the UI since they haven't been reviewed by a human yet. You can enable all languages by turning on --debug mode.

To help improve or review the UI translations:

 "name": "<Language>",
 "reviewed": true <-- Change this from false to true

UI Languages

Language Reviewed Weblate Link
Arabic Edit
Azerbaijani Edit
Chinese Edit
Czech Edit
Danish Edit
Dutch Edit
English ✔️ Edit
Esperanto Edit
Finnish Edit
French Edit
German ✔️ Edit
Greek Edit
Hebrew Edit
Hindi Edit
Hungarian Edit
Indonesian Edit
Irish Edit
Italian ✔️ Edit
Japanese Edit
Kabyle Edit
Korean Edit
Occitan Edit
Persian Edit
Polish Edit
Portuguese Edit
Russian ✔️ Edit
Slovak Edit
Spanish ✔️ Edit
Swedish Edit
Turkish Edit
Ukranian Edit
Vietnamese Edit


Help us by opening a pull request!

  • [x] A docker image (thanks @vemonet !)
  • [x] Auto-detect input language (thanks @vemonet !)
  • [X] User authentication / tokens
  • [ ] Language bindings for every computer language
  • [ ] Improved translations


Can I use your API server at for my application in production?

In short, no. You need to buy an API key. You can always run LibreTranslate for free on your own server of course.

Can I use LibreTranslate behind a reverse proxy, like Apache2 or Caddy?

Yes, here are config examples for Apache2 and Caddy that redirect a subdomain (with HTTPS certificate) to LibreTranslate running on a docker at localhost.

sudo docker run -ti --rm -p libretranslate/libretranslate

You can remove on the above command if you want to be able to access it from domain.tld:5000, in addition to subdomain.domain.tld (this can be helpful to determine if there is an issue with Apache2 or the docker container).

Add --restart unless-stopped if you want this docker to start on boot, unless manually stopped.

Apache config

Replace [YOUR_DOMAIN] with your full domain; for example, translate.domain.tld or libretranslate.domain.tld.

Remove # on the ErrorLog and CustomLog lines to log requests.


#Redirect http to https
<VirtualHost *:80>
    ServerName http://[YOUR_DOMAIN]
    Redirect / https://[YOUR_DOMAIN]
    # ErrorLog ${APACHE_LOG_DIR}/error.log
    # CustomLog ${APACHE_LOG_DIR}/tr-access.log combined

<VirtualHost *:443>
    ServerName https://[YOUR_DOMAIN]

    ProxyPass /
    ProxyPassReverse /
    ProxyPreserveHost On

    SSLEngine on
    SSLCertificateFile /etc/letsencrypt/live/[YOUR_DOMAIN]/fullchain.pem
    SSLCertificateKeyFile /etc/letsencrypt/live/[YOUR_DOMAIN]/privkey.pem
    SSLCertificateChainFile /etc/letsencrypt/live/[YOUR_DOMAIN]/fullchain.pem

    # ErrorLog ${APACHE_LOG_DIR}/tr-error.log
    # CustomLog ${APACHE_LOG_DIR}/tr-access.log combined

Add this to an existing site config, or a new file in /etc/apache2/sites-available/new-site.conf and run sudo a2ensite new-site.conf.

To get a HTTPS subdomain certificate, install certbot (snap), run sudo certbot certonly --manual --preferred-challenges dns and enter your information (with subdomain.domain.tld as the domain). Add a DNS TXT record with your domain registrar when asked. This will save your certificate and key to /etc/letsencrypt/live/{subdomain.domain.tld}/. Alternatively, comment the SSL lines out if you don't want to use HTTPS.

Caddy config

Replace [YOUR_DOMAIN] with your full domain; for example, translate.domain.tld or libretranslate.domain.tld.

  reverse_proxy localhost:5000

Add this to an existing Caddyfile or save it as Caddyfile in any directory and run sudo caddy reload in that same directory.

NGINX config

Replace [YOUR_DOMAIN] with your full domain; for example, translate.domain.tld or libretranslate.domain.tld.

Remove # on the access_log and error_log lines to disable logging.

server {
  listen 80;
  server_name [YOUR_DOMAIN];
  return 301 https://$server_name$request_uri;

server {
  listen 443 http2 ssl;
  server_name [YOUR_DOMAIN];

  #access_log off;
  #error_log off;

  # SSL Section
  ssl_certificate /etc/letsencrypt/live/[YOUR_DOMAIN]/fullchain.pem;
  ssl_certificate_key /etc/letsencrypt/live/[YOUR_DOMAIN]/privkey.pem;

  ssl_protocols TLSv1.2 TLSv1.3;

  # Using the recommended cipher suite from:

  ssl_session_timeout 10m;
  ssl_session_cache shared:MozSSL:10m;  # about 40000 sessions
  ssl_session_tickets off;

  # Specifies a curve for ECDHE ciphers.
  ssl_ecdh_curve prime256v1;
  # Server should determine the ciphers, not the client
  ssl_prefer_server_ciphers on;

  # Header section
  add_header Strict-Transport-Security  "max-age=31536000; includeSubDomains; preload" always;
  add_header Referrer-Policy            "strict-origin" always;

  add_header X-Frame-Options            "SAMEORIGIN"    always;
  add_header X-XSS-Protection           "1; mode=block" always;
  add_header X-Content-Type-Options     "nosniff"       always;
  add_header X-Download-Options         "noopen"        always;
  add_header X-Robots-Tag               "none"          always;

  add_header Feature-Policy             "microphone 'none'; camera 'none'; geolocation 'none';"  always;
  # Newer header but not everywhere supported
  add_header Permissions-Policy         "microphone=(), camera=(), geolocation=()" always;

  # Remove X-Powered-By, which is an information leak
  fastcgi_hide_header X-Powered-By;

  # Do not send nginx server header
  server_tokens off;

  # GZIP Section
  gzip on;
  gzip_disable "msie6";

  gzip_vary on;
  gzip_proxied any;
  gzip_comp_level 6;
  gzip_buffers 16 8k;
  gzip_http_version 1.1;
  gzip_min_length 256;
  gzip_types text/xml text/javascript font/ttf font/eot font/otf application/x-javascript application/atom+xml application/javascript application/json application/manifest+json application/rss+xml application/x-web-app-manifest+json application/xhtml+xml application/xml image/svg+xml image/x-icon text/css text/plain;

  location / {
      proxy_set_header Host $http_host;
      proxy_set_header X-Real-IP $remote_addr;
      proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header X-Forwarded-Proto $scheme;
      client_max_body_size 0;

Add this to an existing NGINX config or save it as libretranslate in the /etc/nginx/site-enabled directory and run sudo nginx -s reload.


This work is largely possible thanks to Argos Translate, which powers the translation engine.


GNU Affero General Public License v3


See Trademark Guidelines

Popular Translation Projects
Popular Translate Projects
Popular Data Processing Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.