Markdown is the text format I use most frequently in my daily work, but existing translation tools often struggle to preserve its original formatting. That's why I developed md-translator, a translation tool optimized specifically for Markdown. It is designed to accurately translate content while fully preserving the original format.
md-translator currently supports parsing the following Markdown syntax and retains their formatting during translation:
---
)#
)[text](url)
)-
/ *
/ +
)1. 2. 3.
)**bold**
, _italic_
)> quote
)Additionally, md-translator can extract plain text content, stripping away Markdown syntax and optionally hiding elements like links and code blocks, making it suitable for other purposes.
This tool supports 5 translation APIs and 5 LLM (large language model) interfaces, allowing users to choose the appropriate translation method based on their needs:
API Type | Translation Quality | Stability | Suitable Scenarios | Free Quota |
---|---|---|---|---|
DeepL(X) | ★★★★★ | ★★★★☆ | Suitable for long texts; smoother translations | 500,000 characters per month |
Google Translate | ★★★★☆ | ★★★★★ | Ideal for UI text and common phrases | 500,000 characters per month |
Azure Translate | ★★★★☆ | ★★★★★ | Broadest language support | 2,000,000 characters per month for the first 12 months |
GTX API (Free) | ★★★☆☆ | ★★★☆☆ | General text translation | Free |
GTX Web (Free) | ★★★☆☆ | ★★☆☆☆ | Suitable for small-scale translation | Free |
For higher translation speed and quality, you can apply for an API Key from Google Translate, Azure Translate, or DeepL Translate. Refer to the related API application tutorial for the application process.
This tool provides access to five mainstream AI large models (LLMs) or interfaces: OpenAI, DeepSeek, Siliconflow, Groq, and Custom LLM.
The Custom LLM option allows integration with third-party services or local inference platforms (such as ollama) by configuring the API endpoint and model name. For example, the default API endpoint for a local ollama setup is:
The default model used is llama3.2
. For LM Studio, the local API endpoint is:
To achieve better translation quality, it is recommended to use qwen2.5-14b-instruct
or a higher-performing model in the Custom LLM setup.
In terms of multilingual translation capabilities, Google, Azure, and large language models (LLMs) support translation between hundreds of languages, while DeepL currently supports only 30 mainstream languages. Therefore, in terms of language coverage, DeepL is relatively weaker.
For a detailed list of supported languages, please refer to the official documentation:
For text files with contextual relationships—such as subtitles or Markdown documents—this tool automatically merges multiple lines into "chunks" for translation. The chunk size refers to the maximum number of characters per grouped block. The character limits for each translation service are as follows:
Note: Google Translate disrupts line breaks during processing, so chunked translation is not used with this service.
Delay time sets the wait interval between chunk translations. When processing large volumes of text, some translation APIs may respond slowly—especially under poor network conditions or when using free interfaces. In such cases, delay settings are particularly important.
For example, when testing with Azure Translate’s free tier, it is recommended to set the delay time to 5,000 milliseconds or more to avoid empty responses or errors.
Setting the translation rate too high may result in empty API responses or cause requests to be flagged as abnormal. It's recommended to adjust the rate based on the specific translation service and its stability to improve success rates and maintain reliable performance.
This tool introduces an optional local translation cache to improve translation efficiency and reduce resource consumption:
source text_target language_source language_translation API_model settings
.To disable the use of translation cache, you can uncheck "Use translation cache" or click "Clear translation cache" in the API settings.
Supports translating the same file into multiple languages at once, which is especially suitable for international video content:
When using this tool, please note the following: