-
Notifications
You must be signed in to change notification settings - Fork 991
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
i18n RFC #111
Conversation
rfcs/i18n.md
Outdated
and a new global Tera function will be added to get the i18n values out of the config.toml. | ||
|
||
```jinja2 | ||
{{ i18n(key="title", lang=lang) }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe trans
to stick with Django?
{{ trans(key="title", lang=lang) }}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm fine with both
Really looking forward to this! Thanks 👍 |
rfcs/i18n.md
Outdated
### Url | ||
|
||
The default language base url will be equal to `config.base_url`. | ||
Other languages will be available at `{config.base_url}/{language}` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it’s important to have the language in the URL when i18n is used.
One challenge is when an user access the root url, what would happen?
I would address this issue by:
- Copying the output folder in the root and in
/en
- All the links in the page direct to the link path with the language
- Include a head tag
<link rel="canonical" href="https://example.com/en/" >
- Optional redirect in Javascript for the files in the root folder, like:
{% if config.language_redirect %}
<script>
if (navigator.languages) {
let paths = ['/en/', '/fr/', '/jp/']
let path = match_language_path(navigator.languages, paths, '/en/');
window.location.replace(path);
}
</script>
{% endif %}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that should be up to the users whether they want the lang code in the url for their default language
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree it should be a choice. Zola should make it easy either way. Do you mind adding a setting to opt in this behavior? redirect_to_default_language
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For reference, the index.html
of rust-lang.org:
https://github.com/rust-lang/rust-www/blob/master/_layouts/redirect.html
It is a empty page with just the JS and HTML redirect. The setting could be then force_i18n
.
rfcs/i18n.md
Outdated
The content files have to have the same name for multiple languages. | ||
The language is defined in the extension prefix: `{name}.{language_code}.{extension}` | ||
|
||
The language code is omitted for the default language. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The language code can be (?) omitted […]
Hi! I work at Mozilla on Gecko Platform Intl/L10n technologies. I'm one of the core contributors to ECMA402 and one of the authors of Project Fluent. One of the projects I work on is bringing the core modern intl/l10n API to Rust, since we'd like to soon start transitioning Gecko Intl/L10n to it, which makes me interested in Rust intl/l10n story. I'd be happy to help you with your project and advice on designing around modern i18n/l10n systems, but I have to warn you that Rust is very immature in this area and there are not many hashed out libraries to chose from. On the flip side, that means that we can focus on designing and implementing modern libraries and leap frog the last 20 years of badly designed intl solutions :) In that spirit, while I'm trying to stay unbiased and not necessarily recommend technologies my team is working on, I will in an opinionated fashion recommend three things:
Everything else is up to you. It's going to be some work to bring full intl/l10n, but you can start with simple things like managing language negotiation between user request, accepted headers and articles in many locales. (for that I strongly recommend using locale lists when dealing with user requests, rather than a single locale). Hope that helps! |
Thanks!
That's one of the reason I haven't started working on i18n before, hoping that it would get more tools by now.
Gutenberg is a static site engine so it's kind of easier on that end: the language lives in the URL and that's it. No need to deal with headers, JS or anything else: all the interpolation/resolution happens at "compile time".
I maintain tera and the oldest issue open is still Keats/tera#134 :) I'm not too sure how that would look like though, it seems more a concern at the user (in this case Gutenberg, Rocket etc) level. In Gutenberg case, we only really need to translate the templates so that's a very limited scope. |
I'm no expert - my main experience in the area comes from writing several django sites, but I'm not sure if that's correct. Imagine a scenario where the website author provides content in two locales - they provide an article in French and German (content localization). They also provide a translation of the website UI's using MessageFormat as A Swiss user visits the website, and their Accepted Headers look like this: Unless I'm mistaken, the system will have to perform a language negotiation based on the selected locales. It'll have to parse the locale headers, collect the available locales for the content and for the UI, and negotiate the best possible fallback chain of locales for UI and content to show. But you can imagine other negotiation strategies that will provide the content in All of that is solved by a language negotiation API. Am I wrong assuming gutenberg will need one? |
Gutenberg outputs HTML files, you don't need to have a server in between them and the users so content negotiation is not necessarily possible: I use netlify, some people host them on S3, others in git repos. Even less serverless than serverless :) The way it is usually done in static site is that the creator selector a default lang and the users can switch later if needed. After that, nothing prevents someone from writing some JavaScript that would redirect from homepage to another language after checking |
Ahh, I see. Interesting. Thank you for taking time to explain it to me :) |
What do you think about separate files for each language? Having everyone edit the same file (config.toml) just brings merge conflicts to resolve. After working on an internationalized site with Hugo, i now think that one file per language is the minimum requirement for maintainability. To be honest, i actually found that system limiting. Allowing multiple files per language would make it easier for people to maintain translations, by grouping strings by area of interests.
If custom output formats (#365) are implemented as well, i think the issue of per-language RSS feed would be de facto addressed. Each language's sections/taxonomies could have their own RSS. On a sidenote, i've taken a look at Fluent and it's amazing. I didn't dig too much, but from a translator's perspective it seems to address all issues i encountered with other internationalization systems. There's also a rust implementation. |
Definitely needed. The current
Is it that big of a deal for a static site? The content translation is done by the user so there isn't as much to translate as for an app for example, only the templates need to be translated. |
Yes, but when making a theme with different components (widgets of sorts), that can become a lot of different strings to handle in different languages. From my (limited) perspective, i only see two way to decouple properly content from templating :
Thinking of it, i don't even think one of the approaches contradict the other. They even seem complementary to me. Translation namespaces for small strings offer proper separation of concerns, easily porting components from a site/theme to another, and potentially even sharing sharing translation packs like we share themes. I also find it easier if you write scripts to keep updated on missing translations to let translators know that this small file hasn't been translated (or lacks some strings) instead of pointing them to line XYZ in a single file. Even without a script you can see directly from your file manager when a file is missing in a language. Internal page bundles, on the other hand, would make it easier to organize actual content translation (potentially big files with separate paragraphs). To demonstrate the power of internal page bundles, imagine a multi-column widget that's content agnostic (like asked for in #289). For example, to allow for easy translation of a footer with multiple columns (separated by
So in the end content translators only have to deal with Markdown files as they would for regular pages, and theme translators with a few translation strings in dedicated files that are easy to keep track of. |
I guess I would go with one file per lang to start with simply because it's easier to have something working.
Adding namespaces to the mix seem to complicate the code and user UX for not much gains IMO. |
If you'll decide to use Fluent, we have the concept of It allows you to select a number of resources and put them together into a single context that resolves messages together. |
When using i18n static blog, there are two scenarios for articles that are not translated : either display the article in the default language, or else do not display the article. I think it's a common scenario to not have all articles translated, and as such it would be nice if the two alternatives were offered by Zola. |
Yep that's definitely one of the required bits for i18n |
I updated the RFC and moved it to the new Discourse for Zola: https://zola.discourse.group/t/rfc-i18n/13 Closing this in favour of that thread. |
Everyone interested in that thread can have a look at #567 for the initial implementation (only content, no translations in templates yet). It is missing taxonomies (as I have no ideas what to do with them yet), links to other translations of the same content and per language RSS. Feedback is probably better given on https://zola.discourse.group/t/rfc-i18n/13 to avoid notifications from commit on the PR and is a better discussion medium overall. cc @batisteo @enricostano @matclab and anyone else wanting multilingual sites! |
No description provided.