Correct character/string width detection in console clients #10592
Labels
Issue-Feature
Complex enough to require an in depth planning process and actual budgeted, scheduled work.
Resolution-Duplicate
There's another issue on the tracker that's pretty much the same thing.
Description of the new feature/enhancement
This is not really a feature request, more a question / thinking out loud, but I haven't found a better section on the new issue page. Please move / retag if needed.
It is not uncommon for text mode apps to organise and display data in a table-like way with multiple columns. To do so, for any arbitrary string an app should be able to calculate its visible length, i.e. the number of screen cells occupied, and truncate it or append with spaces to fit into the desired column.
Historically the most popular way to do so is just take the string size is characters, e.g.
string.size()
(here and below "character" meanswchar_t
), assuming that each character occupies exactly one cell. It is extremely easy and for the USA and Europe it usually "just works". Except when it doesn't. Sooner or later unusual characters go slipping through the cracks and that assumption goes out with a bang: the rendered string is actually longer (or shorter) than expected and all the following characters are shifted. And all the following lines as well. Oops. You've probably seen that already somewhere.So, to make sure that everything works even with unusual characters, apps need to do something smarter and treat different characters differently. There are ways to do that, e.g. using external libraries or Unicode data directly. There's one, just one small problem with that approach: text mode apps don't and can't render anything directly. The actual rendering happens in a different process in an unpredictable way: the number of occupied cells could depend on the OS version, the console host, the console mode, the API used, the output codepage, the active font, the colour of the character (yes), and so on and so forth.
In other words, to do the right thing, it's not enough to fully support Unicode and take into account character widths, grapheme clusters etc. An app needs to ask itself "what would renderer do?" first. And it's not exactly trivial to find out. Even the methods that worked in the past, e.g. checking the OS version or querying the console font, are now deprecated and either don't work without advanced magic or don't work at all in Terminal.
So, are there any reasonable ways / recommendations to predict the renderer behaviour and say for sure "if I print this particular string, the cursor will move exactly N characters to the right"? (not even to mention RTL, that's a different PITA).
I've found one, but it must not be named in public - it's way too horrible to accidentally become a design pattern.
The text was updated successfully, but these errors were encountered: