Crazy algorithm for displaying text size value
Anyone who has written any graphics or text manipulation software will know the following problem:
- Each character or object has a particular style, for example the size of the writing, the thickness of the stroke, etc.
- There is some user interface element e.g. a dialog or rollup where the user can view and edit the attribute of the currently selected object
- The user may select multiple objects
And there we have it: the user selects two pieces of text, one is 12pt, the other 18pt, and opens the "font size" dialog. What size is it to display? There are a number of solutions to this problem, none particularly elegant:
- Display one of the sizes, i.e. 12pt or 18pt. (It doesn't matter if the program uses an "intelligent" algorithm to decide which size to display—the largest number, the smallest number, the value of the leftmost character, the value of the character the user selected first or last—the user won't work out which algorithm has been used!)
- Display either a gray box or the text "multiple selected". Microsoft Word does this, and I suppose it's an acceptable solution. The user is clearly informed that the value of all the characters is neither 12pt nor 18pt.
However, amazingly enough, I came across a new solution in the graphics program Inkscape. And it's worse than any of the above! It has a certain elegance about it, and for sure this lead to the developers thinking its a good idea, but it's certainly completely useless in practice.
The solution used by Inkscape is to display the average of the values. So if you have a character at 18pt and a character at 12pt, then display 15pt in the dialog. This is really confusing, as none of the characters you've selected actually are 15pt.
So if e.g. you have an 18pt character followed by lots of 12pt characters, as you shift-right and select more and more of the 12pt characters, the size displayed in the dialog slowly becomes less, displaying most of the time various fractional values, tending towards 12pt.
In fact–that's wrong. It's even more interesting than that. In writing this post I tried it out to check what was really happening. In fact it takes the average of the sizes of constant-sized spans of characters. For example if you have an 18pt char and a 12pt char, then it displays 15pt; if you have 18pt then 12pt then 18pt it displays 16pt; if you have 18pt then 18pt then 12pt, it displays 15pt, as there is one span of 18pt, and one span of 12pt, the average of 18pt and 12pt being 15pt.
It's crazy stuff. It took me about 20 minutes to work out that this algorithm was what was being used. With my wedding tomorrow, one would imagine I should be concentrating on things other than font size display algorithms at the moment!