Phil Race
2006-05-25 16:46:02 UTC
I have been trying to reconcile my expectation of the display length and
printed length of
text using MS Gothic with what actually happens.
I have found that if the size requested (LOGFONT.lfHeight) equates to an odd
pixel size,
that GDI in fact uses the next highest even size.
eg 11 and 12 pixel both measure as 12 pixel.
I know that MS Gothic and MS Mincho are designed at 256 upem rather than 1024
or 2048 as is common now, because they are older fonts designed at a time when
computers were much less capable. So I wonder if there was also a long ago
decision
to save memory by scaling them only to even sizes? This is pure speculation
on my
part.
So with that background my questions are :
1. What is the actual reason for this ?
2. Is this implemented as some font-specific hack ?
3. How can I programmatically detect which fonts exhibit this kind of
behaviour?
-phil
printed length of
text using MS Gothic with what actually happens.
I have found that if the size requested (LOGFONT.lfHeight) equates to an odd
pixel size,
that GDI in fact uses the next highest even size.
eg 11 and 12 pixel both measure as 12 pixel.
I know that MS Gothic and MS Mincho are designed at 256 upem rather than 1024
or 2048 as is common now, because they are older fonts designed at a time when
computers were much less capable. So I wonder if there was also a long ago
decision
to save memory by scaling them only to even sizes? This is pure speculation
on my
part.
So with that background my questions are :
1. What is the actual reason for this ?
2. Is this implemented as some font-specific hack ?
3. How can I programmatically detect which fonts exhibit this kind of
behaviour?
-phil