Discussion:
problem with unicode, offset of 27?
(too old to reply)
s***@gmail.com
2013-08-30 09:32:36 UTC
Permalink
Hi

I am running an older programme on Win 7 / 32 bit, and found that glPrint turns the text "1;1" into "LVL", "1;2" into "LVM" and so on...
Now this works on win xp, and there are still many places where it needs to work on XP.

current solution:

for i :=1 to lentgh(text) do
text[i] := chr(ord(text[i])-27);

which means I have an offset of 27

but why?

WBR
Sonnich
R.Wieser
2013-08-30 10:29:54 UTC
Permalink
Post by s***@gmail.com
Sonnich
which means I have an offset of 27
but why?
glPrint uses a font defined by the 'wglUseFontBitmaps' call. If the font is
organized differently than expected or this call isn't given the correct
arguments you will get such an offset.

wglUseFontBitmaps(hDC, 32, 96, base); // Builds 96 Characters Starting At
Character 32

(Straight from a NeHe tutorial you can find here:
http://nehe.gamedev.net/tutorial/bitmap_fonts/17002/ )

Hope that helps.
Rudy Wieser
Post by s***@gmail.com
Hi
I am running an older programme on Win 7 / 32 bit, and found that glPrint
turns the text "1;1" into "LVL", "1;2" into "LVM" and so on...
Post by s***@gmail.com
Now this works on win xp, and there are still many places where it needs to work on XP.
for i :=1 to lentgh(text) do
text[i] := chr(ord(text[i])-27);
which means I have an offset of 27
but why?
WBR
Sonnich
s***@gmail.com
2013-09-03 08:37:32 UTC
Permalink
Post by R.Wieser
Post by s***@gmail.com
Sonnich
which means I have an offset of 27
but why?
glPrint uses a font defined by the 'wglUseFontBitmaps' call. If the font is
organized differently than expected or this call isn't given the correct
arguments you will get such an offset.
wglUseFontBitmaps(hDC, 32, 96, base); // Builds 96 Characters Starting At
Character 32
http://nehe.gamedev.net/tutorial/bitmap_fonts/17002/ )
Hope that helps.
Rudy Wieser
Post by s***@gmail.com
Hi
I am running an older programme on Win 7 / 32 bit, and found that glPrint
turns the text "1;1" into "LVL", "1;2" into "LVM" and so on...
Post by s***@gmail.com
Now this works on win xp, and there are still many places where it needs
to work on XP.
Post by s***@gmail.com
for i :=1 to lentgh(text) do
text[i] := chr(ord(text[i])-27);
which means I have an offset of 27
but why?
WBR
Sonnich
I looked at it and played a little around - the present code is as follows.

I understand it now, but I dont see what can cause it

I tried to hardcode the font name and style from 1(default) to 0(ansi) etc, but that really did not change much

Any suggestions?

if fsBold in font.Style then
bold := FW_BOLD
else
bold := FW_DONTCARE;

f := CreateFont(Font.Size, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
bold, // Font Weight
Integer(fsItalic in Font.Style), // Italic
Integer(fsUnderline in Font.Style), // Underline
Integer(fsStrikeout in Font.Style), // Strikeout
Font.Charset, // Character Set Identifier <-THIS IS 1
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE or DEFAULT_PITCH, // Family And Pitch
PChar('Arial')); // Font Name
// PChar(Font.Name)); // Font Name <- currently removed

hDC := GetDC(Handle);
SelectObject(hDC, f);

fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
R.Wieser
2013-09-03 10:09:50 UTC
Permalink
Hello Sonnich
Post by s***@gmail.com
I understand it now, but I dont see what can cause it
Well, in your code you use
Post by s***@gmail.com
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
telling it to store the characters 28 thru 127 into an image-list. That
means that character 28 will be stored in the list at position 0, character
29 at position 1, and so on. That means, if you continue it, that
character 76 ("L") will be stored at position 49. Now if you look at your
character (ASCII) table at position 49 you will find the character "1".

And there you have your problem: you are mixing up two tables, which do not
start at the same position : At offset 0x00 in the the ASCII-table you have
a symbol representing the ASCII character 0x00. However, at offset 0x00 in
the FontTable List you have a symbol representing the ASCII character 28.

When you are providing a string to the glPrint command those characters are
actually converted to their ASCII values, and those are, without any
conversion, used as indices into the FontTable list you generated. As your
ASCII table starts with (a symbol representing) character 0x00 and the
FontTable starts with (a symbol representing) character 0x1C (28 decimal)
you need to convert. And that is where your subtraction comes from:
converting the ASCII indices to FontTable list indices.

*# Read this please #*
I must say something is odd here: You say you have a difference of 27 (the
subtraction in your posted code), where I would expect you to have a
difference of 28 (offset used in the wglUseFontBitmaps call) ... Could you
please re-check ?

By the way: the simpelest solutions seems to be to subtract the difference
from the "fontListBase" variable you're using. One drawback: if-and-when
your text contains a(n ASCII) character below 28 or above 127 the result
will be unknown (depending on what other DrawList you have created before
and after creating the FontTable list.
Post by s***@gmail.com
Any suggestions?
The font itself is not the problem. Even better: You can remove everything
above the "fontListBase := ..." line and still get readable text (please try
it. The less code you have when trying to catch a problem the better).
Reason? It than simply takes whatever font was already assigned to that
window/control (either a default font, or the one you specified for the
window in the resource file)

-----------------
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
....
glRasterPos... ;The location of the text (relative to the
last translation)
glListBase(base - 28) ;Where ASCII char 0x00 would have been.
glCallLists, .... ;Draw the text (using ASCII characters)
-----------------

One question though: why did you start at 28 and not 32 ? ASCII 0 thru 31
are the so-called control-characters, which are normally not readable
(though a number of them are).

Hope that helps.
Rudy Wieser


-- Origional message
Post by s***@gmail.com
I looked at it and played a little around - the present code
is as follows.
I understand it now, but I dont see what can cause it
I tried to hardcode the font name and style from 1(default) to
0(ansi) etc, but that really did not change much
Any suggestions?
if fsBold in font.Style then
bold := FW_BOLD
else
bold := FW_DONTCARE;
f := CreateFont(Font.Size, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
bold, // Font Weight
Integer(fsItalic in Font.Style), // Italic
Integer(fsUnderline in Font.Style), // Underline
Integer(fsStrikeout in Font.Style), // Strikeout
Font.Charset, // Character Set Identifier <-THIS IS 1
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE or DEFAULT_PITCH, // Family And Pitch
PChar('Arial')); // Font Name
// PChar(Font.Name)); // Font Name <- currently removed
hDC := GetDC(Handle);
SelectObject(hDC, f);
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
R.Wieser
2013-09-03 10:42:09 UTC
Permalink
Post by R.Wieser
glListBase(base - 28) ;Where ASCII char 0x00 would have been.
Ooops ....

That should ofcourse have been
Post by R.Wieser
glListBase(fontListBase - 28) ;Where ASCII char 0x00 would have
been.

Sorry.
s***@gmail.com
2013-09-03 18:33:43 UTC
Permalink
Hi

Many thanks for your answer - I am not sure I understand it all.
Some other guy made this unit years ago, and now - in win7/32bit we have this problem.
I tried and played around I think I understand it, at least it works on both systems.

Here it goes:

Yes, it really was 27 as you can see below.
Why they chose 28 I dont know, but (maybe) they had their reasons. However I changed it to 32, and the code is only printing once. So I dont get this either.
We are printing numbers such as: "1;2" and "4,3"

And yes, it can easily select the default font. However I keep the current font, as the default font is bigger.

My current code with comments:

procedure TViewer.Loaded;
var
f: HWND;
bold: integer;
hDC: HWND;
begin
InitDC;
InitGL;

inherited;

if fsBold in font.Style then
bold := FW_BOLD
else
bold := FW_DONTCARE;

f := CreateFont(Font.Size, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
bold, // Font Weight
Integer(fsItalic in Font.Style), // Italic
Integer(fsUnderline in Font.Style), // Underline
Integer(fsStrikeout in Font.Style), // Strikeout
Font.Charset, // Character Set Identifier
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE or DEFAULT_PITCH, // Family And Pitch
PChar(Font.Name)); // Font Name

hDC := GetDC(Handle);
SelectObject(hDC, f); // remove this and I get a bigger font

fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 32, 100, fontListBase); // was 28
glListBase(fontListBase-32); // added as you suggested

ReleaseDC(Handle, hDC);
ZoomAll;
DeleteObject(f);
end;

--- and ---

procedure TViewer.glPrint(text: string);
var
i: integer;
begin
if text = '' then
exit;

// for i := 1 to length(text) do
// text[i] := chr(ord(text[i]) - 27); // now obsolete

glPushAttrib(GL_LIST_BIT);
// glListBase(fontListBase - 28); // this was here
glCallLists(Length( text ), GL_UNSIGNED_BYTE, PChar( text ));
glPopAttrib(); // this was missing!
// glListBase(0); // here it was reset - I wonder if that could do it, as the offset used is lost. The very interesing thing here was that when playing around I found a way where every 2nd would be right - so I would get 1;1 kVL 1;2 LVL ect. I did not understand why, but I did see how so solve it
end;

Finally many thanks I have it working. Still, I have to learn a bit more about openGL.
Post by R.Wieser
Hello Sonnich
Post by s***@gmail.com
I understand it now, but I dont see what can cause it
Well, in your code you use
Post by s***@gmail.com
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
telling it to store the characters 28 thru 127 into an image-list. That
means that character 28 will be stored in the list at position 0, character
29 at position 1, and so on. That means, if you continue it, that
character 76 ("L") will be stored at position 49. Now if you look at your
character (ASCII) table at position 49 you will find the character "1".
And there you have your problem: you are mixing up two tables, which do not
start at the same position : At offset 0x00 in the the ASCII-table you have
a symbol representing the ASCII character 0x00. However, at offset 0x00 in
the FontTable List you have a symbol representing the ASCII character 28.
When you are providing a string to the glPrint command those characters are
actually converted to their ASCII values, and those are, without any
conversion, used as indices into the FontTable list you generated. As your
ASCII table starts with (a symbol representing) character 0x00 and the
FontTable starts with (a symbol representing) character 0x1C (28 decimal)
converting the ASCII indices to FontTable list indices.
*# Read this please #*
I must say something is odd here: You say you have a difference of 27 (the
subtraction in your posted code), where I would expect you to have a
difference of 28 (offset used in the wglUseFontBitmaps call) ... Could you
please re-check ?
By the way: the simpelest solutions seems to be to subtract the difference
from the "fontListBase" variable you're using. One drawback: if-and-when
your text contains a(n ASCII) character below 28 or above 127 the result
will be unknown (depending on what other DrawList you have created before
and after creating the FontTable list.
Post by s***@gmail.com
Any suggestions?
The font itself is not the problem. Even better: You can remove everything
above the "fontListBase := ..." line and still get readable text (please try
it. The less code you have when trying to catch a problem the better).
Reason? It than simply takes whatever font was already assigned to that
window/control (either a default font, or the one you specified for the
window in the resource file)
-----------------
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
....
glRasterPos... ;The location of the text (relative to the
last translation)
glListBase(base - 28) ;Where ASCII char 0x00 would have been.
glCallLists, .... ;Draw the text (using ASCII characters)
-----------------
One question though: why did you start at 28 and not 32 ? ASCII 0 thru 31
are the so-called control-characters, which are normally not readable
(though a number of them are).
Hope that helps.
Rudy Wieser
-- Origional message
Post by s***@gmail.com
I looked at it and played a little around - the present code
is as follows.
I understand it now, but I dont see what can cause it
I tried to hardcode the font name and style from 1(default) to
0(ansi) etc, but that really did not change much
Any suggestions?
if fsBold in font.Style then
bold := FW_BOLD
else
bold := FW_DONTCARE;
f := CreateFont(Font.Size, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
bold, // Font Weight
Integer(fsItalic in Font.Style), // Italic
Integer(fsUnderline in Font.Style), // Underline
Integer(fsStrikeout in Font.Style), // Strikeout
Font.Charset, // Character Set Identifier <-THIS IS 1
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE or DEFAULT_PITCH, // Family And Pitch
PChar('Arial')); // Font Name
// PChar(Font.Name)); // Font Name <- currently removed
hDC := GetDC(Handle);
SelectObject(hDC, f);
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
R.Wieser
2013-09-03 21:16:06 UTC
Permalink
Post by R.Wieser
Hello Sonnich
Many thanks for your answer - I am not sure I understand it all.
You're welcome, and don't worry about that last part ( I do not understand
it *all* either). :-)
Post by R.Wieser
I tried and played around I think I understand it,
at least it works on both systems.
That it now works is currently for your company(?) the most important.
Post by R.Wieser
Yes, it really was 27 as you can see below.
Why they chose 28 I dont know, but (maybe) they had
their reasons.
Odd. With an offset of 28 the number to subtract should be the same ...
But than again, I don't know everything either.
Post by R.Wieser
And yes, it can easily select the default font. However I
keep the current font, as the default font is bigger.
It was just a "while debugging" suggestion, not a permanent alteration. The
less code remains to debug, te easier it is to understand "whats going on".

And now the code:

That "glListBase(fontListBase-32)" should be used directly infront of the
"glCallLists" function, in between that glPush/popAttrib.

You don't have to add that "glListBase(0)" following the glPopAttrib, as
that is what the Push/Pop with the GL_LIST_BIT is for.

I'm also seem to miss a glRasterPos in your code. Without it your text will
be printed in the bottom-left corner (and the next text behind it, and so
on).


So, it should look like this:

--------------------------------
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 32, 96, fontListBase); // 100 changed to 96 ...
// glListBase(fontListBase-32); // <-- Not here
--------------------------------
glRasterPos2i,0,0 // Convert current translation to
char position.
glPushAttrib(GL_LIST_BIT); // Save the current ListBase
glListBase(fontListBase - 32); // Set our own ListBase <-- do it here
glCallLists(Length( text ), GL_UNSIGNED_BYTE, PChar( text ));
glPopAttrib(); // Restore the 9old ListBase
--------------------------------

As remarked ih the above, I changed that 100 to 96. 28+100 equals 32+96.
Post by R.Wieser
Finally many thanks I have it working.
I'm glad you did, and its always nice to hear that my offered help did
indeed do so.
Post by R.Wieser
Still, I have to learn a bit more about openGL.
You and me both. In this case it looks like I'm just a single step in front
of you -- I tackeled the "draw text in an OpenGL world" just a few days
before. :-)

Regards,
Rudy Wieser


-- Origional message:
<***@gmail.com> schreef in berichtnieuws
ad6175b1-5096-4ddf-aea5-***@googlegroups.com...
Hi

Many thanks for your answer - I am not sure I understand it all.
Some other guy made this unit years ago, and now - in win7/32bit we have
this problem.
I tried and played around I think I understand it, at least it works on both
systems.

Here it goes:

Yes, it really was 27 as you can see below.
Why they chose 28 I dont know, but (maybe) they had their reasons. However I
changed it to 32, and the code is only printing once. So I dont get this
either.
We are printing numbers such as: "1;2" and "4,3"

And yes, it can easily select the default font. However I keep the current
font, as the default font is bigger.

My current code with comments:

procedure TViewer.Loaded;
var
f: HWND;
bold: integer;
hDC: HWND;
begin
InitDC;
InitGL;

inherited;

if fsBold in font.Style then
bold := FW_BOLD
else
bold := FW_DONTCARE;

f := CreateFont(Font.Size, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
bold, // Font Weight
Integer(fsItalic in Font.Style), // Italic
Integer(fsUnderline in Font.Style), // Underline
Integer(fsStrikeout in Font.Style), // Strikeout
Font.Charset, // Character Set Identifier
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE or DEFAULT_PITCH, // Family And Pitch
PChar(Font.Name)); // Font Name

hDC := GetDC(Handle);
SelectObject(hDC, f); // remove this and I get a bigger font

fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 32, 100, fontListBase); // was 28
glListBase(fontListBase-32); // added as you suggested



ReleaseDC(Handle, hDC);
ZoomAll;
DeleteObject(f);
end;

--- and ---

procedure TViewer.glPrint(text: string);
var
i: integer;
begin
if text = '' then
exit;

// for i := 1 to length(text) do
// text[i] := chr(ord(text[i]) - 27); // now obsolete

glPushAttrib(GL_LIST_BIT);
// glListBase(fontListBase - 28); // this was here
glCallLists(Length( text ), GL_UNSIGNED_BYTE, PChar( text ));
glPopAttrib(); // this was missing!
// glListBase(0); // here it was reset - I wonder if that could
do it, as the offset used is lost. The very interesing thing here was that
when playing around I found a way where every 2nd would be right - so I
would get 1;1 kVL 1;2 LVL ect. I did not understand why, but I did see how
so solve it
end;

Finally many thanks I have it working. Still, I have to learn a bit more
about openGL.
Post by R.Wieser
Hello Sonnich
Post by s***@gmail.com
I understand it now, but I dont see what can cause it
Well, in your code you use
Post by s***@gmail.com
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
telling it to store the characters 28 thru 127 into an image-list. That
means that character 28 will be stored in the list at position 0, character
29 at position 1, and so on. That means, if you continue it, that
character 76 ("L") will be stored at position 49. Now if you look at your
character (ASCII) table at position 49 you will find the character "1".
And there you have your problem: you are mixing up two tables, which do not
start at the same position : At offset 0x00 in the the ASCII-table you have
a symbol representing the ASCII character 0x00. However, at offset 0x00 in
the FontTable List you have a symbol representing the ASCII character 28.
When you are providing a string to the glPrint command those characters are
actually converted to their ASCII values, and those are, without any
conversion, used as indices into the FontTable list you generated. As your
ASCII table starts with (a symbol representing) character 0x00 and the
FontTable starts with (a symbol representing) character 0x1C (28 decimal)
converting the ASCII indices to FontTable list indices.
*# Read this please #*
I must say something is odd here: You say you have a difference of 27 (the
subtraction in your posted code), where I would expect you to have a
difference of 28 (offset used in the wglUseFontBitmaps call) ... Could you
please re-check ?
By the way: the simpelest solutions seems to be to subtract the difference
from the "fontListBase" variable you're using. One drawback: if-and-when
your text contains a(n ASCII) character below 28 or above 127 the result
will be unknown (depending on what other DrawList you have created before
and after creating the FontTable list.
Post by s***@gmail.com
Any suggestions?
The font itself is not the problem. Even better: You can remove everything
above the "fontListBase := ..." line and still get readable text (please try
it. The less code you have when trying to catch a problem the better).
Reason? It than simply takes whatever font was already assigned to that
window/control (either a default font, or the one you specified for the
window in the resource file)
-----------------
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
....
glRasterPos... ;The location of the text (relative to the
last translation)
glListBase(base - 28) ;Where ASCII char 0x00 would have been.
glCallLists, .... ;Draw the text (using ASCII characters)
-----------------
One question though: why did you start at 28 and not 32 ? ASCII 0 thru 31
are the so-called control-characters, which are normally not readable
(though a number of them are).
Hope that helps.
Rudy Wieser
-- Origional message
Post by s***@gmail.com
I looked at it and played a little around - the present code
is as follows.
I understand it now, but I dont see what can cause it
I tried to hardcode the font name and style from 1(default) to
0(ansi) etc, but that really did not change much
Any suggestions?
if fsBold in font.Style then
bold := FW_BOLD
else
bold := FW_DONTCARE;
f := CreateFont(Font.Size, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
bold, // Font Weight
Integer(fsItalic in Font.Style), // Italic
Integer(fsUnderline in Font.Style), // Underline
Integer(fsStrikeout in Font.Style), // Strikeout
Font.Charset, // Character Set Identifier <-THIS IS 1
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE or DEFAULT_PITCH, // Family And Pitch
PChar('Arial')); // Font Name
// PChar(Font.Name)); // Font Name <- currently removed
hDC := GetDC(Handle);
SelectObject(hDC, f);
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 28, 100, fontListBase);
s***@gmail.com
2013-09-04 10:10:24 UTC
Permalink
Post by R.Wieser
That "glListBase(fontListBase-32)" should be used directly infront of the
"glCallLists" function, in between that glPush/popAttrib.
You don't have to add that "glListBase(0)" following the glPopAttrib, as
that is what the Push/Pop with the GL_LIST_BIT is for.
I tried - and it has to be where I put it, otherwise it does not work.
For XP it works, but not for Win7

I dont know why....
Post by R.Wieser
I'm also seem to miss a glRasterPos in your code. Without it your text will
be printed in the bottom-left corner (and the next text behind it, and so
on).
The current position is fine, I guess they didnt need this.
I will keep this in mind, but as of now I will change it.
Post by R.Wieser
--------------------------------
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 32, 96, fontListBase); // 100 changed to 96 ...
// glListBase(fontListBase-32); // <-- Not here
--------------------------------
glRasterPos2i,0,0 // Convert current translation to
char position.
glPushAttrib(GL_LIST_BIT); // Save the current ListBase
glListBase(fontListBase - 32); // Set our own ListBase <-- do it here
// it has no action here in my code for some reason. on win7, but on XP it works
Post by R.Wieser
glCallLists(Length( text ), GL_UNSIGNED_BYTE, PChar( text ));
glPopAttrib(); // Restore the 9old ListBase
--------------------------------
R.Wieser
2013-09-04 11:21:20 UTC
Permalink
Post by R.Wieser
Hello Sonnich
I tried - and it has to be where I put it, otherwise it
< does not work. For XP it works, but not for Win7
Post by R.Wieser
I dont know why....
The only thing I can currently think of is that you're using, on those
versions of the OS, different versions of OpenGL ...

Regards,
Rudy Wieser
Post by R.Wieser
Post by R.Wieser
That "glListBase(fontListBase-32)" should be used directly infront of the
"glCallLists" function, in between that glPush/popAttrib.
You don't have to add that "glListBase(0)" following the glPopAttrib, as
that is what the Push/Pop with the GL_LIST_BIT is for.
I tried - and it has to be where I put it, otherwise it does not work.
For XP it works, but not for Win7
I dont know why....
Post by R.Wieser
I'm also seem to miss a glRasterPos in your code. Without it your text will
be printed in the bottom-left corner (and the next text behind it, and so
on).
The current position is fine, I guess they didnt need this.
I will keep this in mind, but as of now I will change it.
Post by R.Wieser
--------------------------------
fontListBase := glGenLists(100);
wglUseFontBitmaps(hDC, 32, 96, fontListBase); // 100 changed to 96 ...
// glListBase(fontListBase-32); // <-- Not here
--------------------------------
glRasterPos2i,0,0 // Convert current translation to
char position.
glPushAttrib(GL_LIST_BIT); // Save the current ListBase
glListBase(fontListBase - 32); // Set our own ListBase <-- do it here
// it has no action here in my code for some reason. on win7, but on XP it works
Post by R.Wieser
glCallLists(Length( text ), GL_UNSIGNED_BYTE, PChar( text ));
glPopAttrib(); // Restore the 9old ListBase
--------------------------------
s***@gmail.com
2013-09-03 08:35:13 UTC
Permalink
Post by s***@gmail.com
Hi
I am running an older programme on Win 7 / 32 bit, and found that glPrint turns the text "1;1" into "LVL", "1;2" into "LVM" and so on...
Now this works on win xp, and there are still many places where it needs to work on XP.
for i :=1 to lentgh(text) do
text[i] := chr(ord(text[i])-27);
which means I have an offset of 27
but why?
WBR
Sonnich
Loading...