f



Odd glMultiDrawArrays behaviour...

Does anyone know why glMultiDrawArrays is causing me problems when I use it
to render line-strips from a vertex buffer containing more than 65536
vertices? No crashing but the results are obviously wrong.

The line-strips are stored sequentially in a single vertex buffer which will
usually contain more than this number of vertices, depending on user
interaction. So it's possible for the "first" array to contain vertex
indices greater than 65536. Could it be possible that the indices are being
converted to 16-bit values even though the input values are 32-bit? It looks
like only those line-strips which are referenced with indices greater than
65536 are rendered incorrectly; the rest up to this point are ok.

I suspect it's a problem with glMultiDrawArrays itself rather than the data
I supply to it. I can render from the same buffer with multiple calls to
glDrawArrays, one for each strip, and the results are ok. But because the
strips are in general quite small (<100 vertices each) and there are
thousands of strips, this would mean calling glDrawArrays thousands of times
in each rendering pass, which is far from efficient.

I cannot find any reference to this problem either online or in the OpenGL
documentation so it may just be a limitation, if not bug, in the OpenGL
implementation used by my graphics card. But I'd be interested to hear from
anyone who can confirm this behaviour or who can shed some insight on what I
might be going on here. Thanks very much.

John.

0
John
7/8/2016 11:31:37 AM
comp.graphics.api.opengl 7074 articles. 1 followers. Post Follow

3 Replies
541 Views

Similar Articles

[PageSpeed] 20

On Fri, 08 Jul 2016 12:31:37 +0100, John Irwin wrote:

So:

a) glGetError() isn't reporting any errors, and
b) if you replace the glMultiDrawArrays() call with a call to the
function:

void myMultiDrawArrays(GLenum mode, const GLint* first,
                       const GLsizei* count, GLsizei drawcount)
{
    GLsizei i;
    for (i = 0; i < drawcount; i++)
        glDrawArrays(mode, first[i], count[i]);
}

then everything works, just inefficiently?

If that's the case, I can't see how this can be anything other than a
driver bug.

glMultiDrawArrays() isn't affected by any GL state beyond that which
also affects glDrawArrays().

0
Nobody
7/9/2016 5:10:21 AM
"Nobody" <nobody@nowhere.invalid> wrote in message 
news:pan.2016.07.09.05.10.20.65000@nowhere.invalid...
> On Fri, 08 Jul 2016 12:31:37 +0100, John Irwin wrote:
>
> So:
>
> a) glGetError() isn't reporting any errors, and
> b) if you replace the glMultiDrawArrays() call with a call to the
> function:
>
> void myMultiDrawArrays(GLenum mode, const GLint* first,
>                       const GLsizei* count, GLsizei drawcount)
> {
>    GLsizei i;
>    for (i = 0; i < drawcount; i++)
>        glDrawArrays(mode, first[i], count[i]);
> }
>
> then everything works, just inefficiently?

That's the gist of it...

> If that's the case, I can't see how this can be anything other than a
> driver bug.

Thanks for your feedback. I suspect it's a driver bug too. Unfortunately my 
graphics card is no longer supported so there is no prospect of an updated 
driver.

However I've got glMultiDrawElements working with my vertex data, which is a 
relief as that function requires you to specify explicitly the data type of 
the indices. The downside is that I need to work with a large index buffer 
containing the consecutive integers 0,1,2... which, in a sane world, would 
normally be considered redundant. But it seems I've little choice in the 
matter.

John.

0
John
7/10/2016 9:43:44 AM
El viernes, 8 de julio de 2016, 4:31:43 (UTC-7), John Irwin escribi=C3=B3:
> Does anyone know why glMultiDrawArrays is causing me problems when I use =
it
> to render line-strips from a vertex buffer containing more than 65536
> vertices? No crashing but the results are obviously wrong.
>=20
> The line-strips are stored sequentially in a single vertex buffer which w=
ill
> usually contain more than this number of vertices, depending on user
> interaction. So it's possible for the "first" array to contain vertex
> indices greater than 65536. Could it be possible that the indices are bei=
ng
> converted to 16-bit values even though the input values are 32-bit? It lo=
oks
> like only those line-strips which are referenced with indices greater tha=
n
> 65536 are rendered incorrectly; the rest up to this point are ok.
>=20
> I suspect it's a problem with glMultiDrawArrays itself rather than the da=
ta
> I supply to it. I can render from the same buffer with multiple calls to
> glDrawArrays, one for each strip, and the results are ok. But because the
> strips are in general quite small (<100 vertices each) and there are
> thousands of strips, this would mean calling glDrawArrays thousands of ti=
mes
> in each rendering pass, which is far from efficient.
>=20
> I cannot find any reference to this problem either online or in the OpenG=
L
> documentation so it may just be a limitation, if not bug, in the OpenGL
> implementation used by my graphics card. But I'd be interested to hear fr=
om
> anyone who can confirm this behaviour or who can shed some insight on wha=
t I
> might be going on here. Thanks very much.
>=20
> John.



El viernes, 8 de julio de 2016, 4:31:43 (UTC-7), John Irwin escribi=C3=B3:
> Does anyone know why glMultiDrawArrays is causing me problems when I use =
it
> to render line-strips from a vertex buffer containing more than 65536
> vertices? No crashing but the results are obviously wrong.
>=20
> The line-strips are stored sequentially in a single vertex buffer which w=
ill
> usually contain more than this number of vertices, depending on user
> interaction. So it's possible for the "first" array to contain vertex
> indices greater than 65536. Could it be possible that the indices are bei=
ng
> converted to 16-bit values even though the input values are 32-bit? It lo=
oks
> like only those line-strips which are referenced with indices greater tha=
n
> 65536 are rendered incorrectly; the rest up to this point are ok.
>=20
> I suspect it's a problem with glMultiDrawArrays itself rather than the da=
ta
> I supply to it. I can render from the same buffer with multiple calls to
> glDrawArrays, one for each strip, and the results are ok. But because the
> strips are in general quite small (<100 vertices each) and there are
> thousands of strips, this would mean calling glDrawArrays thousands of ti=
mes
> in each rendering pass, which is far from efficient.
>=20
> I cannot find any reference to this problem either online or in the OpenG=
L
> documentation so it may just be a limitation, if not bug, in the OpenGL
> implementation used by my graphics card. But I'd be interested to hear fr=
om
> anyone who can confirm this behaviour or who can shed some insight on wha=
t I
> might be going on here. Thanks very much.
>=20
> John.



El viernes, 8 de julio de 2016, 4:31:43 (UTC-7), John Irwin escribi=C3=B3:
> Does anyone know why glMultiDrawArrays is causing me problems when I use =
it
> to render line-strips from a vertex buffer containing more than 65536
> vertices? No crashing but the results are obviously wrong.
>=20
> The line-strips are stored sequentially in a single vertex buffer which w=
ill
> usually contain more than this number of vertices, depending on user
> interaction. So it's possible for the "first" array to contain vertex
> indices greater than 65536. Could it be possible that the indices are bei=
ng
> converted to 16-bit values even though the input values are 32-bit? It lo=
oks
> like only those line-strips which are referenced with indices greater tha=
n
> 65536 are rendered incorrectly; the rest up to this point are ok.
>=20
> I suspect it's a problem with glMultiDrawArrays itself rather than the da=
ta
> I supply to it. I can render from the same buffer with multiple calls to
> glDrawArrays, one for each strip, and the results are ok. But because the
> strips are in general quite small (<100 vertices each) and there are
> thousands of strips, this would mean calling glDrawArrays thousands of ti=
mes
> in each rendering pass, which is far from efficient.
>=20
> I cannot find any reference to this problem either online or in the OpenG=
L
> documentation so it may just be a limitation, if not bug, in the OpenGL
> implementation used by my graphics card. But I'd be interested to hear fr=
om
> anyone who can confirm this behaviour or who can shed some insight on wha=
t I
> might be going on here. Thanks very much.
>=20
> John.



El viernes, 8 de julio de 2016, 4:31:43 (UTC-7), John Irwin escribi=C3=B3:
> Does anyone know why glMultiDrawArrays is causing me problems when I use =
it
> to render line-strips from a vertex buffer containing more than 65536
> vertices? No crashing but the results are obviously wrong.
>=20
> The line-strips are stored sequentially in a single vertex buffer which w=
ill
> usually contain more than this number of vertices, depending on user
> interaction. So it's possible for the "first" array to contain vertex
> indices greater than 65536. Could it be possible that the indices are bei=
ng
> converted to 16-bit values even though the input values are 32-bit? It lo=
oks
> like only those line-strips which are referenced with indices greater tha=
n
> 65536 are rendered incorrectly; the rest up to this point are ok.
>=20
> I suspect it's a problem with glMultiDrawArrays itself rather than the da=
ta
> I supply to it. I can render from the same buffer with multiple calls to
> glDrawArrays, one for each strip, and the results are ok. But because the
> strips are in general quite small (<100 vertices each) and there are
> thousands of strips, this would mean calling glDrawArrays thousands of ti=
mes
> in each rendering pass, which is far from efficient.
>=20
> I cannot find any reference to this problem either online or in the OpenG=
L
> documentation so it may just be a limitation, if not bug, in the OpenGL
> implementation used by my graphics card. But I'd be interested to hear fr=
om
> anyone who can confirm this behaviour or who can shed some insight on wha=
t I
> might be going on here. Thanks very much.
>=20
> John.



El viernes, 8 de julio de 2016, 4:31:43 (UTC-7), John Irwin escribi=C3=B3:
> Does anyone know why glMultiDrawArrays is causing me problems when I use =
it
> to render line-strips from a vertex buffer containing more than 65536
> vertices? No crashing but the results are obviously wrong.
>=20
> The line-strips are stored sequentially in a single vertex buffer which w=
ill
> usually contain more than this number of vertices, depending on user
> interaction. So it's possible for the "first" array to contain vertex
> indices greater than 65536. Could it be possible that the indices are bei=
ng
> converted to 16-bit values even though the input values are 32-bit? It lo=
oks
> like only those line-strips which are referenced with indices greater tha=
n
> 65536 are rendered incorrectly; the rest up to this point are ok.
>=20
> I suspect it's a problem with glMultiDrawArrays itself rather than the da=
ta
> I supply to it. I can render from the same buffer with multiple calls to
> glDrawArrays, one for each strip, and the results are ok. But because the
> strips are in general quite small (<100 vertices each) and there are
> thousands of strips, this would mean calling glDrawArrays thousands of ti=
mes
> in each rendering pass, which is far from efficient.
>=20
> I cannot find any reference to this problem either online or in the OpenG=
L
> documentation so it may just be a limitation, if not bug, in the OpenGL
> implementation used by my graphics card. But I'd be interested to hear fr=
om
> anyone who can confirm this behaviour or who can shed some insight on wha=
t I
> might be going on here. Thanks very much.
>=20
> John.



El viernes, 8 de julio de 2016, 4:31:43 (UTC-7), John Irwin escribi=C3=B3:
> Does anyone know why glMultiDrawArrays is causing me problems when I use =
it
> to render line-strips from a vertex buffer containing more than 65536
> vertices? No crashing but the results are obviously wrong.
>=20
> The line-strips are stored sequentially in a single vertex buffer which w=
ill
> usually contain more than this number of vertices, depending on user
> interaction. So it's possible for the "first" array to contain vertex
> indices greater than 65536. Could it be possible that the indices are bei=
ng
> converted to 16-bit values even though the input values are 32-bit? It lo=
oks
> like only those line-strips which are referenced with indices greater tha=
n
> 65536 are rendered incorrectly; the rest up to this point are ok.
>=20
> I suspect it's a problem with glMultiDrawArrays itself rather than the da=
ta
> I supply to it. I can render from the same buffer with multiple calls to
> glDrawArrays, one for each strip, and the results are ok. But because the
> strips are in general quite small (<100 vertices each) and there are
> thousands of strips, this would mean calling glDrawArrays thousands of ti=
mes
> in each rendering pass, which is far from efficient.
>=20
> I cannot find any reference to this problem either online or in the OpenG=
L
> documentation so it may just be a limitation, if not bug, in the OpenGL
> implementation used by my graphics card. But I'd be interested to hear fr=
om
> anyone who can confirm this behaviour or who can shed some insight on wha=
t I
> might be going on here. Thanks very much.
>=20
> John.

0
puta
9/28/2016 5:26:40 AM
Reply: