f



GLSL 3.30 is not supported (Ubuntu 12.04, Intel HD Graphics 3000 and NVIDIA Graphics with Optimus)

The system:
Dell Latitude E6520=20

Video Card
    Intel=AE HD Graphics 3000
    NVIDIA=AE NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus

Ubuntu 12.04

I installed bumblebee.

I installed PyOpenGL and am following the tutorial (http://pyopengl.sourcef=
orge.net/context/tutorials/shader_1.xhtml)

Result on Python says:
- - - - - - - - - - -
RuntimeError: ('Shader compile failure (0): 0:1(10): error: GLSL 3.30 is no=
t supported. Supported versions are: 1.00 ES, 1.10, 1.20, and 1.30\n\n', ['=
#version 330\n        void main() {\n            gl_Position =3D gl_ModelVi=
ewProjectionMatrix * gl_Vertex;\n        }'], GL_VERTEX_SHADER)
- - - - - - - - - - -

I know NVIDIA Graphics with Optimus can not be supported on Ubuntu.
But I think Intel Graphics should support the latest version of OpenGL.

(1) What should I do? Can I update something like drivers to make Intel Gra=
phics support the GLSL 3.30?

(2) If I can not, how can I use lower version of OpenGL in PyOpenGL?=20
On http://pyopengl.sourceforge.net/, it writes:
PyOpenGL 3.0.2 includes support for:
    OpenGL v1.1 through 4.3

So there would be an option to set OpenGL at a lower version. But I failed =
to find the way to do it.

Help!! Thanks in advance!
0
codingpotatolinda
1/24/2013 2:05:05 AM
comp.graphics.api.opengl 7074 articles. 1 followers. Post Follow

17 Replies
8639 Views

Similar Articles

[PageSpeed] 55

(1) >>So there would be an option to set OpenGL at a lower version. But I failed to find the way to do it. 

I found the option in the code. Need to specify the version.

(2) From the http://en.wikipedia.org/wiki/GLSL#Versions, the corresponding GLSL versions are 
GLSL version   OpenGL version
1.30.10 	3.0
1.40.08 	3.1
1.50.11 	3.2
3.30.6 	        3.3

So it seems Intel Graphics 3000 support OpenGL Version 3.0.

I went to the Intel official website and Linux driver website, fail to find the answer. 
(Drivers for Linux*
http://www.intel.com/support/graphics/sb/CS-010512.htm

Linux Graphics
https://01.org/linuxgraphics/search/node/HD%20Graphics%203000)

But now I almost know I will use OpenGL 3.0 and only follow tutorials for OpenGL 3.0. Who could recommend good ones?



0
Linda
1/24/2013 3:43:30 AM
I am using BumbleBee with Ubuntu and Dell with NVIDIA, and it works.

I don't remember the details now, but you have to start the application with "optirun". Or, you simply do "optirun bash". That will make your application use the NVIDIA card instead of HD3000.

Regarding HD3000, it doesn't support OpenGL 3.3, as you say. But, you can use most functionality from OpenGL 3.3 anyway. I don't know how Python GL works, you may need to check the extensions. I got most of it working, except Uniform Buffers.

On Thursday, 24 January 2013 04:43:30 UTC+1, Linda Li  wrote:
> (1) >>So there would be an option to set OpenGL at a lower version. But I failed to find the way to do it. 
> 
> 
> 
> I found the option in the code. Need to specify the version.
> 
> 
> 
> (2) From the http://en.wikipedia.org/wiki/GLSL#Versions, the corresponding GLSL versions are 
> 
> GLSL version   OpenGL version
> 
> 1.30.10 	3.0
> 
> 1.40.08 	3.1
> 
> 1.50.11 	3.2
> 
> 3.30.6 	        3.3
> 
> 
> 
> So it seems Intel Graphics 3000 support OpenGL Version 3.0.
> 
> 
> 
> I went to the Intel official website and Linux driver website, fail to find the answer. 
> 
> (Drivers for Linux*
> 
> http://www.intel.com/support/graphics/sb/CS-010512.htm
> 
> 
> 
> Linux Graphics
> 
> https://01.org/linuxgraphics/search/node/HD%20Graphics%203000)
> 
> 
> 
> But now I almost know I will use OpenGL 3.0 and only follow tutorials for OpenGL 3.0. Who could recommend good ones?

0
ISO
1/24/2013 2:15:20 PM
On Thursday, January 24, 2013 8:15:20 AM UTC-6, Lars Pensj=F6 wrote:
> I am using BumbleBee with Ubuntu and Dell with NVIDIA, and it works.
=20
What language do you use?
I am trying it on python, using pyOpenGL. But the tutorial is not that frie=
ndly to newbies. I am thinking to change one..


>=20
> I don't remember the details now, but you have to start the application w=
ith "optirun". Or, you simply do "optirun bash". That will make your applic=
ation use the NVIDIA card instead of HD3000.
>=20

I tried the following commands:
optirun bash
Python
execfile("Tutoiral1_OpenGLContextPython.py")

# the code is from http://pyopengl.sourceforge.net/context/tutorials/shader=
_1.xhtml

RuntimeError: ('Shader compile failure (0): 0(3) : error C7533: global vari=
able gl_ModelViewProjectionMatrix is deprecated after version 120\n0(3) : e=
rror C7533: global variable gl_Vertex is deprecated after version 120\n', [=
'#version 330\n        void main() {\n            gl_Position =3D gl_ModelV=
iewProjectionMatrix * gl_Vertex;\n        }'], GL_VERTEX_SHADER)

If I change 330 to 130, it works.=20
Just whenever I close the window for openGL, it writes:
[VGL] ERROR: in getglxdrawable--
[VGL]    177: Window has been deleted by window manager


>=20
> Regarding HD3000, it doesn't support OpenGL 3.3, as you say. But, you can=
 use most functionality from OpenGL 3.3 anyway. I don't know how Python GL =
works, you may need to check the extensions. I got most of it working, exce=
pt Uniform Buffers.
>=20

If HD3000 can support OpenGL 3.0, why should I use optirun, if I am fine wi=
th OpenGL 3.0. However, from the above experiment, if I do not run "optirun=
 bash", the python code simply does not work. Why?

0
Linda
1/24/2013 9:36:45 PM
On Thursday, January 24, 2013 3:36:45 PM UTC-6, Linda Li wrote:
> On Thursday, January 24, 2013 8:15:20 AM UTC-6, Lars Pensj=F6 wrote:
> If HD3000 can support OpenGL 3.0, why should I use optirun, if I am fine =
with OpenGL 3.0. However, from the above experiment, if I do not run "optir=
un bash", the python code simply does not work. Why?

Sorry, I think I am tortured a little bit and got my head spin a little bit=
..
Actually I test: If I do not run optirun bash, just change version 330 to 1=
30, it will work.

So what advantage to run optirun rush?
And does NVIDIA=AE NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus =
 support OpenGL 3.3?

I used the command:
glxinfo|more
finding out:=20
OpenGL version string: 3.0 Mesa 8.0.4
OpenGL shading language version string: 1.30



0
Linda
1/24/2013 9:57:41 PM
On 24/01/2013 22:57, Linda Li wrote:
> And does NVIDIA� NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus  support OpenGL 3.3?
>
> I used the command:
> glxinfo|more
> finding out:
> OpenGL version string: 3.0 Mesa 8.0.4
> OpenGL shading language version string: 1.30
Your Mesa driver does not support OpenGL 3.3
See http://www.mesa3d.org/faq.html

-
Fabien
0
Fabien
1/25/2013 8:17:01 AM
Thanks. I am just confused:
So it seems we depend on two: one is the hardware, the other is the softwar=
e.

Actually I also installed PyOpenGL 3.0.2, which supports for:
    OpenGL v1.1 through 4.3=20

What relationship with Mesa?

With NVIDIA NVSTM 4200M Discrete Graphics with Optimus, does it support Ope=
nGL 3.3?



On Friday, January 25, 2013 2:17:01 AM UTC-6, Fabien R wrote:
> On 24/01/2013 22:57, Linda Li wrote:
>=20
> > And does NVIDIA=EF=BF=BD NVSTM 4200M (DDR3 512MB) Discrete Graphics wit=
h Optimus  support OpenGL 3.3?
>=20
> >
>=20
> > I used the command:
>=20
> > glxinfo|more
>=20
> > finding out:
>=20
> > OpenGL version string: 3.0 Mesa 8.0.4
>=20
> > OpenGL shading language version string: 1.30
>=20
> Your Mesa driver does not support OpenGL 3.3
>=20
> See http://www.mesa3d.org/faq.html
>=20
>=20
>=20
> -
>=20
> Fabien

0
Linda
1/25/2013 5:36:42 PM
According to the wiki for OpenGL: "The OpenGL specification describes an abstract API for drawing 2D and 3D graphics. "
It seems different OSs have different libraries to implement API?

And Ubuntu only has Mesa 3D graphics librfary to implement OpenGL, and the latest version only supports OpenGL 3.1.

So to implement OpenGL apps, there are two factors to consider about:
Hardware and its driver + OS OpenGL libraries

For Ubuntu, I can only use OpenGL up to version 3.1.



On Friday, January 25, 2013 11:36:42 AM UTC-6, Linda Li wrote:
> Thanks. I am just confused:
> 
> So it seems we depend on two: one is the hardware, the other is the software.
> 
> 
> 
> Actually I also installed PyOpenGL 3.0.2, which supports for:
> 
>     OpenGL v1.1 through 4.3 
> 
> 
> 
> What relationship with Mesa?
> 
> 
> 
> With NVIDIA NVSTM 4200M Discrete Graphics with Optimus, does it support OpenGL 3.3?
> 

0
Linda
1/25/2013 6:40:07 PM
I am no expert on this, someone please correct me if I got it wrong.

As far as I know, the Mesa driver is an open source driver. However, there are proprietary OpenGL drivers available for Ubuntu also. The NVIDIA and AMD supports OpenGL 4.2, and there may already be prereleases available for 4.3.

On Friday, 25 January 2013 19:40:07 UTC+1, Linda Li  wrote:
> According to the wiki for OpenGL: "The OpenGL specification describes an abstract API for drawing 2D and 3D graphics. "
> 
> It seems different OSs have different libraries to implement API?
> 
> 
> 
> And Ubuntu only has Mesa 3D graphics librfary to implement OpenGL, and the latest version only supports OpenGL 3.1.
> 
> 
> 
> So to implement OpenGL apps, there are two factors to consider about:
> 
> Hardware and its driver + OS OpenGL libraries
> 
> 
> 
> For Ubuntu, I can only use OpenGL up to version 3.1.
> 
> 
> 
> 
> 
> 
> 
> On Friday, January 25, 2013 11:36:42 AM UTC-6, Linda Li wrote:
> 
> > Thanks. I am just confused:
> 
> > 
> 
> > So it seems we depend on two: one is the hardware, the other is the software.
> 
> > 
> 
> > 
> 
> > 
> 
> > Actually I also installed PyOpenGL 3.0.2, which supports for:
> 
> > 
> 
> >     OpenGL v1.1 through 4.3 
> 
> > 
> 
> > 
> 
> > 
> 
> > What relationship with Mesa?
> 
> > 
> 
> > 
> 
> > 
> 
> > With NVIDIA NVSTM 4200M Discrete Graphics with Optimus, does it support OpenGL 3.3?
> 
> >

0
ISO
1/31/2013 9:48:44 AM
Thanks. So this is exactly my question in the post 4 of this topic:
--------------------
I tried the following commands:
optirun bash
Python
execfile("Tutoiral1_OpenGLContextPython.py")

# the code is from http://pyopengl.sourceforge.net/context/tutorials/shader=
_1.xhtml

RuntimeError: ('Shader compile failure (0): 0(3) : error C7533: global vari=
able gl_ModelViewProjectionMatrix is deprecated after version 120\n0(3) : e=
rror C7533: global variable gl_Vertex is deprecated after version 120\n', [=
'#version 330\n        void main() {\n            gl_Position =3D gl_ModelV=
iewProjectionMatrix * gl_Vertex;\n        }'], GL_VERTEX_SHADER)=20

If I change 330 to 130, it works.=20
--------------------
Whether I use "optirun bash" or not, the situation is the same: that is: it=
 seems it supports 130 not 330.

So my question is:=20
(a) If NVIDIA=AE NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus  s=
upports OpenGL 3.3,
when I use optirun bash, it should work.
But it not. Why?

(b) Or maybe NVIDIA=AE NVSTM 4200M (DDR3 512MB) Discrete Graphics with Opti=
mus does not support OpenGL 3.3.
But in my google searching, it is not the case.


On Thursday, January 31, 2013 3:48:44 AM UTC-6, Lars Pensj=F6 wrote:
> I am no expert on this, someone please correct me if I got it wrong.
>=20
>=20
>=20
> As far as I know, the Mesa driver is an open source driver. However, ther=
e are proprietary OpenGL drivers available for Ubuntu also. The NVIDIA and =
AMD supports OpenGL 4.2, and there may already be prereleases available for=
 4.3.
>=20
>
0
Linda
1/31/2013 8:53:24 PM
On Thu, 31 Jan 2013 12:53:24 -0800, Linda Li wrote:

> error C7533: global variable gl_ModelViewProjectionMatrix is deprecated
> after version 120.

> If I change 330 to 130, it works. 
> --------------------
> Whether I use "optirun bash" or not, the situation is the same: that is:
> it seems it supports 130 not 330.

No, the situation is that your code is GLSL 1.2, not GLSL 3.3. If it
didn't support GLSL 3.3, you'd get a different error message (something
along the lines of "version 330 not supported".

Newer versions of GLSL are not necessarily backward compatible with older
versions, so if you declare your code as being GLSL 3.3 but it uses
features which are no longer present, you will get an error such as the
one above.

The compatibility uniforms (e.g. gl_ModelViewProjectionMatrix) should
still be supported in 3.3 (and even 4.3), provided that you are using a
compatibility profile context. If you're using a core profile context,
you'll have to avoid using deprecated features.

0
Nobody
1/31/2013 9:47:10 PM
The code is for GLSL 3.3.
As mentioned in my first post, I followed the code in the tutorial (http://pyopengl.sourceforge.net/context/tutorials/shader_1.xhtml) 
- - - - - -
        VERTEX_SHADER = shaders.compileShader("""#version 330
        void main() {
            gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
        }""", GL_VERTEX_SHADER)
- - - - - -

I meant, when I change "330" to "130", it works.

Actually I thought I found the answer:
Running of GLSL depends on: hardware, hardware driver, OS installed OpenGL libraries.

So in one my post I made a note of what I learned through online search:
Is it related to Ubuntu, which uses mesa OpenGL, which currently only support OpenGL 3.1?


On Thursday, January 31, 2013 3:47:10 PM UTC-6, Nobody wrote:
> On Thu, 31 Jan 2013 12:53:24 -0800, Linda Li wrote:
> 
> 
> 
> > error C7533: global variable gl_ModelViewProjectionMatrix is deprecated
> 
> > after version 120.
> 
> 
> 
> > If I change 330 to 130, it works. 
> 
> > --------------------
> 
> > Whether I use "optirun bash" or not, the situation is the same: that is:
> 
> > it seems it supports 130 not 330.
> 
> 
> 
> No, the situation is that your code is GLSL 1.2, not GLSL 3.3. If it
> 
> didn't support GLSL 3.3, you'd get a different error message (something
> 
> along the lines of "version 330 not supported".
> 
> 
> 
> Newer versions of GLSL are not necessarily backward compatible with older
> 
> versions, so if you declare your code as being GLSL 3.3 but it uses
> 
> features which are no longer present, you will get an error such as the
> 
> one above.
> 
> 
> 
> The compatibility uniforms (e.g. gl_ModelViewProjectionMatrix) should
> 
> still be supported in 3.3 (and even 4.3), provided that you are using a
> 
> compatibility profile context. If you're using a core profile context,
> 
> you'll have to avoid using deprecated features.

0
Linda
2/1/2013 5:03:28 PM
On Fri, 01 Feb 2013 09:03:28 -0800, Linda Li wrote:

> The code is for GLSL 3.3.

> - - - - - -
>         VERTEX_SHADER = shaders.compileShader("""#version 330
>         void main() {
>             gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
>         }""", GL_VERTEX_SHADER)
> - - - - - -

Actually, it's for "3.3 compatibility profile". It won't work in "3.3 core
profile" because both gl_Vertex and gl_ModelViewProjectionMatrix are
only available in the compatibility profile.

> I meant, when I change "330" to "130", it works.

I know. gl_Vertex and gl_ModelViewProjectionMatrix exist (and are not
deprecated) in 1.3, so it's safe to use them with "#version 130".

One factor which I overlooked in my previous reply is that a #version
statement also selects a profile. If a profile isn't specified explicitly,
it defaults to "core" (at least, that's what the standard specifies).

You might try using:

	#version 330 compatibility

to select the compatibility profile instead.

0
Nobody
2/3/2013 2:46:48 AM
Thank you.=20

It seems "compatibility" does not work.
When I changed the .py file as you instructed, the error information is as =
below:
RuntimeError: ('Shader compile failure (0): 0:1(15): preprocessor error: sy=
ntax error, unexpected IDENTIFIER, expecting NEWLINE\n', ['#version 330 com=
patibility\n        void main() {\n            gl_Position =3D gl_ModelView=
ProjectionMatrix * gl_Vertex;\n        }'], GL_VERTEX_SHADER)


And actually I have a question:=20
Do you think with the supported version of mesa OpenGL in Ubuntu is 3.1, no=
 matter how the hardware graphic card is powerful, all OpenGL programs in U=
buntu can only support up to 3.1?




On Saturday, February 2, 2013 8:46:48 PM UTC-6, Nobody wrote:
> On Fri, 01 Feb 2013 09:03:28 -0800, Linda Li wrote:
>=20
>=20
>=20
> > The code is for GLSL 3.3.
>=20
>=20
>=20
> > - - - - - -
>=20
> >         VERTEX_SHADER =3D shaders.compileShader("""#version 330
>=20
> >         void main() {
>=20
> >             gl_Position =3D gl_ModelViewProjectionMatrix * gl_Vertex;
>=20
> >         }""", GL_VERTEX_SHADER)
>=20
> > - - - - - -
>=20
>=20
>=20
> Actually, it's for "3.3 compatibility profile". It won't work in "3.3 cor=
e
>=20
> profile" because both gl_Vertex and gl_ModelViewProjectionMatrix are
>=20
> only available in the compatibility profile.
>=20
>=20
>=20
> > I meant, when I change "330" to "130", it works.
>=20
>=20
>=20
> I know. gl_Vertex and gl_ModelViewProjectionMatrix exist (and are not
>=20
> deprecated) in 1.3, so it's safe to use them with "#version 130".
>=20
>=20
>=20
> One factor which I overlooked in my previous reply is that a #version
>=20
> statement also selects a profile. If a profile isn't specified explicitly=
,
>=20
> it defaults to "core" (at least, that's what the standard specifies).
>=20
>=20
>=20
> You might try using:
>=20
>=20
>=20
> 	#version 330 compatibility
>=20
>=20
>=20
> to select the compatibility profile instead.

0
Linda
2/4/2013 6:51:43 PM
On Mon, 04 Feb 2013 10:51:43 -0800, Linda Li wrote:

> It seems "compatibility" does not work. When I changed the .py file as you
> instructed, the error information is as below: RuntimeError: ('Shader
> compile failure (0): 0:1(15): preprocessor error: syntax error, unexpected
> IDENTIFIER, expecting NEWLINE\n', ['#version 330 compatibility\n       
> void main() {\n            gl_Position = gl_ModelViewProjectionMatrix *
> gl_Vertex;\n        }'], GL_VERTEX_SHADER)

That appears to be a bug in the GLSL preprocessor. It works fine with an
ATI card on Windows (the only suitable test system I have available), and
the standard is fairly clear on this (the #version directive supports a
profile in all versions from 1.5 onwards).

> And actually I have a question:
> Do you think with the supported version of mesa OpenGL in Ubuntu is 3.1,
> no matter how the hardware graphic card is powerful, all OpenGL programs
> in Ubuntu can only support up to 3.1?

The version of libGL.so shouldn't affect GLSL, as it should just pass the
source code directly to the driver. Using functions which don't exist in
3.1 would be a problem, as those functions don't exist in the libGL.so
library.

However, if the back-end driver is based upon Mesa (as opposed to a
proprietary driver), that would explain it; the Mesa 9.0 source code
doesn't recognise the profile part of a #version directive:

http://cgit.freedesktop.org/mesa/mesa/tree/src/glsl/glsl_parser.yy?h=9.0#n254

This appears to be fixed in the trunk version (9.1):

http://cgit.freedesktop.org/mesa/mesa/tree/src/glsl/glsl_parser.yy#n263

But IIRC, nVidia provides their own libGL.so (other drivers just provide
the kernel DRM module and the X11 DRI module), so Mesa shouldn't be
involved when using the proprietary nVidia driver.

0
Nobody
2/4/2013 9:09:57 PM
(1)
I apologize. When I tried your suggestion (changing #version 330 to #version 330 compatibility), I did not include the command "optirun bash".

Now I include it, and it works (you are right)!

Thanks a lot! 

(2) 
So from your explanation, my understanding is:
for GLSL, if we use proprietary graphic card driver, we do not need to install any OpenGL libraries, as "it should just pass the source code directly to the driver"?

But now I am confused about the difference between graphic card drivers and installed OpenGL libraries.

If we have proprietary graphic card drivers, do we need to install mesa in Linux?
If so (I think it is), why? 


Thanks in advance.


On Monday, February 4, 2013 3:09:57 PM UTC-6, Nobody wrote:
> On Mon, 04 Feb 2013 10:51:43 -0800, Linda Li wrote:
> 
> 
> 
> > It seems "compatibility" does not work. When I changed the .py file as you
> 
> > instructed, the error information is as below: RuntimeError: ('Shader
> 
> > compile failure (0): 0:1(15): preprocessor error: syntax error, unexpected
> 
> > IDENTIFIER, expecting NEWLINE\n', ['#version 330 compatibility\n       
> 
> > void main() {\n            gl_Position = gl_ModelViewProjectionMatrix *
> 
> > gl_Vertex;\n        }'], GL_VERTEX_SHADER)
> 
> 
> 
> That appears to be a bug in the GLSL preprocessor. It works fine with an
> 
> ATI card on Windows (the only suitable test system I have available), and
> 
> the standard is fairly clear on this (the #version directive supports a
> 
> profile in all versions from 1.5 onwards).
> 
> 
> 
> > And actually I have a question:
> 
> > Do you think with the supported version of mesa OpenGL in Ubuntu is 3.1,
> 
> > no matter how the hardware graphic card is powerful, all OpenGL programs
> 
> > in Ubuntu can only support up to 3.1?
> 
> 
> 
> The version of libGL.so shouldn't affect GLSL, as it should just pass the
> 
> source code directly to the driver. Using functions which don't exist in
> 
> 3.1 would be a problem, as those functions don't exist in the libGL.so
> 
> library.
> 
> 
> 
> However, if the back-end driver is based upon Mesa (as opposed to a
> 
> proprietary driver), that would explain it; the Mesa 9.0 source code
> 
> doesn't recognise the profile part of a #version directive:
> 
> 
> 
> http://cgit.freedesktop.org/mesa/mesa/tree/src/glsl/glsl_parser.yy?h=9.0#n254
> 
> 
> 
> This appears to be fixed in the trunk version (9.1):
> 
> 
> 
> http://cgit.freedesktop.org/mesa/mesa/tree/src/glsl/glsl_parser.yy#n263
> 
> 
> 
> But IIRC, nVidia provides their own libGL.so (other drivers just provide
> 
> the kernel DRM module and the X11 DRI module), so Mesa shouldn't be
> 
> involved when using the proprietary nVidia driver.

0
Linda
2/5/2013 12:03:36 AM
On Mon, 04 Feb 2013 16:03:36 -0800, Linda Li wrote:

> (2)
> So from your explanation, my understanding is: for GLSL, if we use
> proprietary graphic card driver, we do not need to install any OpenGL
> libraries, as "it should just pass the source code directly to the
> driver"?
> 
> But now I am confused about the difference between graphic card drivers
> and installed OpenGL libraries.
> 
> If we have proprietary graphic card drivers, do we need to install mesa in
> Linux? If so (I think it is), why?

You need libGL.so, as that's what the application talks to. I.e. that's
where OpenGL programs like glUseProgram() are defined.

But libGL is basically just a conduit. It either encodes the commands as
GLX protocol and passes them to the X server, which then passes them to
the driver (indirect rendering), or it passes the commands directly to the
video driver (direct rendering).

Mesa provides both libGL and various open-source video drivers. Mesa's
libGL works with any of Mesa's drivers, as well as for indirect rendering.

Proprietary drivers often include a separate libGL, which may or may not
work with indirect rendering, and almost certainly won't work if you have
multiple video cards from different vendors and want to use them
simultaneously from the same process. Actually, just being able to switch
from one to the other without having to uninstall and reinstall drivers
isn't always straightforward.

On the plus side, proprietary versions of libGL tend to be ABI-compatible
with the Mesa version and each other, so you don't need separate versions
of each OpenGL-based program.

0
Nobody
2/6/2013 12:45:39 AM
This is very informative and clear my confusion.
Thanks a lot!

On Tuesday, February 5, 2013 6:45:39 PM UTC-6, Nobody wrote:
> On Mon, 04 Feb 2013 16:03:36 -0800, Linda Li wrote:
> 
> 
> 
> > (2)
> 
> > So from your explanation, my understanding is: for GLSL, if we use
> 
> > proprietary graphic card driver, we do not need to install any OpenGL
> 
> > libraries, as "it should just pass the source code directly to the
> 
> > driver"?
> 
> > 
> 
> > But now I am confused about the difference between graphic card drivers
> 
> > and installed OpenGL libraries.
> 
> > 
> 
> > If we have proprietary graphic card drivers, do we need to install mesa in
> 
> > Linux? If so (I think it is), why?
> 
> 
> 
> You need libGL.so, as that's what the application talks to. I.e. that's
> 
> where OpenGL programs like glUseProgram() are defined.
> 
> 
> 
> But libGL is basically just a conduit. It either encodes the commands as
> 
> GLX protocol and passes them to the X server, which then passes them to
> 
> the driver (indirect rendering), or it passes the commands directly to the
> 
> video driver (direct rendering).
> 
> 
> 
> Mesa provides both libGL and various open-source video drivers. Mesa's
> 
> libGL works with any of Mesa's drivers, as well as for indirect rendering.
> 
> 
> 
> Proprietary drivers often include a separate libGL, which may or may not
> 
> work with indirect rendering, and almost certainly won't work if you have
> 
> multiple video cards from different vendors and want to use them
> 
> simultaneously from the same process. Actually, just being able to switch
> 
> from one to the other without having to uninstall and reinstall drivers
> 
> isn't always straightforward.
> 
> 
> 
> On the plus side, proprietary versions of libGL tend to be ABI-compatible
> 
> with the Mesa version and each other, so you don't need separate versions
> 
> of each OpenGL-based program.

0
Linda
2/6/2013 5:22:07 PM
Reply:

Similar Artilces:

Graphics--combining graphics #3
Are you looking for something like the following? Needs["Graphics`Colors`"] Needs["Graphics`Animation`"] y[t_] := Piecewise[{{0, t <= 10}, {-(t - 10)^2/10, t > 10}}] frame[s_] := Module[ {t}, t = Max[0, Min[20, s]]; Show[ Graphics[ {Red, Disk[{t, y[t]}, 0.2], Black, Circle[{t, y[t]}, 0.2], VanDykeBrown, Rectangle[{-0.2, -0.5}, {10, -0.2}]}], AspectRatio -> Automatic, PlotRange -> {{-1, 21}, {-11, 1}}, ImageSize -> 400, Background -> Linen ] ] Animate[frame[s],...

graphic inside a graphic
Hi everyone Does anybody know how to insert a graphic inside a graphic?? I mean, to add a small graphic (as a zoom for example) inside a plot. Thanks augusto try axes try axes check out the tool 'imshow()', then try to bring the figure in shape get(gca) and superimpose your graph - 'hold on', 'plot()', 'hold off' (Haven't tried it my self so far) Thorben The example below shows you how to insert a graphic (e.g. photo) into another MATLAB figure. I = imread('peppers.png'); theta = linspace(-pi,pi,200); figure; plot(theta,cos(theta)); gr...

paintComponent(Graphics graphics)
Hi=20 I hope I can explain this clearly without getting into too much detail as t= he code in question is a couple of thousand of lines. So to begin, I have = a JPanel to which I add JLabels and JComponents using JPanel's add() method= .. It works fine, but now I would like to print an image ON TOP of the adde= d JLabels and JComponents. For this I manipulate my JPanel's paintComponen= t(Graphics graphics) method. In it I have something of the following sort: paintComponent(Graphics graphics){ Graphics2D g =3D (Graphics2D)graphics.create(); g.drawImage(image, 200, ...

the graphics and the concept of the graphic
Hiya all. everybody should applicate this type of guidelines This movement builds a strong visual link between the head and shoulders. If you get the correct relationship between the eyes, nose and mouth, you will end up with a good likeness. Our Pencil Portrait lesson explains and illustrates the step by step drawing techniques involved in creating the portrait of a young girl. At this stage, the drawing is no more than a set of carefully measured features that establish the final proportions of the portrait. There are three basic stages in the creation of the image...

Graphics--combining graphics
I am trying to create an animation of a piecewise function. I have a graph of a ball rolling and I would like to combine an animation for one equation of motion for the first part with another equation for the second part of the motion. It is easy enough to create the two animations individually, however I am stumped on how to combine them into one continuous animation. Link to the forum page for this post: http://www.mathematica-users.org/webMathematica/wiki/wiki.jsp?pageName=Special:Forum_ViewTopic&pid=10007#p10007 Posted through http://www.mathematica-users.org [[postId=10007]] ...

Intel Graphics Media Accelerator for graphics?
I'm thinking of getting a Sony laptop (VGN-FS315E) which uses the Intel GMS card. How does this compare with ATI or nVidia systems? Is it reasonably well supported in Linux? I see Intel actually provides Linux drivers of some kind; do people usually use these? Also, I see that the GMA driver shares the main memory. This laptop comes with 500MB RAM, while GMA is said to use 128MB. Does that leave enough for normal usage? I'm not a gamer, but might like to watch DVDs. Any advice gratefully received. -- Timothy Murphy e-mail (<80k only): tim /at/ birdsnest.maths.tcd.ie tel: +353...

Intel Graphics Media Accelerator 900 Graphics
I'm planning on getting a Fujitsu notebook shortly and have an offbeat request for any laptop owner whose machine utilizes the Intel Graphics Media Accelerator 900 graphics. There is a neat free program offered by google called google earth (free) which allows one to fly over the world and and view streaming data obtained from collected satellite images (non-real time). I would like to know if the Intel integrated graphics can handle this - especially the 3D view of major U.S. cites (i.e., New York). One way to access the software is via: www.picasa.com (a neat picture cataloging ...

OpenGL support for Intel Extreme Graphics 2
I'm considering the dv1000 series laptop by HP. I do some fiddling with the OpenGL API and play a little Quake III Arena from time to time. I was wondering how well the Intel Graphics chipsets work under Linux. I don't really care about running recent games. I would just like to be able to run some GL code(My desktop has a GeForce 2 MX and that's fine for me). Has anyone had any luck with this setup? It seems to me that theses integrated chipsets are here to stay as computing devices continue to get smaller and smaller. Jason Wells <spinmaster@nospam.com> wrote in news:_%YJ...

Ubuntu 9.04 and Intel G31 graphics chip
Has anyone ever tried this? I know that Ubuntu works with nVidia modern graphics cards but I was wondering about the Intel G31 graphics? There are some things that you cannot look up on the internet. On Mon, 20 Jul 2009 18:59:03 -0700, RickyBobby wrote: > Has anyone ever tried this? I know that Ubuntu works with nVidia modern > graphics cards but I was wondering about the Intel G31 graphics? There > are some things that you cannot look up on the internet. <https://bugs.launchpad.net/ubuntu/+source/xserver-xorg-video-intel/ +bug/337243> -- Rick 2009-07-21, Rick ...

direct graphics or object graphics ?
Hi people, I'm a student programming a software for analysis and visualisation of some kinds of data. In my interface I want to show some plots. I have an object for do OPLOTs using object graphics but I dont know if is better to use object graphics when I'll try to plot more than 1000 plots. I think programming in object graphics is better than direct graphics but Im not sure. Any opinions ???? Thanks, Bernat nata writes: > I'm a student programming a software for analysis and visualisation of > some kinds of data. In my interface I want to show some plots. > I ha...

Printing Graphic or saving graphic
Hello, I wrote a program that does some processing to an image, and then makes a histogram. Except I have 20 images to look at and compare their histograms. I can run the program 20 times, however, how do I save or print the histogram so I can compare them? Thanks! Radha rpertaub@gmail.com writes: > I wrote a program that does some processing to an image, and then > makes a histogram. Except I have 20 images to look at and compare > their histograms. I can run the program 20 times, however, how do I > save or print the histogram so I can compare them? Well, let me count the wa...

Graphics--combining graphics #2
Hi, > I am trying to create an animation of a piecewise function. > I have a graph of a ball rolling and I would like to combine > an animation for one equation of motion for the first part > with another equation for the second part of the motion. It > is easy enough to create the two animations individually, > however I am stumped on how to combine them into one > continuous animation. > Link to the forum page for this post: > http://www.mathematica-users.org/webMathematica/wiki/wiki.jsp? > pageName=Special:Forum_ViewTopic&pid=10007#p100...

Intel Extreme Graphics i740 8M AGP Graphic Card
Check it out! here: http://cgi.ebay.co.uk/ws/eBayISAPI.dll?ViewItem&item=4132757349 On offer! 5.oo ...

Graphic screen to a graphic file
Does anybody know how to save a graphic screen created using dGE into a file that can be printed with Pagescript such as a BMP or JPG? Ian DTE Systems ...

graphic
Is it possible to have the value of the column on top of the column on a histogram graphic? Rafael B=E9langer via AccessMonster.com wrote: > Is it possible to have the value of the column on top of the column on a > histogram graphic? Could you tell us more about what graphic tool you are using? Are you making your own histogram in Access (I do all the time) or are you using Excel? Thanks, Hank Reed Yes, my histogram is a access graphic and what i really want is to have the value of a field, which represent 1 column, to be at the top of the column so u can see the exact percentage o...

Graphic
Good mornig everybody. I want to plot in scilab and hide the middle of the axies. Example : i have a circle centred, center ( 0, ); radius = 10+rand. I want to see only the random values. Has anyone understood my message ? A if yes, can you help me ? Fred Do you want to remove all the axis or just a part of the axis ? ...

graphic
how i can use graphics mode under windows for c++builder-x(borland) enterprise edition? **************************************************************** what is best and simple reference for learn visual c++? Majid Sadeghi wrote: > how i can use graphics mode under windows for c++builder-x(borland) > enterprise edition? > **************************************************************** > what is best and simple reference for learn visual c++? http://www.parashift.com/c++-faq-lite/how-to-post.html#faq-5.9 Jonathan file->new project. choose platform DOS-Standard Chek library...

graphic
Dear All, May I build 3D curve graphic z=F2(x,y), where x and y are the solutions of equation F1(x,y)==0 ? May I receive list of points from a "ImplicitPlot" graphic ? Best regards Nodar Shubitidze On 2009.11.01. 23:58, Nodar wrote: > Dear All, > > May I build 3D curve graphic z=F2(x,y), where x and y are the > solutions of equation F1(x,y)==0 ? > May I receive list of points from a "ImplicitPlot" graphic ? The simplest way would be to solve the F1(x,y)==0 equation. Here's a hackish solution ...

Intel i915G/i945G integrated graphics OpenGL performance (Quake 3 et al)
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hello everyone, I'm looking for performance-figures of Intel's integrated graphics solutions for desktop products, currently branded "Intel® Graphics Media Accelerator 950". If I got things right, there are 100% free drivers with DRI and hardware-accelerated GLX available, and now I'd like to see some benchmark-scores... Quake 3 timedemo "four" would be nice for a start. Can anybody in here provide me with such? I'd also be interested in figures regarding Intel's mobile solutions (as they feature an id...

Trend in Graphics & Graphics Art
Hi What is the trend in Graphics & Graphics Art? Thank you ...

2D graphics with C++ (graphics newbie)
I'm trying to make a card game in C++ with animation so simple as dealing cards to a hand, placing cards on a table, a bit of scaling and thats about it (maybe some 2d characters running across the screen). I'm looking for the most painless way to do this, while still making it look good ie no flickering etc. Any advice into libraries to use or techniques. Would maybe embedding Flash be most efficient? What about things such as the Java 2d API or would i have to go as complex as OpenGL? Thanks in advance for any help. (maybe it is a little OT?) Flash would be simple, but if you wa...

help with graphic a circle and triangles on the same graphic.
Hi , i made a program (.m file) that calculates de area of a circle by adding isoceles triangles (polygons) so tehn it shows the errors and stuff , so you can put how many trianges you want to inscribe in the circle but now i need to graphic the circle and the triangles , and i just can't do that , help me please , by email im from chile thanks. ...

Creating Graphics Files without Graphics
I'm automating some tasks, creating graphics for articles. Currently I'm creating a graph, then sizing it, then using xs2eps to export the graphic to a file. The problem is that when I run this all in batch mode Scilab wants to actually create each graph as it goes, as an open window. This is very distracting. Is there a way to at least get it to create the window in the closed state, or -- better -- to make the graphics file more directly, without having to create a graphic window at all? -- Tim Wescott Wescott Design Services http://www.wescottdesign.com ...

graphics library needed for comparing of graphics
Wide ranging subject line, let me narrow it down: I'm looking for a library/language that will allow me to pretty quickly build up a graphics analyser that will allow me to detect significant changes between a serious of files, probably png or jpg but could be include gif, bmp. Significant changes would include such stuff as 1. having any different color scheme from a baseline graphic 2. missing a color found in the baseline etc. 3. Having differing shapes 4. placement of shapes vary by a declared degree. 5. size differences in shapes. It would be beneficial not to have to have a base...

Web resources about - GLSL 3.30 is not supported (Ubuntu 12.04, Intel HD Graphics 3000 and NVIDIA Graphics with Optimus) - comp.graphics.api.opengl

List of Linux supported architectures - Wikipedia, the free encyclopedia
Text is available under the Creative Commons Attribution-ShareAlike License ;additional terms may apply. By using this site, you agree to the ...

Facebook Audience Network Available to More Brands; Link Ads Now Supported
... it announced at its F8 global developer conference in San Francisco April 30, and the social network also announced that link ads are now supported. ...

Facebook’s cover feed coming as an option to Home-supported Android devices
... latest update for Android devices will borrow from a notable feature within Facebook Home . Starting today, Android users with Home-supported ...

Heat Index Lite - Ad Supported on the App Store on iTunes
Get Heat Index Lite - Ad Supported on the App Store. See screenshots and ratings, and read customer reviews.

Ukraine: Atrocities committed by the US-Supported Ukrainian National Guard
We bring to the attention of Global Research Readers a video on the atrocities committed in Eastern Ukraine. The mainstream media is silent on ...

Thousands grieve for murdered Syrian imam who supported Assad
As with so many of the shadowy deaths in Syria since the uprising began, there will be no independent, credible and thorough investigation into ...

Kevin Rudd supported Peter Garrett's handling of insulation program, royal commission hears
Former Prime Minister Kevin Rudd supported then Environment Minister Peter Garrett's handling of the home insulation program during the period ...

Adele publicly supported Kesha during her BRIT Awards speech
Kesha is in a legal battle with producer Dr....

BoQ supported on Investec buy
Bank of Queensland has won strong support for its $440m acquisition of most of Investec’s Australian business.

Attacks on the public service are not supported by the facts
Attacks on the public service are not supported by the facts

Resources last updated: 2/29/2016 10:16:56 PM