I recently submitted a benchmark for Quake 3 (great site btw!), and
was interested in the OpenGL features/extensions provided with my new
G5 (w/Nvidia 7800GT card). Needless to say I was more than a bit surprised to discover that
only 4 texture units are reported by OS X! According to the
information to be found on both the web and straight from NVidia
themselves, there are supposed to be 20 texture units on the PCI-
Express 7800 GT in my Dual-Core G5.
So I am left wondering:
A) Is this normal for OS X?
B) Are vast performance improvements in the works for the next update?
C) Have I been robbed?
I don't have access to any Apple machines with ATI cards installed to
compare this to. (See replies below. For instance the ATI X800 XT for instance is reported as 8, with MultiSample AA support-Mike) Hope you don't mind that I include a screenshot with
two highlighted areas. One is the RealTech VR program (free
download), (OpenGL Extensions Viewer-Mike) and the other is the output from glxinfo in an xterm
(bundled with OS X). Would really appreciate any input!
(he later wrote)
No multisample Anti-Aliasing (?)
(See below for an Apple reply/explanation on this.
I don't have my Nvidia 6800 Ultra card installed in the G5 anymore but readers with Mac 6800 Ultras (AGP) and Mac 6600 (PCI-e) cards wrote that (like the 7800GT) it also lists 4 texture units and no MultiSample AA support. (But see below for Apple's comments that some reported items are misleading/not an indication of the hardware's limits.)
If anyone has a new G5 with the Nvidia FX4500 card installed, let me know what is reported for it as far as texture units and MulitSample AA support. Thanks in advance.)
Explanation from Nvidia/Apple:
(from an Apple contact)
"(Quote from above- "Needless to say I was more than a bit surprised to discover that only 4 texture units are reported by OS X!")
This is misleading, different numbers of texture units are exposed to different parts of the OpenGL pipeline.
4 texture units are available to the old "fixed function" pipeline, while 16 are available to ARB_fragment_program & GLSL.
This is exactly the same as on Windows, compare with the 7800 driver here:
For more info, you can read an explanation of "texture unit" terminology in Issue 1) of the ARB_fragment_shader specification:
Nvidia also has a FAQ describing why the reported limits are different:
(But why are some other (older) graphics chips reported to have more texture units even for a 'lesser' (spec) chip - i.e. the Mac X800 XT (and others like the Mobility 9700) are reported as having 8)
Again, there are simply different limits for different parts of the pipeline. On recent ATI hardware, there are 8 texture units exposed for the old "fixed function" pipeline (i.e. MAX_TEXTURE_UNITS) and 16 texture image units exposed for the newer programmable paths (i.e. MAX_TEXTURE_IMAGE_UNITS.) This is normal and not a problem; games and other 3D applications just need to use the programmable paths if they want to access the full range of texture units. The Nvidia FAQ (linked above) explains this clearly
You can see this in the "OpenGL Extensions Viewer" application by looking in the Extension Specifics section for "Fragment program". It will report the maximum number of texture image units there.
(And any info on the lack of MultiSample AA support?)
All recent Nvidia hardware does support multisample FSAA. The best way to see this is to look for "ARB_multisample" in the extensions string."
An Nvidia 6800 Ultra (Mac) card user wrote (regarding the "MultiSample - No" listing)
"...Couldn't see it in OpenGL Extension Viewer, but Apple's OpenGL Profiler lists 'ARB-multisample' as enabled, and also lists both 'MAX_TEXTURE_UNITS' and 'MAX_TEXTURE_UNITS_ARB' as 4.
(I asked what is shown for 'MAX_TEXTURE_IMAGE_UNITS'-Mike)
I only have a listing for 'MAX_TEXTURE_IMAGE_UNITS_ARB' which says 16.
Just checked another Apple developer tool (OpenGL Driver Monitor) and it contradicts the OpenGL Profiler by saying multisample is not enabled. Wonder if this is an on-the-fly setting that only changes to on/yes when a program utilises multi-sampling?
Another reader said:
Nvidia can't do floating point blending and multisample antialiasing
at the same time. It's a known Nvidia problem. This could be why it
doesn't show up. Apple may have made a decision as to what mattered
more. As floating point blending is used for HDR (High Dynamic Range), it could have been one or the other.
This (HDR) is all the rage now in games like Chaos Theory.
We also use it in PS on certain images in 16 bit per pixel mode.
The architecture of the 6 and 7 series cards has a problem in that
it's impossible to do HDR and multisampling AA at the same time.
I don't know why it would only support 4 texture units though.
I don't have my Nvidia 6800 Ultra card installed in the G5 anymore but today readers with Mac 6800 Ultras (AGP) and Mac 6600 (PCI-e) cards wrote that like the 7800GT, 4 texture units and no MultiSample AA support is listed in OpenGL Rendering Info "Sample Modes" listing. If anyone has a new G5 with the Nvidia FX4500 card installed, let me know what is reported for it as far as texture units and MulitSample AA support. (Use OpenGL Extensions Viewer.) Thanks in advance.
Some other Mac graphics card users sent comments on their reported info/support:
Not sure if this would help you. But, to add to the Nvidia poster there, seems like ATIÕs driver supports more. But, shouldn't this be 16? (He sent a screenshot from his X800 XT card, which is reported as 8 units. Multisample AA is also supported.-Mike) Would this explain why games like Doom 3 get half the FPS on the Mac as they do on the PC? Would be nice to know what is up with this. Thanks again for the great site. :)
I wrote an ATI programmer contact for more info on this. (I tried calling yesterday but no answer - guess they're off for the
On my Powerbook G4 17 1.5 GHZ ATI Mobility Radeon 9700 128MB I have 6 texture units with multisampling available.
On my Dual 2.7 GHZ G5 with ATI Radeon X800 256MB AGP I have 8 texture units with multisampling available. I find this interesting as I was thinking about selling the Dual for a Quad. I wonder if the Nvidia 6600 has the same issue?
Readers w/Mac 6600 and 6800 cards wrote they also see 4 texture units/no MultiSample AA support. No reports yet on the FX4500, although I think it's a unified driver it still would be interesting to see what is reported. (Use OpenGL Extensions Viewer.)
I had mentioned that I hear 10.4.4 is due soon (and hopefully with improvements and fixes for some
10.4.3 GL/driver issues) - an anom. reader commented:
I've heard that 10.4.4 will DRAMATICALLY improve
nVidia support - 10.4.3 were pretty much placeholders to get the
products out the door, 10.4.4 is pretty much what it oughta be - not
that it won't get better in the future, but the 10.4.3 variant
shipped w/PCIe Macs wasn't what it should have been.
I do not think that the texture units showing only 4 texture units is
the definitive answer. I have a Powerbook G4 with 9600 64 megs and it
shows 8 texture units, and I also have a Dell Inspirion 9300 with a
Nvidia Geforce Go 6800 256 megs. That application reports that I have
only 4 texture units. (see nvidia links in post above-Mike)
My card on my Dell Laptop is blistering fast.
Nvidia reports that the 7800 has 20 pixel pipelines, not 20
Multisample Texture Units. There is a difference. A easy way to see
if your card is being dumbed down somehow through the OS is to run
benchmarks and rate those against the known ones for that card. As in
my case there just about parallel.
Further to your thing about the 7800 only reporting 4 texture units,
the ATI 9800 in my dual G5 reports 8, so I'd guess its a limit of the
I have a G5 with a 7800 winging its way to me, .... so if it gets here soon
(and I hope it will) I'll let you know what it says (after the 10.4.4 update)
Other Mac Graphics Card Related Articles:
For other Mac video/card related articles see the
Video topics page.