www.xlr8yourmac.com

Reviews and Daily News with a  Difference!
Select a Topic Area To See a List of Related Reviews & Articles
Systems  | CPU Upgrades  | SCSI | IDE | Firewire  | Video  | Audio  | Games  | Misc/OS  | Archives  | Search
2/24/01 Saturday's News: Story DetailReturn to News Page

Click for SMOKING FAST SATA 6G SSDs!
Click for SMOKING FAST SATA 6G SSDs!

GeForce2MX vs Radeon Performance
in a G4/500 2xAGP System

By Mike
Published: 2/24/2001
Updated for Lightwave 3D 6.5 Image and performance tests

Update: On 5/27/2001 I ran tests with the Nvidia GeForce2MX (and GeForce3) again in Lightwave 3D 6.5 with the later 2.0 and 2.1 Nvidia extensions and OpenGL 1.2.2 (the shipping version on new Macs, including those with the GeForce3). The same image problems noted below with driver versions 1.1.1 were still present in 2.0 and 2.1. (missing features in previews, etc.).


Thanks to a reader that swapped my BTO Radeon for his GeForce2MX card, I've been able to run a few tests in my G4/500 DP system (100mhz bus, 2X AGP). [This is not a full review like the previous Radeon Mac Edition review which has far more detail and tests.] I ran tests in Quake3 v117 at resolutions from 640x480 to 1600x1200, in 32bit mode with high quality settings. I also ran some 16bit vs 32bit tests on the GeForceMX, MacBench 5's Publishing Graphics (2D) tests at 1280x1024/Millions color mode and tests with Lightwave 3D 6.5.

Setup: The test system is a Gigabit G4/500 dual processor running OS 9.1, 1GB of RAM, Virtual Memory off (it can't be enabled with 1GB of RAM), Quicktime 5 beta 3 and OpenGL 1.2.1. (Note the GeForce2MX systems ship with OpenGL 1.2, but I used 1.2.1 since that's the latest version.) The Nvidia drivers used were the ones that ship with the current new systems (1.1.1) and the Radeon drivers used were the latest available (Radeon update v1.1.1).

Quake3 v117 was allocated 160MB of RAM, all game options were on . Texture quality was one notch down from max, medium Geometric detail (I ran high Geo. detail settings also, but the effect was minimal). Trilinear filtering setting was used.

Q3 results

I did not test any tweaked config files like Locki, which run 16 bit mode, reduced detail settings, and other tweaks that can dramatically increase framerates at the cost of image quality. I did run some quick tests at 16bit mode on the GeForce2MX. At 800x600 or above, it didn't look that bad in the game (better 16bit image quality than the Radeon) and did help framerates a lot on the card at higher resolutions. The higher resolutions make the textures in the game look a lot better in 16bit mode than it does at 640x480 (where the pixels are larger).

Here's the FPS rates from the same game settings at 16bit vs 32bit mode for each end of the resolution spectrum (as well as at 1280x1024).

16bit vs 32bit

As you can see from the graph, the CPU is the limiting factor at lower resolutions, where it really doesn't matter much which mode is used. However at higher resolutions, the SDR RAM based card really gets a benefit from 16bit mode, since it reduces memory bandwidth requirements.

The in-game menus in Quake3 always look bad at 16Bit mode. At higher resolutions I think the in-game graphics at 16bit mode look at lot better than the game menus. I think it looks good enough to be used on the GeForce2MX at higher resolutions. The Radeon as you know from past reviews here also has a very noticable mesh effect at 16bit modes. And since it doesn't benefit much from 16bit modes even at higher resolutions there's really no reason to run that mode on the Radeon, unless you're using something like Locki's config file, which also reduces the CPU load. (See the review of the Radeon AGP Retail on the Video articles page for test results on that card.)

A friend with a dual G4/533 (133mhz bus, 4xAGP) and GeForce2MX card gets similar performance to my results when run at 1024x768 mode with maximum quality settings (max. Texture Quality and high Geometic detail.) He's running OpenGL 1.2, I had OpenGL 1.2.1. There's literally no difference in the GeForce2MX scores between my dual G4/500 and his dual G4/533 at that mode/settings - both get FPS rates of about 46 FPS. (The card may be a limiting factor, but regardless if you like to run that mode, a 2x AGP system despite the 33mhz slower CPU speed and bus speed does just as well with the MX card.)

Also interesting is unless I'm mistaken, I think the GeForce2MX card is running at a 200Mhz core and memory clock rate. (Faster than the retail PC models I believe.) The Radeon runs at 166/166MHz but uses DDR RAM (double data rate video RAM). This is why at 32bit modes it shows less drop as resolutions rise (higher framebuffer bandwidth).


MacBench 5 2D Tests: I ran the "Publishing Graphics" test from MacBench 5.0 (requires the CD, which is no longer available). The two cards were literally a tie on this test at that mode. The Radeon scored 2963, the GeForce2MX scored 2964 - much closer than the run/run variation of the test. One thing that might make a difference is my Radeon tests were done before applying the latest 1.1.1 driver update from ATI. (All the game tests were run using that driver update however.) I'm not sure that would have made a difference in 2d scores however, but it might.

One thing I did note was the Graphics Primitives (Quickdraw function tests like lineto, filling arcs, etc.) were a mixed bag - in some cases the Radeon was twice as fast at certain functions, in others the GeForce2MX was faster (on some functions much faster). I think ATI's latest driver update was to fix some issues like low copybits performance, but I'd have to test to verify this. The emulated Graphics Publishing test (which emulates Photoshop, page layout and word processing applications tests) overall score was a draw however.

After I complete the PowerBook G4 review and other work I intend to test things like DVD playback, 2D apps and Rave/OpenGL (non-game) performance.

Based on past PC experience, I suspected the 4x AGP GeForce2MX was backward compatible with 2X AGP, but it's good to prove it in a Mac. I hope you found this interesting. If you have a 2x AGP G4 system and see a OEM GeForce2MX card for sale (ebay, etc.), it will work based on my experience if you can get the extensions which so far are only available on the new G4s and their system CDs.


Updates: (Additional Tests)

Lightwave3D 6.5 Tests:
I loaded the Hummer.lws file and generated a preview of the 210 frames of that scene in Lightwave v6.5. Lightwave was allocated 256MB of RAM in this G4/500 running OS 9.1. The test system also had QT 5 beta preview 3 installed. (Same QD 3D version as QT 4 however.)

OpenGL/Driver Notes: I'm not sure if it could be a factor in the GeForce2MX's preview flaws, but I tested with OpenGL 1.2.1, not v1.2 which ships with the new Macs w/GeForce2MX cards. I disabled all ATI extensions including their OpenGL ATI Renderer extension when running the GeForce2MX card. The identical system/software, etc. was used for each test. One other note - even when disabling all ATI or Nvidia extensions using Extensions Manager, there will still be related extensions in the Extensions folder (for instance the ATI and Nvidia Video Accelerator extension). I manually moved these extensions to the disabled Extensions folder before installing each card.

With the GeForce2MX card, the Hummer's tire tracks and/or ridge in the sand (ground) would not be shown in every frame which generating the preview (the tire tracks would disappear and/or the tire's impression track, then appear in the next frame etc.). This oddity was present in many of the frames, I'd guess more than 25% of them. In some frames the Hummer's tires had sunk down noticably into 'ground' of the scene, which seemed to be related to the tracks disappearing (some sort of depth buffer issue, you could see this during generating the preview or playing it back frame-by-frame). These errors were also in the preview movie file. With the Radeon card installed, the tire tracks/impressions were shown in every frame during generating the preview (and in the movie file).

The screenshots below illustrate what I'm talking about. The first image shows the Radeon sample with correct tire impression and tracks. The two other images are from the GeForce2MX, showing both the missing tracks/impressions and an example of where the vehicle appears sunk into the ground layer.

(Note: These same problems were evident using v2.0 or v2.1 of the Nvidia drivers, and with OpenGL 1.2.2. An update to Lightwave 6.5B also made no difference. Disabling the Nvidia OpenGL extension solved it, but with no OpenGL hardware acceleration of course. )

LW 3D 6.5 Radeon Sample
Radeon Sample - note tire impressions/tracks

LW 3D 6.5 GeForce2MX Sample
GeForce2MX Sample (missing tracks/tread)

LW 3D 6.5 GeForce2MX Sample


GeForce2MX Sample (note sunken vehicle)

Update: This issue was confirmed by another user in the 3/7/2001 news page. Disabling the Nvidia OpenGL 1.1.1 extension solved the problem he said, but of course then reverts to software OpenGL mode.

" Hi Mike,
I received this message from Pat Turner which shows similar problems with Layout OGL display using the nVidia card...
Take care.
Julian
The Mac Lightwave 6.5 Resource Page
http://www.exchange.co.uk/lightwave/index.htm
(coments from Pat Turner follow)

Hi Julian,
I just ran the Hummer test on my 533 with the nVidia card. Some interesting results not related to how long the test ran.
With 400Megs allocated memory D1 (NTSC) (720x486) the preview took 3 mins 11 secs with the nVidiaOpenGL extension on. This resulted in similar missed tire tracks that Mike reported.
[note - the vehicle also seemed to sink down in some frames as shown in the article-Mike]

When I turned the nVidiaOpen GL extension off I wound up with a render time of 5 mins 18 secs but all the textures appeared normally.
I guess that says something. That the nVidiaOpenGL extension is responsible for more than just modeler crashes! :)
Pat Turner"


If any GeForce2MX owners running OpenGL 1.2 and Lightave 6.5 can comment if this issue is typical let me know.

Performance Tests in LightWave 3D 6.5: Since the GeForce2MX wasn't actually displaying/drawing all the scene data (missing tracks/impressions in many frames), the time to generate a preview and time to play the scene in layout isn't really a fair comparison since the Radeon displayed each component of the scene correctly in every one of the 210 frames. (Considering the missing data I'd say the Radeon was the faster card actually.)

With that caveat noted, here are the results of two series of tests. First I recorded the time to generate a preview at 1024x768 mode, then I tested the time to play through the 210 frames of the scene in the layout window at both 1024x768 and 1280x1024 resolutions.

Time to Generate Preview:

    1024x768 Mode (layout maximized)
  • GeForce2MX: 3 minutes 25 seconds (scene objects missing)
  • Radeon AGP: 3 minutes, 45 seconds


Layout Playback: LW was set to a high threshold so that it would playback the scene in the layout window without reverting to bounding boxes. Since the GeForce2MX doesn't render the tire tracks in every frame, the Radeon card is actually doing more work during the 210 frames. (The Radeon is drawing the tire tracks and impressions in every frame, where the GeForce2MX doesn't in more than 1/4 of the frames I'd say.)

Layout Scene Playback Times:

    1024x768 Mode (layout maximized)
  • GeForce2MX: 1 Min 56 Seconds
  • Radeon AGP: 2 Min 8 Seconds

    1280x1024 Mode (layout maximized)
  • GeForce2MX: 2 Min 5 Seconds
  • Radeon AGP: 2 Min 8 Seconds

Conclusions: With the Radeon card it appears performance is CPU bound in this test (same time even with larger viewport). Again the GeForce2MX isn't drawing all the objects in each frame so that is likely why the times are slightly faster.

Bottom line, unless there's some explanation, at this time I'd say the Radeon was the preferred card for Lightwave 3D.

Granted the GeForce2MX 1.1.1 drivers are only their second official release and should improve over time in future updates. (However it is disturbing that the serious OpenGL issues above have not been fixed even months after it's release.) There are also reports of problems in some RAVE games and lower quality DVD playback than the Radeon. (The X-men DVD is one example.)


Related Articles:


 
Return to Main News Page


OWC XYM SPECIALS!
(Ad/Sale items)

= UPGRADES by Mac =
Upgrades just for
YOUR Mac!

= Used Macs =
(Click for List)

SSDs from under $50!
Fast SSDs for Most Macs/PCs

= ThunderBolt =
Drives & More

= HARD DRIVES =
Up to 6 TB HDs!
Hitachi, WD, Seagate, Samsung HDs

= 2.5in HDs & SSDs =
Notebook Hard Drives and DIY drive/case kit bundles

= MEMORY =
Lifetime warranty RAM Upgrades!

= OPTICAL DRIVES =
Internal and External Superdrives/Blu-Ray drives

= VIDEO/LCDs/TV =
Graphics cards, TV tuners, DVR, adapters and more.

= SOFTWARE =
Apps, Utilities, OS, VM, Games and more

= WIRELESS =
WiFi and Bluetooth Devices/Adapters/More

= Repair Service =
for iPhone, iPad, Macs

= iPad/iPhone/iPod =
Accessories, Cases, Repairs & More

NuGuard iPhone Case *Extreme* Drop Tested!


XLR8YourMac T-Shirts


FasterMac.net
ISP/Hosting

Help Fight Hunger



 
= back to www.XLR8YOURMAC.com =


= Other Site Topic Areas =
Mac Mods/Upgrades | CPU Upgrades | Storage | Video | Audio/HT | Misc/Software | Search | Recent


Copyright © , 1997-2014. All Rights Reserved
All brand or product names mentioned are properties of their respective companies.

Legal: Site Privacy and terms/conditions of use.