Singapore Hardware Zone!

Creative Labs 3D Blaster Annihilator AGP Review
Reviewed by CPU-zilla
Date : 7th October 99




- The Good
- The Benchmarks
- The Bad
- Test System Configuration
- Conclusion
- Rating

- Other video cards reviews


I am not a fan of sports, but I do remember watching the Formula One race on the TV some time ago. The fast paced and adrenaline saturated races are somewhat equivalent to today's computing technology. Every maneuver is carefully planned and being the first in the race is always a top priority. The Formula One race looks pretty simple from a spectator point of view, but in actual fact, it is not. A small mistake can cost you your life, and in the same way, a wrong move in this dynamic business environment could cost you your business competitive edge. Signs of the next economic boom is looming closer, and if you don't fight to keep yourself on top, you could lose everything altogether.

Of course, with this, I am talking about nVidia, the company that has been tirelessly developing new products to keep the graphics market going and going and going... (sounds like they are driven by Energizer batteries). Anyway, it is clear that nVidia is serious to stay on top.

Not long ago, when they released their Riva 128, I was already pretty impressed with what it could do. Although it was not the fastest 3D performer around, but it was already a pretty darn good 2D/3D card available in the market then. Things started to change when they released the Riva TNT chipset, which was then quickly followed by the TNT2 chipsets with different versions tailored to suit every market segment. The Riva TNT/TNT2 has been a popular choice among gamers, professionals and even overclockers. I believe it will still be a every user's choice given the fact that prices should be lower after the release of their new chipset, the GeForce 256.

The GeForce 256 is the world's first GPU (Graphics Processing Unit) that's made up of more than 23 million transistors. That is more complex than the Pentium III! It is unbelievable how much attention has been poured into the graphics processor, and it won't be surprising to see more and more complex graphic chipsets hitting the stores in the near future. And it is certainly not surprising if prices of video cards surpasses the price of mid- or high-end CPUs.

Creative Labs is the first to release their next 3D Blaster series of graphic cards based on the GeForce 256 chipset. It won't be long until we see more products from other graphic card vendors hitting the shelves of your favourite store. So, start saving for your next Christmas gift.


Video Card Specifications

Interface AGP 4x with Fast Writes (AGP 2x compatible)
Chipset nVidia GeForce 256
Ram ESMT (M12L16161A -5T) SDRAM
Data Path 256-bit
RAMDAC 350 Mhz
TV-Output none
Video Playback MPEG-1, MPEG-2, DirectShow & Indeo
Supported Resolutions 640x480 - 2048x1536
Supported Refresh Rates 60 - 240 Hz

The package includes the following : These are the contents of the package:   These are the utilities & software
that are given on the installation CD:
  • (1) CL GeForce video card
  • (1) Installation guide
  • (1) Driver+softwares CD
  • (1) InterVideo WinDVD CD
  • (1) Colorific Color Reference Card
  • (1) Warranty Registration Card (3 years)
 
  • Creative GeForce 256 drivers
  • MS DirectX 7
  • Creative SoftMPEG
  • Colorific software
  • InterVideo WinDVD

The product comes in a cool looking black package that is about the same size as their TNT2 Ultra packaging. As usual, there are not much contained inside, with only the card, two CDs and a manual. The internal packaging could have been done with more care as the card was out of its static bag when I first opened the box. Imagine the kind of stress a product goes through in the shipping process, and frankly speaking, Creative should take more care in packaging the card to ensure it stays in a firm and soft packaging to prevent any damages.

Anyway, here is how the card look like. At first glance, one could easily mistaken it for a TNT2-Ultra. Users would be relieved to find the inclusion of a fan on the chipset. Running a processor with more than 23 million transistors without active cooling is unthinkable, something that processor rights activists (if they even exist) would be protesting about. Anyway, it is the same Hypro Bearing fan used in their TNT2-Ultra cards although I wished they would have used a larger piece of heatsink with more fins and density. Click on the thumbnails below to see a larger view of the card.


This is a snapshot of the SDRAM used on the card. I have not heard of ESMT RAM before, but it looks like the one included on this card is rated at 5ns. However, I cannot really confirm the actual rating.

Back to top


The Good

The GeForce 256™ chipset is the world's first Graphics Processing Unit which packs some really incredible features together with a whole new design in its architecture. The QuadEngine™ design has separate engines for transformation, lighting, setup and rendering. With transformation and lighting built into the chipset, it literally frees up the CPU for other processes. According to nVidia, it is capable of handling up to 15 million triangles per second, almost twice as much as what the Voodoo3-3500 can do. Now, that's a pretty impressive number I must say. The figure below illustrates how the GPU will exactly benefit the gamer and the game developer.

In addition, the QuadPipe™ rendering engine can deliver up to 480 million fully rendered pixels per second. The engine is built on four independent 3D rendering engines, each running in parallel, which produces pixels that's rendered in 32-bit color. I think that's enough to get you excited enough already.

Transform & Lighting

The transform step in the computation of 3D graphics are highly mathematical and requires complex computations. It basically converts 3D data from one frame of reference to another. The transformation step is performed every time a scene is redrawn, and every transformation involves the computation of every object displayed or hidden. Thus, it can get pretty intensive with very complex scenes, and the inclusion of the transform engine in the GPU helps free the CPU from these complex calculations.

On the other hand, the lighting engine helps enhance the realism of a scene by providing lighting effects. This engine is pretty similar to the transform engine due to the mathematical functions it must perform. The advantage of the GeForce 256 GPU is its implementation of separate transform and lighting engines that run independent of each other, thus maximizing its efficiency.

With these two engines integrated into the GPU, developers can create games with higher complexity and realism without sacrificing the framerates since they are no longer computed at the CPU level but on the GPU. This leaves the CPU free for application and scene level tasks. Users with slower processors would also experience better performance in addition to better quality graphics. The two pictures below illustrates how much more detail you can experience in a 3D object without sacrificing the performance.

Without the GPU

With GPU

Cube Environment Mapping

The GeForce 256 is the first to incorporate cube environment mapping acceleration in hardware. It basically solves the limitations of current environment mapping techniques. Current sphere mapping techniques have severe limitations that produces unsatisfactory image quality. Image warping and distortions as well as the inability to draw in real-time makes the current option undesirable. However, with the cube environment mapping built into the GeForce 256, developers are able to create accurate, real-time reflections. As this feature is accelerated in the hardware itself, future games can make use of this to create more interesting environments with the use of reflections and specular lighting effects. Cube environment mapping is fully supported by DirectX 7 and OpenGL. As this is a pretty new feature, we will have to wait a while before we can see some real applications. nVidia has claimed to have support for cube environment mapping from popular game developers such as Blizzard Entertainment, Monolith Productions, id Software, 3DO and Electronic Arts (just to name a few). Nevertheless, it would be interesting to see just how much more realism can be added.

Without Cube Environment Mapping

With Cube Environment Mapping

AGP 4x with Fast Writes

The GeForce 256 is also the first to implement AGP 4x with Fast Writes. The Fast Writes is part of the AGP 2.0 specification. It basically improves all writes from the CPU to the graphics chip. With this feature, systems with Fast Writes can have a dramatic increase in performance over systems without it.

Let us consider the current Intel 440BX chipset with AGP 2x. Assume triangles at a rate of 2 million per second and that each triangle is an average of 90 bytes, thus, you would expect data transfers up to 180MB/s. This is well below the maximum bandwidth of the memory.

However, with the increasing complexity of games, we can expect future games to have up to 10 million triangles per second. This would increase the required bandwidth up to 900MB/second. This exceeds the bandwidth that current systems can provide. With the introduction of the i820 Camino chipset with AGP 4x and Direct RDRAM support, transfer rates between the CPU and chipset would have increased to 1.06 GB/sec (133MHz FSB) and transfer rates between chipset and memory increased to 1.6 GB/sec (Direct RDRAM). However, the memory bandwidth is still not enough to support the required transfer at 900MB/sec (both ways), as illustrated below.

On the other hand, with the implementation of Fast Writes, the performance is boosted as there are no bottlenecks in the transfer of data between the CPU and the GeForce 256.

The Installation, Driver features and Programs

Installing the card is as routine as any other graphic card. It doesn't take an expert to install one. Just shut down your system, open the case, unscrew and remove the old AGP card, and replace with the GeForce 256 card. Upon rebooting your system should detect the new graphic card and subsequently prompts you to install the driver. I hit "cancel" and ran the system with the default VGA driver and inserted the driver CD. It should autorun and automatically prompts you to install the driver. DirectX 7 is installed together with the driver by default. It is needed to support new features such as hardware Transform & Lighting as well as Cube Environment Mapping.

Upon rebooting, you will notice the Advanced BlasterControl™ Panel icon in the task bar. It provides almost similar tweaking and adjustment functions that you'll see in the TNT or TNT2. Anyway, here's the complete table of the display features for you to drool over.

The info section tells you what driver versions you have, as well as the video BIOS version. It also reports the amount of video memory built into the card. Useful utility to have when you are unsure whether you have the latest drivers installed.
The monitor settings section lets you adjust the display size and positions. It also provides a handy test pattern option to help you refine your adjustments. They should have also included a test pattern to adjust for moire patterns.
The color settings section allows you to adjust the brightness, contrast and gamma of your display. It also lets you adjust individual colors to let you tune your monitor for color correctness.
The desktop settings option gives you a quick drop down menu to select the resolution and color depth.
The tweak option is similar to your TNT/TNT2 tweak with options for Direct3D performance/compatibility as well as settings for Direct3D Mipmapping.

In the Advanced Tweak section, you are again presented with four more options, just like your TNT/TNT2 tweaking.

In the More D3D tab, you will fine more options for Anti-Aliasing, Texture Alignment, Texture Cace Size, Buffer Rendering and Monitor refresh rate Synchronisation.
In the Video Control tab, you are given the choice to adjust the brightness, contrast, saturation and hue of videos (such as MPEG movies) played through your video card. In the picture on the left, the window was black because the sample image was placed there by video overlaying, which is not possible to capture.
The next OpenGL tab gives you tweaking for Performance and Compatibility, OpenGL Synchronisation and Texture optimization.
In the Other Options tab, users would applaud the inclusion of memory overclocking feature. However, most would also prefer the ability to tweak the core clock speed.

Besides the drivers, a copy of Creative SoftMPEG is included on the driver CD. I never tested the SoftMPEG since a copy of InterVideo WinDVD was also included in the package. Why settle for less when you are given a better software to view your VCDs or DVDs? The Colorific software which is included is a useful utility to calibrate your display to give accurate color display. The accompanied reference card is also included to aid you in the calibration process.

3D Quality & Features

The card was tested on both a K7 and Intel system. I must say, the K7 really rocks! However, I was also surprised that the card performed relatively well in a slower system.

Anyway, here's the 3D feature set for the Creative 3D Blaster GeForce 256 as detailed in their product specs.:
  • Over 15 million triangles per second at peak rates
  • Peak fill rate of 480 million bilinear filtered, multi-textured pixels per second
  • 2.6 GB/sec total memory bandwidth
  • 100% hardware triangle setup
  • 256-bit graphics architecture
  • Transform and Lighting (T&L) engine
  • Four rendering pipelines capable of delivering four pixel per clock
  • Single pass multi-texturing support (DirectX and OpenGL)
 
  • Support 8 tap anisotropic filtering @ 480 Mpixel/sec
  • Cube environment mapping in hardware, fully supported by DirectX 7 and OpenGL
  • Complete support for DirectX 7
  • Texture Blend support for Multi-texture, Bump map, Light maps, Reflection maps and Detail textures
  • 32-bit ARGB rendering with destination alpha
  • Point sampled, Bilinear, Trilinear and 8-tap Anisotropic filtering

The screenshots below were taken using 3DMark 99 Max and we can see the comparison between the 3D Blaster GeForce and the G400Max. The images were captured at 640x480 in 32-bit mode. Click on the screenshots to view the image in full size.

Creative 3D Blaster GeForce 256 Matrox G400Max 3DMark99
Reference Shots
Description
Alpha Blending

There doesn't seem to be any difference in image quality between the two cards.
Texture Resolution

Once again, the difference is hard to tell.
Game-1

The GeForce picture quality has really improved as compared to the TNT2. There is very little difference between the G400Max and the GeForce 256. However, if you take a closer look, the G400Max still produces sharper images (compare the 3D Mark logo near the center of the image).
Game-2

The rendered scene between the two cards look almost identical. Even if there are slight differences, it is hard to spot.

As you can see from the above comparisons, the GeForce image quality is very good indeed. It is very difficult to find any difference between the G400Max (best in picture quality so far) and the GeForce. Looks like nVidia is catching up on quality as well. Anyway, shall we proceed to some action? Yeah, baby, yeah!

Back to top



The Benchmarks

All the benchmarks were performed in Windows 98 Second Edition. I was fortunate enough to have the latest K7 650MHz system for the benchmarks. I suppose it is the most ideal setup I've seen to date, a system packed with so much punch that you could almost physically feel the power. Besides the K7, I also ran a set of Quake II/III benchmarks on my trusted 6163Pro using a slower Pentium II-350. Results of the GeForce are also compared to the Voodoo3 2000 with 2 sets of benchmarks obtained at core/memory clocks of 166MHz and 183MHz. The Voodoo3 benchmarks were also performed on the K7-650MHz system.

At this point, you should be able to see just how fast the GeForce can go as compared to all the different setups. However, to make things a little more complete, I've also added benchmark results from my previous benchmarks with the Creative TNT2 Ultra running with at Pentium III 600MHz. Below, you'll find benchmarks obtained from Quake II at 4 different sets of resolutions. Well, take a guess at how long it took me to get and compile all these numbers for you guys? (or gals?)

Quake II v3.20


demo1.dm2


demo2.dm2


crusher.dm2


massive1.dm2

In the demo1, demo2 and massive1 timedemos, the GeForce topped the Voodoo3 running at 183MHz. Although it may not be a fair comparison to make with the TNT2 Ultra, but you should be able to see that the GeForce is indeed faster than the TNT2 Ultra. I doubt even the fastest TNT2 Ultra cards in a K7-650 setup could beat the GeForce. The Voodoo3 can closely match the GeForce at low resolutions like 640x480, but at higher resolutions, it fails miserably.

In the crusher benchmark, the TNT2 Ultra gives the highest framerates at 640x480, but falls off at higher resolutions. Still, we can see just how well the GeForce can handle complex graphics. I was particularly glad to see framerates exceeding 60 at 1280x960 resolution.

Running the GeForce using a slower processor like the Pentium II 350 does have its advantages as well. At high resolutions, we can see the GeForce matching other video cards running on fast systems like the K7-650 or even the Pentium III-600.

Well, let's proceed to see how it fares in newer games like Quake III Arena.

Quake III Test 1.08


q3demo1


q3demo2

In the Quake III benchmarks, the GeForce came out tops again, with speeds in excess of 100 frames per second. It is also interesting to note the small changes in performance across different resolutions when running the GeForce on the Pentium II 350 system. This behavior is probably due to the slower CPU speed.

So, are you drooling yet?

Resolutions
3D Mark overall scores
Fill Rate performance
(MTexels/s)
640x480




800x600




1024x768




In the 3D Mark tests, we can see that the GeForce really excels in its fill rate performance. Although it is quite an unfair comparison between the TNT2 Ultra (running on the weak Pentium III-600) and the GeForce (running on the K7-650), there is really no way that the TNT2 Ultra could beat the GeForce in the fill rate performance. Of course, I would love to show the power of the GeForce by pairing the GeForce with the weaker Pentium III-600, but due to timing constraints, I was unable to do so. However, the thought of setting up an ultimate gaming machine is enough motivation for me to stay up late to work on this review. :)

16-bit and 32-bit rendering comparison

I wished there was no need for a comparison between 16-bit and 32-bit color depth rendering, but no video card review would be complete without one. So, it was back to running benchmarks again, while watching crappy infomercials on the TV. Hmm.. I wonder if those slimming creams work. (Hey, hey! This doesn't mean I'm fat, okay?!).

Quake III Test 1.08
(16bit & 32bit comparison)


q3demo1


q3demo2

Over here, we can see that the GeForce performs reasonable well at 32bit color depth at resolutions below 800x600. However, at higher resolutions, the GeForce comes pretty close to the TNT2-Ultra. Note that the TNT2-Ultra was tested on the "weak" Pentium III-600, and may have performed better on the K7-650.

3D Mark 99 Max Tests
(16bit & 32bit comparison)


Overall scores


Fill Rate performance
(MTexels/s)

In the 3D Mark tests, you can see that the Rasterizer and Fill Rate performances are halved if you choose to run at 32-bit color depth. However, the 3D Mark scores were pretty consistent at lower resolutions but at 1024x768, you begin to notice a slight decline in performance. Anyway, the performance is still pretty acceptable. Seems like 32-bit gaming would come to pass real soon.

Super 7 compatibility

I believe this would be a major concern to a lot of users out there. Anyway, Super 7 motherboard users would be delighted to know that there's no problem running this card on their existing boards. I tested the card without any problems on a DFI K6XV3+ Super 7 motherboard which uses the VIA MVP3 chipset. As what Vijay said in his other reviews, you just have to make sure that you install the latest VIA drivers, and you should be on your way to having fun with your newly acquired video card. However, users may have a tougher time getting it to run on motherboards using the ALI chipsets. In any case, do make sure you have the latest motherboard BIOS and drivers installed, and there should be little problem.

Back to top


The Bad

Creative has taken the quick route in bringing the GeForce to users, probably with very little re-designing of the reference board. The board design looks pretty simple with a lot of space reserved for future versions of the card which may include TV-out. I wished Creative would have included a TV-out option as it is almost a "must have" feature for all high-end graphic cards. It would be superb to output your display to a large screen TV to enhance your gaming experience.

The fan and heatsink included on the card does not seem to provide sufficient cooling. Running benchmarks on the card, I was constantly checking if the chipset is too hot. Bear in mind that my tests were all performed in an open space, so it does not simulate an actual enclosed environment. Feeling the heatsink with my fingers, I felt the extreme heat generated by the GPU. It is not surprising to experience that kind of heat dissipation since it physically have more transistors than the K7. So, I brought out my thermocouple from the MS-6163Pro to make temperature measurements. As the heatsink is pretty small, I had difficulty attaching the thermocouple on the heatsink. In order to obtain accurate readings, I removed the metal plate from my thermocouple, thus exposing the small thermocouple junction, which I could slot into the heatsink easily. Here's what I got from the temperature monitor.

Even with active cooling present, the heatsink temperature was at a whopping 52 degrees. (I remember reading a review elsewhere which claimed that the GeForce temperature was less than 40 degrees. I really doubt the accuracy of the measurement. It is unlikely that the reviewer obtained a high quality chip.) Anyway, what makes me worry is when the card is placed in an enclosed environment. Temperatures could easily rise by another 5 to 10 degrees, depending on how efficient is your casing air flow. In addition, the RAMs were also quite hot. I wonder if it is about time we start sticking heatsinks on the RAM. I really wouldn't recommend overclocking the card unless you have properly planned your cooling strategy.

Back to top


Test System Configuration

Processor(s)

AMD K7-650
Intel Pentium !!!-600
Intel Pentium II-350

Ram 128MB PC100 Mitsubishi SDRAM Dimm
Motherboard MS-6167 (AMD 751 chipset)
MS-6163 Pro (Intel BX chipset)
HardDrive(s) IBM Deskstar 22GXP (DJNA-370910)
Operating System Windows 98 Second Edition (Build 4.10.2222A)
DirectX Version MS DirectX Version 7
Other software used

-

Video Card(s)

Creative Labs 3D Blaster Annihilator GeForce
3dfx Voodoo3 2000 (overclocked to 166MHz & 183MHz)
Creative Labs TNT2 Ultra
Matrox G400 Max

Video Card Drivers Creative GeForce driver (included in the CD)
Matrox G400 drivers 5.21 (latest version)
nVidia Reference Drivers 2.08
3dfx Voodoo3 drivers 1.02.13

Back to top


Conclusion

The Creative 3D Blaster Annihilator GeForce video card is really out to annihilate all competitions. We've seen how much more performance it can provide, especially when it is coupled with a fast processor. Users with slower processors would also benefit with the card, but only at high display resolutions. If you are an ardent 3D gamer who hunger only for framerates, this is definitely the card for you. If you can't wait to show off to your friends how rich you are (which retails at about S$430 to S$450), run down to the store right now and get yourself an early Christmas present. However, if you can wait, I suggest waiting for other cards to hit the stores. In addition, it would also be wise to allow the drivers to mature.

Back to top


VIDEO CARD RATING

Overall Rating
(Out of a maximum of 5 Star)

Installation *****
Performance *****
Price **
Software Bundle ***
Material Quality ****
Overall Rating ****½

Any comments/remarks/questions? Click picture below.


Back to top


This Product is provided courtesy of

Copyright © 1999 by Singapore Hardware Zone. All rights reserved.

None of the above shall be reproduced, copied and/or
modified without the permission of the WebMaster.

Click Here!