This is the OEM version of the Radeon HD 2900 X DX it’s one of my most desired video cards of all time not because of its legendary performance or the fact that it made a huge impact on the GPU world.
Actually it’s very much the opposite the 2900 xt x didn’t exactly meet expectations during development and because of that it was canceled and never released into consumer channels think of it as a TI’s unreleased version of the FX 5800 ultra on april 26 2007 daily tech got ahold of an OEM version of the 2900 xt s.
Hardware enthusiast got a taste of the would-be flagships performance and unfortunately for ATI the aspiring flagship couldn’t keep pace with NVIDIA 8200 GTX to make matters worse the X DX saw minimal gains when compared to the 2900 XT delving into the article a bit deeper shows.
The xt x with the conservative 750 megahertz core clock along with a 20 20 megahertz effective memory clock on the 1 gigabyte of gddr5 8200 GTX for the comparison a reference clock GTX would have no problem laying waste to a TI’s offerings.
Because of this dismal performance board partners were not willing to produce the xt x in any meaningful capacity so while it never saw the light of day in consumer form ATI did produce XT exes for the OEMs to sample and some of those cards may have been renamed and sold as 2900 xt OEM versions while the history of the 2900 xt acts.
So many unknowns i did my best to research and collect as much information as I could with that being said the information I’m about to present would still be considered conjecture let’s now take a look at one of the first known.
Xt x prototypes and he’s originated in mid to late 2006 and they were the first to use the r 600 silicon identified as the a11 the earliest of these cards sported an all-black tinted cooler and came equipped with 512 megabytes of gddr3 some of the later 811 revisions came with either a clear or red housing now these models came with one gigabyte of gddr5 core clock.
Speeds and voltages core clocks are said to vary depending on the revision but the earliest samples were reported to be as low as five hundred fifty megahertz evidently these early prototypes were also known to be written with bugs and performance inefficiencies after the a 1180.
I went to work on the a12 now these chips were sampled in late 2006 and early 2007 evidently a lot of the bugs from a 11 were fixed and ATI was reaching desired core clock speeds to the tune of around 800 megahertz unfortunately increased core clocks came at the cost of higher power consumption.
Some of the revisions also had faster Samsung memory that came clocked to the tune of 2200 megahertz effective from what I can tell there were nearly two dozen revisions for the a12 including this one.
Recently sold on eBay for over eight hundred US dollars this one sports an all-black tinted cooler and has a core clock of 800 megahertz and a memory clock of 2020 effective the last known silicon found in the XTX was the a13 and while voltage and power consumption improved.
Ati did not meet their target of 800 megahertz within a practical power envelope because of this ati decided to terminate work on the xt x the a13 chips were produced in early 2007 and were used for the remainder of the 2900 series now the earlier a13 samples had their information edged onto the die.
The later chips had their information on the side of the tension plate even though consumer versions of the xt x were officially canceled there were still an unknown amount of oleum versions in the wild speaking of OEM versions.
It’s now time to fill you all in on my latest find a number of weeks ago I was looking for an HD 3870 x2 on eBay and in my search as I kept seeing this HD 3870 Apple card pop up it caught my attention because it used the same 12-inch cooler found on the loose XTX which I thought was kind of odd.
After looking through some of the pics of the card I began to realize this doesn’t look anything like a 3870 it actually looks more similar to a 2900 XT but a bit shorter PCB wise anyway I spent some time looking at 38 70s for Apple computers on the internet and they looked nothing like this card at this point.
I didn’t know 100% what I was looking at but I had a pretty good idea I wasn’t gonna let this car to get away so I opened up some dialog with the seller and began to negotiate on the price I did end up paying less and the asking price so yay for me one thing that was noticed.
When looking at the pictures of the card it did appear to have an apple like PCIe bracket as its light silver and color and has numbers above each DVI port also this card doesn’t have an s-video output like most other xcx’s so this looks very much like a card intended for Mac.
So after a week or so of waiting I finally received this elusive card the first thing I did was throw it into my test bed and surprisingly the card had no problem working in Windows I did have some driver issues from the start but they were quickly sorted out next thing.
I did was open up gpu-z and notice the card was running at 709 megahertz on the core and 20/20 effective on the one gigabyte of gddr5 a do n card to have lower clock speeds compared to their PC counterparts for instance the 2600 XT is 100 megahertz slower on the core for Mac.
The card was extremely clean and before I tested it any further I decided to disassemble it to get a better look under the hood this Apple version of the XT X has only two 6-pin peg connectors like some of the earliest prototypes which was definitely interesting.
I’m going to assume having this peg connector configuration along with the reduced core clocks that Apple may have had lower power consumption in mind when requesting this card for testing again this is an assumption.
I notice is this card is missing the rage Theatre 200 chip which is used for vivo and there appears to be no need for it as this card doesn’t have an s-video out it’s now time to dig into some testing for today’s test system.
Somewhat of an testbed as well as a handful of older titles additionally we’re using older drivers for better performance in these games as for the cards tested we’re using the HD 2900 XT 512 and one gig the 800 GTX along with the XTX.
I chose to use a stock Apple clock speeds as well as pushing the core clocks up to 756 megahertz to more or less matched daily texts XTX so with all that said let’s jump straight into it and the first came up in our test suite.
Benchmarking was done using the game’s built-in benchmark and here we can see that gtx is 30% faster to the pack of radians you can see the XTX doesn’t distance itself from the rest of Radeon lineup.
We did experience some inconsistent frame times with all the radians as well World in Conflict is a next game we tested and here we use the built-in benchmark to get our numbers here we can see the GTX pulls ahead of the XTX by a huge 55% it’s not a great showing for the radians in this title and the XCX only pulls ahead of the 512 xt by 7%.
Now onto call of duty 4 modern warfare benchmarking was done at the beginning of the first mission as is consistent and repeatable again we see the gtx pull away from the xt x by 54% which is pretty huge.
I think at this point you can see why the xdx never made the cut next is the meme that keeps on giving now we obtained our numbers using the awesome time demo found in the crisis benchmark tool cue the GTX leads by 23% which bodes a bit better for the XTX and these settings were a bit extreme for all the cards tested.
But it was kinda sorta maybe playable and the RTS Supreme Commander is the next game up and we captured two minutes of the built in benchmark to get our numbers and again we see the beatings do continue for the radians but they keep getting back up the GTX only leads the XCX by 12% here which is finally nice to see at the last game.
We tested is quake 4 and here we use the hoc time demo to get our numbers and since it’s a short one we decided to only display the average FPS we can see the GTX beats the X DX by 17% we can also see the XC x does beat the lower tier cars here and even the stock clock X DX holds its own now that.
We got all the games tested let’s tally them up to see how well the X DX holds up keep in mind is a very small sample size and 1080p was the only resolution tested now the GTX ended up being 28% faster than the higher clock x TX furthermore.
The X DX is only 2% faster than the 2900 xt 512 and it’s probably obvious at this point but the one gigabyte xt is nearly on par with the x DX let’s move on to the only synthetic test in our test suite three new mark.
Oh six and here we can see the GTX leads the X DX by 6 percent while the 512 megabyte XT achieves the best score versus the rest of the radians this is pretty typical from the majority of the results that I have seen and furthermore the base XCX doesn’t put up a good showing.
Here clearly showing that clock speed is a huge factor here next let’s take a look at the entire systems power consumption using crisis to stress our system here we can see the 1800 GTX uses the least amount of power.
After that the XT acts in stock form only uses 5 watts more the 2100 XT 512 was the big pig of the bunch while the 756 megahertz XTX and the one gigabyte XT drew the same amount of power now the 512 mega my XT card is an early engineering sample so it might have a less mature a13 chip which could be the reason why it draws a little bit more power.
But I don’t know for certain with a cooler this huge you might be wondering how well does it keep the GPU cool and how loud does it get let’s first listen to the other cards to get a good base this is 1800 GTX next is a 2,800 XT this would cover both the 512 and one gigabyte model.
To keep the GPU chilly and under load the GPU maxed out saw temperature peak at 65 C keep in mind this is with a fixed fan speed of 31% it seems a fan speed is fixed until it reaches 99 C as shown here in the BIOS when the fan is ramped up to 100% temps drop into the low 50s which is a pretty substantial drop.
When comparing temps to the 2900 xt idle temps are around the same while load temps are much improved now the xt x is the only model to come with a large 4 pipe copper heat sink while the 2900 xt uses either a two or three pipe copper heatsink.
Ati decided to release this card to compete against the 1800 GT x and the 1800 ultra it would have been a bloodbath and seriously tarnished their reputation the fact is the XC x was barely any faster than the 512 megabyte xt model.
It was a good sign to can this would be flagship because the x DX was never released many thought the 2900 xt was the true contender to the crown but it really wasn’t so you might be wondering why is the overall performance of our 600 disappointing well it’s likely due to a number of things.
First of all shader performance wasn’t exactly strong you would think 320 shaders would have no problem pounding into videos 128 shaders into the dirt right well they’re not equal now ATI was essentially taking 64 five-way shader units and marking them as 320 stream processors even though they were scaler like G ATS so they couldn’t process instructions on individual threads like in videos GPU this meant and videos GPU could process more threads.
In parallel compared to our 664 furthermore a a is typically handled by the raster operators otherwise known as Rob’s at the last stage of the graphics pipeline are 600 handled a a resolved on the shader hardware which reduces both shader performance and pixel throughput also one might comment on the 16 ROPS.
TM use and this one to one ratio hasn’t changed since the introduction of the X 800 XT which seems a bit bizarre for this new unified architecture additionally.