So... I reckon most of you wait for my point of view, now that the card is released, so I will do just that. Before doing it though, a small disclaimer (yes, once again): I am merely posting as myself in here, not as some AMD PR guy. Most of the older members know what I am about, and also know that I always call them as I see them, right or wrong. I saw a thread yesterday about AMD paying me to say stuff, and other things like that: that's plain silly and also insulting; I don't want to see stuff like that here again, I would think that with the history I have around here people would know better. As for some of the people (you know who you are and you know who THEY are) who only bother to post doom on Rage3D only before/after ATI/AMD launches, well, I really don't have to say much about them.
So here goes - so far, the people who have been following the graphics card industry have been quick to "forgive" things that, when they see them in other types of hardware, they immediately dismiss them. I know, I used to be one of them. Believe it or not, Intel made me change my mind about that. So what did Intel do? Intel released the Core 2 Duo - a processor that was heralded by many as the processor that brought them back from the Pentium 4 wastelands. From the many Intel processors that have fallen into my hands, all of them could easily reach much greater-than-their-nominal speeds with ease (I have a E6600 here, running at 402x9, that's 3.62GHz up from 2.4GHz, with a 9700 cooler by zalman, and it runs beautifully). So, if that's the case, why didn't Intel release a 3.6GHz Core 2 part yet, and make it available in limited quantities for bragging rights? It certainly could. But it didn't. Then it kinda dawned on me: it didn't have to. You see, me, as a consumer, would never go out and buy a 3.6GHz processor, when I could easily get the 6600 for MUCH less and overclock it to that amount. What Intel is doing now essentially is what AMD has been successfully doing for the past years with the Athlons; bringing wonderful overclocking chips to the masses, chips that are great in their own right, but can easily have a life extension by running them at higher speeds, without any fancy hardware modding hacks. Just remember the 2500+ Barton, the 1700+ Thunderbird, and so on and so forth, the examples are endless. Fast-forward to the graphics card industry. So far, you had ATI and you had nVIDIA. Both releasing overclocked versions of their high end cards, just to get bragging rights. They didn't care if the part is low-volume, they didn't care if the part is a Phantom Editon or an Ultra Extreme people will never EVER see - all they wanted was "ZOMG fastest" prize. So a question kind of hit me: why should ATI or nVIDIA merely overclock the card, and call it a new official product? Why can't I buy a card, that I know has excellent performance, and overclock the hell out of it? And why should I have to pay premium for an overclock I can easily do in seconds?
Enter the Radeon HD2900XT. Read: XT. It goes up against the 8800GTS 640MB version. When you compare it with that card, it is a better card. No question about that, most reviews I've seen (and trust me, I've been working with journalists all over Europe so in the past weeks I've seen MANY) agree to that, some even place its performance near the GTX in some games. So it is a better card, and in many situations, it will cost less. That's in DX9. In DX10, it's a whole different ballgame, and I will not obviously comment on stuff people haven't seen, but time will clearly show which card is best and which card isn't in that regard. But, I hear you say, the card is slower than the GTX and the Ultra! Yes, it is. It is slower than cards that are not positioned in that pricerange. But guess what: you can get even more performance out of the card, by overclocking it. And there is ample headroom for that. All of the reviews are getting great results, you don't need me to tell you that, and I think we will start seeing even bigger 3DMark world records in a few weeks than the 30k in 3dmark05 Kinc did a while ago (yes I know its not a game and not indicative of how the card performs in real games, but it is indicative that you can have MORE value from your purchase with overclocking, and also indicative of the true power of the card). And many GTX cards I've seen in their reviews are overclocked versions, not stock GTX cards, so the value you get by overclocking the card is even greater. Yes, it needs to draw more power if you overclock it, hence the 8-pin connector. Whereas the 2900XT can easily work with two 6-pin connectors (I know, I have it like that for two weeks simply because I was bored to change the PSU in my PC ), I see the inclusion of the 8-pin connector as AMD's way of saying "here, you can give some more juice to the card if you want to overclock it". Think about it; there's really no other need for the 8-pin connector other than overclocking. That's the sole reason it's there. So why not take advantage of it?
Some other comments on the card: people have been asking me what I meant about the exciting thing I hinted some months back. This is it. AMD is bringing a damn good card, with huge potential, to a pricerange everyone can reach, not only some enthusiasts. Hence I said "many people", they are the majority in a market than 50% and more just buy integrated graphics. So, when someone asks a question: "Hey, for $399, what can I buy?" there is no question about what you will recommend. If they want to give $700+ for an Ultra, hell, just get a second 2900XT and go Crossfire, which seems to have improved very much in this iteration; I won't tell you much because some stuff are yet to be announced in that regard, but increases in performance when you add more card(s) are substantial enough to actually feel like you added another card. And if someone yells "omg wattage", well, that's why us enthusiasts buy large PSUs and don't care much about power consumption. Other companies have been preaching about that for ages, since up until recently they always had parts that consumed more than the Radeons they were competing against. Didn't see this stopping people from buying them though.
So, all in all, I have the card installed for two weeks in my main rig and I really like it. It feels more silent than my former card, the X1900 XTX. And it's a Radeon, make no mistake, with all the advantages this name brings to the table. Drivers are stable, have ample room for improvement (the driver guys are amazing in that regard, they keep producing faster and faster drivers, I already have 8.38 here) and, as far as DX10 is concerned, well, the card's architecture is made with DX10 in mind and ATI (now AMD) has a nasty habit of getting new features right, the first time . It also sports some nifty stuff to those of you who deal with HD Video/Audio, but you don't need me to tell you about them, most reviews cover them. The Tesselator is definitely exciting stuff too. And maybe, just maybe this card will make people look at factory-overclocked gfx cards that are hard to find in the stores (or they appear in some quantities on launch and then you hear "oh, the card is out! sorry! we'll have another batch in two weeks!") a bit differently than they used to in the past I know that there are people that are bound to disagree with many of my points here, but well, you can't agree with everyone can you? That's one thing I've learned all these years in here: everyone has his/her opinions, and rightfully so, and those above are mine, and solely mine, not AMD's.
So hopefully you peeps had your answers with this mini-editorial. Since I will be quite busy for the next couple of weeks don't expect me to be around all the time, answering to questions and or rebutting arguments, but I will try to as often as I can.