Sli X Cf Fight

crs

know-it-all Member
Registrado
Está sendo organizado por diverssos sites uma espécia de campeonato mundial entre SLI e CF, com várias matérias e benchs en cima dos sistemas multi-GPU da NVIDIA e ATI.

Estes testes prometem mostrar verdadeiramente qual o melhor sistema multi-GPU da atualidade. Para quem tem dúvidas vale dar uma consultada, existem várais informações interessantes sobre o funcionamento desses sistemas seus prós e contras.

Vale a pena dar ua conferida.:yes:

Estão encolvidos nos testes os seguintes sites:

PenstarSys = Introduction
http://www.penstarsys.com/reviews/video/multigpu/pss/index.html

bit-tech = GPUs
http://www.bit-tech.net/hardware/2006/07/12/multi_gpu_world_tour_gpus/1.html

Bjorn3D = Motherboards
http://www.bjorn3d.com/read.php?cID=938&pageID=2402

HardInfo = Common Benchmarks
http://www.hardinfo.com/show.asp?page=6741

Neoseeker - Xfire most played
http://www.neoseeker.com/Articles/Hardware/Reviews/multigpumostplayed/

NVNews = Uncommon Benchmarks 1
http://www.nvnews.net/articles/gpu_world_tour_2006/index.shtml

Rage3D = Uncommon Benchmarks 2
http://www.rage3d.com/articles/mgpuworldtour_p8/index.php?p=1

Legit Reviews = Uncommon Benchmarks 3
http://www.legitreviews.com/article/366/3/

Guru3D = The Verdict
http://www.guru3d.com/article/gpu/367/
 
Gostaria que o teste entre o BLADE e o MAJR também fossem postados aqui, para efeito de comparação.

Abraços.
 
Aos poucos vou postar trechos interessantes retirados dos sites que estão fazendo os testes.

Texto retirado do penstarsys:

Multi-GPU Overview​

The now ancient history of multiple GPU rendering goes back to the days of 3dfx, and perhaps because NVIDIA bought most of the tech from 3dfx they were the first to pursue a modern variant of this technology. SLI (Scalable Link Interface) was introduced with the GeForce 6 series of products, and it has been further refined with the latest GeForce 7 series.

To enable SLI a user has to have a SLI capable motherboard with 2 x PEG slots (PCI-Express Graphics) and two nearly identical cards. As long as the cards are within the same general family (7600’s could only be used with other 7600’s, same with 7800’s and 7900’s). Even if the cards are from different manufacturers and have different clockspeeds, SLI will work with these cards (though it will normalize the clockspeeds to match the slower card). SLI can be enabled and disabled without rebooting as well.

NVIDIA does everything internally and uses both an “Over-The-Top” connector and the PCI-E bus the shuffle data back and forth between the cards. NVIDIA designed the compositing hardware into the GPU’s themselves, and there is no need for an external compositing engine. Quite a bit can be done by using both the OTT connector as well as the PCI-E bus, and we can see that throughout its lifetime SLI has improved not only in performance and compatibility, but NVIDIA has also brought out new features not initially in the release (namely SLI-AA).

NVIDIA designed SLI into their products from the ground up, so there is a bit more of a polish to the product than most of the competitors have. Also, the speeds of even the budget cards has required the use of the OTT connector, so all cards from the 7600 GS up will feature this connectivity.

NVIDIA has spent a lot of time working on software and driver compatibility, so it has over 300+ games that SLI natively supports with profiles. After the initial launch of SLI, NVIDIA exposed the functionality to enable user created profiles for games that are not officially supported by NVIDIA. Those folks hoping to run System Shock 2 with SLI-AA enabled can certainly do so at their discretion by creating a custom SLI profile for that game. A full list of supported applications can be found at www.slizone.com.

There are several modes of SLI that NVIDIA utilizes. While 3dfx used Scan-Line Interleaving either in analog (Voodoo 1/2) or digitally (Voodoo 5), NVIDIA utilizes several modes that can clean up a scene, push the pixels, or scale geometry.

AFR is the most common form used, and it stands for Alternate Frame Rendering. Each card is given a frame to render, and it alternates the frames being output from the primary and secondary card. This maximizes both pixel pushing power and geometry scaling, but is not maximal for memory due to the scene being replicated in full by each card.

SFR stands for split frame rendering. Each card takes a portion of each frame and renders it. Depending on the complexity of the scene, each card can dynamically allocate a certain percentage of the frame to each card. For instance, if we see a scene with sky taking up 50% of the frame, and complex shading is done on the low half, one card can take upwards of 70% of the frame, while leaving the more shading intensive 30% to the other card. This maximizes pixel performance and memory space, but geometry scaling is not as effective as each card has to handle the geometry for that frame in full.

The final rendering feature of SLI is SLI-Anti-Aliasing. This is a very performance intensive operation which combines the anti-aliased frames to improve AA quality and coverage. This should only be used in applications which are more CPU bound, but the AA quality is much improved by allowing a mixture of supersampling and multisampling AA at high levels.

SLI still has room to grow. This Spring NVIDIA announced that they will be offloading certain physics effects onto their GPU’s through the use of HavokFX. This means that portions of the GPU, or a GPU by itself, can be utilized to do physics effects work.
 
How ATI's CrossFire Technology works:

dongle.jpg


CrossFire was initially launched back in June 2005, and it took ATI about the same time as NVIDIA to get the technology out of the lab. In fact, we didn't get our hands on hardware until the end of September. This was in the form of a pair of Radeon X850 XT video cards and a CrossFire Xpress 200 motherboard. Since then, ATI has renamed CrossFire Xpress 200 to CrossFire Xpress 1600, for marketing purposes. In March this year, the company announced its CrossFire Xpress 3200 chipset, which came with a pair of true dual PCI-Express x16 interconnects and a number of other improvements to its core logic, too.

Obviously, the end of September was very close to the launch of ATI's Radeon X1000-series, which happened on October 5th. As a result of this and a number of massive shortcomings, very few Radeon X850-series CrossFire systems were sold. However, ATI didn't sit still and the company had improved its CrossFire architecture and drivers by the time Radeon X1000-series came around. This included fixing the inability to render games at resolutions above 1600x1200 at 60Hz.


The Hardware:
CrossFire is a strange and confusing technology for anyone who doesn't follow the progress of 3D graphics on a daily basis. This is mainly down to the confusing way that ATI gets CrossFire to work on a pair of ATI video cards. Rather than having one simple method of enabling CrossFire across its entire product stack, ATI came up with a bit of a hack job in all honesty - it's certainly not something that a novice would want to try and get their head round without some help. Let us explain, and we're sure that you'll agree with us.

If you're buying high-end Radeon X1900XT or Radeon X1900XTX video cards with the vision of running CrossFire, you need to be sure that you've purchased one CrossFire-ready Radeon X1900 (either X1900XT or X1900XTX) and a Radeon X1900 CrossFire Edition video card. The CrossFire Edition card will come with all of the required hardware for connecting the two cards together to work in CrossFire.


The reason why two mis-matched video cards are required is because there is no frame compositing hardware in either R520 or R580 and it would have been incredibly costly to transfer data across the PCI-Express bus at the high end, especially before the company had launched its dual PCI-Express x16 RD580 chipset. The Radeon X1900 and X1800 CrossFire Edition cards come with the required compositing hardware in the form of two Silicon Image Sil163B TDMS recievers and a Xilinx FGPA.

In one way, this is reasonably straight forward if you consider that all CrossFire-enabled motherboard and GPU boxes come with a simple diagram detailing 'what you need to get CrossFired up'. In order to run CrossFire, you need a CrossFire-ready motherboard, a CrossFire Edition video card and a CrossFire-ready video card. At least the box tells you exactly what you need in order to run CrossFire.

Here is where the fun starts, though.

Things change dramatically at the low end of the spectrum. In fact, things change right the way up to ATI's Radeon X1900GT - there are two ways to enable CrossFire on the Radeon X1900GT. Well, there will be soon. Firstly, you can use the method that we've outlined above - this means buying a Radeon X1900GT CrossFire-ready video card and then spending a lot more on a Radeon X1900 CrossFire Edition. Typically, the Radeon X1900GT costs around £200, while a Radeon X1900 CrossFire Edition will set you back around £290-300.

The second method of enabling CrossFire with Radeon X1900GT is in the same way that Radeon X1800GTO CrossFire is enabled. By this we mean that it will be possible to install a pair of Radeon X1900GT's and run a 'masterless' CrossFire configuration, with all data transferring over the PCI-Express bus. Unfortunately, this is not officially supported yet, but it will be possible in a future driver release, scheduled for August this year.

At the low end of the spectrum - anything below and including Radeon X1600XT - ATI enabled support for 'masterless' CrossFire configurations soon after the announcement of Radeon X1000-series on October 5th. However, if you have a video card that is not a part of ATI's X1000-series (i.e. a Radeon X700), you will not be able to enable CrossFire at all. Indeed, with the exception of Radeon X850, there is no support for CrossFire outside of ATI's Radeon X1000-series of products.
 
Rendering Modes:

superaa-1.jpg


ATI has four different multi-GPU rendering modes - we went into some detail as to how all of them works here. The first is the default profile for CrossFire and is known as SuperTiling. However, we've never really seen any substantial gains when using the SuperTile mode. Games like The Elder Scrolls IV: Oblivion didn't have a CrossFire profile at launch, so should have defaulted to SuperTiling.

However, Radeon X1900 CrossFire originally resulted in poor performance - slower than running with CrossFire disabled - because there wasn't a game profile to enable ATI's preferred rendering mode, which known as Alternate Frame Rendering. This was fixed with a driver update, but it wasn't immediately available to consumers when the game launched.

The second mode is Alternative Frame Rendering - this appears to be ATI's preferred CrossFire rendering mode. This does exactly what it says on the tin - the frames are split between the two GPUs and each GPU renders alternate frames. Frames rendered by the CrossFire-ready video card are then sent to the CrossFire Edition card in readiness for displaying on screen - it is the compositing engine's job to make sure that the frames are rendered in the right order.

Along with SuperTiling and AFR, there is scissor mode. This mode splits the image in two, with each GPU rendering a portion of the scene - this process is load balanced to ensure that both GPUs are fully utilised during the process. It seems that this mode is not favoured, due to high driver overheads, but it is used in some games where Alternate Frame Rendering doesn't provide optimal performance benefits.

It turns out that ATI has chopped and changed its stance with Scissor mode. At first, we were told that ATI dynamically load balances scissor mode, but on further conversation with company representatives, the Catalyst team found that they were achieving better results with a static 50/50 split screen. As a result, ATI has switched from using a dynamic load balancer to using a static 50/50 split screen for scissor mode.

The final rendering mode used by ATI's CrossFire technology is known as SuperAA. This is generally used in games that are CPU-limited and show no benefits from enabling CrossFire. SuperAA unlocks more anti-aliasing modes when CrossFire is enabled - all the way up to 14xAA. For those who want to know more about how this rendering mode works, we covered it in some detail here.

The Limitations:
During our extensive use of ATI's CrossFire technology since its inception in September 2005, we have a number of reservations. Our biggest reservation is with the messy way in which CrossFire has to be enabled - there is no consistent industry wide standard for enabling CrossFire and ATI seems unsure at the moment, too. Some configurations require a CrossFire Edition 'master' card, while others work with a pair of identical cards. This is set to change when ATI introduces its next-generation hardware - when that happens, we hope that things become a little more consistent across its product line.

Our second reservation is also related to installation, too. There are many CrossFire-ready motherboards out there, but - depending on whether they chose to follow ATI's reference design or not - the positioning of the primary PCI-Express x16 slot can change. For instance, the lower PCI-Express x16 slot on both Sapphire's and Abit's socket 939 RD580 implementations is the primary PCI-Express x16 slot, while other boards we've seen - like ASUS' A8R32-MVP Deluxe - use the top slot as the primary video card interconnect. This only affects people who are considering a high-end CrossFire setup, where a CrossFire Edition and CrossFire-ready card are required.

ATI's dongle is another worry of ours. It's cumbersome and is a pain to install - compare it to NVIDIA's implementation and it is almost like black and white. It also serves as another point for failure, especially with it sticking out of the back of the case. Having used a CrossFire platform in my home system for a couple of months, I found myself trying to push the case right up to the wall under my desk - the dongle limits the distance you can push your case back, and it also adds more mess to the collection of cables already gathered at the back of your case.

The final reservation we have is the time it takes for ATI to roll out drivers with CrossFire profiles for new games - to be completely fair, this applies to NVIDIA as well. Don't get me wrong, ATI's support programme is very good with consistent monthly driver updates, but its dedication to deliver drivers once a month can hurt them in the multi-GPU arena. The rigidness of the Catalyst programme means that it can take some time for drivers with CrossFire profiles for new games to come out. There is a workaround whereby you rename the game's .EXE file to afr-friendlyd3d.exe. Ideally though, there shouldn't have to be one - certainly not if ATI wants its multi-GPU technology to be ready for the mainstream.
 
Segundo o HardInfo, a Nvidia ainda ganha no SLi

If we look upon the many hours of benchmarking, we have taken a look at both the lowend, midrange and highend platforms. NVIDIA seems to be the winner in the majority of tests with quality settings of 4xAA and 8xAF.

These flaws were most noticeable while benchmarking the Radeon X1900XTX Crossfire using the latest Catalyst 6.6 driver. The faults seem to appear in resolutions of 2048x1536 and above in some game-titles. The system either reboots, halts or jumps out of the benchmarks which might be caused by either our motherboard or a driver issue. Only time will tell..
 
How NVIDIA's SLI Technology works:

sli.jpg


NVIDIA launched its multi-GPU technology back at the end of June 2004, before rolling the tech out later in the year when NVIDIA had a chipset good enough to complement the video cards. This came in the form of nForce4 SLI, which was an incredibly popular chipset for NVIDIA because of the upgrade opportunities presented by it - all of a sudden it was possible to install more than one video card into your system.

Many were incredibly sceptical about NVIDIA ever pulling the whole multi-GPU thang off. However, based on the fact that we're now part of this series of articles dedicated to multi-GPU platforms and performance, we think it is safe to say that NVIDIA has succeeded in raising the awareness of multi-GPU technologies. Indeed, products like the recently-released GeForce 7950 GX2 shows that NVIDIA is dedicated to moving things forwards, even if it does mean getting things wrong from time to time.

Some Early Caveats:
NVIDIA's SLI Technology was built from the ground up and was finally introduced with NVIDIA's PCI-Express based GeForce 6800-series video cards, after over three years of development. Initially, SLI was incredibly finicky - you could only use a matched pair of video cards with exactly the same BIOS on each card. However, around eight months after its initial realisation, drivers were released to correct these shortcomings, allowing users to pair any video cards, so long as they were from the same product family (i.e. you could pair a GeForce 7800 GTX with another GeForce 7800 GTX, but not with a GeForce 7800 GT).

We suspect that this was in response to ATI's announcement, which picked up on a lot of NVIDIA's shortcomings at that time. Another early caveat was the fact that you had to reboot everytime you enabled or disabled SLI. This was a massive pain for end users with two monitors, because they had to enable SLI and reboot before they could start playing their chosen game. Once they'd finished gaming, they would have to disable SLI and reboot if they wanted use of both monitors again.

Thankfully, NVIDIA fixed this shortcoming before ATI's solution had come to market, and there is no longer a need to reboot when enabling and disabling SLI. However, the multi-monitors issue is still apparent, but this isn't limited to NVIDIA hardware, because ATI has the same problems too. To be fair to both sides, the fact that you don't have to reboot inbetween enabling and disabling multi-GPU is enough, but when striving for perfect products, we'd like to think that it is possible to auto-switch between multi-monitor and multi-GPU modes in the future.
 
The Hardware:
Once the early compatibility problems had been worked out, NVIDIA's SLI platform is easy to install. All that you need to ensure is that you have a pair of SLI-ready video cards and an SLI-approved motherboard, based on one of NVIDIA's SLI-enabled chipsets.

Once you have installed the pair of SLI-ready video cards, you need to connect the over-the-top bridge that connects the cards together. The bridge is used to send frame data, and it is also used for NVIDIA's enhanced anti-aliasing modes - known collectively as SLI-AA, which we will come to shortly - on GeForce 7900 and 7600-series cards.

Both ATI and NVIDIA are working towards certifying other hardware, too. This is designed to improve the end user experience - if an end user buys a selection of SLI-certified products, they would expect them to work together and deliver a great end user experience. NVIDIA currently certifies motherboards, power supplies and memory modules. These are all tested by NVIDIA and then given a seal of approval, stating that the products will work together as part of an SLI platform without issue.

ATI does the same, but we feel that it isn't quite as far down the certification road as NVIDIA at the moment. However, the company has acknowledged that it needs to have a certification programme if its multi-GPU platform is to succeed.

Because NVIDIA uses a pair of identical video cards to achieve SLI, there is no need to worry about which PCI-Express x16 slot on the motherboard is the primary and which is the secondary; however, NVIDIA appears to have standardised this, as we are yet to see an SLI-ready motherboard with the primary PCI-Express slot farthest away from the CPU socket.

Rendering Modes:
If we forget about Quad SLI for the time being, NVIDIA uses three specific rendering modes in its SLI technology - these are known as Alternate Frame Rendering, Split Frame Rendering and SLI-AA. The first two are performance-combining modes, while the third is an eye candy-combining mode similar to ATI's SuperAA.

Alternate Frame Rendering does exactly what it says on the tin, and works in much the same way as ATI's own AFR method. The two cards render alternate frames, and the secondary video card sends its frame data to the primary card in readiness for displaying on screen. The primary card is responsible for making sure that the frames are displayed in the right order.

Split Frame Rendering works the same way as ATI's Scissor mode, whereby the frame is split in two and load balanced across the two GPUs. This is the mode that NVIDIA focused its multi-GPU push on initially, however, it seems that NVIDIA has found that Alternate Frame Rendering is the better performer in most of the SLI-enabled titles out there. NVIDIA has an option to show load balancing in the older driver control panel - for those that are interested, it's quite cool to watch the drivers dynamically load balance a game across two GPUs. We discussed this in a bit more detail here> http://www.bit-tech.net/hardware/2005/05/17/nvidia_sli_pt2/3.html

The final rendering mode that NVIDIA uses is SLI-AA. With a pair of GPUs, it is possible to enable up to 16xSLI-AA, with both GPUs using the standard 8xSAA mode with a subsample offset when the final image is combined on the primary GPU. This means that each GPU samples from a slightly different position, giving the overall impression of a much smoother image.


The Limitations:
As with ATI, NVIDIA's multi-GPU platform is held back by its driver team. We don't mean this in a bad way, but ultimately when a new game is released, NVIDIA also needs to release a new driver with an optimised multi-GPU performance profile. Much like ATI, NVIDIA has a work around for getting SLI working in games that aren't SLI-optimised. Using NVIDIA's coolbits registry hack, you can create your own profiles for games that don't have profiles.

While this method offers the user a lot more control than ATI's app-rename method, it could also be a downside. The end user can try the various different rendering modes, and NVIDIA provides a guide on how to create your own SLI profiles over on it's SLIZone portal. The more important games get the new driver treatment, but if you're a niche gamer, you may be left on your own for a long time.
 
Para a batalha foram definidos três sistemas, como sempre, HIGH, MID e LOW.
baseadas no preços e desempenho em sigle card.

HIGH:
2X-7900GTX X 2X-X1900XTX

MID:
1X-7950GX2 X 2X-X1900XT

LOW:
2X7600GS X 2X1600PRO
 
FIGHT:evil:

Alguns benchs já efetuados pelo neosseker:

SISTEMA HIGH:
 

Attachments

  • high.JPG
    high.JPG
    163.7 KB · Visitas: 113
Pelo que eu vi a ATi não ganhou em nenhum jogo?
 
Conclusão do neoseeker:

Conclusion

While the overall conclusion is up to Guru3d to decide, the leader in most the tests we were responsible for was NVIDIA. Though not a massive delta by any means, there tends to be a relatively predictable difference between the ATI and NVIDIA numbers. This is reversed in some instances, though the majority of tests are in fact led by NVIDIA, however small that lead often works out to be in the scope of things.

Although this article series leans towards a heavy slant of comparing ATI performance versus NVIDIA, our particular portion of the series gives us an opportunity also to remark on the overall value and relevance of Mult-GPU. We're looking at the most popular games being played, and many of these games simply don't stress the GPU enough for a very expensive setup to be of much gain. While other articles will and have covered games that put more pressure on the GPU, the only title we tested that really stresses the GPU is Oblivion. While its well accepted now that most gamers buy new hardware to play newer games at higher detail, these results show that alot of the games that people are still playing don't require a new top end SLI or Crossfire setup. Also of note is the fact that LCD monitors are locked at their native resolution, which in many cases works to be 1280 x 1024 and 1600x1200. With response times no longer a problem, many gamers have moved over to LCD's making this another area of consideration. With the exception of Oblivion, all of the games we tested ran at maximum detail levels just fine at 1280 x 1024 across the entire tested hardware spectrum. Increasingly, gamers looking to spend mega money on a new multi-gpu setup ought to be looking at a new, larger display (assuming they only wish to buy LCD) which supports resolutions that will truly tax the hardware, or else those same gamers really need to re-evaluate whether they need all that horsepower if they tend to focus on games like those we covered.

SLI and Crossfire have been out for a while and can be considered relatively mature technologies, but as far as issues and problems go, neither company has an entirely clean performance record during this test series. It is in the nature of a dual card setups to misbehave and output puzzling results in benchmarks and the platforms we tested here are no different. Some benchmarks (including some of our own) show that in some instances a single card solution is faster when the GPU is not the limiting factor. The extra data required to be shuffled across the bus can slow things down in some situations in comparison to a single GPU setup. I think you'll see more of the same sort of thing as this series continues and more and more benchmarks are being put to the test. Remember, in the following parts of the Multi-GPU World Tour the other sites will be showcasing results which represent the first time that some of those games have ever been official benchmarks in SLI and Crossfire testing of this scope.

The only major issue we ran into was the melting of our Thermaltake Toughpower 550 Watt power supply in the face of an X1900 XT Crossfire setup. Admittedly, it was my own fault for not realising that this particular supply was Crossfire certified, but only for up to two X1800 XT's. Still, this just proves the fact that high end Multi-GPU setups require solid power (though NVIDIA's power requirements are in fact less), and right now ATI Crossfire is notorious for its pickiness as far as power goes. In fact, we recently switched our scope for PSU testing to cover Crossfire X1900 certified PSUs because we noticed the instability with X1900 setups. With the trend moving towards heavier power draw in the next generation of GPUs, the power supply is most definitely as key a component as any other in a high end setup.

We will be looking to rectify our Battlefield 2 situation as soon as possible, and we will update our benchmarks accordingly once we figure out where the issue is.

We'd like to take this opportunity to thank the other sites in the Multi-GPU World Tour and also thank the sponsors of this series, this collaboration has been an interesting project and we're hoping that you, the reader enjoy this unprecedented article series. Please do look for the next installment in this series from NVNews, as they will be covering the first in the series of "uncommon benchmarks", meaning... testing with games that you pretty much never see get used in hardware reviews!

As a recap of the series, here's a complete list of the full article series "index" that was included in the introduction of this article, both released articles and upcoming installments are listed:

Look for our contribution to the Tour coming on Friday.
 
Espero poder ter ajudado a tirar algumas dúvidas, também irei atualizar quando sairem os outros testes que ainda não foram feitos.

Abraços a todos.
 
opa, blz

nossa cara do hexen:lol: :lol: :lol: ta foda demais o topico... eu dei uma lida e achei bem foda!!!!

a ati perde feio contra o sli, eu acho por ser que a empresa fez a tecnologia desde o principio, ja a nvidia compro da 3dfx ne!!!

mas o CF vai deslançar no futuro, ate pq acho ele mais vantajoso naum de performace, mas de outras aplicações... ainda mais com akele lance de ageia da ati ne...

bom vamu ver o que da con a x1950 e as dx10!!!

falwo
 
Pelo que eu vi a ATi não ganhou em nenhum jogo?

nao tem nem como... o teste é entre 2x X1900XT e 2x 7900GTX... como infelizmente nao tem X1900 Crossfire Edition com clocks da XTX era bom se o Blade ou alguem achasse algum site com testes de X1900 Crossfire Edition overclokada para clocks da X1900XTX para assim termos noção de um Crossfire de X1900XTX e ver se sai melhor diante o SLi de 7900GTX

quanto a um teste ali o Crossfire de X1900GT em resoluções de 2048 se saiu um pouco melhor que a 7950GX2, entao ja da pra ter um pouco de noção que o Crossfire de X1900XT ou X1900XTX deve ser melhor que a 7950GX2, mais logico que a 7950GX2 se sai mais barata do que um Crossfire de X1900XT... mais em perfomace...
 
quanto a um teste ali o Crossfire de X1900GT em resoluções de 2048 se saiu um pouco melhor que a 7950GX2, entao ja da pra ter um pouco de noção que o Crossfire de X1900XT ou X1900XTX deve ser melhor que a 7950GX2, mais logico que a 7950GX2 se sai mais barata do que um Crossfire de X1900XT... mais em perfomace...

Mas a mobo do CrossFire deve ser PCI EX 16x nos dois conectores, a 7950GX2 só usa um conector 16x então teoricamente divide em 8x para cada placa.

E quanto a "noção" o Crossfire de X1900XT perdeu da 7950 GX2 no Oblivion e CoD2 em 1600x1200, mas ganhou no CoD em 2048.
 
nao tem nem como... o teste é entre 2x X1900XT e 2x 7900GTX... como infelizmente nao tem X1900 Crossfire Edition com clocks da XTX era bom se o Blade ou alguem achasse algum site com testes de X1900 Crossfire Edition overclokada para clocks da X1900XTX para assim termos noção de um Crossfire de X1900XTX e ver se sai melhor diante o SLi de 7900GTX

quanto a um teste ali o Crossfire de X1900GT em resoluções de 2048 se saiu um pouco melhor que a 7950GX2, entao ja da pra ter um pouco de noção que o Crossfire de X1900XT ou X1900XTX deve ser melhor que a 7950GX2, mais logico que a 7950GX2 se sai mais barata do que um Crossfire de X1900XT... mais em perfomace...


Aqui temos o CF de X1900XTX vs SLi de 7900 GTX.
http://www.hardinfo.com/show.asp?page=6743
 
Foto da caixa né :p

Muito bonita por sinal...a placa pode não ser muito bonita mas a performance recomempensa hehe

estou ancioso pelos testes da sua vga e a do blade :p
 
Os testes comparanto 7900-GTX / X1900-XTX
 

Attachments

  • HARDHIGH.JPG
    HARDHIGH.JPG
    240.6 KB · Visitas: 56
  • HARDHIGH1.JPG
    HARDHIGH1.JPG
    169.8 KB · Visitas: 43
  • HARDHIGH3.JPG
    HARDHIGH3.JPG
    191.1 KB · Visitas: 39
  • HARDHIGH2.JPG
    HARDHIGH2.JPG
    184.7 KB · Visitas: 35
Bf2 Aa4x Af16x

Preciso reinstalar o jogo e tirar foto de uma cena mais "movimentada" pra se ter uma idéia de como é o comportamento da vga nesse tipo de situação. Já adianto q ela vai bem.....

Flw,
 

Attachments

  • BF2 2006-07-17 21-10-21-78.jpg
    BF2 2006-07-17 21-10-21-78.jpg
    234.3 KB · Visitas: 83
Desempenho da GX2 frente a um CF de X1900-GT.
 

Attachments

  • HARDMID.JPG
    HARDMID.JPG
    229.5 KB · Visitas: 33
  • HARDMID1.JPG
    HARDMID1.JPG
    224.6 KB · Visitas: 32
  • HARDMID2.JPG
    HARDMID2.JPG
    194.9 KB · Visitas: 29
  • HARDMID3.JPG
    HARDMID3.JPG
    166.9 KB · Visitas: 22
Nossa...pelos testes ateh agora parece que o SLI ta bem mais firme.
So uma coisinha, eu li que o SLI foi comprado da 3dfx que invento essa solucao pra muti-gpu, eh verdade?
 

Users who are viewing this thread

Voltar
Topo