Jump to content
*Coming Soon* Signature Rigs & Content Embed Wizard (Google Docs,Soundcloud,Spotify & More) ×

Welcome to ExtremeHW

Welcome to ExtremeHW, like most online communities you must register to view or post in our community, but don't worry this is a simple free process that requires minimal information for you to signup. Be apart of ExtremeHW by signing in or creating an account.

  • Start new topics and reply to others
  • Subscribe to topics and forums to get email updates
  • Get your own profile page and make new friends
  • Send personal messages to other members.
  • Take advantage of site exclusive features.

Recommended Posts

Admins please note: This thread is intended to settle somewhere between 'GPU' and 'Benchmarking'

 

...Ah, the joys of mGPU (multi-GPUs), such as NVidia SLI/NVLink and AMD Crossfire/Quadfire...and why CFR (checkerboard / tile-based) mGPU vs AFR (traditional Alternate Frame Rendering) matters...may be not now, but certainly in the not-too-distant future.

 

Yes, yes - there is the chorus that SLI/mGPUs "is dead"...while not entirely true, it certainly is the case that a single GPU will usually be far more painless to optimize for a given game, or really be the only option for other games...that said, I recently switched from four Quad-SLI / Quad-Fire systems to 'only' dual NVLink (2x 2080 TIs), and while I do not play as many games on this HEDT system that also does 'work' as others, I have relatively few problems with NVLink on my fav games I do play, such as various NFS and also Metro Exodus, never mind 3DM and other benches.

 

Yet this thread is NOT intended to convert anyone to mGPU. Instead, it is pointing out that there seems to be a movement aloft by various GPU producers (NVidia, Intel and likely AMD) to introduce 'mGPUs' in future generations of their GPUs. Think AMD Ryzen 7nm and soon smaller 'chiplets' vs Intel's difficulties with large 10nm 'all-in-one' giant and complex dies in the CPU realm. Likely, the next gens of GPUs will still be single die, but sooner or later, it will be mGPUs for the middle and upper class performance graphics cards - for which you need extremely-well performing mGPU drivers :cool:

 

As such, NVidia released, rather quietly, their CFR capability in their drivers as of late 2019 to do 'CFR' - for RTX only. CFR is actually not new, but was supplanted by the easier-for-developers 'AFR'. Yet with future GPUs, CFR seems to be the far more capable ticket for future mGPU generations than AFR...

 

 

e27063497f60.jpg

 

 

Below are some early CFR vs AFR comps with the current gen of RTX GPUs. I already have run benchmarks of my own such as 8K Unigine Superposition with CFR vs AFR, but much more (tedious mod and setup) work is needed. I will update this post as I get more results of my own, time permitting...

 

In the meantime, I will note that CFR is not always faster than AFR in outright FPS, but it seems to have better frame times...and below are some YouTube vids someone else ran for DCX12 (Titan RTX) and Tom Clancy's Division 2 (DX11)...have fun and plan you next mGPU (oops :o)

 

 

 

 

Link to post
Share on other sites

That is interesting, though I wonder why Nvidia has sat on it ? Possibly because A. It is not ready or B. It was not what they hoped for. I have been through the whole single gpu versus multi-gpu. Back in the day when I moved from single GPU to CF I was in awe of the additional performance and it was fantastic, the same when I moved over to NV with SLI. Even up to the 2080Ti's I remained with SLI, or in this case Nvlink. The issue for me was the 2080Ti multi-gpu was more of a "Maybe things will get better" type deal. We have all seen the decline in multi-gpu titles or the poor scaling if it was supported.

 

I remember during the 2080Ti launch Nvidia where stating that they would be working hard with devs and the likes to bring back the multi-gpu platform (for gaming) but I am yet to see it. I think that assuming the rumors are semi accurate about the 3080Ti, that will be the route I will go, but this time a single gpu build.

 

I think that the API fragmentation regarding DX, Vulkan, OpenGL has not helped matters either. It is great to have the competition for sure but multi-gpu has been even more hit and miss over these different API's. All in all multi-gpu for gaming is not dead, but it really is now a niche due to the lack of support and the fact you will see little return on your investment.

 

You never know Nvidia may pull something out of the hat, but whatever it is, it would need to make life so easy for devs to make it worthwhile, or to incentivize devs in some way.

Link to post
Share on other sites
That is interesting, though I wonder why Nvidia has sat on it ? Possibly because A. It is not ready or B. It was not what they hoped for. I have been through the whole single gpu versus multi-gpu. Back in the day when I moved from single GPU to CF I was in awe of the additional performance and it was fantastic, the same when I moved over to NV with SLI. Even up to the 2080Ti's I remained with SLI, or in this case Nvlink. The issue for me was the 2080Ti multi-gpu was more of a "Maybe things will get better" type deal. We have all seen the decline in multi-gpu titles or the poor scaling if it was supported.

 

I remember during the 2080Ti launch Nvidia where stating that they would be working hard with devs and the likes to bring back the multi-gpu platform (for gaming) but I am yet to see it. I think that assuming the rumors are semi accurate about the 3080Ti, that will be the route I will go, but this time a single gpu build.

 

I think that the API fragmentation regarding DX, Vulkan, OpenGL has not helped matters either. It is great to have the competition for sure but multi-gpu has been even more hit and miss over these different API's. All in all multi-gpu for gaming is not dead, but it really is now a niche due to the lack of support and the fact you will see little return on your investment.

 

You never know Nvidia may pull something out of the hat, but whatever it is, it would need to make life so easy for devs to make it worthwhile, or to incentivize devs in some way.

 

 

...good point "that the API fragmentation regarding DX, Vulkan, OpenGL has not helped matters either. It is great to have the competition for sure but multi-gpu has been even more hit and miss over these different API's."

 

What I'm thinking, based on some rather preliminary and speculative pieces, is that future mGPU will not so much be separate cards (like current NVLink/SLI), but more like multiple (ie 5nm) chiplets on a single card, thus the renewed need for some mGPU drivers...and when properly implemented, CFR has some advantages over AFR. But as with anything else 'forecasting is difficult, especially when applied to the future'...:)

 

Link to post
Share on other sites

Well they could go that route with a chiplet like design, however I am wondering if that makes it any easier for the devs. Unless Nvidia or AMD make it so that it is game agnostic and the GPU/Drivers decide how to best process the game ? I think the only way Multi-GPU will return or another guise of it is if the workload is taken away from the dev.

Link to post
Share on other sites
Well they could go that route with a chiplet like design, however I am wondering if that makes it any easier for the devs. Unless Nvidia or AMD make it so that it is game agnostic and the GPU/Drivers decide how to best process the game ? I think the only way Multi-GPU will return or another guise of it is if the workload is taken away from the dev.

 

 

...along the lines of what happened with DLSS 1.0 > DLSS 2.0 ?

Per Nvidia's site, " The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games."

 

Link to post
Share on other sites

 

 

...along the lines of what happened with DLSS 1.0 > DLSS 2.0 ?

Per Nvidia's site, " The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games."

 

I forgot about DLSS ! Yeah a technology akin to that would be the best way to go even with non chiplet design multi-gpu.

 

Just imagine if they achieved that, multi-gpu would then come back to the enthusiast level rather than a small niche.

Link to post
Share on other sites

 

I forgot about DLSS ! Yeah a technology akin to that would be the best way to go even with non chiplet design multi-gpu.

 

Just imagine if they achieved that, multi-gpu would then come back to the enthusiast level rather than a small niche.

 

my personal opinion is that if they expect to see multi gpu usage in rigs by peeps like me,they need to lower the price of the cards that can be run together. When I bought my RTX 2070,it was $600 after taxes and didn't support sli(the super the added support back in) while the RTX 2080 from EVGA is on sale for $800($50 off). My total build cost was roughly $1600 after taxes & a new OS. A pair of RTX 2080's are the same price as my whole rig,and God forbid I were to go with anything faster in sli. lol, Maybe a pair of Titans in sli @ $2490 a piece? :rolleyes: Unless people are using the comp for work also or can easily afford it,I don't see sli making gains. My non super 2070 meets the rec specs for the game releases in 2020 that I'm interested in,it even meets half life alyx requirements.:thumbs_thumbsupup:

 

Link to post
Share on other sites

Cost is a massive factor. SLI used to be good if you wanted to add in another card later down the line for a performance boost, obviously by then the second GPU will have come down in price. However still a moot point until SLI comes out of a niche it's stuck in.

Link to post
Share on other sites
Posted (edited)

A bit more on mGPU CFR, AFR/2...

 

 

 

 

- Rumour has it (rumour = add salt shaker!) that NVidia, very quietly, added CFR for developers working on 'Hopper' (architecture after Ampere). With GPU dies still being huge even after node shrink to 7nm, mGPU is expected to gain some wind in its sails as vendors *might* have to focus on multi-die single socket GPUs (a la AMD Ryzen CPU pattern)

 

- Below are 2 x 2 benches (Valley, Neon Noir) comparing CFR and AFR NVLink modes. For now, just at 1080p, no GPU OC etc which is also held back a bit by the TR 2950X at that resolution (unlike 4K, but that monitor is in use on another test-bench). Then again, the same handicap applied to all tests, and other factors such as ambient temps were the same. Still, I look forward to add full-OC, 4K results.

 

- CFR seems to work in Valley, Heaven, Superposition and Cryengine Neon Noir..and work very well. I noticed not only slightly higher scores but also a bit lower frame times

 

Valley 1080 AFR

 

a07b781e6171.jpg

 

 

Valley 1080 CFR

 

dba8a4e7a2c7.jpg

 

 

Neon Noire 1080 (ultra) AFR2 (better than AFR for this bench)

fb19c52e252f.jpg

 

 

Neon Noire 1080 (ultra) CFR

 

8159e3f1460d.jpg

 

 

 

...and for 1080p AFR 'all stock' vs 4k full-OC AFR (incl. GPUs + CPU) with the same quality settings (but slightly older driver)...

 

01af8672f253.jpg

 

 

 

 

 

Edited by J7SC_Orion
Link to post
Share on other sites
Posted (edited)

Well, some select 4K results now for Crytek's Neon Noir...

 

First, in case you're wondering why NVlink_SLI 'CFR' is also called checkerboard :D ...not everything works as it should if you 'force' CFR...here is some Firefox browser fun

 

d1b0eef90f76.jpg

 

 

...but Crytek's Neon Noir bench works very well with CFR ! Not only better scores but apparently also lower and more consistent frame times.

 

4K results. Please note that there was no overclock (GPU or VRAM) on the GPUs in order to keep it all compatible between runs and also between resolutions. Typically, full-tilt OC with this bench and setup adds about 10% -15% to the scores. Also, this might still be the older Crytek Neon Noir engine (I thought I updated it, but every time I log in, it wants to update again :mad: - still, the same engine was used for all the runs below).

 

Settings: 4K, Ultra, Fullscreen, stock clocks

 

7073 - Forcing CFR -

6326 - Forcing AFR2 -

4282 - Forcing AFR - (AFR is default dual card setting)

4733 - Single/Auto-select GPU -

 

bbbfc24708ac.jpg

aba3bbedfe81.jpg

165fd2e40d5b.jpg

08db4467a979.jpg

Edited by J7SC_Orion
Link to post
Share on other sites
Well, some select 4K results now for Crytek's Neon Noir...

 

First, in case you're wondering why NVlink_SLI 'CFR' is also called checkerboard :D ...not everything works as it should if you 'force' CFR...here is some Firefox browser fun

 

d1b0eef90f76.jpg

 

 

...but Crytek's Neon Noir bench works very well with CFR ! Not only better scores but apparently also lower and more consistent frame times.

 

4K results. Please note that there was no overclock (GPU or VRAM) on the GPUs in order to keep it all compatible between runs and also between resolutions. Typically, full-tilt OC with this bench and setup adds about 10% -15% to the scores. Also, this might still be the older Crytek Neon Noir engine (I thought I updated it, but every time I log in, it wants to update again :mad: - still, the same engine was used for all the runs below).

 

Settings: 4K, Ultra, Fullscreen, stock clocks

 

7073 - Forcing CFR -

6326 - Forcing AFR2 -

4282 - Forcing AFR - (AFR is default dual card setting)

4733 - Single/Auto-select GPU -

 

bbbfc24708ac.jpg

aba3bbedfe81.jpg

165fd2e40d5b.jpg

08db4467a979.jpg

 

Looks like CFR will be the future, will be interesting if Nvidia keep working on it or if they more publicly announce the enhancements.

Link to post
Share on other sites
Posted (edited)

...for now they'll probably keep it more in the shadows, with support still spotty depending on app...but per above posts, this is probably more for SW developers for 'some future-gen GPU architecture (such as Hopper)' when mGPUs / multi-chiplet designs come out. Still, some apps work great already - and yes, "it can play Crysis" :)

 

 

EDIT: Not so much about CFR vs AFR/2, but about SLI/NVLink in general vs a single card...:

Here's an interesting YouTube vid (by BFG / Benchmarks For Gamers) about 2x 2080 Ti in some modern game titles. Looking at 1% low FPS, when gaming with 4K especially (rather than 1080p), the results are pretty impressive

 

Edited by J7SC_Orion
Link to post
Share on other sites

SLI certainly still has a place as per the video, I just wish it was as bigger place. Still going single GPU this coming gen as the ROI on a second card is really poor still :(....oh but while it can run Crysis, will it be able to run the new remaster of Crysis !?

Link to post
Share on other sites
SLI certainly still has a place as per the video, I just wish it was as bigger place. Still going single GPU this coming gen as the ROI on a second card is really poor still :(....oh but while it can run Crysis, will it be able to run the new remaster of Crysis !?

 

 

...yeah, definitely still very much a niche market for SLI/NVLink, as is 4k (according to Steam user data). Perhaps with Hopper onward, it *might* change for the aforementioned reasons.

 

As to running remastered Crysis, possibly, if it uses the Cryengine updates in Neon Noir

Link to post
Share on other sites

 

 

...yeah, definitely still very much a niche market for SLI/NVLink, as is 4k (according to Steam user data). Perhaps with Hopper onward, it *might* change for the aforementioned reasons.

 

As to running remastered Crysis, possibly, if it uses the Cryengine updates in Neon Noir

 

Yeah I was only messing with the Crysis thing. With the advancement in game engine technology you can get games with the fidelity of Crysis without the performance hit that once was, plus the efficiency of drivers and API's have come such a long way. That is not to say Crytek cannot make something that will give our machines a good run for their money. Hopper may spur on SLI/Nvlink but as per our prior conversation, Nvidia really will need to take the work away from the developer as I do not see anything changing, fingers crossed.

Link to post
Share on other sites

 

Yeah I was only messing with the Crysis thing. With the advancement in game engine technology you can get games with the fidelity of Crysis without the performance hit that once was, plus the efficiency of drivers and API's have come such a long way. That is not to say Crytek cannot make something that will give our machines a good run for their money. Hopper may spur on SLI/Nvlink but as per our prior conversation, Nvidia really will need to take the work away from the developer as I do not see anything changing, fingers crossed.

 

...exactly, IF (!) Hopper turns out to be a chiplet/mGPU based design, then NVidia will do the development in-house, a la DLSS 2.0 (3.0)

 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy