Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
11920222425209

Comments

  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    I would say that Far Cry 5 graph is possibly skewed by the fact that the only Vega 64 in the chart is the Strix, which has issues with thermal throttling due to piss poor VRM cooling and aggressively high clocks (very strange from Asus on a Strix model). There's a thread about it on OCUK.

    Techspot, Guru3D, PC Gamer, etc all found the standard Vega 64 outpaced the standard 1080 at 1440p Ultra.

    Their results are all pretty consistent as well, whereas the GamerNexus Strix figures are a good chunk lower.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Those articles are all 6-7 months old as well. The GamersNexus 1 is the most up to date being only a week old. Latest drivers and patches.

    Those old articles are also only comparing stock coolers and clocks. That is cherry picked as well.

    The EVGA FTW 1080 is not exactly the best 1080 model either.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    I get what you're saying but I doubt the Vega 64 has strongly disimproved in Far Cry 5 in that time.

    If you look at the GTX1080 results in that benchmark, they're pretty much bang-on with what they were earlier in every benchmark across the board also. Only the Vega 64 is different in that Gamer Nexus chart, and it's hardly a coincidence that their only Vega 64 model is one known to have major throttling issues.

    Even Asus have acknowledged the issue, which is due to crappy cooling design - GamerNexus themselves have a teardown on YouTube which also acknowledges the insufficient cooling on the Strix. I'm guessing that's the same one he's using to bench in that article actually.


  • Registered Users Posts: 1,305 ✭✭✭Joshua J


    I've seen about a dozen 1070's sold for 250-270 recently. A lot of it was mining stock obviously.

    And you're right, for some reason people seem to think 980Ti's are magically worth more.....but a quick look on Adverts and you'll find tons of them for sale at that price, and zero offers on them!

    Same with most of the GTX970's and 980's. People expecting to get way over their worth.

    Odd great bargain. Was a reliable seller in Galway the other day selling a decent model GTX970 for €110 - and a lad selling EVGA 1070's with plenty warranty for €250.
    That lad can't shift his 970 for €110 lol. I've an RX 480 that I was offered 240 for 8 months ago turned it down. Be lucky to get half for it now. Card market has died. Been looking for a cheap 980ti but as you say way overpriced.


  • Registered Users Posts: 17,434 ✭✭✭✭Blazer


    So still no games released with rtx support.
    BFV is coming soon....apparently.

    Plus the 2xxx series could be extremely short-lived...ouch!!!


    https://www.extremetech.com/gaming/280322-battlefield-v-wont-support-nvidias-dxr-at-launch


  • Advertisement
  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    Blazer wrote: »
    BFV is coming soon....apparently.

    Correct BF V is coming very soon, finally after the long delay.
    I figured RTX support would be MIA at launch. RTX features are not for this generation of cards despite them having tensor cores.


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    Windows Update for DXR is still MIA, I wonder if thats causing delays.

    Its funny that everything is delayed but people still thing 7nm will be on time, I cant see 7nm in the next 18 months.


  • Registered Users Posts: 655 ✭✭✭L


    Windows Update for DXR is still MIA, I wonder if thats causing delays.

    Its funny that everything is delayed but people still thing 7nm will be on time, I cant see 7nm in the next 18 months.

    Well, let's wait and see.

    The Radeon Instinct MI60 is due to launch on the 18th of November, so that'll give us a bit more of an indication of how viable consumer 7nm will look.


  • Moderators, Computer Games Moderators Posts: 14,707 Mod ✭✭✭✭Dcully


    Well DX12 in BFV is a disaster, not looking good for ray tracing at all with this title atleast.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    They will have to focus on fixing DX12 before even thinking about DXR. I'm not sure Shadow of the Tomb raider will even bother at this stage unless Nvidia has some agreement with them where they have to.
    The new Metro game in February may be our first glimpse at it in game or perhaps something that's released in summer of next year.
    I think DXR has support in developer mode for windows 1803 so at least developers should be getting some benefit from the tensor cores in the 20xx series.


  • Advertisement
  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    I am getting annoyed at this stage, at least give us something Nvidia, I dont care if it runs like crap at this stage I just want to see it.


  • Registered Users Posts: 655 ✭✭✭L


    Apparently FFXV has cancelled it's RTX features (this seems to be tied in with their Hajime Tabata leaving though so it's hard to draw an inference from it).

    It might be worth putting together a list of the games that were supposed to have RTX features implemented, and what their current status is.


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    L wrote: »
    Apparently FFXV has cancelled it's RTX features (this seems to be tied in with their Hajime Tabata leaving though so it's hard to draw an inference from it).

    It might be worth putting together a list of the games that were supposed to have RTX features implemented, and what their current status is.

    FFXV is totally cancelled for PC, not just the DLSS features, despite the fact they were shown working.


  • Registered Users Posts: 655 ✭✭✭L


    FFXV is totally cancelled for PC, not just the DLSS features, despite the fact they were shown working.

    I'm pretty sure it came out in March Fitz - it's the future DLC and features that have been cancelled.


  • Registered Users Posts: 13,984 ✭✭✭✭Cuddlesworth


    I am getting annoyed at this stage, at least give us something Nvidia, I dont care if it runs like crap at this stage I just want to see it.

    That's kind of the thing though, a brand new tech, using a brand new API, in a brand new GPU architecture, brought into engines that were never designed for it. Its going to run like crap at the start regardless. Most companies don't want to bother with that **** in production releases outside of some tech demos. And again, even if you invest the time and resources into it, it can only be used by a fraction of the market.

    I'd say Metro Exodus in Jan is the best bet for any working RTX features in a game.


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    L wrote: »
    I'm pretty sure it came out in March Fitz - it's the future DLC and features that have been cancelled.

    I know, but its not just DLSS its the entre PC support for the game. https://www.guru3d.com/news-story/final-fantasy-xv-pc-cancelled-and-yes-that-includes-rtx-features.html


  • Registered Users Posts: 655 ✭✭✭L



    Read the first line of the article not the headline Fitz. That article is saying what I said :p


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    L wrote: »
    Read the first line of the article not the headline Fitz. That article is saying what I said :p

    Yes I can read, my point is that some would say that its RTX (never supported RTX its DLSS it had ) that was cancelled, my point is that the entire PC version of the game is no longer in production. So thinking that they could not implement this feature are incorrect.


  • Registered Users Posts: 655 ✭✭✭L


    Yes I can read, my point is that some would say that its RTX (never supported RTX its DLSS it had ) that was cancelled, my point is that the entire PC version of the game is no longer in production. So thinking that they could not implement this feature are incorrect.

    Easy there - I'm not saying you can't read, I specifically called out that it was because the producer was leaving, and RTX is the card, not shorthand for raytracing.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    I am getting annoyed at this stage, at least give us something Nvidia, I dont care if it runs like crap at this stage I just want to see it.

    You have the fastest gaming GPU money can buy. What more did you expect when you pre-ordered the 2080 ti? You can't blame Nvidia for not wanting to release something that runs 30 fps @ 1080p on their flagship cards.
    You can however blame them for releasing RTX too soon.


  • Advertisement
  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    tuxy wrote: »
    You have the fastest gaming GPU money can buy. What more did you expect when you pre-ordered the 2080 ti? You can't blame Nvidia for not wanting to release something that runs 30 fps @ 1080p on their flagship cards.
    You can however blame them for releasing RTX too soon.

    Just want something to test the RT or tensor cores with, release the demos the press got, release the 3dmark benchmark...the reviewers at the launch saw tombraider running between 40 and 60 fps so I am not too worried about that aspect, the star wars demo runs at fine FPS they junt wont release it...the card rasterising fast is fine, but its not why it was purchased.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    Releasing such things prematurely is of no benefit to developers who are still getting their head around the new technology. I think it's fantastic that the 20xx series has given these developers an early look at how RTX may be used in years to come. I think releasing it on some kind of titan card or something to show that it was aimed at developers would have been more honest but this way they get to secure more funding.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    I think it would have been better as a separate card if that was possible.

    Have some sort of optional transitional phase before they integrate them onto the main cards.

    This would allow the devs a few years to work on it then bring it onto the cards at 7nm at a more affordable price with higher spec.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    But this way Nvidia get to recoup much of the cost of producing experimental tensor cores.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    BloodBath wrote: »
    I think it would have been better as a separate card if that was possible.

    Have some sort of optional transitional phase before they integrate them onto the main cards.

    This would allow the devs a few years to work on it then bring it onto the cards at 7nm at a more affordable price with higher spec.

    EG PhysX.

    Rember Cell Factor? :)


  • Registered Users Posts: 13,984 ✭✭✭✭Cuddlesworth


    BloodBath wrote: »
    I think it would have been better as a separate card if that was possible.

    Have some sort of optional transitional phase before they integrate them onto the main cards.

    This would allow the devs a few years to work on it then bring it onto the cards at 7nm at a more affordable price with higher spec.

    The RTX(compute) part of the 2xxx series was created for the enterprise market, to try lock down automated driving with Ray tracing, deep learning/AI with the Tensor Cores. Its too costly to make a separate card and chip, so to sell cards while adoption takes off, its been sold to gamer as a feature. And it will be in a couple of years.

    Separating it would have cost way to much end to end to bother.


  • Registered Users Posts: 655 ✭✭✭L


    Separating it would have cost way to much end to end to bother.

    I'm not convinced of that - the sheer die size of TU102 and TU104 make them massively costly chips to make. Disproportionately so to any benefit that could be accrued from standardized lineups.

    Chiplet design is what I'd expect to see these days for reuse of niche technology and that would lend itself quite well to lower cost addon cards (low end product? less chiplets. High end product? More chiplets. Addon card? Just that type of chiplet but stacked way up).


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    October update of windows now re-released! Staggered auto update as usual but you can do a manual update and bask in all the raytracing glory!


  • Registered Users Posts: 2,259 ✭✭✭Shlippery


    tuxy wrote: »
    October update of windows now re-released! Staggered auto update as usual but you can do a manual update and bask in all the raytracing glory!

    and I suspect the BFV early RTX patch was just released this morning as well...


  • Advertisement
  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    L wrote: »
    I'm not convinced of that - the sheer die size of TU102 and TU104 make them massively costly chips to make. Disproportionately so to any benefit that could be accrued from standardized lineups.

    Chiplet design is what I'd expect to see these days for reuse of niche technology and that would lend itself quite well to lower cost addon cards (low end product? less chiplets. High end product? More chiplets. Addon card? Just that type of chiplet but stacked way up).

    I think raytracing is very much integral to the entire lighting process, and the data it need to transfer from memory and to the main core might be too fast for PCIe 16x. The PCI pathway is quite long and would degrade performance. SLI works well becuause most of the time the certainly with AFR only the finished frame needs to be transfered to the primary card.

    What seems better to me would be handing off some of the processing to the spare CPU cores seeing as we seem to be heading into a core count war at the moment with very little to do with them except run cine-bench.

    On a side note, playing the new battlefield V ( got the origin subscription for 100 euro and there are some good games there, is this the way PC gaming is going to go....netflix for games? )...graphics without ray-tracing are very good but the DX12 is stuttery as all hell...on a 5ghz 8700 and a 2080ti...so some optimization needed of that area. I would say the dx12 update will be along in the next couple of months, but its EA so I am not expecting a well optimmised experiance. Game itself is the usual fair, same as BF1 but re-skinned. Serious SJW content.
    Shlippery wrote: »
    and I suspect the BFV early RTX patch was just released this morning as well...

    What? Links please?


Advertisement