Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

NVIDIA Kepler Information

Options
24567

Comments

  • Registered Users Posts: 2,720 ✭✭✭dan_ep82


    Tea_Bag wrote: »
    ill play tonight and monitor my vram usage to see what it uses. I remember reading that BF spikes to 1200mb in some cases.

    Thanks,I think the 1gb is holding me back in this and future games:(


  • Registered Users Posts: 22,929 ✭✭✭✭ShadowHearth


    dan_ep82 wrote: »
    Thanks,I think the 1gb is holding me back in this and future games:(

    If you got crossfire, won't system use space fro
    Both Gpus? Like 2x6870 will make 2gb? Or will it use only one Gpus VRAM?


  • Moderators, Technology & Internet Moderators Posts: 17,134 Mod ✭✭✭✭cherryghost


    Aren't they saying that it will be 20% more then 7970, not 580 m8.

    The 580 and 7970 are closely matched in some benchmarks. My 570 is pretty much on par with stock 580 performance. So I want it more than 20% faster than my 570 :P


  • Registered Users Posts: 22,929 ✭✭✭✭ShadowHearth


    The 580 and 7970 are closely matched in some benchmarks. My 570 is pretty much on par with stock 580 performance. So I want it more than 20% faster than my 570 :P

    Ya cheeky bugger :)


  • Registered Users Posts: 7,692 ✭✭✭Dublin_Gunner


    The best thing about Kepler supposedly performing so well, is bringing current high end parts down to price levels where value can be had.

    Nvidia are really just screwing with their potential customer if they truly decide to release GK104 as a high end part.

    Up until very recently, it was going to be a mid range part, with performance similar to 7950/70 - but at mid range prices.

    If they kept with this model, they would sell shed loads of not only the GK104, but the GK110 when it releases - plus no need for artificially blinding people with a 2in1 generation.


    And no, it's not like GTX480 / 580 - ass that was a re-spin of the silicon on a new manufacturing node, with some improvements. GK104/110 is the exact same architecture (just with more execution units in 110 obviously).

    The 4xx series should not have been released as early as they were - nvidia needed to get something out the door though.


  • Advertisement
  • Registered Users Posts: 2,720 ✭✭✭dan_ep82


    If you got crossfire, won't system use space fro
    Both Gpus? Like 2x6870 will make 2gb? Or will it use only one Gpus VRAM?

    Doesn't work that way unfortunately,only uses 1gb Vram


  • Registered Users Posts: 4,983 ✭✭✭Tea_Bag


    If you got crossfire, won't system use space fro
    Both Gpus? Like 2x6870 will make 2gb? Or will it use only one Gpus VRAM?
    both cards mirror the same data before compute, meaning that the 'effective' RAM is still only 1gb.

    if you CF or SLI with two different cards, they'll have the effective RAM of the lesser card, as well as operate at the speed of the slowest card.

    I actually get annoyed when reviewers call the 6990 a '4gb' monster. they should know better, and it only confuses people.


  • Registered Users Posts: 7,692 ✭✭✭Dublin_Gunner


    Only the 'main card' in xfire or SLI uses it's framebuffer for output, therefore only the framebuffer of the main card can be counted really as effective gfx ram.

    So yes, only the ram from 1 card is actually used in a traditional sense as the frambuffer output of the xfired / SLI'd cards.


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Tea Bag's right; The data's mirrored across both cards, not stored on one. Equates to the same thing, but I'm just being pedantic. :)

    If you use two different cards though, they'll actually operate at their respective speeds, they won't downclock to the lower. Common misconception. Not sure on framebuffer sizes though.


  • Moderators, Computer Games Moderators Posts: 4,281 Mod ✭✭✭✭deconduo


    Serephucus wrote: »
    If you use two different cards though, they'll actually operate at their respective speeds, they won't downclock to the lower. Common misconception. Not sure on framebuffer sizes though.

    Really? So the performance of a 5770+5750 would be somewhere between 2x5750 and 2x5770? That's interesting to know.


  • Advertisement
  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Sorry, I was talking only of SLI there. I'm not sure exactly how CF goes.


  • Registered Users Posts: 4,983 ✭✭✭Tea_Bag


    Serephucus wrote: »
    Sorry, I was talking only of SLI there. I'm not sure exactly how CF goes.
    are you sure? wouldnt that severly increase tearing? each GPU does an alternative line (or in some games one does the top half of the screen and the other does the bottom) and if one line is rendered faster than the one below it, it'll just be aweful.

    it might be a misconception, but i dont see how a 5770 + 5750 will perform better than a 5750 + 5750.


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Well it wouldn't work very well with AFR, but with SFR I don't see why it wouldn't work. The slower GPU would just render less of the frame than it otherwise would have.

    I'm not 100% sure, but I remember seeing a post by an Admin on the NVIDIA forums a while back (480-ish time) that said as much.


  • Registered Users Posts: 712 ✭✭✭deejer


    Some "independant" reviews done on performance difference between the 580/6970/680 here.

    http://www.techpowerup.com/162498/GTX-680-Generally-Faster-Than-HD-7970-New-Benchmarks.html


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    You jerk. :P I found this literally three minutes ago.

    That's interesting alright. About a 20% improvement over my 580, and about a 40% improvement over a stock 580. Might have to pick up one of these!


  • Registered Users Posts: 712 ✭✭✭deejer


    Serephucus wrote: »
    You jerk. :P I found this literally three minutes ago.

    That's interesting alright. About a 20% improvement over my 580, and about a 40% improvement over a stock 580. Might have to pick up one of these!

    I think you should ;)

    I spend alot of time on that site when I am at work. Helps break up the day :D


  • Registered Users Posts: 618 ✭✭✭pandaboy


    Did that cooler come from Iron man's chest?


  • Registered Users Posts: 4,983 ✭✭✭Tea_Bag


    deejer wrote: »
    Some "independant" reviews done on performance difference between the 580/6970/680 here.

    http://www.techpowerup.com/162498/GTX-680-Generally-Faster-Than-HD-7970-New-Benchmarks.html
    fishy.

    "an extreme-cooled Intel Core i7-3960X Extreme Edition processor (running at stock frequency)"

    179a.jpg


  • Registered Users Posts: 7,692 ✭✭✭Dublin_Gunner


    Tea_Bag wrote: »
    fishy.

    "an extreme-cooled Intel Core i7-3960X Extreme Edition processor (running at stock frequency)"

    lol so true.

    Lets waste lots of LN2 running it at stock, for no reason whatsoever.

    Maybe they literally had no other CPU cooler in their lab?? :D:D

    **The screenshot just below in that thread actually shows a 3960X @5Ghz, and its only 300 points higher than the supposed 'stock' result in 3DM11. Complete BS.


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Holy crap, finally! Take that eyefinity!


  • Advertisement
  • Registered Users Posts: 2,315 ✭✭✭deceit


    Serephucus wrote: »
    Holy crap, finally! Take that eyefinity!

    Nvidia said that a few months back?
    Its only reason I've been contemplating buying one. I taught this was common knowedge now?
    Must not have been widely publicised back then.


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Didn't hear anything about it. Mind you at that point I wasn't really paying attention to industry news.


  • Registered Users Posts: 425 ✭✭Surfn


    it looks like TTL has a kepler on his test bench


  • Registered Users Posts: 4,983 ✭✭✭Tea_Bag


    Surfn wrote: »
    it looks like TTL has a kepler on his test bench
    who?


  • Registered Users Posts: 2,315 ✭✭✭deceit


    Tea_Bag wrote: »
    who?
    Tinytomlogan, makes video reviews for overclock3d on youtube. Mostly he's a really good reviewer.


  • Registered Users Posts: 425 ✭✭Surfn


    Tea_Bag wrote: »
    who?

    youtube timetolivecustoms. this guy rocks when it comes to hardware. you cant see the card but hes letting people know in his kinda way thats it hidden behind the bag on his test bench


  • Registered Users Posts: 2,315 ✭✭✭deceit


    Surfn wrote: »
    youtube timetolivecustoms. this guy rocks when it comes to hardware. you cant see the card but hes letting people know in his kinda way thats it hidden behind the bag on his test bench

    Yea he's really good but about 1 in 10 videos he just becomes an asshole :P


  • Registered Users Posts: 4,983 ✭✭✭Tea_Bag


    deceit wrote: »
    Tinytomlogan, makes video reviews for overclock3d on youtube. Mostly he's a really good reviewer.
    all these abbreviations you cool guys use, i havent a clue about.


    he's a good reviewer, but i cant listen to him for long, cause his voice annoys me :o


  • Registered Users Posts: 1,600 ✭✭✭Eboggles


    His videos are a bit long, he drags them out longer than they need to be, most of his videos could be done in 25-30 minutes. Although, saying that he is a good reviewer, I really hope Kepler (680 in particular) does well as the prices of the Radeons should drop accordingly and I will soon be in the market for something with 7970 style grunt :D.


  • Advertisement
  • Registered Users Posts: 22,929 ✭✭✭✭ShadowHearth


    Damn. Since I discovered him, he is the only reviewer on YouTube I watch. His reviews are great and he is prety damn fun at times. He does look at products from an avarege person point and how well item priced for performance etc.

    I really hope they will let reviews out before 22nd of march though :(


Advertisement