Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia: DX11 will not sell new cards.

Options
«1

Comments

  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    AMD confirms DirectX 11 games: Battleforge, Stalker: Call of Pripyat, Dirt 2 and Alien vs. Predator

    AMD offers some insight on DirectX 11 powered games in a current blog on their website. According to AMD, Battleforge, Stalker: Call of Pripyat and Dirt 2 will feature DX11 so that their Radeon 5000 series will have serious advantages this year.
    During the launch of Ati Eyefinity, AMD also talked about games that will come out in the next months powered by DirectX 11. There were a lot of rumours surrounding the DX11 games list, and until now just Dirt 2 was a serious contended. Now AMD lifted the curtain for:
    • Battleforge (EA) - around September/October 2009
    • Stalker: Call of Pripyat (GSC) - November 2009
    • Dirt 2 (Codemasters) - December 2009

    Also, it's rather obvious that the fourth game in the DX11 series will be Alien vs. Predator (beginning of 2010) as the Rebellion developers make an appearance in a DX11 video from AMD.

    Link


  • Registered Users Posts: 82,405 ✭✭✭✭Overheal


    Yeah, and Blizzard confirmed DX10 ready Starcraft 2.... in 2007. Its still without a release date.

    You'd be retarded to jump on DX11 early. I'm a damn fool for being insistent on DX10. I dont even use it. It brings the system to its knees, despite the eligibility.


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    Overheal wrote: »
    You'd be retarded to jump on DX11 early. I'm a damn fool for being insistent on DX10. I dont even use it. It brings the system to its knees, despite the eligibility.

    I've decided I am, I'm buying a 5870, but not for directx11, I'm buying it because I want to go back to a single card solution from my X2 as the main game I play don't support Crossfire well (Arma II). Its not an expensive upgrade for me anyway, could be free as I'll prob get near the value of the 5870 for the X2

    I wouldn't call You an enthusiast :p, loads of people on here can max Crysis in directx10


  • Registered Users Posts: 82,405 ✭✭✭✭Overheal


    Staying ahead of the hardware curve is expensive and bothersome. have fun.


  • Registered Users Posts: 1,864 ✭✭✭uberpixie


    Overheal wrote: »
    Staying ahead of the hardware curve is expensive and bothersome. have fun.

    That it is.... Still if anything it makes the current gen cheaper, doubt I will be replacing my poor 4870 for another year yet unless the new ATI gives some sort of magic 200% increase in performance :pac:.

    Still Nvidia spouting that DX11 won't drive sales sounds like sour grapes and that they do not have it working yet.....

    If they had a DX11 solution ready, they would be singing about DX11 support along with their CUDA, gpr processing, physx etc....


  • Advertisement
  • Registered Users Posts: 5,560 ✭✭✭Slutmonkey57b


    Overheal wrote: »
    Yeah, and Blizzard confirmed DX10 ready Starcraft 2.... in 2007. Its still without a release date.

    Blizzards games never have a release date, n00b. ;)


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    AMD respond to NVIDIA's tough Radeon HD 5800 questions
    Why is AMD focusing so much on DX11 when most games are on DX9 console ports?

    Today and over the life of the Radeon HD 5000 series, dozens of triple A titles will leverage DirectX11 features. If NV was able to produce a DirectX11 card today, they'd be delivering a much different narrative on DirectX 11 to the press. If NV really believes that DirectX11 doesn't matter, then we challenge them to say that publically, on the record.


    Aren't they punishing PC gamers by pushing out the schedule of PC titles such as Dirt 2 in order to support DX11?

    Proprietary standards punish gamers, not industry standards like DirectX11. Why is NVIDIA punishing gamers by putting in proprietary and closed standards like PhysX in games?


    When are GPU-accelerated Havoc titles going to be shipped? Do they have a list of games that will support Havoc?

    PhysX has been around for years and years, but today, GPU-accelerated PhysX titles are still in the single digits. The physics experiences that many of those titles delivered have disappointed gamers and were widely panned by the press worldwide. GPU accelerated game physics will only be accepted in the marketplace when industry standards are embraced.


  • Registered Users Posts: 2,888 ✭✭✭Rsaeire


    Nvidia simply can't match AMD's latest graphics offering, so they're resorted to acting like a petulant child and basically saying that they don't want to play and gave a lame reason as to why.

    I'll be interested to see what graphics cards Nvidia eventually release and sincerely hope its not another batch of rebadged cards sold as new technology; as if they won't get found out in anway.


  • Registered Users Posts: 5,560 ✭✭✭Slutmonkey57b


    Since they've already displayed a faked card, it's not looking good for them.


  • Registered Users Posts: 2,888 ✭✭✭Rsaeire


    Since they've already displayed a faked card, it's not looking good for them.

    I missed that this week actually; just read the story now. Like I said, I was wondering what they'd come out with, but nothing prepared me for their dummy card!

    Nvidia should spend less time having a war of words with Intel and more time doing what they've done best in the past; make good consumer graphics cards that make money.


  • Advertisement
  • Registered Users Posts: 5,560 ✭✭✭Slutmonkey57b


    That's their problem, the GT200 is essentially being sold at cost or a loss, yields aren't good enough even on die shrunk g92 parts, the chipset business is ****ed (just as well for consumers as their chipsets have been unreliable untrustworthy junk for years), GT300 is very late, doesn't work, can't be made and will sell at a massive loss assuming they can even get cards out the door. That's the reason for the focus on "supercomputing" - tesla cards featuring the same chip can be sold for $4,000 instead of $400.


  • Closed Accounts Posts: 8,983 ✭✭✭leninbenjamin


    That's the reason for the focus on "supercomputing" - tesla cards featuring the same chip can be sold for $4,000 instead of $400.

    i'd argue it's the other-way around. I've worked with CUDA and PhysX, and found them to be good technologies, accessible, and documented far better than those 'industry standards' MS always come up with. They made the mistake of putting all their eggs into GPGPU, which hasn't kicked off quite like they wanted it to due to developers failure to adapt to parallel programming practices (much like the Cell). All the petulance is simply their fear at their own lack of ideas.

    Tis a pity though. I've always preferred Nvidia's offerings, but it looks like my GTX285 will be the last Nvidia card I buy for a while. I bought it so i could develop for CUDA, but frankly found the i7 much easier to work with so i hardly go near CUDA any more. Seeing CUDA's the only thing in their current line-up they have to justify the price, it'll be ATI for me til they sort themselves out.

    funny thing is, larrabee could kill both ATI and Nvidia stone dead in the water if Intel get it right. The GPU war is getting all the more interesting.


  • Registered Users Posts: 2,888 ✭✭✭Rsaeire


    i'd argue it's the other-way around. I've worked with CUDA and PhysX, and found them to be good technologies, accessible, and documented far better than those 'industry standards' MS always come up with. They made the mistake of putting all their eggs into GPGPU, which hasn't kicked off quite like they wanted it to due to developers failure to adapt to parallel programming practices (much like the Cell). All the petulance is simply their fear at their own lack of ideas.

    I agree with your comment regarding the cell. It appears that Toshiba is utilising the processor more than any other company with their new range of 4K LCD TVs. The cell really is capable of so much and it's sad to see it being underutilised.
    funny thing is, larrabee could kill both ATI and Nvidia stone dead in the water if Intel get it right. The GPU war is getting all the more interesting.

    If Intel get it right, it will mean a better market place for us as consumers; all anyone needs do is look at the constant one-upmanship between ATI and Nvidia to see how we benefit.


  • Closed Accounts Posts: 8,983 ✭✭✭leninbenjamin


    was this posted already?
    NVIDIA IS KILLING the GTX260, GTX275, and GTX285 with the GTX295 almost assured to follow as it (Nvidia: NVDA) abandons the high and mid range graphics card market. Due to a massive series of engineering failures, nearly all of the company's product line is financially under water, and mismanagement seems to be killing the company.

    Not even an hour after we laid out the financial woes surrounding the Nvidia GTX275 and GTX260, word reached us that they are dead. Normally, this would be an update to the original article, but this news has enough dire implications that it needs its own story. Nvidia is in desperate shape, whoop-ass has turned to ash, and the wagons can't be circled any tighter.

    Word from sources deep in the bowels of 2701 San Tomas Expressway tell us that the OEMs have been notified that the GTX285 is EOL'd, the GTX260 is EOL in November or December depending on a few extraneous issues, and the GTX275 will be EOL'd within 2 weeks. I would expect this to happen around the time ATI launches its Juniper based boards, so before October 22.

    The lone survivor, maybe, is the GTX295, available only as a complete board from Nvidia. This is said to be almost impossible to get, likely for reasons we went into in the earlier financial article, cost. AIBs do not expect this product to last very long, likely until current stock is depleted.

    Basically, engineering failure after engineering failure has left Nvidia without a high end part. It is left waving shells while trying desperately to convince the loyal press that it is real. While that is a problem for the future, the current concern is that it has nothing that can compete with ATI's Evergreen line, HD5870, HD5850, and the upcoming Junipers.

    The G200b based parts can compete on performance, but not at a profit, so they are going to die. Nvidia was booted out of the high end market, and is now abandoning the mid range in a humbling retreat. Expect a similar backpedaling from the rest in January when Cedar and Redwood come out.

    There are no half or quarter Fermi derivatives taped out yet, so at a bare minimum, Nvidia has nothing for 2 more quarters. To make matters worse, due to the obscene 530++mm^2 die size on TSMC's 40nm process, Fermi is almost twice the size of its competitor, Cypress/HD5870/HD5850. A cut down half version would cost less but still be barely competitive with Juniper. That chip would once again be vastly larger and more expensive than the ATI equivalents, and that is before board costs are examined. As the product stack waterfalls down, the ratios remain the same, Nvidia cannot be cost competitive for the Evergreen vs Fermi generation, period.

    Massive engineering failures and cover-ups, starting with Bumpgate, have defined the company for the last two years. More recently, this includes the G212 failure, G214 fiasco and failure, now morphed into the G215 which is 3Q late so far, if it can ever be made profitably, and the G216 and G218 with the broken GDDR5 controllers. One or two failures are understandable, this many is flat out mismanagement.

    As we have been saying all along, there is no savior chip, no plan B, they all failed. Nvidia can make chips and sell them at a loss, or retreat from those markets and lose less money. The only question now is whether or not it can fix its engineering problems and get competitive parts out before it runs out of cash. Given that the earliest that this can happen is next summer, it will be very touch and go.

    Nvidia has alienated anyone who could be its friend, spawned needless lawsuits that very likely drive the company to a net negative value, and failed to sell the company while it still had a perception of worth. If you ask it, Nvidia will tell you that it is boldly turning itself into a vibrant GPU compute and cell phone chip giant. Should you want to remain in its good graces, this is not to be perceived as an exit strategy.

    With the cancellation of the GTX285, GTX275, and GTX260, possibly the GTX295 too, Nvidia is abandoning the entire high end and mid-range graphics market. Expect a reprise in January on the low end. The company is badly mismanaged and hated by the very partners it needs to throw it a life preserver.

    The only thing that can save Nvidia now is a wholesale replacement of top management. Sadly, the only people with enough shareholder leverage to do so are those very managers that need to go, so that is very unlikely. Unless there is a white knight or buyer in the wings, it is game over. At $1 per year, Jen-Hsun is overpaid.

    seems they are in much worse a position than anyone realised, although the truth of this article is questionable.


  • Registered Users Posts: 82,405 ✭✭✭✭Overheal


    Good riddance. Nvidia had its day, and theres more than 2 GPU manufacturers out in the wild.


  • Closed Accounts Posts: 12,401 ✭✭✭✭Anti


    Overheal wrote: »
    Good riddance. Nvidia had its day, and theres more than 2 GPU manufacturers out in the wild.


    There are lots, but only ati and nvidia put out cards people want, ati more now then ever. Look back over the last, oh lets go back to 2002/2003. Back then nvidia reigned supreme with its ti4200 and ti4600. Ati was nowhere to bee seen untill 2004 with its 9500 and 9800pro and it then the ati/nvidia fanboi **** started with the launch of the 5800fx (anyone apart form me owned this feckin hoover of a thing?)

    ever since then its been a tug of war over cards x850GT PE....6800GTX. Ati 1900, nvidia 7800/7900, ati 2900, and a 2900 x2. Nvidia come back with a 7950GT2 and then we have the 8800GTX which obliterated all for a long time, and now we have the 5870 and 5850, cooler, quieter and alot faster.

    It's always been like this, even as far back as the matrol mystique and matrox millenium in the 90's. Although matrox did the cleaver thing and bow out gracefully. Then Nvidia came in with the NV1, ati with the Rage and S3 with its virge. Then came the voodoo rush and voodoo2 chipsets...

    Wait.. where was i going with this?


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    This whole thing gets even crazier. Now Nvidia's pr guys are telling bare faced lies on Tweaktown
    We are confident that when the Fermi-based GeForce products ship that we will maintain our performance crown. Even today our GeForce GTX 285 significantly outperforms the HD 5870 in next gen games like Batman Arkham Asylum with physics. In fact, even our GeForce GTS 250 outperforms it. That's fundamentally the difference between us and AMD. We focus on graphics plus cool effects like physics and 3D stereo, whereas AMD just makes incremental changes to their graphics cards.

    Link

    ..........Cough..........

    20101.png


  • Registered Users Posts: 82,405 ✭✭✭✭Overheal


    Keyword with Physics.


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    Overheal wrote: »
    Keyword with Physics.

    What physics, there's no gpu physics with ATi so it wins. These effects could easily have been done on the cpu, I've a quad core, 2-3 of my cores just sit around with nothing to do during most games, give them something to do instead of putting pressure on my gpu. Look at Crysis, the physics effects, demolishable buildings and explosions in that were unreal. Much better than a few papers & shít blowing around in Batman AA. Nothing wrong with physics on the cpu in fact thats where we should be going with future cpu's having more cores.


  • Registered Users Posts: 5,560 ✭✭✭Slutmonkey57b


    What that link doesn't say is that part of Batman's core graphics and physics engine code was done solely by Nvidia. Not by the developer or the publisher. Not "technical support". Nvidia went in and actually coded it for the developer, and openly borked it on ATI cards, disables support for physics if ATI cards were detected, switched off AA support on ATI cards.... the developer should be flamed to the ground for this. Taking money off NV so that NV can deliberately break a game for 50% of consumers for no reason other than to make their own unsellable soon to be discontinued parts look better against the competition. Flagrant abuse.


  • Advertisement
  • Closed Accounts Posts: 8,983 ✭✭✭leninbenjamin


    PogMoThoin wrote: »
    These effects could easily have been done on the cpu,

    it'd be very difficult to match PhysX on CPU, even on the likes of an i7. Basically GPGPU is much much more efficient in terms of thread creation, so you can literally create 1000's of independent threads at a single time. that's what cloth and fluid simulations depend on, a single lightweight thread per particle. You simply cannot get the same efficiency on a CPU for those kind of effects.

    The other thing is, the way CUDA operates the operated threads are constantly switched to avoid memory latency. Hyper Threading on CPUs does this to a degree, but it's not near as efficient. a CPU parallelisation strategy is only going to be more efficient than GPGPU if you're reading and writing the same small set of shared memory repeatedly and so the memory can be stored in the CPU cache. The GPU is much more effective then for operating on independent data sets to get a single result, i.e. particle physics.

    I'm not exaggerating btw in this fact, for certain highly parallel algorithms, like the Fast Fourier Transform, it's possible to achieve near 100x speedups using the GPU as opposed to the CPU.


  • Registered Users Posts: 82,405 ✭✭✭✭Overheal


    Admittedly, high end physics formulas are quite math heavy, and you cant beat having a dedicated pipeline geared specifically for that.


  • Closed Accounts Posts: 8,983 ✭✭✭leninbenjamin


    Overheal wrote: »
    Admittedly, high end physics formulas are quite math heavy, and you cant beat having a dedicated pipeline geared specifically for that.

    it's not simply the fact it's math heavy though, it's the nature of the math. It's just a happy coincidence that most of the math behind interactive applications can be readily and easily parallelised. We're only just beginning to exploit this fact.

    It wont be long before we start seeing crowd simulations and other multi-agent AI applications shoved onto the GPU too, possibly even physically based audio synthesis, if it ever gets off the ground.


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    it'd be very difficult to match PhysX on CPU, even on the likes of an i7. Basically GPGPU is much much more efficient in terms of thread creation, so you can literally create 1000's of independent threads at a single time. that's what cloth and fluid simulations depend on, a single lightweight thread per particle. You simply cannot get the same efficiency on a CPU for those kind of effects.

    The other thing is, the way CUDA operates the operated threads are constantly switched to avoid memory latency. Hyper Threading on CPUs does this to a degree, but it's not near as efficient. a CPU parallelisation strategy is only going to be more efficient than GPGPU if you're reading and writing the same small set of shared memory repeatedly and so the memory can be stored in the CPU cache. The GPU is much more effective then for operating on independent data sets to get a single result, i.e. particle physics.

    I'm not exaggerating btw in this fact, for certain highly parallel algorithms, like the Fast Fourier Transform, it's possible to achieve near 100x speedups using the GPU as opposed to the CPU.

    Do we really need a closed source api like Cuda and PhysX? Nvidia have manipulated and cheated (and even lied) with Batman AA. No PhysX if an Ati card is present as primary adapter, even people with older Ageia ppu and an Ati gpu are blocked from PhysX, not to mention completely manipulating the game to run better on Nvidia cards.
    From what I've seen the PhysX results in Batman AA are not all that amazing when compared to whats already available thru Havok and Ati have said that some of this can be gpu accelerated . Havok is hardware agnostic and can run on every pc
    http://www.havok.com/index.php?page=showcase



    The time may be here for dedicated ppu, but I feel it needs not be tied to one or other of the main market players


  • Closed Accounts Posts: 8,983 ✭✭✭leninbenjamin


    PogMoThoin wrote: »
    Do we really need a closed source api like Cuda and PhysX?

    no we don't. It's just reality that Nvidia have been pushing GPGPU for a long time and are ahead of ATI in this regard. Even their architecture is more geared to GPGPU than ATI's because of the R&D they've been putting in.

    Happily though, DirectX 11 introduces DirectCompute, which is the first GPGPU API not locked to a particular hardware set. Hopefully DirectCompute (which is basically an MS knockoff of CUDA) will become the widespread GPGPU API for games and relegate CUDA to supercomputing only.
    PogMoThoin wrote: »
    The time may be here for dedicated ppu, but I feel it needs not be tied to one or other of the main market players

    I agree with all you've said there. I was just pointing out that those Batman effects simply could not be performed on CPU as efficiently. (notice there's no cloth or fluid simulation in that Havok demo there)


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    So, do You think game developers are going to embrace DirectX 11, most of them are stuck on DirectX 9 with no plans


  • Closed Accounts Posts: 8,983 ✭✭✭leninbenjamin


    PogMoThoin wrote: »
    So, do You think game developers are going to embrace DirectX 11, most of them are stuck on DirectX 9 with no plans

    hehe, that's always the question isn't it? the technology is there, the results have been shown. meh...


  • Registered Users Posts: 82,405 ✭✭✭✭Overheal


    The next 6 months of PC sales will be a determining factor. Much in part to the Christmas Season and post recession spending habits, and windows 7 - with more people jumping from XP onto DX10 and 11 capable OS's.


  • Closed Accounts Posts: 8,983 ✭✭✭leninbenjamin


    Overheal wrote: »
    The next 6 months of PC sales will be a determining factor. Much in part to the Christmas Season and post recession spending habits, and windows 7 - with more people jumping from XP onto DX10 and 11 capable OS's.

    heh, Nvidia made the mistake of thinking it's determined by PC sales. At the end of the day, PogMoThóin's on the right track, it's more determined by the developers openness and acceptance of the technologies. Sure the only consistent determinant of success of a platform is the games line-up.


  • Advertisement
  • Registered Users Posts: 82,405 ✭✭✭✭Overheal


    Mmm. I am forgetting the PS3 fiasco. Nobody wanted to touch that thing. Some still dont.


Advertisement