Just wondering,what AMD would need to do…to at least MATCH nvidias offering in A.I/dlss/Ray tracing tech
Fsr3!…they need to try hard to get ray tracing in ther gpus
They can’t Nvidia is to big if AMD put in as much as Nvidia does and it didn’t pay off with a few years they would go bankrupt.
Do people think they can’t catch up? Remember Ryzen?
They can if they start taking their GPU division seriously, in terms of R&D and units produced, which they are not.
Yes I remember ryzen, tell me more how it took what, 4 gens to best intel’s skylake++++++++ in gaming?
Ryzen caught up first and foremost because Intel stalled. And the driving stall on Intels part was vis a vis TSMC more than it was against AMD.
The opposite is also true. In terms of CPU design itself, that is, what AMD actually does, they are able to match Intel’s offering since. It could’ve been a flop if TSMC didn’t deliver, but the Ryzen architecture (which is what we’re talking about in this thread, design) was up to the level of their competitor after being lagging behind for like half a decade.
So, I insist. With enough R&D they’d be able to do something similar in the GPU side of things.
no, at 4k fsr quality maybe looks only as good as dlss performance. that makes a 7900xtx only as fast as a 3080.
And needs an answer to dlss.
It will take a while but Ray tracing will become as low-cost and standard as anti aliasing, probably buy around 2040 they will both have indistinguishable performance for Ray tracing. By that time games will be completely AI designed and generated using things like gorce and splatting and AMD will completely miss the gaussian splatting development suits because it’s a generalist company and it lost its graphics department is subservient and less well managed.
Of course, they could. Their hardware isn’t that bad; they are closer than anybody else. Their software stack is another story. AMD has been promising to do a better job at that for more than a decade. I don’t really trust their commitment to their software stack anymore. Actually, Intel might overtake them in that regard.
Intel is actually closer than AMD. Apparently so is Apple
Competition drives innovation, so AMD catching up to Nvidia would only make both better. I’m hopeful they rise to the challenge for the benefit of us all.
No because Nvidia has software companies on their side and that proprietary software is simply more developed than that for AMD.
Yes because what Nvidia is doing isn’t super special. Of course AMD will have an equivalent or better solution, so the question really should be “how many years behind” will AMD be.
They closed the gap significantly in raster perf. Power efficiency is pretty close, and so is area efficiency. AI mostly a software problem and AMD aren’t blind to this, and are very clearly investing a ton more into software to close this gap. (They just bought Node AI and absorbed all their talent)
The hardware is arguably better in many aspects. MI200 and MI250 are HPC monsters and MI300 is a chiplet packaging masterpiece that has has HPC performance on lockdown.
There’s a reason that no new HPC super computers are announced with Nvidia GPUs.
Nvidia has lead in AI, AMD has lead in HPC. Nvidia has lead in area efficiency, AMD has lead in packaging expertise (which means they can throw a ton more area at the problem with the same cost of Nvidia)
Same could be said with amd / intel in 2015, and yet here we are
Ain’t that the truth. I remember reading a bunch of obituaries for AMD around that timeframe. Dr. Su is a very smart person and AMD has proven that they have the technical chops to change the game. Anyone who loves computer hardware should be rooting for whoever is losing to win because it just drives down prices and pushes competitors to innovate. Look at intel’s pricing since Ryzen dropped. If they could manage to beat Nvidia outright for one generation in GPU’s it would benefit all consumers.
People really underestimate how big nvidia is
Can they? Yes!
Will they? Probably not. Nvidia has more software developers, more industry support, an already established technology, more money… I don’t see it coming.How long did it take for AMD to catch up to Nvidia with tessellation? My google search pulls up threads from 8 years ago (2015) discussing if AMD had caught up. The first game to use it was Messiah from 2000. Tessellation personally caught my attention in 2011 with Batman Arkham City.
With pressure from Sony/MS, I could possibly see AMD trying to ramp up RT performance or risk losing two lucrative partners.
I think it highly depends on what patents NVIDIA holds. Patents hold technology back, and one of the major obstacles for companies is finding ways around the patents other companies hold.