Jump to content

Recommended Posts

Posted
7 hours ago, davidwhangchoi said:

MicroCenter lists Radeon RX 9070 series: RX 9070 XT starting at $699, RX 9070 at $649

Quote

Once again, let’s remind everyone that AMD has not yet confirmed pricing to the media. All we have for now are speculation and second-hand information from board partners. Naturally, some will provide early info to retailers, and larger chains like Micro Center may and should have final pricing before launch. However, even the example above shows the disparity between models.

AMD (and Nvidia before RTX) have been know to alter the price, sometimes, right up to the day before their announcements so let's not jump on this yet.

Posted
9 hours ago, azrael said:

AMD (and Nvidia before RTX) have been know to alter the price, sometimes, right up to the day before their announcements so let's not jump on this yet.

AMD hasn't set a suggested price yet. GN put up an "old man yells at cloud" video, inspired partially by people in the Radeon division asking journalists such as him what price they should be setting, because they don't have one and their homework is due on the 28th.

 

This is weirder than nVidia's thing, where the board manufacturers learn what suggested retail is at the same time we do. nVidia knows but doesn't tell anyone, AMD just doesn't know.

Posted

So the 9700 and 9700 XT are actually going to be $549 and $599 (assuming you get get them at MSRP, though it does sound like AMD is working to have a lot more stock than Nvidia). Still concerned about AMD's ray tracing performance and FSR lagging behind Nvidia and DLSS, but I'm eagerly awaiting real benchmarks and proper reviews. 

Posted
1 hour ago, mikeszekely said:

So the 9700 and 9700 XT are actually going to be $549 and $599 (assuming you get get them at MSRP, though it does sound like AMD is working to have a lot more stock than Nvidia). Still concerned about AMD's ray tracing performance and FSR lagging behind Nvidia and DLSS, but I'm eagerly awaiting real benchmarks and proper reviews. 

This is great news that AMD is going in the direction of normal GPU pricing. I'm excited about getting a PC again. 

hopefully AMD's Ray Tracing for these card are sufficient enough of a gain from last gen.     

Posted
2 hours ago, mikeszekely said:

So the 9700 and 9700 XT are actually going to be $549 and $599 (assuming you get get them at MSRP, though it does sound like AMD is working to have a lot more stock than Nvidia). Still concerned about AMD's ray tracing performance and FSR lagging behind Nvidia and DLSS, but I'm eagerly awaiting real benchmarks and proper reviews. 

AMD will never top Nvidia's RT and DLSS performance. Will it be better than AMD's previous generation? Hopefully.

As for pricing...🤞The hope is to get equal or better performance at that price vs the -70-class cards. It's possible the pricing at Microcenter is true since AIBs will add a $50-100 charge on top of MSRP. Unfortunately, we can only wait and see.

Posted (edited)
20 minutes ago, azrael said:

AMD will never top Nvidia's RT and DLSS performance. Will it be better than AMD's previous generation? Hopefully.

As for pricing...🤞The hope is to get equal or better performance at that price vs the -70-class cards. It's possible the pricing at Microcenter is true since AIBs will add a $50-100 charge on top of MSRP. Unfortunately, we can only wait and see.

From what I've seen, the performance of my 7900xt, and the price point my expectation is we'll be pleasantly surprised.

You're not going to see bleeding edge performance, but for resolutions up to 4k, in most cases, it'll be good enough for the mainstream gamer.

Edited by Test_Pilot_2
Posted
4 hours ago, azrael said:

AMD will never top Nvidia's RT and DLSS performance. Will it be better than AMD's previous generation? Hopefully.

Yeah, I don't expect them to even match DLSS 3, let alone DLSS 4. I'm just going they've improved enough that i don't look at AMD's RT and FSR and immediately say, "Eww, I'll pay extra for an RTX 5070ti, thanks anyway."

Posted (edited)

https://videocardz.com/newz/amd-confirms-radeon-rx-9070-xt-is-42-faster-than-rx-7900-gre-at-4k

AMD shares official performance estimates. 

As we wait for 3rd party independent testing, we can get the range their claims at least.

They have Ultra settings and Max settings listed:

RX 7900 XT at 4k Max settings 26% faster than 3090

RX 7900 at 4k MAx settings 26% faster than 3080

 

No mention of FSR or Ray Tracing being on or off in their comparisons, but given they are not even comparing (edit: consistently bouncing between Max and Ultra settings), and comparing with older 3000 series nvidia cards is a indicator of so so performance. 

 

 

Edited by davidwhangchoi
Posted (edited)

They have a slide comparing 9070 XT to 5070 Ti, for a simpler comparison.

For whatever reason, it seems to have flown under the radar despite showing up in the official presentation. As usual, take first-party claims with a grain of salt (video timestamped, but 27:00 if not):

  • -2% 4K performance
  • -23% price +23% "performance per dollar"
  • +2% 4K performance if OCed
Edited by kajnrig
Posted
10 minutes ago, kajnrig said:

They have a slide comparing 9070 XT to 5070 Ti, for a simpler comparison.

For whatever reason, it seems to have flown under the radar despite showing up in the official presentation. As usual, take first-party claims with a grain of salt (video timestamped, but 27:00 if not):

  • -2% 4K performance
  • -23% price +23% "performance per dollar"
  • +2% 4K performance if OCed

-2% 4K performance
+2% 4K performance if OCed

🤨That's practically margin-of-error numbers there....

Posted
1 minute ago, azrael said:

-2% 4K performance
+2% 4K performance if OCed

🤨That's practically margin-of-error numbers there....

Yep. The specific graphs do show a disparity between raster and RT, so that's something to keep in mind, but essentially it's a 5070 Ti for $150 less... assuming both at MSRP.

If 5070 Ti prices don't drop in response, and 9070 XT prices stay anywhere near MSRP for any significant length of time, the difference will only be that much starker. What's the average price of the 5070 Ti right now, $900? $1000?

Posted
20 minutes ago, kajnrig said:

If 5070 Ti prices don't drop in response,

image.gif

I'm pretty sure Nvidia doesn't care about lowering prices.

Posted
53 minutes ago, azrael said:

-2% 4K performance
+2% 4K performance if OCed

🤨That's practically margin-of-error numbers there....

Pr'y much. nVidia said even -5% was nothing to worry about, didn't even warrant telling manufacturers they sent out defective chips.

Posted

Let's say, for the sake of argument, AMD isn't bs-ing and the 9070 XT is the same as the RTX 5070ti at rendering frames at 1440p with no upscaling or ray tracing. That's good and all... but upscaling and RT matter to me. So I still kind of need to know, how much worse is the 9070 XT at ray tracing? How much worse does the latest Zen 4 version of FSR look compared to DLSS? What kind of framerates both cards getting with RT on and upscaling set to Quality? Ultimately, to me, the RTX 5070ti might be worth $150 more (assuming it's eventually possible to get one at or near  MSRP later in the year).

Posted
2 hours ago, mikeszekely said:

Let's say, for the sake of argument, AMD isn't bs-ing and the 9070 XT is the same as the RTX 5070ti at rendering frames at 1440p with no upscaling or ray tracing. That's good and all... but upscaling and RT matter to me. So I still kind of need to know, how much worse is the 9070 XT at ray tracing? How much worse does the latest Zen 4 version of FSR look compared to DLSS? What kind of framerates both cards getting with RT on and upscaling set to Quality? Ultimately, to me, the RTX 5070ti might be worth $150 more (assuming it's eventually possible to get one at or near  MSRP later in the year).

 FSR will be significantly inferior to dlss 4.0, Raytracing will be weaker than a 5070 ti but hopefully it's decent in performance standing on it's own.  

I think i'll get a 9700 XT if i want to pay around 600, if i want to pay closer to $1000, i'll scale up to  go hunt for a 5080 and pay 1100+. I had some 5080's at 999-1079, sitting in my cart for 30 mins launch day but passed on it.  

Based on history don't expect a 5070 ti to come down till a super variant in released that is if it's even at the same price point. Nvidia are not in a hurry to make more and will let the AIB's continue the pricing they have.

i do believe nvidia will begin to push out the 5060 and 5070 cards in May/June and will be ample supply of them, It's their ploy to get those who missed the 5070 ti to grab a 5070. 

Posted
6 hours ago, mikeszekely said:

...but upscaling and RT matter to me. So I still kind of need to know, how much worse is the 9070 XT at ray tracing? How much worse does the latest Zen 4 version of FSR look compared to DLSS? What kind of framerates both cards getting with RT on and upscaling set to Quality? Ultimately, to me, the RTX 5070ti might be worth $150 more (assuming it's eventually possible to get one at or near  MSRP later in the year).

With more AAA games requiring RT for even some minor scene rendering, this is where I'm in agreement with you; paying the Nvidia-tax for a 5070ti is probably going to be worth it. 🤷‍♂️ Even if the RX 9070s are price/performance competitive, AMD is still playing catch-up to Nvidia's RT, upscaling & frame gen performance.

4 hours ago, davidwhangchoi said:

Based on history don't expect a 5070 ti to come down till a super variant in released that is if it's even at the same price point.

Nvidia learned their lesson when the 40-series came out. They will stop producing the previous version when an announcement is coming; causing the prices to go up as supply dries up on the previous versions.

4 hours ago, davidwhangchoi said:

i do believe nvidia will begin to push out the 5060 and 5070 cards in May/June and will be ample supply of them, It's their ploy to get those who missed the 5070 ti to grab a 5070. 

The 5070 is due out on March 5th....OOOOOh you mean when there will be actual stock in stores and not a paper launch.....May/June is a reasonable timeframe. The 5060 will probably show up this summer. I mean actual stock in stores. 😝 Assuming they corrected the defective GPUs at the fab already.

Posted
1 hour ago, azrael said:

The 5070 is due out on March 5th....OOOOOh you mean when there will be actual stock in stores and not a paper launch.....May/June is a reasonable timeframe. The 5060 will probably show up this summer. I mean actual stock in stores. 😝 Assuming they corrected the defective GPUs at the fab already.

:lol: yeah, 3/5 is way too soon...

 https://videocardz.com/newz/nvidia-geforce-rtx-5070-and-rtx-5060-mass-production-faces-delays

5070 is still a crap card... I'll see on pre-order day if it's easy to obtain just for kicks.  

i'm thinking by the time May/June comes around, the frenzy will die and people will realize 12GB in 2025 is not really a good deal. I still think the 5080 at msrp ($999) is a not a buy. (why i didn't bite)  It's just artificially desirable due to the shortage and scalpers. I'd still get a 5090 though just to resell. lol   

 

Posted
7 hours ago, azrael said:

With more AAA games requiring RT for even some minor scene rendering, this is where I'm in agreement with you; paying the Nvidia-tax for a 5070ti is probably going to be worth it. 🤷‍♂️ Even if the RX 9070s are price/performance competitive, AMD is still playing catch-up to Nvidia's RT, upscaling & frame gen performance.

Nvidia learned their lesson when the 40-series came out. They will stop producing the previous version when an announcement is coming; causing the prices to go up as supply dries up on the previous versions.

The 5070 is due out on March 5th....OOOOOh you mean when there will be actual stock in stores and not a paper launch.....May/June is a reasonable timeframe. The 5060 will probably show up this summer. I mean actual stock in stores. 😝 Assuming they corrected the defective GPUs at the fab already.

On the RT point... my 7900xt does well enough.

In Avowed with raytracing on it runs fine at 4k ultra and naturally better at 1440p. Just finished it - great game.

If you're chasing absolute framerates, yeah go ham... Thunderdome for a 5090... I'm sure someone will figure out how to SLI them and achieve singularity.

I think for practical purposes the 9700xt is more than capable. I do think that for the AI-based frame gen handheld PCs Nvidia is the better choice. They're doing pretty amazing things there, too.

Posted (edited)
19 hours ago, mikeszekely said:

Let's say, for the sake of argument, AMD isn't bs-ing and the 9070 XT is the same as the RTX 5070ti at rendering frames at 1440p with no upscaling or ray tracing. That's good and all... but upscaling and RT matter to me. So I still kind of need to know, how much worse is the 9070 XT at ray tracing? How much worse does the latest Zen 4 version of FSR look compared to DLSS? What kind of framerates both cards getting with RT on and upscaling set to Quality? Ultimately, to me, the RTX 5070ti might be worth $150 more (assuming it's eventually possible to get one at or near  MSRP later in the year).

We'll have answers to all these questions and more once the review embargos lift(allegedly on the fifth).

 

The only "notable" thing about AMD's benchmarks is they were actual performance, not upscaled from a lower resolution and with frame interpolation turned on. That this is notable is a sad statement on the modern tech market.

Edited by JB0
Posted
12 hours ago, Test_Pilot_2 said:

In Avowed with raytracing on it runs fine at 4k ultra and naturally better at 1440p. Just finished it - great game.

Already?  I thought I was playing it pretty hardcore, I'm just reaching Thirdborn.

7 hours ago, JB0 said:

We'll have answers to all these questions and more once the review embargos lift(allegedly on the fifth).

Ayup.

7 hours ago, JB0 said:

The only "notable" thing about AMD's benchmarks is they were actual performance, not upscaled from a lower resolution and with frame interpolation turned on. That this is notable is a sad statement on the modern tech market.

I mean, they have to go on actual performance because they're lagging on things like ray tracing and AI upscaling.  But what's sad?  Moore's Law really is dead... we're getting close to the limit of what's physically possible with how the chips are manufactured.  Can't keep shrinking the dies or packing it transistors.  You can pump more power in, but an RTX 5090 can already drawn nearly 600 watts; you're going to need dedicated circuits from your breaker and an industrial cooler to drawn much more.  Like it or not, you want more frames, AI upscaling is the future, and Nvidia's just plain better at it.  They've gotten to the point where, unless I'm actively looking for it, I don't notice a difference in quality 99% of the time.

(Not a fan of frame gen, though).

Posted
5 hours ago, mikeszekely said:

Already?  I thought I was playing it pretty hardcore, I'm just reaching Thirdborn.

Ayup.

I mean, they have to go on actual performance because they're lagging on things like ray tracing and AI upscaling.  But what's sad?  Moore's Law really is dead... we're getting close to the limit of what's physically possible with how the chips are manufactured.  Can't keep shrinking the dies or packing it transistors.  You can pump more power in, but an RTX 5090 can already drawn nearly 600 watts; you're going to need dedicated circuits from your breaker and an industrial cooler to drawn much more.  Like it or not, you want more frames, AI upscaling is the future, and Nvidia's just plain better at it.  They've gotten to the point where, unless I'm actively looking for it, I don't notice a difference in quality 99% of the time.

(Not a fan of frame gen, though).

I don't consider upscaling or frame interpolation to be worth counting. I only acknowledge them for the sake of mocking them. 

 

While I think the amount of raytracing that a top-end nVidia card can handle counts as a toy and a gimmick, I do consider it actual performance.

If they can ever bring enough power to bear to raytrace a complete scene in realtime, it WILL be a game-changer, but we have decades of rendering techniques that were all developed specifically so we don't have to pay the massive performance cost of raytracing. 

(I actually think nVidia started the raytracing thing specifically to put AMD on the back foot. It came out of nowhere and was something nVidia could do that AMD couldn't.)

 

 

My usage is such that raytracing doesn't matter to me at all, but I do acknowledge it.

My main interest in more power is VR, and it continues to irk me that multi-GPU setups were killed off right as rendering to two screens at once became a thing anyone was doing. 

Posted
7 hours ago, JB0 said:

I don't consider upscaling or frame interpolation to be worth counting. I only acknowledge them for the sake of mocking them. 

The whole idea behind these technologies is to reduce the GPU load spent on rendering the image, especially at greater resolutions. 8.3 million pixels (i.e. 4k resolution) to render every second is quite a lot of work for a GPU. Offloading that to a secondary processing unit means the GPU is free to render instead of waiting for the app to tell the GPU what to render next.

8 hours ago, JB0 said:

If they can ever bring enough power to bear to raytrace a complete scene in realtime, it WILL be a game-changer,

Isn't that the point of RT cores? To ray-trace in real time? That's kind of the point to all this; to render in real-time. We're not there yet, but we're getting closer. Rasterization is faster and less taxing, but it's not as accurate or as true-to-life as ray-tracing.

8 hours ago, JB0 said:

(I actually think nVidia started the raytracing thing specifically to put AMD on the back foot. It came out of nowhere and was something nVidia could do that AMD couldn't.)

The road not taken....

8 hours ago, JB0 said:

My usage is such that raytracing doesn't matter to me at all, but I do acknowledge it.

My main interest in more power is VR, and it continues to irk me that multi-GPU setups were killed off right as rendering to two screens at once became a thing anyone was doing. 

There's not many applications that actually use ray-tracing but it's slowly starting to be used more.

Outside of commercial/datacenter applications, multi-GPU was a pain to support in the consumer space. Time spent trying to make multi-GPU work in the consumer-space exceeded the benefit of it. So the ROI just wasn't there as GPUs got more and more powerful (and more expensive). 

Posted (edited)

^agree with everything above.

posting this since there's interest in upgrading to an RTX 5080

 

Dell Alienware - Core Ultra 7 265F RTX 5080 16GB 1TB $2299

Dell Technologies has Alienware Aurora R16 Gaming Desktop on sale for $2299.99 or as low as $2069 for customers whose accounts are eligible to apply the extra 10% off email offers sign-up coupon (see details below). Shipping is free.

 

details and instructions at the link:

https://slickdeals.net/f/18145978-dell-alienware-aurora-desktop-ultra-7-265f-rtx-5080-16gb-ddr5-1tb-ssd-2300-or-less-free-shipping?v=1&src=frontpage

(just get your own RAM upgrade for that low 16gb) 

edit:  Amex premium Business Cash Card has 13% off. If somebody can combine with 10% off, this is killer deal.

10% off, 13% Amex, 12% CB

with cashback from Rakuten with the 10% off dell coupon you can get this for under $1800 which is the best deal around. (pls read the site thread for details on cashback)

Edited by davidwhangchoi
Posted
4 hours ago, azrael said:

Isn't that the point of RT cores? To ray-trace in real time? That's kind of the point to all this; to render in real-time. We're not there yet, but we're getting closer.

The point is to render a handful of rays for gimmick effects. All evidence suggests that raytracing a scene with dedicated raytracing hardware would require far more power than traditional rasterization, and we already "can't" bring enough power to bear for rasterization.

 

4 hours ago, azrael said:

The whole idea behind these technologies is to reduce the GPU load spent on rendering the image, especially at greater resolutions. 8.3 million pixels (i.e. 4k resolution) to render every second is quite a lot of work for a GPU. Offloading that to a secondary processing unit means the GPU is free to render instead of waiting for the app to tell the GPU what to render next.

Ooooor... you can admit you're only running the game at 640x480 and not use upscaling and frame interpolation to pretend you aren't. It's what we used to do when we couldn't render a game at our monitor's max resolution. 

I also believe poorly-optimized games is a large portion of the problem.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...