Help me build my PC 2017 Catch All

Rykin wrote:

It would make any games she wants to play on multiple screens run worse probably.

There are games that support multiple screens? I guess there’s the trick where you span the 3D scene across multiple monitors. First thing that came to mind though was how much I’d like to be able to have my map and inventory on monitor 2 in a lot of games...

Vargen wrote:
Rykin wrote:

It would make any games she wants to play on multiple screens run worse probably.

There are games that support multiple screens? I guess there’s the trick where you span the 3D scene across multiple monitors. First thing that came to mind though was how much I’d like to be able to have my map and inventory on monitor 2 in a lot of games...

Lol. Nintendoes what PCdon't? Except at insanely low resolutions.

Vargen wrote:

... should I be turning off the Intel integrated video? I didn't, because that's not the change I was there to make. But I am wondering.

Generally, there's no reason to.

Conversely, is there any advantage to be gained from plugging monitor 2 into the integrated video while the nvidia card handles the 3d game on monitor 1?

No.

AMD just announced the Radeon 7 card to be coming out February 7th for $699.

WizKid wrote:

AMD just announced the Radeon 7 card to be coming out February 7th for $699.

https://www.hardocp.com/news/2019/01...

Looks to be competing with the RX2080 in terms of power and price. We'll see once real world benchmarks are released. I'm cautiously optimistic.. its not earth shattering in terms of price/performance to really impact Nvidia at all. I would have been more excited if AMD announced this card at $499

Vargen wrote:
Rykin wrote:

It would make any games she wants to play on multiple screens run worse probably.

There are games that support multiple screens? I guess there’s the trick where you span the 3D scene across multiple monitors. First thing that came to mind though was how much I’d like to be able to have my map and inventory on monitor 2 in a lot of games...

Only ones I know of are racing sims or flight sims and even then it may be that they are just stretching windowed mode across multiple screens.

I would have been more excited if AMD announced this card at $499

It makes you wonder if AMD has learned from the past. Perhaps they have priced it higher than what would be more competitive so that when Nvidia undercuts them, they can pivot to match with a price cut of their own.

I think it's a good sign that AMD is competing with the 2080, whereas previously with Polaris they were topping out at competing with the 1060. I assume AMD will have a line of cards coming out beneath this card which fills out the lower price points.

I also have to assume the pricing is set in expectation of reacting to moves from NVIDIA. They've set this guy at exactly $100 below the MSRP of the 2080 it competes against.

Wasn't the MSRP for 2080 at $699 too? While $799 was for the "founder's" thing.
Sounds like it is basically a beefed up current-generation AMD card and not their (supposedly) upcoming one. Maybe more like a Titan card "shortly" before the next generation.

TheGameguru wrote:

I would have been more excited if AMD announced this card at $499

See dragon comic on previous page.

Well new PC purchased. I forgot how scary the numbers can be.

That seems like a very unexciting announcement. The 2080 and 2080Ti are over- and massively over-priced, and not really that much faster than the 1080 and 1080Ti. Their big draw, supposedly, is all that ray tracing silicon. Doesn't interest me, in this generation, but that's the headline reason to buy those cards.

So this new AMD card is almost as expensive, doesn't offer ray tracing at all, and apparently will burn a crapload of power; Ars is saying maybe 300 watts. And this is going from a 14nm to a 7nm process, so this card really should be offering much better performance, but isn't.

I'd call that a major letdown. They should have a huge relative silicon budget, because they're not doing raytracing. These chips should be stomping NVidia at conventional rendering, and they're not even quite comparable, and possibly burning substantially more power (and heat) to reach those lower performance levels. And the pricing is very high.

AMD is not doing well on the GPU front. From what I can see, their major competitive argument is "We're not NVidia", which has some appeal, but that's a very poor tactical position. Not being the other company isn't exactly a high bar to clear, you know?

I'd call that a major letdown. They should have a huge relative silicon budget, because they're not doing raytracing. These chips should be stomping NVidia at conventional rendering, and they're not even quite comparable, and possibly burning substantially more power (and heat) to reach those lower performance levels. And the pricing is very high.

While I agree with most of your points.. feels like this card should be at most $599 (and lets face it within a year it will be that if not lower) but I'm not sure how you have already figured out its performance given no trusted review site has had a shipping version to run through benchmarks.

I read an article on the train this morning that AMD's new card kind of helps prove Nvidia's point-- the market still skews heavily towards 1080p, with 1440p taking the 2nd place spot in Steam's hardware surveys at 3% of their users. Looking from that perspective, there just isn't much need for massive leaps in processing power beyond the GTX 10 series and AMD's Vega stuff to push higher settings at common resolutions. So Nvidia was sort of right to try to shift the conversation towards things other than graphics power, like solving other major real-time problems like Raytracing.

That said, I agree that the prices are too high, given the relatively incremental performance bumps.

WipEout wrote:

I read an article on the train this morning that AMD's new card kind of helps prove Nvidia's point-- the market still skews heavily towards 1080p, with 1440p taking the 2nd place spot in Steam's hardware surveys at 3% of their users. Looking from that perspective, there just isn't much need for massive leaps in processing power beyond the GTX 10 series and AMD's Vega stuff to push higher settings at common resolutions. So Nvidia was sort of right to try to shift the conversation towards things other than graphics power, like solving other major real-time problems like Raytracing.

That said, I agree that the prices are too high, given the relatively incremental performance bumps.

First I hear that RayTracing solves a real problem.. I mean the Raytracing Nvidia is solving for in this generation barely looks different when I see the videos.

Yeah the "RTX Off / RTX On" has become a meme for a reason. It's cool that it's in development, but it's at least another generation away from being remotely useful.

*Legion* wrote:

Yeah the "RTX Off / RTX On" has become a meme for a reason. It's cool that it's in development, but it's at least another generation away from being remotely useful.

Yeah AMD was smart to avoid anything but a tweak in performance... I just wish they realized they need to still be significantly cheaper than Nvidia to move the needle.. The Radeon 7 needs to be $599 max.

These benchmarks at 4K are promising if true..

https://hardforum.com/threads/radeon...

TheGameguru wrote:
WipEout wrote:

I read an article on the train this morning that AMD's new card kind of helps prove Nvidia's point-- the market still skews heavily towards 1080p, with 1440p taking the 2nd place spot in Steam's hardware surveys at 3% of their users. Looking from that perspective, there just isn't much need for massive leaps in processing power beyond the GTX 10 series and AMD's Vega stuff to push higher settings at common resolutions. So Nvidia was sort of right to try to shift the conversation towards things other than graphics power, like solving other major real-time problems like Raytracing.

That said, I agree that the prices are too high, given the relatively incremental performance bumps.

First I hear that RayTracing solves a real problem.. I mean the Raytracing Nvidia is solving for in this generation barely looks different when I see the videos.

I'd argue a difference between a "real" and a "real-time" problem. One being a big hurdle that inhibits a major factor or functionality of your product, the other being a problem that occurs in a real-time engine when trying to do certain things. I didn't say raytracing was a real problem that games need to get past in order to advance the medium, but raytracing in real-time has been virtually impossible until this past year, and is going to change a lot about how we 3D artists and designers create the pretty pictures for players to enjoy in the future.

That said, it's a testament to Nvidia, Microsoft, and their partners in developing tools (in DIrectX) to fake real-time reflections and lighting to such an extent that you guys don't see a big difference. Does it mean it's not a problem, though? Not at all. It's not a problem for you, as gamers, sure. It doesn't really affect your games in any big way (yet), but it will. It's a wall that we're going to run into as developers (a wall that my studio has already run into and worked around on mobile), so it's not like there was no need to solve real-time raytracing in game engines.

And yeah, it's not really a marketable graphics improvement for consumers yet, but it makes sense that given how far graphics have pushed the market and where those performance needs have settled for a majority of gamers, Nvidia is looking to push other boundaries. They're basically going down the checklist while there's still some downtime.

All THAT said, I still agree that those cards cost too damn much and the steps forward in solving real-time raytracing aren't worth it to the consumers just yet.

But you and I both know that many times in 3d graphics it is better to fake it than to have the actual. Just look at biased vs. non-biased renderers or things like fluid simulation where you get just enough samples to be accurate and use algorithms to blend the rest. Because many things computer are economies of scale where the last 10% takes 20x longer than the prior 90%.

It is also important to note that the infrastructure is set up for faking it and is more sophisticated. I am not saying things won't change but there is no magic bullet app that is going to change things over night.

Sure, but my point is that as graphics continue to edge towards realism and physical accuracy, our industry is shifting towards procedural content generation, which means more power is necessary for realistic simulations, often in-engine. Look at how much and often Houdini is used not just for exporting point-cached cloth or fluid sims like we used to do in Maya, but also real-time, in-engine dev tools for characters and environments, Substance for procedurally-based material generation, physically-based rendering, etc. Hell, you can run real-time fluid sims in Unity on mobile phones now.

Also no one here is talking about a magic bullet app (though if there is one, I bet it'd be Houdini). I'm simply saying that Nvidia seems to be right to start tackling other major real-time graphics hurdles given the current state of the industry and the direction that content creation is headed.

TheGameguru wrote:
I'd call that a major letdown. They should have a huge relative silicon budget, because they're not doing raytracing. These chips should be stomping NVidia at conventional rendering, and they're not even quite comparable, and possibly burning substantially more power (and heat) to reach those lower performance levels. And the pricing is very high.

While I agree with most of your points.. feels like this card should be at most $599 (and lets face it within a year it will be that if not lower) but I'm not sure how you have already figured out its performance given no trusted review site has had a shipping version to run through benchmarks.

AMD is claiming that it occasionally beats the 2080. That's the best they can say about it: "we win in some games." If they had something really good, we'd know.

My present assumption is that it will be roughly comparable to a 2080, burn a ton more power, cost a little less, and not have raytracing. With the extra transistors from not raytracing, they should be doing much better than that.

edit: and note that I'm making no argument about raytracing itself. I don't care about it, and don't think it's directly important in this generation. Someday it may matter a lot. Right now, it doesn't.

What I am arguing about, however, is transistor budgets. The RTX stuff is not cheap to make. That means AMD should be able to devote a ton of die space to conventional rendering, space NVidia simply doesn't have available. But even with all that extra die space, they're only getting to about speed parity, and burning more power, possibly a ton more power.

If the cards were then much cheaper, that would be okay, but they're charging for all those transistors and then not using them very efficiently. NVidia costs only a little more, and does have the raytracing to play around with. Even if games don't directly take much advantage, it might be a rather interesting thing to mess around with yourself, either programming it directly, or using a raytracing program/tool that will talk to the hardware.

Yeah for me RayTracing in its current form is useless. The performance hit isn’t worth enabling it so making that some defining difference this generation is a complete waste for me.

AMD priced it competitively because that’s what you are supposed to do. If it competes with a 2080 then in theory it should be priced the same. I don’t like it but I can understand why they did it. Ultimately though I don’t believe gamers will buy this card at $699. Nvidia’s mind Share and their super fanatical fan base will continue to disproportionately influence the GPU market to the detriment of the overall GPU space.

I may get one or a more expensive “Ti” one if AMD supports VRR under HDMI with the new LG OLED’s.

Well, if they want to price it that way, they should have delivered much better performance. It certainly appears that you're getting a lot more transistors (or more efficient ones) for your dollar with NVidia, even though you can't use most of them yet.

It just doesn't feel like they're really trying, that they don't want to actually compete. If they were on their game, they'd be making NVidia look weak and underpowered in this generation, because they are. They blew a ton of transistors on a useless feature, made themselves vulnerable, and AMD isn't going after them.

Which implies that they can't, that they just don't have the expertise anymore to make leading-edge graphic cards. Looks like NVidia wins their gamble pretty decisively.

I think raytracing is currently at the point that early 3D graphics cards were at before games came out that really pushed the hardware to show what it could do and why it was needed. I'm also not sure that the current line of RTX cards are powerful enough to support a game like that since it is my understanding that they are working on a very limited basis for a few very specific things in the scene with the traditional rendering pipeline doing most of the work.

I also think that when the Radeon VII gets out in the wild and benchmarks are done people are going to be just as disappointed in it as they were in the 2080.

Finally found the article I was initially referencing (I wish Android's news feed had a searchable history ). I thought it was an interesting business perspective, and certainly not one I immediately took as a gamer (or a game developer). It definitely puts the performance levels the new generations are (not) generating into perspective.

Either way, from this perspective it makes more sense to me to push GPUs into avenues that aren't necessarily MOAR POWAH, but solving other problems in the real-time graphics world.

Pretty sure my 970 is dead but thought I'd pop in here first to see if anyone had any ideas for fixes. It stopped outputting video this weekend. I just swapped it to a different PC. Same thing. I shoved my phone into the machine and recorded a video to see if it was even running. The fans barely do a full spin every 10 seconds. Dead, right?

WipEout wrote:

Finally found the article I was initially referencing (I wish Android's news feed had a searchable history ). I thought it was an interesting business perspective, and certainly not one I immediately took as a gamer (or a game developer). It definitely puts the performance levels the new generations are (not) generating into perspective.

Either way, from this perspective it makes more sense to me to push GPUs into avenues that aren't necessarily MOAR POWAH, but solving other problems in the real-time graphics world.

I'm all for more and better ways to increase real time graphics but I'm also for doing things in a logical and cost effective way. Why overspend for ray tracing that barely works today? By the time its relevant the cards today will be effectively worthless. Would you spend $25K on an 8K TV? Theres barely any content worth watching in 8K so the answer is probably no. In time would an 8K TV mean something? maybe.. I don't know.. we are still waiting for broadcast to catch up to 4K and that doesnt seem to be moving particularly fast.

4K gaming seems to be a real thing though.. especially as monitors get larger and HDTV and Consoles get cheaper and more powerful. Maybe Nvidia is worried and sought to shift the focus away from 4K raw power and onto buzzword technologies.

McChuck wrote:

Pretty sure my 970 is dead but thought I'd pop in here first to see if anyone had any ideas for fixes. It stopped outputting video this weekend. I just swapped it to a different PC. Same thing. I shoved my phone into the machine and recorded a video to see if it was even running. The fans barely do a full spin every 10 seconds. Dead, right?

Sounds dead to me, but the fan may not be a clear indicator. The 970 runs pretty cool, I think mine keeps the fans off unless I'm gaming.

McChuck wrote:

Pretty sure my 970 is dead but thought I'd pop in here first to see if anyone had any ideas for fixes. It stopped outputting video this weekend. I just swapped it to a different PC. Same thing. I shoved my phone into the machine and recorded a video to see if it was even running. The fans barely do a full spin every 10 seconds. Dead, right?

Guru's got a couple cards up for sale if you're interested

https://www.gamerswithjobs.com/node/...

Did you try a different video cable, McChuck? Just to be thorough? (My instinct says it's shuffled off this mortal coil...)