It would make any games she wants to play on multiple screens run worse probably.
There are games that support multiple screens? I guess there’s the trick where you span the 3D scene across multiple monitors. First thing that came to mind though was how much I’d like to be able to have my map and inventory on monitor 2 in a lot of games...
Rykin wrote:It would make any games she wants to play on multiple screens run worse probably.
There are games that support multiple screens? I guess there’s the trick where you span the 3D scene across multiple monitors. First thing that came to mind though was how much I’d like to be able to have my map and inventory on monitor 2 in a lot of games...
Lol. Nintendoes what PCdon't? Except at insanely low resolutions.
... should I be turning off the Intel integrated video? I didn't, because that's not the change I was there to make. But I am wondering.
Generally, there's no reason to.
Conversely, is there any advantage to be gained from plugging monitor 2 into the integrated video while the nvidia card handles the 3d game on monitor 1?
No.
AMD just announced the Radeon 7 card to be coming out February 7th for $699.
Rykin wrote:It would make any games she wants to play on multiple screens run worse probably.
There are games that support multiple screens? I guess there’s the trick where you span the 3D scene across multiple monitors. First thing that came to mind though was how much I’d like to be able to have my map and inventory on monitor 2 in a lot of games...
Only ones I know of are racing sims or flight sims and even then it may be that they are just stretching windowed mode across multiple screens.
I would have been more excited if AMD announced this card at $499
It makes you wonder if AMD has learned from the past. Perhaps they have priced it higher than what would be more competitive so that when Nvidia undercuts them, they can pivot to match with a price cut of their own.
I think it's a good sign that AMD is competing with the 2080, whereas previously with Polaris they were topping out at competing with the 1060. I assume AMD will have a line of cards coming out beneath this card which fills out the lower price points.
I also have to assume the pricing is set in expectation of reacting to moves from NVIDIA. They've set this guy at exactly $100 below the MSRP of the 2080 it competes against.
Wasn't the MSRP for 2080 at $699 too? While $799 was for the "founder's" thing.
Sounds like it is basically a beefed up current-generation AMD card and not their (supposedly) upcoming one. Maybe more like a Titan card "shortly" before the next generation.
I would have been more excited if AMD announced this card at $499
See dragon comic on previous page.
Well new PC purchased. I forgot how scary the numbers can be.
That seems like a very unexciting announcement. The 2080 and 2080Ti are over- and massively over-priced, and not really that much faster than the 1080 and 1080Ti. Their big draw, supposedly, is all that ray tracing silicon. Doesn't interest me, in this generation, but that's the headline reason to buy those cards.
So this new AMD card is almost as expensive, doesn't offer ray tracing at all, and apparently will burn a crapload of power; Ars is saying maybe 300 watts. And this is going from a 14nm to a 7nm process, so this card really should be offering much better performance, but isn't.
I'd call that a major letdown. They should have a huge relative silicon budget, because they're not doing raytracing. These chips should be stomping NVidia at conventional rendering, and they're not even quite comparable, and possibly burning substantially more power (and heat) to reach those lower performance levels. And the pricing is very high.
AMD is not doing well on the GPU front. From what I can see, their major competitive argument is "We're not NVidia", which has some appeal, but that's a very poor tactical position. Not being the other company isn't exactly a high bar to clear, you know?
I read an article on the train this morning that AMD's new card kind of helps prove Nvidia's point-- the market still skews heavily towards 1080p, with 1440p taking the 2nd place spot in Steam's hardware surveys at 3% of their users. Looking from that perspective, there just isn't much need for massive leaps in processing power beyond the GTX 10 series and AMD's Vega stuff to push higher settings at common resolutions. So Nvidia was sort of right to try to shift the conversation towards things other than graphics power, like solving other major real-time problems like Raytracing.
That said, I agree that the prices are too high, given the relatively incremental performance bumps.
Yeah the "RTX Off / RTX On" has become a meme for a reason. It's cool that it's in development, but it's at least another generation away from being remotely useful.
WipEout wrote:I read an article on the train this morning that AMD's new card kind of helps prove Nvidia's point-- the market still skews heavily towards 1080p, with 1440p taking the 2nd place spot in Steam's hardware surveys at 3% of their users. Looking from that perspective, there just isn't much need for massive leaps in processing power beyond the GTX 10 series and AMD's Vega stuff to push higher settings at common resolutions. So Nvidia was sort of right to try to shift the conversation towards things other than graphics power, like solving other major real-time problems like Raytracing.
That said, I agree that the prices are too high, given the relatively incremental performance bumps.
First I hear that RayTracing solves a real problem.. I mean the Raytracing Nvidia is solving for in this generation barely looks different when I see the videos.
I'd argue a difference between a "real" and a "real-time" problem. One being a big hurdle that inhibits a major factor or functionality of your product, the other being a problem that occurs in a real-time engine when trying to do certain things. I didn't say raytracing was a real problem that games need to get past in order to advance the medium, but raytracing in real-time has been virtually impossible until this past year, and is going to change a lot about how we 3D artists and designers create the pretty pictures for players to enjoy in the future.
That said, it's a testament to Nvidia, Microsoft, and their partners in developing tools (in DIrectX) to fake real-time reflections and lighting to such an extent that you guys don't see a big difference. Does it mean it's not a problem, though? Not at all. It's not a problem for you, as gamers, sure. It doesn't really affect your games in any big way (yet), but it will. It's a wall that we're going to run into as developers (a wall that my studio has already run into and worked around on mobile), so it's not like there was no need to solve real-time raytracing in game engines.
And yeah, it's not really a marketable graphics improvement for consumers yet, but it makes sense that given how far graphics have pushed the market and where those performance needs have settled for a majority of gamers, Nvidia is looking to push other boundaries. They're basically going down the checklist while there's still some downtime.
All THAT said, I still agree that those cards cost too damn much and the steps forward in solving real-time raytracing aren't worth it to the consumers just yet.
But you and I both know that many times in 3d graphics it is better to fake it than to have the actual. Just look at biased vs. non-biased renderers or things like fluid simulation where you get just enough samples to be accurate and use algorithms to blend the rest. Because many things computer are economies of scale where the last 10% takes 20x longer than the prior 90%.
It is also important to note that the infrastructure is set up for faking it and is more sophisticated. I am not saying things won't change but there is no magic bullet app that is going to change things over night.
Sure, but my point is that as graphics continue to edge towards realism and physical accuracy, our industry is shifting towards procedural content generation, which means more power is necessary for realistic simulations, often in-engine. Look at how much and often Houdini is used not just for exporting point-cached cloth or fluid sims like we used to do in Maya, but also real-time, in-engine dev tools for characters and environments, Substance for procedurally-based material generation, physically-based rendering, etc. Hell, you can run real-time fluid sims in Unity on mobile phones now.
Also no one here is talking about a magic bullet app (though if there is one, I bet it'd be Houdini). I'm simply saying that Nvidia seems to be right to start tackling other major real-time graphics hurdles given the current state of the industry and the direction that content creation is headed.
I'd call that a major letdown. They should have a huge relative silicon budget, because they're not doing raytracing. These chips should be stomping NVidia at conventional rendering, and they're not even quite comparable, and possibly burning substantially more power (and heat) to reach those lower performance levels. And the pricing is very high.While I agree with most of your points.. feels like this card should be at most $599 (and lets face it within a year it will be that if not lower) but I'm not sure how you have already figured out its performance given no trusted review site has had a shipping version to run through benchmarks.
AMD is claiming that it occasionally beats the 2080. That's the best they can say about it: "we win in some games." If they had something really good, we'd know.
My present assumption is that it will be roughly comparable to a 2080, burn a ton more power, cost a little less, and not have raytracing. With the extra transistors from not raytracing, they should be doing much better than that.
edit: and note that I'm making no argument about raytracing itself. I don't care about it, and don't think it's directly important in this generation. Someday it may matter a lot. Right now, it doesn't.
What I am arguing about, however, is transistor budgets. The RTX stuff is not cheap to make. That means AMD should be able to devote a ton of die space to conventional rendering, space NVidia simply doesn't have available. But even with all that extra die space, they're only getting to about speed parity, and burning more power, possibly a ton more power.
If the cards were then much cheaper, that would be okay, but they're charging for all those transistors and then not using them very efficiently. NVidia costs only a little more, and does have the raytracing to play around with. Even if games don't directly take much advantage, it might be a rather interesting thing to mess around with yourself, either programming it directly, or using a raytracing program/tool that will talk to the hardware.
Well, if they want to price it that way, they should have delivered much better performance. It certainly appears that you're getting a lot more transistors (or more efficient ones) for your dollar with NVidia, even though you can't use most of them yet.
It just doesn't feel like they're really trying, that they don't want to actually compete. If they were on their game, they'd be making NVidia look weak and underpowered in this generation, because they are. They blew a ton of transistors on a useless feature, made themselves vulnerable, and AMD isn't going after them.
Which implies that they can't, that they just don't have the expertise anymore to make leading-edge graphic cards. Looks like NVidia wins their gamble pretty decisively.
I think raytracing is currently at the point that early 3D graphics cards were at before games came out that really pushed the hardware to show what it could do and why it was needed. I'm also not sure that the current line of RTX cards are powerful enough to support a game like that since it is my understanding that they are working on a very limited basis for a few very specific things in the scene with the traditional rendering pipeline doing most of the work.
I also think that when the Radeon VII gets out in the wild and benchmarks are done people are going to be just as disappointed in it as they were in the 2080.
Finally found the article I was initially referencing (I wish Android's news feed had a searchable history ). I thought it was an interesting business perspective, and certainly not one I immediately took as a gamer (or a game developer). It definitely puts the performance levels the new generations are (not) generating into perspective.
Either way, from this perspective it makes more sense to me to push GPUs into avenues that aren't necessarily MOAR POWAH, but solving other problems in the real-time graphics world.
Pretty sure my 970 is dead but thought I'd pop in here first to see if anyone had any ideas for fixes. It stopped outputting video this weekend. I just swapped it to a different PC. Same thing. I shoved my phone into the machine and recorded a video to see if it was even running. The fans barely do a full spin every 10 seconds. Dead, right?
Pretty sure my 970 is dead but thought I'd pop in here first to see if anyone had any ideas for fixes. It stopped outputting video this weekend. I just swapped it to a different PC. Same thing. I shoved my phone into the machine and recorded a video to see if it was even running. The fans barely do a full spin every 10 seconds. Dead, right?
Sounds dead to me, but the fan may not be a clear indicator. The 970 runs pretty cool, I think mine keeps the fans off unless I'm gaming.
Pretty sure my 970 is dead but thought I'd pop in here first to see if anyone had any ideas for fixes. It stopped outputting video this weekend. I just swapped it to a different PC. Same thing. I shoved my phone into the machine and recorded a video to see if it was even running. The fans barely do a full spin every 10 seconds. Dead, right?
Guru's got a couple cards up for sale if you're interested
Did you try a different video cable, McChuck? Just to be thorough? (My instinct says it's shuffled off this mortal coil...)
Pages