FreeSync, G-SYNC, and Raw Horsepower Showdown

What say you on the merits of G-SYNC monitors, FreeSync monitors, vs just raw frame-throwing horsepower to prevent tearing?

I've had my eye on one of the 34" curved monitors recently (specifically, the LG 34UC89G-B) but I've been staggered by the cost premium commanded by the 89G-B model (G-SYNC) over the 79G-B model (FreeSync), which runs to something like an extra $500 for the privilege of having G-SYNC.

I have a 980ti in my system at the moment, so FreeSync is no good to me. On the other hand, I don't particularly feel like paying nearly double the price for a monitor that is essentially identical except for the G-SYNC chip. Given that I've never had G-SYNC before my current monitor (Acer XB270HU), I'm wondering if the FreeSync monitor would be sufficient, as long as I throw enough frames out of my video card to prevent egregious tearing.

Is anyone running a FreeSync monitor on an nVidia card? What's the experience like?

AFAIK, if you run a FreeSync monitor on an NVidia card, it just doesn't work as a sync-based monitor. It becomes a standard 60Hz screen.

I don't own either technology, as I've got a very nice 30" monitor that I don't want to replace, but I've gotten the impression that, most of the time, G-Sync works better. NVidia seems to control the process more closely than AMD does. If you can believe online reputation (which is much less true than it was ten years ago, as many companies actively manipulate posts and votes in their favor), a G-Sync monitor should work perfectly, where a FreeSync monitor on an AMD card can sometimes be quite troublesome.

At the moment, putting FreeSync on an NVidia card is only useful if you're planning to transition to AMD later on. It'll work, but no differently than any other monitor would.

Again, give this opinion a weak weight level, because I could have been manipulated/advertised into saying this. Direct experience from established forumgoers here (e.g,Thin_J, GameGuru, Legion, like that) will be enormously better.

Yeah. In short, NVidia introduced G-Sync, which is requires some proprietary hardware in the monitor, and then AMD followed with FreeSync, which is a software-only version for AMD cards. So the monitor has to match the card for it to work at all. From what I've heard (I have an NVidia card), FreeSync doesn't work quite as well as G-Sync but it does work.

Personally, I've found G-Sync to be wonderful in terms of reducing the likelihood of motion sickness from games, but the visual effect is subtle. If you aren't really particular about screen-tearing and motion sickness isn't a problem, I don't think it's worth the added cost.

I have an AOC G2460PG, which was a pretty reasonable price for G-Sync, and it's a great monitor in general. But it's certainly not a 34" curved monitor. I have a 970 and like to have really high framerates, and once you go above 1080p you need some serious horsepower to get things to a level I find acceptable.

Yeah, I think I had a little sticker-shock at just *how much* G-Sync added to the cost. I mean, if it was an extra $100 above the Freesync version, I don't think I would have blinked. But I find it difficult to believe that the G-Sync chip costs more than the CPU that powers my entire system. I mean, it's literally in region of $400-$500 additional dollars for the functionality. That's the same price as a pretty damned nice video card!

Yeah forget that. As you can see, the G-Sync monitor I own cost $350, and it isn't more than $100 more expensive than a comparable non-GSync monitor. I guess just once you get into the real high-end they feel like they can upcharge based on a percentage rather than anything to do with actual parts cost.

You're probably right. I have a pretty nice G-Sync monitor (Acer HB270HU) that I got on sale, paired with a couple of smaller non-G-Sync monitors for a wrap around display, but was hoping to drop to a single wide monitor. Since I don't see myself going in AMD's direction anytime soon, I guess I'll wait for a price drop.

Coldstream wrote:

Since I don't see myself going in AMD's direction anytime soon, I guess I'll wait for a price drop.

Unfortunately the G-Sync Ultrawides have been painfully static on pricing. One of them is like two years old now and is still over $1000.

It's rough.

Do most games or media even support ultrawide though? It's a very particular resolution so I'd expect black bars or image stretching in most cases.
eta: i have a 4k acer with gsync, which for this model was maybe $50 more than not having it. I do wonder sometimes about HDR but the cost increase for that at the time of my purchase was just too absurd.

The ultrawide support isn't usually in the resolution settings. It tends to be in FOV and UI. If a game doesn't support ultrawide, the game will still render correctly under the chosen aspect ratio (ie, no stretching or black bars), but the FOV may not adjust accordingly, and UI elements might not back to the furthest corners like they usually do with a 16:9 ratio; or conversely, UI elements might actually lock to the far corners where the game would otherwise benefit from a different layout. Either way is an indication that the devs hadn't been able to accommodate the irregular aspect ratio.

As for other media, ultrawide's 21:9 aspect ratio was originally intended for Cinemascope's 2.35:1 or anamorphic 2.39:1.

Most developers do support Ultrawide these days, and it's glorious! Except Bethesda for some reason...