Is it likely there'll be 'retina-only' software on MacBooks?

We're buying a MacBook. The high-end non-retina model suits us best. More storage, upgradable Ram and bigger, replaceable, HDD.

My only concern is that we'll end up a couple of years from now with 'retina only' software being sold. We want to use this laptop for several years.

What do you think? Could we scupper ourselves by going normal res?

Edit

Here's the one we're going for. I can only link to the config page. We waited for ivy bridge to come out.

http://store.apple.com/us/configure/...?

It will be a very long time before there's enough Retina MacBooks in the world to make producing Retaina required software a worthwhile return on investment I think. Lots of stuff will benefit from it but I doubt much of anything will require it that isn't very niche.

That seems extremely unlikely to me. It's hard to even access a Retina display in native resolution... they're doing some really bizarre crap in the OS with those things. I think the chance of any Retina-required software coming out before your laptop dies of old age is probably quite close to nil.

Now, I'm sure there will be plenty of programs that will be enhanced by a Retina display, but require it? Not for a long while. And because of the weird way that the Retina display is used, it's quite possible that the OS may just automatically downscale anything to whatever the screen can display. It may not even be possible to write apps that won't run on the lower-res screens.

It's a non-issue.

There are far too many existing non-Retina Macs, not to mention current non-Retina Mac product lines, for Retina-only apps to be a thing anytime soon.

Furthermore, it's pretty much unnecessary. App developers will migrate to using SVG instead of pixel graphic assets. Fonts scale, and by using vector graphic formats, art assets can scale too. There hasn't been much of a reason to use vector graphics instead of pixel graphics for app development, as desktop screen resolutions just haven't been excessively divergent. Now, it's a little more so, but an easily solvable problem.

The way the APIs work, there is basically no incentive to being retina-only. All of the calls work like this:

command: "get me an image named foo.png"

When you get back an image named "foo.png", it is an object that says:

- I am an image named foo.png
- I have a resolution of x [points] by y [points]
- If you are on a non-retina display, 1 [point] = 1 pixel and I will display it that way

On a retina device, it says "well I have a non-retina image here, with 1 pixel-per-point, so I will just pixel-double that PNG file and render it."

IF there is also an image named "[email protected]" AND you are on a retina device, it will return that image instead. In that case, 1 point represents a 2x2 pixel grid, and internally, it renders everything with the higher-resolution image at native resolution, but internally the APIs still consider everything in "[points]" that you can think of as non-retina pixels.

It's basically a way to have the code double everything for you if you provide high-resolution imagery, while doing everything as if the retina resolution doesn't exist. I suspect it will be a long time before truly acting like the retina resolution is a "real thing" from an API point of view, so you are safe. FOR NOW.

(Does that mean there won't be people who try to detect retinaness and don't allow it? Probably. But it won't be common...)

Ranger Rick wrote:

The way the APIs work, there is basically no incentive to being retina-only. All of the calls work like this:

command: "get me an image named foo.png"

When you get back an image named "foo.png", it is an object that says:

- I am an image named foo.png
- I have a resolution of x [points] by y [points]
- If you are on a non-retina display, 1 [point] = 1 pixel and I will display it that way

On a retina device, it says "well I have a non-retina image here, with 1 pixel-per-point, so I will just pixel-double that PNG file and render it."

IF there is also an image named "[email protected]" AND you are on a retina device, it will return that image instead. In that case, 1 point represents a 2x2 pixel grid, and internally, it renders everything with the higher-resolution image at native resolution, but internally the APIs still consider everything in "[points]" that you can think of as non-retina pixels.

It's basically a way to have the code double everything for you if you provide high-resolution imagery, while doing everything as if the retina resolution doesn't exist. I suspect it will be a long time before truly acting like the retina resolution is a "real thing" from an API point of view, so you are safe. FOR NOW.

(Does that mean there won't be people who try to detect retinaness and don't allow it? Probably. But it won't be common...)

You sound drunk!

FedoraMcQuaid wrote:
Ranger Rick wrote:

The way the APIs work, there is basically no incentive to being retina-only. All of the calls work like this:

command: "get me an image named foo.png"

When you get back an image named "foo.png", it is an object that says:

- I am an image named foo.png
- I have a resolution of x [points] by y [points]
- If you are on a non-retina display, 1 [point] = 1 pixel and I will display it that way

On a retina device, it says "well I have a non-retina image here, with 1 pixel-per-point, so I will just pixel-double that PNG file and render it."

IF there is also an image named "[email protected]" AND you are on a retina device, it will return that image instead. In that case, 1 point represents a 2x2 pixel grid, and internally, it renders everything with the higher-resolution image at native resolution, but internally the APIs still consider everything in "[points]" that you can think of as non-retina pixels.

It's basically a way to have the code double everything for you if you provide high-resolution imagery, while doing everything as if the retina resolution doesn't exist. I suspect it will be a long time before truly acting like the retina resolution is a "real thing" from an API point of view, so you are safe. FOR NOW.

(Does that mean there won't be people who try to detect retinaness and don't allow it? Probably. But it won't be common...)

You sound drunk!

damaati!

Thanks all.

The retina displays are 2880x1800 or whatever, but the default screen resolution the half-res 1440x900, which is the same resolution as 15" MacBooks are now. So screen coordinates haven't changed, but the text rendering engine uses full screen resolution to get really sharp text, for example. In short, I wouldn't expect to see any retina-only apps any time soon. In fact, I don't think you can even up-res the screen to the full native retina resolution, probably because there isn't a mobile video card made today that could handle it. If you'll notice, the graphics card in the normal 15" macbook is the same as in the retina 15" macbook. I'd really be quite interested in seeing game performance comparisons between the two, as I suspect the retina laptop will be slower.

EDIT: Here it is. It runs better than I thought.