Brad Wardell's views about technology, politics, religion, world affairs, and all sorts of politically incorrect topics.
Fun with high resolution
Published on January 22, 2012 By Draginol In Personal Computing

I have MacOS X Lion running on a 30 inch Cinema Display.  If you have the latest Xcode, you can use Quartz Debugger to enable it.  What it does is allow you to run in a sort of retinal display.  In my case, it turns my 2560x1440 to 1280x720 but with very high DPI. It looks absolutely gorgeous and is exactly what I think most people have wanted out of their high resolution displays all along.

Apparently, new Macs coming out this year are going to have screens that do crazy high resolution (2880 x 2400 for instance) so that users can run them like they would at 1440x1200 except for having everything be super sharp.

I have to say, it is a really great experience to see text and the UI this clear. In fact, it's going to be hard to go back to any other type of display.


Comments (Page 2)
2 Pages1 2 
on Jan 23, 2012

JcRabbit
3 x 30" displays.

Hey Jorge, what monitors are those, what's the resolution and what graphics cards are you using? I think I may have asked you those questions before, but I have a memory like a sieve these days.

on Jan 24, 2012

Frogboy
Well obviously it doesn't increase the real DPI of the monitor.

Well, some render engine use the subpixel rendering who can somehow triple the horizontal resolution...

1 white line on grayscale monitor

2 white line on grayscale monitor with antiliasing

3 white line on color monitor

4 white line on color monitor with chroma subpixel

5 same that 1 to 4 with less zoom

 

If i wrote about these subpixel thing, it is because it is somehow used by SubLCD ( http://www.oyhus.no/SubLCD.html )... and the SubLCD method look very similar to these new Apple HiDPI method :

The picture goes through SubLCD first, which shrinks it to half the size, but double the resolution.

Well, nothing wrong there... seem that Apple have invent the method 25 year ago, Microsoft have "stole" it ( 1998 ) and patent it until 2019... SubLCD ( 2007 ) have found a way for similar result without patent conflict... Apple make HiDPI... seem like a full circle...

It is how i think that HiDPI work... i can be wrong since until now, i have not find any article who explain how it really work at the technic level...

Will not surprise me that it was implemented in new MAC OS in planning of future desktop retina screen who have 300 dpi or more... when these screen hit the market, these HiDPI thing will reach his full potential...


on Jan 24, 2012

Starcandy
Hey Jorge, what monitors are those, what's the resolution and what graphics cards are you using? I think I may have asked you those questions before, but I have a memory like a sieve these days.

Sorry, don't want to hijack the thread, but here goes:

Two LG W3000H (DVI-D input only) and one HP ZR30w (DVI-D + native DisplayPort). Resolution of each monitor is 2560x1600. Graphic cards are one HD Radeon 5970 (dual GPU) and one 5870, in (tri)Crossfire.

The original idea was to run Eyefinity in 3x30" monitors, but I found out that pushing 7680x1600 pixels running Crysis at high settings was too much even for those cards (it is said that the crossfire + eyefinity issue has to do with the crossfire bridge not being able to support the bandwidth needed to output high resolutions - also, the cards only have 1 GB memory for each GPU, which impacts stuff such as 2xAA).

Note that we are talking 3D games here at very high resolutions, for normal Windows desktop usage a single 5870 is more than enough and totally unnecessary.

Now I'm waiting for Intel's Ivybridge and the Radeon 7990 (dual GPU) to get a new system. Having a single dual GPU card, *in my experience*, is much less troublesome than having two or more single GPU cards in Crossfire. And from the benchmarks of the 7970 I have seen so far, the 7990 should put my 5970+5870 tri-crossfire system to shame.

on Jan 24, 2012

Cheers Jorge!

on Jan 26, 2012

Thoumsin
In place of render a high quality frame at 2560x1440, the HiDPI render a lower quality frame at 1280x720 who is interpolated at 2560x1440 by software...

A original frame at 2560x1440 will always be more sharp and have more detail that one at 1280x720 who is interpolated at 2560x1440...

Didn't completely understand what you meant there. Apples HiDPI are just the GUI images remade from scratch at double the resolution (as with the iPhone Retina) to be rendered at double the pixel density.

on Jan 26, 2012

TobiWahn_Kenobi

Quoting Thoumsin, reply 13In place of render a high quality frame at 2560x1440, the HiDPI render a lower quality frame at 1280x720 who is interpolated at 2560x1440 by software...

A original frame at 2560x1440 will always be more sharp and have more detail that one at 1280x720 who is interpolated at 2560x1440...

Didn't completely understand what you meant there. Apples HiDPI are just the GUI images remade from scratch at double the resolution the resolution (as with the iPhone Retina) to be rendered at double the screen resolution.

on Jan 26, 2012

TobiWahn_Kenobi

Quoting TobiWahn_Kenobi, reply 20
Quoting Thoumsin, reply 13In place of render a high quality frame at 2560x1440, the HiDPI render a lower quality frame at 1280x720 who is interpolated at 2560x1440 by software...

A original frame at 2560x1440 will always be more sharp and have more detail that one at 1280x720 who is interpolated at 2560x1440...


Didn't completely understand what you meant there. Apples HiDPI are just the GUI images remade from scratch at double the resolution the resolution (as with the iPhone Retina) to be rendered at double the screen resolution.

2 Pages1 2