Follow Slashdot stories on Twitter


Forgot your password?

IBM's 5.2M Pixel Flat Panel 142

An anonymous reader writes "A current prototype of the Roentgen monitor offers a resolution of 200ppi (pixels per inch), with a total of 5.2 million full-color pixels, laid out in a 2,560 by 2,048 grid. Once the production version of the monitor is released, Greier said it will be able to display two full-sized 8.5-inch by 11-inch documents side by side. The article also notes that the monitor needs a 4 head Matrox graphics board to drive it." Thats ungodly. Sign me up.
This discussion has been archived. No new comments can be posted.

IBM's 5.2M Pixel Flat Panel

Comments Filter:
  • by evilpete ( 26941 ) on Thursday July 13, 2000 @05:22AM (#937313) Homepage

    The advantage of having a much higher dpi resolution means you can get crisp large fonts without the need for anti-aliasing.

    You've got to remember to up the font size when you up the resolution on a monitor, otherwise you do end up squinting at tiny text - though sometimes (scanning large web docs, editing html etc.) it is helpful to fit a large body of text on screen at once.

    At the moment it is much easier to read printed rather than on-screen type. Hopefully higher res monitors will fix this pretty soon - or my eyes are going to be dead by the time I'm thirty.

  • the monitor was more like 100ppi, or even down to 72ppi, like most "normal" monitors. You'd be able to get 8 "pieces of paper" on the screen at once, at a resolution the human eye can actually cope with.

    And the windows icons wouldn't look like specks of dust....

  • for monitor. Mmmmmmmmmm.
  • I think the disturbing part of all this is how excited we all are (I'm drooling over it for one) over a 200ppi display. Admittedly, its far better than current displays, but even a cheap inkjet printer can hit 300dpi without a problem. And I've got a laser here at my office that can do 1200dpi. Does anyone know at what resolution the human eye can see at? IOW, when are we going to get documents on screen that look half as decent as its own printout?
  • Professional digital photography users are already dealing with resolutions this large. A mid-range 35mm film scanner will produce 6 megapixels. Medium format scans produce even more pixels.

    A professional who is looking at digital backs for $10,000 or so could also be quite interested in this device.
  • Actually, what I have found is that shadow mask monitors tend to converge better over the long run than the aperture grille ones. The third party Sony FD based monitors tend to be the worst in my experience. There is something about that flat screen technology that the OEM's can't seem to get right because the convergence seems to drift over very short time periods (a few hours).
  • Monitors aren't supposed to give the user a big radiation dose. Ya, I know that I'm sitting at the dirty end of a particle accelerator right now.

    Naming their monitor technology Roentgen worries me a bit.

  • 200ppi sounds nice. It's certainly more than the 110ppi I get with my SGI 1600SW flatpanel. Still, I think I'll stick with my SGI. First of all, you can pick up one for around $3000 mas o menos. Secondly, the "sexiness" of this monitor is second-to-none. SGI did a beautiful job styling this flatpanel. The lines are just as nice when viewed from behind as they are from the front. I've also noticed that, unlike the Sony GDM-F500R monitor (which is Sony's top-end) that sits next to it on my desk, the SGI flatpanel is polarized. I noticed this when I wore my polarized shades into my office one day and saw the screen turn black when I cocked my head 45 degrees to either side. I'm wondering why Sony isn't doing this with their monitors. I think the polarization is what gives the SGI flatpanel its extraordinary anti-glare ability. My office has several large windows behind the desk and in the early afternoon (like right now), it's almost impossible to see the Sony monitor without the blinds closed. Even with the blinds open, the SGI flatpanel is as bright and contrasty as always.

    For a while, the big let-down about this flatpanel was SGI's use of the (now defunct) #9 Revolution IV graphics card and the so-called "OpenLDI" digital interface. Basically, it meant that one card and one card only worked with this monitor. But recently, SGI has released a VGA-to-LDI adapter that lets you hook any video card up to the flatpanel. The question I have is: what video card (besides the #9) supports the SGI's funky (yet wonderful) 1600x1024 resolution?

    Yeah, the IBM flatpanel sounds nice, but I think I'd take three SGI flatpanels on my desk instead.

  • ....

    that I actually want to sit in front of that thing... Roentgen invented the X-Rays.... And even though I would like to leave an "imprint" in the world, it doesn't have to be as a shadow of my self on the backwall of my office :)

  • The outside dimentions of the monitor are 21"x16.5". The displayable image is 12.8" x 10.24".
  • It's interesting that a blurry (compared to a monitor) tv can be just as effective at smoothing blocky low resolution stuff as the best, most intensive anti-aliasing effects.

    Actually anti-aliasing on images often happens after the image has been rendered (for best results it should be applied before), and at that point there is nothing else to do but apply a blur filter... so your observation is probably correct :-)

  • Cool as it would be to have the biggest, baddest display on the block, I think it still makes more practical sense to have several smaller ones. (Besides, what could be cooler than an array of four flat panels arrayed in front of you? You can pretend you're ground control for the space shuttle.) Take a look at this website [] for some ideas, as well as this slashdot article. []
  • Did anybody look at the IBM website for this thing? 3 transistors per pixel, 1.64 miles of wiring for the LCD alone... No wonder it's gonna be priced outrageously... In case you can't find the link, click here [] to go there.
  • I'm thirty and my eyes are still okay, so don't worry yourself unnecessarily.

  • by Forge ( 2456 ) <> on Thursday July 13, 2000 @05:27AM (#937328) Homepage Journal
    I think we should file a class action suite against Slashdot for the loss of our keyboards. How do these people sleep at night knowing they have cause do many gallons of drool to clog, short circuit and rust the keyboards of nerds.

    The loss of earnings is staggering and the share human trauma of being unable to use you computer is just mind bugling.

    Rumor has it that they have signed a deal with the guys making the "Happy Hacking Keyboard" to increase sales.

  • by Pope ( 17780 ) on Thursday July 13, 2000 @05:27AM (#937329)
    Roentgen was the scientist who discoveredy X-Rays, which were called "Roentgen Rays" for many years. What the hell is in that thing?! :)


    Freedom is Slavery! Ignorance is Strength! Monopolies offer Choice!
  • What's the point of running such a big monitor for games if the frame rate sucks. Me.... I'll stick to Wolfenstein 3D on my 1ghz Athlon...
  • Does that compute? Right now I've got 96ppi/1600x1200 and not quite 11x17 display size.

    I would think you'd have to have quite a bit more than 2560x2048 at that ppi to get a screen that size. Or am I doing the math wrong/missing some calculation?

    In addition, the screen ratio they give is 1.25, as opposed to the 1.33 of most other resolutions.
  • As much as I would like to have better screens, I don't think merely higher resolution will solve the problem of viewing a hi-res image all at once without loosing resolution. In fact, you eye cannot see full detail on all of a 17" screen at once anyway, you have to change gaze (lots of times) to see all details.

    GIMP 1.1 has a very nice feature that makes panning of large images a very pleasant experience :-). It gives you a small pop-up showing the entire image at lower resolution, and lets you move around the region of interest in the miniature with the mouse. If you have enough RAM is actually quite quick, even on large images.

    BTW I also work with large images: aerial photograph databases which I probably never will be able to see on screen at full detail at once :-)

  • IMHO, this is one of the biggest advantages of having a Mac. My PowerBook can plug into a second monitor and use its 8MB of VRAM to drive both at decent resolutions.
    Windows 98 added this feature (a full decade after the Mac did - in 1988) but from what I've heard, it's spotty and unstable. Multiple-monitor support often requires slight modification of the way graphics code is written (all drawing commands have to be sent to both cards), and because Windows apps do generally funky nonstandard stuff more often, M$ instability seems to make sense (tho I'm not a Windows programmer). Multiple-monitor support was added at the same time as color support in the Mac.

    Ramble on!
    foo = bar/*myPtr;
  • I have a 14" TFT screen on my laptop + Rage LT Pro 3D card. Quake I,II and III are sharp and clear. I get far less eye strain from my LCD than i do from nearly all CRT screens.
  • I use an HP-FX70 LCD and it is pretty nice. It will run at 85Hz, although I run mine at 75Hz. It has an auto setup that nearly eliminates all of the phase variation (analog clock lining up with the LCD stripes) inherent with analog LCD panel. A little manual tweaking gives me a 1024x768 display that is every bit as crisp at the 14.1" on my VAIO. It also has a digital interface, but I am yet to find a reasonable card that has that connector for an Intel box. After using my VAIO on the road for a month, it was too painful to use a regular old CRT.
  • The european greens have managed to get passed laws effecting the use and disposal of lead in 2004.

    Good. No more conventional war in Europe after 2004.

    (Next time we'll have to use bismuth.)
  • controlling a beowulf cluster with one of these?

    (hey, no one else had said it yet.)
  • Multimon support on Win9x and NT is mostly app independent. Usually the worst thing that happens is dialog boxes appear on the wrong screen. Annoying but not fatal and increasingly rare.

    Still, you do need to make sure you have compatible video cards.

  • Still, you do need to make sure you have compatible video cards.

    Some manufacturers (matrox [] for example)are putting out cards that will drive multiple monitors from a single board. I think that makes the most sense. The thought of trying to persuade boards from different manufacturers to harmoniously coexist gives me acid flashbacks to the early days of the PC. Been there, done that, ain't goin' back.
  • quake III @ 4 frames/second with 4 matroxes driving it.


  • I imagine anyone working in the graphics realm of things would kill for one of these. 200dpi is getting pretty close to laser printer quality.
    it's kind of ironic that nearly everthing the Art Department prints out is only 800x600, and sometimes they even scale it up from that to get something like 60dpi on their printouts. All those lost dots per inch. *sniff* makes me sad.
  • > I'm starting to see anti-aliased text as *fuzzy* rather than smoother. Go see an optician, you need (stronger ?) glasses.. But I agree, new screens and high resolutions don't always display crisp images.. I still prefer my 5yr old 15' ADI at home to the compaq tft450 I have to see at work on :-( Refresh rates and sharpness of lcd's and so aren't what they should be :-(
  • Oh, yeah, baby... I want it BAD.

    Ever programmed with a dual-head display? Code editor/IDE up on one, references on the other, execution on one, debugger on the other... I miss those projects....

    And the market for this will be HUGE. Once people realize what they've been settling for, how will we be able to take pride in our little .22 dot pitch 1600x1280s? Even the Trinitron doesn't come close. Price'll be a pain, but there are enough different high-fidelity applications for this kind of display (how many will Lucas order to edit SWIII on?). Not just CGI, or IBM's favorite market, CAD - artists, architects, medical folk (like the article mentioned), the defense simulation folks (I know some tank simulators that could use this upgrade).

    Of course, I'll have to sell stock to be able to afford one. :-( It still doesn't qualify as my dream workstation, but it's an improvement. (remember Stellar Cartography from Star Trek:Generations? Now THAT's a workstation!)

  • This is the type of thing that takes ten years to get into the marketplace affordably. It's not a cheap thing - most of it is hardware cost that doesn't go down like that!
  • If you read the link, it is called Roentgen because it can be used to view X-ray pictures on screen.
  • If you get the chance, take a look at the 1400x1050 screen on Dell's Inspirons. I use a 17" Trinitron at work and the LCD at home. The LCD is easier on my eyes over a period of time. YMMV

  • by Mike Hicks ( 244 )
    Something you may not know about monitors is that having a really big (CRT) screen doesn't necessarily give you the best experience. I was looking at some big Sonys (I probably won't buy one.. I don't think I'll pull in enough cash this summer) and I noticed that the huge 21" displays run at around 96dpi (even at 2048x1536 resolution). However, the monitor I have right now (15", probably about 14" viewable) runs at 1320x992 and has a dpi of about 118. This means that my display can give me crisper text (though I can't fit as much of it on the same size display).

    200+ dpi would rock.
    Stop the MPAA []
  • With technology like ClearType, there is no reason that a page can't be displayed at just as clearly to the human eye at 100ppi as at 200ppi

    Sorry but this is certainly wrong. Say we have 10"x10" 200ppi display (2000x2000 pixels) and we have 500 lines of text on it. It has now 4 pixels for each line of text - with good antialiasing it might be barely readable. With 100 ppi you have only 2 pixels per line - there is no way text could be recognizable. Now if we could have 400 ppi display still with 10"x10" we would have 8 pixels per line which could be readable even without antialiasing (not looking great though). So display with more ppi is obviously better. Why do you think that 1200 dpi printer makes better result than 300 dpi printer?

    About physical size I would only comment that it only makes difference if you cannot select distance you look at your display from. You may use 1280x1024 head mounted display and it feels equally big (if not bigger) with 19" display on your desktop.

    IMHO 10"x10"x200ppi is equal to 20"x20"x100ppi when it comes to for what you can use it. You just need to use it from different distance.

    And bigger monitor with the same resolution (instead of ppi value) will be more expensive. Reason: you cannot decrease amount of light you create per area unit (because display would look darker otherwise). Because all known ways need more energy for brighter light when using same technology it will cost more to produce bigger display (for example in case of normal CRT you need to shoot much more electrons).

    This isn't to say that smaller display is cheaper because there is limit when it's too hard to create smaller device. I think I would be happy with 300ppi 19" monitor.

  • LCD Projectors make this easy. I was looking at getting an apartment with a vaulted cieling. I could have had 1024x768 at 10' square, or TV at the same size... Apartment was too much though...

    Done this at work a few times, in conference rooms. It's fun!

  • I worked at a medical imaging company and some of the advantages of online X-rays include the ability to bring up file images quickly, view them anywhere/anywhen (including remotely -- Units were installed in places like Bosnia for remote diagnostics). It's also possible to do things like false-color images.

    There are also monetary benefits like film costs and manpower spent hunting and transporting film. And storage: consider how many images you can fit on a raid array, then calculate how much space that would take up in plastic and paper files.

  • >Ever programmed with a dual-head display?
    >Code editor/IDE up on one, references on the
    >other, execution on one, debugger on the
    >other... I miss those projects....

    What's wrong with multiple virtual screens? Does anybody really run without that nowadays? I don't even run Windows without a virtual window manager. Sure I'd never say no if the boss were to offer me the Roentgen monitor, but I can make do with my 21" trinitron and virtual screens for quite a while. $10K is quite a hefty price, and I manage to make the space for a CRT still.
  • As much as I would like to have better screens, I don't think merely higher resolution will solve the problem of viewing a hi-res image all at once without loosing resolution. In fact, you eye cannot see full detail on all of a 17" screen at once anyway, you have to change gaze (lots of times) to see all details.

    Actually, the problem you are describing is due to the size of the image, not the resolution. If the pixels were drawn on the screen at a higher density than usual, then you would be able to see the whole image at once, without having to move your eyes around. And that extra resolution would not be wasted, since your eyes are plenty capable of resolving details at greater than the standard 72dpi or 100 dpi of a monitor (at a distance of about 18" to 20").

  • The article mentioned that the production model would be able to display two 8.5x11 pages. Maybe the production model will be more than 2560x2048 pixels.
  • Don't know if it's still true, since I've been away from the medical imaging business since leaving Sun (they *own* the OEM medical imaging market), but most of the med imaging vendors were using the X Inside (now Xi Graphics) X server to do this sort of thing. It has support for all kinds of high-end imaging hardware. I'd suspect they're still doing this, since it's not exactly the easiest thing for the uninitiated to jump into - there are some really arcane things to know for performance and fidelity. (And radiologists are to displays what the snobbiest audiophile is to stereo gear - they have very well-calibrated eyeballs...)

    BTW: Imaging is very different from graphics. This was one of the revolutionary things about Sun's UPA/VIS architecture in the mid-90's: it was the first affordable graphics susbsystem that did a pretty respectable job at both. Previously serious users had to choose which they wanted and select their hardware accordingly.
  • Raises a point of how big/hi-res is enough?

    I'm sitting at a 26inch monitor with 1280x1024, which is a fair bit of 'bandwidth to the eyes'. But a lot of the time, I'm designing the site I'm working on using the wall behind - a 'screen' of 2 x 4 metres, with enough 'resolution' to fit 20 closely typed pages across its width.

    I think it doesn't top out for a long time yet...
  • Duh! instead of developing large LCD-screen technology, i'm working on small LCD-screen technology. So far, i already have a display that is able to display one monochrome pixel. I call this display 'LED'. Wanna try it?
  • I'm sorry, but considering all of the concerns about monitor radiation emmision, is it really prudent to name a display after the X-Ray?

    "How many six year olds does it take to design software?"

  • How do you get that number?

    The pixel grid is 2560*2048 colored pixel.
    The display size is 21 inches * 16.5 inches.

    This make about 120 colored pixel per inch.

    Are they speaking about 200 mono-colored (Red or Green or Blue) per inch or am I missing something?

    Still these display must really be impressive to see, the sad point is that they won't become affordable anytime soon.. Bah!
  • I'm trying to decide why they need four display adapters. The G400 can drive 2048x1536 at 32-bit on one head, so 2048x2560 should be a job for two heads, right?
  • is a single pixel? why is that cool? the dot's on my 'i's are a single pixel right now. They should have said something like "The dot on this 'i' here is 17 pixels!" that would indicate a higher resolution, or at least a bigger font.
  • Good question. I was wondering the same thing - the math doesn't work.

    In any case, this is still *much* less than what will be required if we're ever to get usable interfaces. Even at 2 A-size sheets at a decent resolution, it's still tiny: A quick look around the stuff on my desk reveals 10 roughly A-size documents "open" and the corners of several others peeking out. The surrounding work area and walls have another severl pages available for reference.

    So for $10,000, you can get a tiny fraction of the bandwidth of my standard-issue IBM desk. Killing trees isn't going to slow anytime soon until computer desktop bandwidth approaches or exceeds that of the physical desktop. Until then, I'll keep printing out the things I'm working on.

    Really, though, this is a real problem - computers simply can't be really useful until they have big screens so we can stop trying to drive the freeway while looking at the world through a knothole. This is the sort of thing we should put all those extra CPU cycles to. Thank you, Gordon Moore.
  • Lotsa dots means never having to say "anti-aliased text". I used to use a 3 pixel wide font on my C64 because I was dialling into systems expecting more resolution.I modemmed for years using a CGA display. (My employer had leased lines to the big city, but only one modem).

    I first looked at anto-aliased text on an ATI card my boss was using in 1992. I hated it. When my eye couldn't reliably pick up the edges of the characters, I tired of reading in a few minutes. Meanwhile, back at my own desk, I can read for hours. I was in my bosses office because he wanted my opinion on whether his monitor needed to be serviced. We turned off the ATI feature and kept the monitor.

    I feel the same about the demo of Gibson's anti-aliasong product at The after image is soft and fuzzy and halfway unreadable.

    My eye will learn to ignore aliasing in a font within a few hours. I never learn to see edges of characters that have been deliberately hidden, and without edges, the font becomes unreadable

  • With really high resolutions, you also run into a lot of other problems where fixed-size bitmaps were used: it becomes practically impossible to distinguish between all the tiny 16x16 toolbar icons in Windows applications, for example.

    When I worked at PARC I had one of the prototype 7 megapixel displays which had a resolution up near 300dpi (282dpi if I remember right). It was 4-bit grayscale only, not colour, but text looked REALLY nice. A lot of Web sites really sucked, though, because they used frames or tables whose sizes were specified in terms of an absolute number of pixels, which usually meant that I'd see a column containing about 2-3 words per line, since each character on my display was 3x wider (in pixels) than they'd been expecting.

    Antialiased text on this display was just beautiful.
  • The only reason I haven't switched to LCD for my desktop is that I don't know of any quality digital switches, so all my computers can share it

    Here's one: []
    Since it's a digital interface, quality is not a huge issue, like it is with analog video switchboxes/cables.

  • Arrrrgggggggg! I will beat the dead horse one more time.

    If you have a larger monitor, you can display more PAGES at ACTUAL size. This is what I have been saying all along. If you have a smaller monitor, you can not. At one point in the original article they mentioned that this is very useful in some situations. I agree. I have done newspaper layout before... I want to see a bunch of pages at the same time, at ACTUAL size. Not half size. 100ppi has always been good enough to do this... now give me a bigger monitor.

    A monitor at 100ppi has plenty of clarity to display text on an ordinary printed page (about 12pt) at actual size. I know, I have one. I never said that it would be great when fonts rendered 4 pixels high at actual size... of course that's true, but who puts text like this on a PRINTED PAGE? Even with your fancy 1200 dpi printer, you can't read it.

    As far as the cost goes... we are talking about a LCD display here, not a CRT. They've been doing approx. 100ppi transistors for a long time now... I know, I have them in my laptop's screen. Making them 1/4 the size of that makes each one more expensive, naturally. Yes, they may be more energy efficient, but that's not the point. By making something using 4000000 cheapo 100ppi transistors, you've just saved money, instead of making something out of 4000000 expensive 200ppi ones.

    ClearType is only icing on the cake after that.

    I hope I have made it clear that time. :)

  • > Since it's a digital interface, quality is not a
    > huge issue, like it is with analog video
    > switchboxes/cables.

    You're absolutely right; I guess what I meant to say is "I can't find a digital switch". Good call! It doesn't have quite PC support I'd hope for, but it looks pretty good.

    With one of these I wouldn't have to upgrade all my video cards: ne/multilink.html

    Now all I need is $5000... ;)
  • Look at the processor speed race. Soon we'll see a 1.4GHz Pentium IV processor, but apart from being able to finish a SETI work unit in 2 hours (I'm guessing)

    2 hours? pphhhtt!
    Look at the platform stats - there are Alpha machines (at DEC/Compaq?) that finish a unit in around an hour. I remember one used to be around 56 minutes, but I can't see it there. And these are probably 600Mhz (650?) 21264 processors. Intel still has a long way to go :)

    ObSlashdot: Imagine a beowulf cluster of those!
    (actually, we don't have to imagine, do we?)

  • Sure didn't sound like it because Cleartype is a valid technology that could possibly have a purpose here, aside from the fact that text doesn't carry tone that well :)
  • I would like to see a paperback book sized screen with the same 200 dpi technology. Then a portable device for reading e-texts might suddenly be a killer app. As it is, reading from any current screen technology for a length of time causes me far more eyestrain. Put this density of screen in one of those transmeta prototype web pads and you have a killer combination in my opinion.

  • Slashdot Article [] - December 13th
    IBM Fact Sheet [] linked in Slashdot Article
  • LCDs are subject to ghosting of images due to the relitively slow speed of liquid crystal. This is similar to slow phosphors on an old TV. But in the case of TV this actually helps the persistence of vision. With LCD you are running 60Hz refresh, and even turning your head will catch the update. This also happens with slower CRT monitors.

    Also, keep in mind that this display is for viewing of X-Rays and other mostly still data. It is still a somewhat poor substitute to the original, but the ability to write on the X-ray without damage, plus the instant development of X-ray streams would be an added benefit. I think the latter already exists in some lab, but I'd also wager it is quite expensive. The former is great for keeping layers of notes.

    Just don't expect any "TekWar" video-tables anytime soon :(

  • SQRT(21^2+16.5^2)
    This makes a 70cm diag. screen (26.7 inch monitor).
    If this can sound good to play Civilization (is the refresh quick enough for Quake ?), this is just a little slow to watch DVD.
    What about its consummation, especially compared to previous laptops ?
    Do IBM intend to make big (I mean tall, not necessarily revolutionary) laptop using these ?
  • Have a look at some of this stuff [] on very small (still quite high resolution) and very fast refreshing FLCOS displays. They have a 1024x768 display which is only 12.3x9.2 mm in size!!

    Rather than trying to have complicated pixels from what I can make of it they build up colours by simply flashing the primary colours at you in different proportions, and with frame rates in the kHz bracket it looks very interesting.
  • Not unless you want to be hauling around a car battery with your now huge, 25 pound laptop....

    Roentgen features:

    200 ppi 16.3 inch Active Matrix Liquid Crystal Display
    diagonal viewing area
    2560x2048 pixels (5,242,880 full color pixels)
    Subpixels are 42 x 126 microns
    15,728,640 transistors
    1.64 miles of thin film wiring on the display
    Aperture ratio of 27.3%
    Backlight power of 44 Watts
    The smallest feature is 5 microns
    The prototype is 21 inches high and 16.5 inches wide, the total depth (including base) is 9.5 inches,
    the thickness of the display is 2.5 inches
    The weight is approximately 20 pounds
    The power dissipated by the new display is similar to the power used by an 18-inch CRT display.

    Not quite ready for mobile applications, apparently (even if they used a TransMeta proc) ;-)

    #include "disclaim.h"
    "All the best people in life seem to like LINUX." - Steve Wozniak
  • I can speak from experience, as LCDs are too bad. I use one exclusivly on my powerbook, and quake is not a blue of pixels. Also, my system considers my LCD as having no refresh rate. My display suffers from some color shifting (worse part of any LCD display) but other than that, I love it. I just wish I have a Cinema Display I could plug it into when I came home every night (well, actually a magma cardbus -> pci bridge, containing a DVI outpot enabled video card, but whose counting?)
  • For instance, how was the previous IBM prototype [] the same 200ppi/2560x2048 with a diagonal viewing area of only 16.3 inches?

    Pythagoras says we need a hypotenuse of 20.2" to get a 11x17 viewing area.
  • I want to feed the troll
  • Hear hear, though it's especially true for the crap Compaq ships. Even when they take good underlying technology, like trinitron, they make it the dimmest monitor ever built (P75), or they ship a 19" (S900) with an alleged .26 dpi that looks more like .38.

    I do like my big Viewsonic, though. It all boils down to quality: if you buy a cheap $200 17" or 19", you should expect it to ruin your eyes. Period.
    Change is inevitable.

  • I believe we'll start to see more announcements like these are research into visualization techniques progresses. Look at the processor speed race. Soon we'll see a 1.4GHz Pentium IV processor, but apart from being able to finish a SETI work unit in 2 hours (I'm guessing), it doesn't bring anything new to the table. No matter how fast you make the processor, it's still just pushing around 1's and 0's very quickly.

    Not so with monitors. The field is wide open--and overripe, if Sci-fi movie special effects have anything to say about it--for a revoluionary change in the way we view data. Whether it's a 50" flat-screen [] or a CAVE [] environment or a holographic projecton [], I think things are going to start changing. And it will start changing the way we see things.

  • Not that a vector based format for photos would work, though. There are too many things that can not be accurately described by vectors, and even if they could be, the amount of data required to describe them would be enormous... exponentially more than a similarly detailed pixel format. Still, 10 megs of pixel data for an image on a monitor with half the pixel size will look 1/4 of the size it previously did, which means that 4 times the data is required to take advantage of the added resolution. If you blow up the 10 meg file 4 times, you don't take advantage of the higher resolution anyway, unless you take into account anti-aliasing. Lots to mull over. In the mean time, whatever happened to the Photoshop-Illustrator hybrid that Adobe was talking about? It might be a step in the right direction.
  • by ChrisDolan ( 24101 ) <chris+slashdot.chrisdolan@net> on Thursday July 13, 2000 @06:36AM (#937381) Homepage
    This is something I've been looking forward to for a while. In my astronomy research, I usually work with images from digital cameras with 2048x2048 pixel resolution. Even with my 1280x1024 monitor, I either have to shrink the image (losing detail) or do a lot of panning to see the whole thing. A monitor more closely matched to the image size would help.

    As consumer digital cameras approach 2048x2048 resolution, I'm sure graphic artists will start to want high-end monitors like this one, too.

    However current top-end astronomy CCDs are using chips of up to 4096x4096 pixels and new cameras are using arrays of 2-16 of these large format chips. This spring I worked on some data from an 8192x8192 mosaic imager and, boy, was it hard to work with images shrunk by a factor of 8x8 to make them fit on my current-generation screen!
  • I gave away a P200 a P201, two P72's and G72. The display on my notebook is fine. Half my office is filled with the three glass monitors, P70 and G70 that are left. It's nice that they're 17+" each and I'm sure the people @ home would want one but...
  • can you really put a price on such a godly monitor? I mean yeah it's $300K, but jesus christ would be drooling over that.
  • Your calculations are essentially correct, but unnecessarily complicated.

    We already know from IBM [] that the diagonal of the viewable display area is 16.3". All we need to do is calculate the number of pixels on this diagonal (sqrt(2560^2 + 2048^2) = ~3278.4), and divide that by 16.3", to get 201.1 ppi.

  • Remember reading that a big problem with displays with this sort of pixels per inch was rendering old programs that used pixels as their main measurement. Everything is rendered too small to read. Obviously there are workarounds, but it seems like OSX for the mac will have a big advantage. Don't know all the specs, but I believe its graphics engine, Quartz, is vector based, with built in scalibility at no loss of image quality.
  • The question I have is: what video card (besides the #9) supports the SGI's funky (yet wonderful) 1600x1024 resolution?

    The first crop of SGI Intel-based workstations, the 320 and 540 series, supported the 1600SW out of the box with their Cobalt chipset. The SGI O2 could also drive it with a special adapter.

    It's really kind of sad that the industry went with the other digitial signalling technology, which is encumbered by patents and limited in resolution. Check out SGI's whitepapers [] on the subject. I think the new MultiLink Adapter is way overpriced. They should include it with the monitor IMHO.

  • Don't forget that a printer dot can only be one colour. Consequently, you need to use multiple dots together to create shades of grey or colours. Thus, the effective resolution of a printer is lower than you would think for certain applications. Of course, having crisp 1200 dpi black-on-white text is very nice :-)
  • Hear hear, though it's especially true for the crap Compaq ships.
    It all boils down to quality: if you buy a cheap $200 17" or 19", you should expect it to ruin your eyes.

    You should re-read your reply in this context. I have a Compaq P110 21" at work and it is an excellent monitor. I'll probably never spend that much on a monitor for home, but it's a damn nice monitor. If a consumer or business spends $200 for a 17" or 19" Compaq monitor, I imagine most of the people who use the monitor will assume all Compaq monitors are crap.

    Then again, if I were Compaq and I were selling crappy monitors cheap, I'd probably not put my brand name in an obvious spot on them.

  • It was sarcasm...
  • well....I dont know if the bank will approve me for a loan THAT big...

  • Thinking about the same thing, I did some simple calculation on bandwidth requirements... 2560x2048 is (as stated) 5Mpixels (using M to mean 1<<20, as in MB). At 32 bits per pixel, that's 20 MB per frame. If we want to display that at 60 Hz, that's a rather hefty 1.2 GB per second bandwidth requirement. One way to ease that is to split the frame buffer across multiple cards, since each frame buffer then only needs to deal with a fraction (here, a fourth, or 300 MBps) of the bandwidth. Reservation: 60 Hz might be more than an LCD uses, so the above figures could be off by a factor of 2 or so. Still, I think there's a problem here.

    If you want to do full-screen 3D graphics (which seems to be high on everybody's wish list, judging from the number of drooling references to Q3 among the posts here :), you do not want your display to "steal" 1.2 GBps of the available bandwidth. You want to use that bandwidth to blast pixels to the screen, read textures and Z-buffer values, etc. It's unclear what the solution to this might be. You could go the route of Bitboys [], and embed the frame buffer in the display core, thus giving wide busses and huge (they mention 12 GBps) bandwidth. However, embedded DRAM currently limits the amount of memory in the frame buffer rather strictly. Bitboys talk about 9 MB, which in the case of the Roentgen wouldn't even allow double buffering the entire screen (remember, each screenful requires 5 MB)... My point, then? Um, I don't know. Memory bandwidth is a hairy thing, or something. Good luck to all engineers involved! ;^)
  • by Paul Carver ( 4555 ) on Thursday July 13, 2000 @05:53AM (#937395)
    200 ppi

    2560" x 2048"
    21" x 16.5"
    two 8.5" x 11" side by side

    2560/21=121.9 ppi
    2048/16.5 =124.1 ppi

    two 8.5" x 11" side by side = 11" x 17" portrait or 8.5" by 22" landscape

    21" x 16.5" is slightly less than four 8.5" x 11" pages in a 2x2 grid.

    So what are the real specs on this monitor?
  • by CIHMaster ( 208218 ) on Thursday July 13, 2000 @05:54AM (#937397)
    This is why the issue was brought up that what is needed is a vector based GUI. A vector based GUI would behave much like a 3D game would, in that regardless of the pixel count on screen, the objects remain the same size visually. So even if your monitor had 2500 x 2500 pixels, you could have a 1280x1024 (or higher/lower) equivalent resolution with photographic clarity. That sounds REALLY good to me.
  • No - that's just nice - this [] is godly!
    Email address is real.
  • They had a prototype of this thing at IBM almost a year ago. It took them a full year to announce it as a product, i.e., to put it into industrial production. Somehow I think it unlikely that they will halve the price after putting in a full year of production development and who knows how many years of research.

  • by FascDot Killed My Pr ( 24021 ) on Thursday July 13, 2000 @05:09AM (#937409)
    What does a "godly" monitor do?
  • IBM outta team up with these guys [] - Can you imagine a monster-res 3d monitor? Kinda leaves this whole "real world" thing in the dust!
  • by Junks Jerzey ( 54586 ) on Thursday July 13, 2000 @05:10AM (#937411)
    I don't know if it comes from using PDAs and emulators for 8-bit home computers, but I'm actually starting to prefer lowish resolutions on small monitors. Maybe it's just the realization that I'm usually staring at a small window in the center of a large, expensive, EMF emitting monitor. Along the same lines, I'm starting to see anti-aliased text as *fuzzy* rather than smoother. I was using an Atari 800 emulator the other day, believe it or not, I really got into the sharp, chunky feel of the text.
  • by georgeha ( 43752 ) on Thursday July 13, 2000 @05:10AM (#937412) Homepage
    They still seem slow to me, especially when dragging a window around.

    I have the chance to play with a Sun Enterprise rackmount server with a flat panel LCD, it sure is nift looking, but the slow refresh rate is to distracting.

    I imagine doing Quake or Doom on this would be lackluster, jsut a bunch of smeared pixels.

    Are they every going to make the refresh rate better?

  • by 11223 ( 201561 ) on Thursday July 13, 2000 @05:10AM (#937413)
    Do you really think that you could afford one of these babies? I don't think that VA is paying you that much...

    That much said, expect around a decade before this technology works it down to a price point such that you can buy it, cheaply. Right now it's mainly for kick-ass CAD, which IBM has been targeting very heavily with its workstations recently.

    Personally, I think the best part of this is the fact that Matrox gets attention out of it - they never seem to get as much attention as they should!

  • Imagine a Beowulf cluster of....wait a practically NEEDS a Beowulf cluster to RUN it.

  • The cost of the materials that goes into one monitor is less of a factor than the number of monitors that come off the production line flawed, and need to be discarded. We're dealing with the fabrication of tens of millions of microscopic components here, and if even a handfull of them are botched in production, the resulting monitor panel will be unsaleable. When someone pays $10000 for one of these flat panel displays, they're paying not only for the display they got, but for the X number of displays that (on average) came off the line too flawed to sell.

    -- WhiskeyJack

  • by Tom7 ( 102298 ) on Thursday July 13, 2000 @07:24AM (#937426) Homepage Journal
    I've played Quake on my Vaio's LCD, and it's the best picture I've ever had (better than my Trinitron, yes). Even at 10"! It's also a lot, lot more comfortable to stare at all day. There are bad LCDs, yes, but try out some new ones...

    The only reason I haven't switched to LCD for my desktop is that I don't know of any quality digital switches, so all my computers can share it.
  • I must say I am a little skeptical. About every 2 years, for the past 10 years or so, there has been an announcement from some lab concerning a breakthrough in display technology (the last one I heard about was from TI and was going to provide 300dpi on a 20' screen). None of these technologies has ever come to market; we are still pretty much stuck with big tubes or expensive (and somewhat slow) LCD's.

    Now, I would love to see a breakthrough in technology and to have a 1200 dpi display device (other than paper that is ;-) ). But forgive me if I don't believe it until I see it at Circuit City at a price of less than 1000 USD.

  • Here is one godly [] version of this monitor.

  • by Watts ( 3033 ) on Thursday July 13, 2000 @05:12AM (#937435)
    It's sort of interesting that they're using the four-head Matrox boards to power these things.
    While consumers are now seeing boards that have output for two monitors from Matrox, according to a friend of mine, Matrox makes a lot of specialty boards like the one mentioned. Some of the four screen models are used in financial institutions or somesuch.

    As for the technology driving it, it's a massive board (or combination of boards) powered by the G200 chipset. Matrox may be making these based on the G400 (or even G450) by now, but I'm not sure.

    IBM must be using some sort of tiling scheme to display the stuff. xinerema in hardware? :)
  • by CalmCoolCollected ( 210423 ) on Thursday July 13, 2000 @05:17AM (#937439)
    Roentgen []
  • This is Yet Another Big Overpriced Flat Panel. There are lots of those. Really Big Displays with Really Big Price Tags have been around for a while now. Visit any trade show.

    When somebody can get the price of this thing down to around $2000, that's news.

  • How will this influence pricing of conventional LCD screens by different manufacturers? And will IBM licence this technology to other companies so we will see for example Toshiba notebooks with ultra-high res screens? How about viewing angle of these displays? Is it the same as in ordinary TFT?

    OK. That's more than one thing I'd like to know.
  • I'm drooling already, even though I won't be able to afford such a thing for at least a year. But in the way of everything electronic, today $10,000, next year $4000, year after $1000, and in three years, they're giving them away ('cause the things are obsolete).

  • by Raleel ( 30913 ) on Thursday July 13, 2000 @05:20AM (#937453)
    I did a search and couldn't come up with it, but this was mentioned before. I know I did once as well, in a report on supercomputing 99 as a comment to a comdex/las vegas 99 report. I played with it a little. The resolution is insane...they showed a map of a 20 mile x 20 mile area of new york as part of the demo. Every single street was displayed. They pointed out the dot on an i of one word and said that was a single pixel. It is really truly nuts, but the graphics head to go along with it is mighty pricy ;)
  • Hopefully they can use that new Microsoft ClearType technology to make the text look better.

"If you lived today as if it were your last, you'd buy up a box of rockets and fire them all off, wouldn't you?" -- Garrison Keillor