Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IBM

IBM Ships First 22" 200dpi Displays 138

wonko writes: "IBM has begun shipping new monitors that are as much as 12 times sharper than current displays, and 4.5 times sharper than HDTV. These new 22-inch active matrix liquid crystal displays use aluminum-based technology and have over 9 million pixels. IBM will soon be licensing the technology to other display makers, so you could soon see these screens in laptops, PDAs, cellphones, etc. Pardon me while I wipe the drool off my keyboard ..." This is the same high-definition display you read about here earlier. They are not yet in CompUSA, to put it lightly -- first examples are going to Lawrence Livermore -- but the trickle-down effect in a couple of years is promising.
This discussion has been archived. No new comments can be posted.

IBM Ships First 22" 200dpi Displays

Comments Filter:
  • by SpinyNorman ( 33776 ) on Saturday November 11, 2000 @06:50AM (#630343)
    These new 22-inch active matrix liquid crystal displays use aluminum-based technology and have over 9 million pixels. IBM will soon be licensing the technology to other display makers, so you could soon see these screens in laptops, PDAs, cellphones, etc.

    22-inch display on a cellphone? Damn!
  • by Anonymous Coward

    Actually for photographs, it is newspaper/magazine quality. According to agfaphoto [agfaphoto.com] and kodak magazine quality is 150-175 lpi and for art quality books/magazine it its 175-250 lpi. [kodak.com]

  • Unfortunately, they are going for $30K a peice and are only making a few (10!) per year.

    I would be very, very suprised if these were as cheap as $30K each. I expect these things are expensive. They're shipping them to Lawrence Livermore, for christ's sake -- not some little $20 million dollar dot com where a VC might blanche at the bill.

    Of course, since they're being used with ASIC White, I'd have to imagine that however many millions of dollars these 10 displays are going for, they make up only a small part of the total rental and service contract bill for any given month. Heck, IBM may have even just tossed 'em in as a promotional item, like the toy in a crackerjack box. "FREE! With every $100,000,000.00 purchase, a 22" super display! Offer available only for US government (and Batman)."
  • <i>True, but you should compare the new technology to the best that is available now, not to the average.</i>

    <p>Why?
  • by pb ( 1020 )
    Well, I used to have a program that would send a "high-res" fax, at 300dpi. (or 300x100? 300x150? I'm confused...)

    But yeah, I mentioned about the anti-aliasing, I think that would make up for most of it. Not true color printing at 600dpi or greater, but on a lit screen I'm sure it would look amazing, much like HDTV's do. :)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • It's about time to do that. I remember back in the 80's when Scottie made Windows with Aluminum. He sat down at an old Pee Cee and commanded it to draw up plans. When the old machine failed to respond, Scottie sighed and took up the keyboard. After some typing, he had Aluminum Windows, which was a vast improvement at the time. Strangely enough, all I've ever seen on my Pee Cee has been MS Windows and X. I suppose MS is what happened to Al and X is an improvement that escaped by accident.

    It's been nearly 15 years and it's time to move on. Copper should not be imposible now that we have been shown the way. Copper Windows should be heavier but flexible stable and lasting. Let's do it!

  • Windows 9x hasn't used bitmapped widgets since day one. You can change the size of the font on the title bar and it will automatically change the size of the widgets on the titlebar.

    The only thing that remains bitmapped about the Windows GUI is the icons, and I'm betting that 32x32 would make a droolingly good small icon on this display.

    Must.. get.. a loan..
  • I have both a DELL i500e notebook (1600x1200 15") and a Sony CPD-M151 TFT monitor (1024x768 15") and the colour on the sony is definately much richer. I'm no expert but I definitely think the colour quality of the sony is right up there with a CRT.

    Once you switch to TFTs you can never go back :-)

    .
  • Or just use OS X.

    That's strange, I thought you were a "be fan."

    "Extraordinary claims require extraordinary evidence."
  • Your criticism against X is simply not true. I have serious eye problem, too. I must move my own monitor to work. I got a high end 21 inches monitor and with a virtual desktop of 1200x1600, the resolution is configured all the way down to 300x400.

    To give you an idea. Right now, I'm holding a ruler and your post in netscape measures as a bit more than 1/4 inch for the caps. a little more than 0.7 a cm. This is acceptable to most people. When I zoom in to the lowest resolution, each capped character measures 0.7 inch.

    X is much better for people with eye problems compared to Windows. XFree lets you change resolution with one keystroke. Windows? at least 0.5 minute to change resolution. This allows you navigate with slightly higher resolutions, and zoom in when you need to read something.

    If you got eye problems, you simply cannot use any low end displays and you must change monitor every 3 years. Don't be a cheap bastard, make sacrifice on your hard drive, CPU, and buy the best monitor/video card you can afford.


    And Geeez! If you got an eye problem, maybe you should stop playing computer games and go out some more. You need your eyes to do something more productive on the screen.



    What you need is a good XF86Config file, fine tuned, and a very fast pointing device (mine is a Logitech trackmanFX) that moves super fast across the desktop. Or put in a second 21 inches monitor for a dual head. If the monitor becomes a little blured, THROW IT AWAY, or get the manufacturere to replace it like I do.

    And don't whine about desktop publishing, either. I used to be a graphics designer. graphics designers do not demand as crisp the monitor as programmers, and CAD operators. They don't need to look hard.

  • by dawg of the south ( 238923 ) on Saturday November 11, 2000 @07:03AM (#630353)
    Hummmm...... Aluminum, imagine what they will do when they switch to copper...

    You know like the CPU manufacturers did. Coppermine liquid crystal displays will rock.


    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  • Let's hope this thing doesn't cost as much as an HDTV, yikes!

    "The good thing about Alzheimer's is that you can hide your own Easter eggs."

  • Speaking of the need for vector based interfaces... Berlin [berlin-consortium.org] anyone? :-)

    -----
    "People who bite the hand that feeds them usually lick the boot that kicks them"
  • All of a sudden those 64 MB video cards don't seem to be that much. At 9 million pixels, this display would require a 36 MB frame buffer.

    You don't want to be pushing that through the system at 60 fps. So none of that "but can you play quake on it" stuff.

  • I used to do tech support. The customer told me that he bought a Flat Panel display. That was at the time when they were really expensive. I asked him why he needed one. And he asnwered that he had no more space on his desk.I told him that with the price of the monitor he could by a new desk or break down the wall of his and rebuild it a little further.
  • Copper is a pretty troubling metal to work with on such scales. It doesn't etch very nicely. The copper in cpu's is all in the interconnects between layers. The layers themselves are still aluminum.

    So I guess I wonder if there is much application for copper in these things. Not sure.
  • And don't forget that about half of the dimensions in stock HTML are specified in pixels. Time to start weeding those out, too.
  • by xjesus ( 231140 ) on Saturday November 11, 2000 @07:30AM (#630360)
    i wonder how many bad pixels they are allowing on this thing. Having 9+ million pixels they have to let a few bad pixels pass in order to get the thing out the door. I wonder if they will let that number increase compared to other lower reolution flat panel displays because of the increased pixel count. Then again, each bad pixel will be smaller so it wouldn't be as noticable. Here's an excerpt from one of their current mass produced high resolution displays
    The LCD monitor has about three million sub-pixel transistors laid over an area of about 328 square inches. Cost effective manufacturing processes produce a small number of defective sub-pixel transistors. The monitor may have up to 10 bright or 15 dark sub-pixels with no adjacent pair of dark sub-pixels, not more than four clusters of two bright sub-pixels and not more than three bright or 15 dark sub-pixels in any circle of 10mm in diameter.

    If they use the same standards as for the T86D [ibm.com] you would have up to 68 bright pixels or 102 dark pixels... ouch! i hope they improved thier manufacturing yields by the time this thing hits consumers.
  • I spent a lot of money on my 19" and 20" dual montitors. Now I need to go and buy two 22"? Damn.

    I thought that these monitors could be a long-term (5 yrs) investment. Unlike a processor that must be upgraded AT LEAST every 2-3 years.

    At least my Palm will have a better screen someday...

  • >I've seen the monitor, it cries out 'Radiation
    >EMITTED STRONGLY'.

    I've never seen an LCD with an emissions sticker on it.

    -LjM
  • I want one, now!
  • From the MSNBC article (which is really just a blurb):

    • but it won't be available in retail stores for another five years, according to IBM officials

    Shipping, sure, but to various famous scientific laboratories only.

  • /etc/X11/fs/config

    default-point-size
  • It wasn't enough that my CPU, memory size, and battery life were becoming obsolete in my laptop (purchased less than a year ago). Now my screen is becoming obsolete too....
  • why can't they use this kind of display on my TV? i'd really like to get a nice TV that'd be that sharp. I just wasted 800 bucks on a tv that has a "3-line digital comb filter" and it's hardly sharp. Why waste that kind of resolution on a computer where any program that fully utilizes the monitor would fill the hard drive in a heartbeat. I'd rather use that kind of technology with a streaming display.
  • by Anonymous Coward on Saturday November 11, 2000 @06:12AM (#630368)
    >rom the my-laptop-will-be-huge! dept.

    Yes! 22 inch laptops that can finally cover the average lap of the sysadmin!

    Tim, you spot the trend before everyone else....
  • Why? Hm... I guess it makes sense to compare new technologies to the best that is available and not to the average because there are already plenty of things that are better then average.

    Suppose there were already monitors with 10 mln pixels, then this news would not have been newsorthy at all.

  • by pb ( 1020 )
    This sounds great; we're up to fax-quality dpi, but in full color. Not quite up to the "virtual paper" level yet, but probably really close, especially with a little anti-aliasing.

    And even though these aren't available to the public yet... How much do they cost, and when can we expect to see these in the home? The answer had better not be "2010" still. :)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • Actually I think remember reading (this was a long time ago now) that the Earth-Moon is more a binary planet than planet-satellite type system. Meaning that the Earth and Moon actually rotate around a point in space (closer to Earth due to its larger mass).

  • Yeah, Windows and X is all you've ever seen on your PC. But Scottie in ST4 was using a Mac!! ;-)
  • All I can think about is how great Half-Life would look at that kind of resolution.

    +++++++++++++++++++++
  • Comment removed based on user account deletion
  • 9 million pixels X 4bpp= 36MB framebuffer! Does anyone know what kind of graphics cards they have on those super computer terminals?
  • If it doesn't make my X-11 Netscape Fonts look any better, then I don't need it.
  • You could NOT break down a wall and rebuild it farther for $3000. Also, a lot of people don't like a big monitor hogging their desk. If you're working in a cozy study or in the corner of the living room, then a flat panel display is ideal. Additionally, if you're a minimalist, then a standard CRT probably doesn't go with the "look" of your house. Not to mention he might just be tired of frying his brain with cancer causing emissions.
  • And what kind of pull does he have to rate being first on the list for the first decent monitor?
  • Actually, you wouldn't be pusing 36MB x 60fps to do Quake. Remember, 3D is vector graphics. Aside from textures, the data rate should be the same wether you're rendering at 1600x1200 or 16,000x12,000. Also, 50fps video at that res would require about 1.8GB/sec of bandwidth. While that's not within the realm of AGP 4X, it is within the 3-5GB/sec of bandwidth that even mid-range SGIs have.
  • Do you realize how few people know that 200dpi==40,000 pixels per square inch? You're talking about a country where 1/5th of people don't know whether the earth goes around the moon or vice versa. (Which, BTW compares favorably to the 1/4-1/3 in parts of Europe.)
  • Actually those or pretty loose guidelines. Some companys (like Micron) have much tighter standards for their displays.
  • Good for you. It's rare for a person to be so willing to demonstrate both arrogance and ignorance...oh wait, that's not true.
  • by be-fan ( 61476 ) on Saturday November 11, 2000 @07:50AM (#630383)
    Or just use OS X. Quartz should be perfect for scalable GUIs (when, of course, we get HW acceleration for Postscript primitives.) I dream of a day when the user can specify a scaling size for their desktop, and program designers can work in a virtual coordinate space without having to worry about resolution.

  • Quake may be vector graphics, but your video card isn't. It would take 5 Geforce2 GTS Ultras (each with more than 7 GB/sec memory bandwidth) to draw enough pixels for 60fps @3840x2400.
  • Nope - as they are bitmaps - they will be smaller.
    :)

    --
  • All I can think is how great Lemmings would look at that kind of resolution. I could see the whole map in a corner of the screen and if I buy a microscope i can even see the lemmings ! :)

    --
  • The amount of unjustified X Window System bashing is amazing: while some design issues start to emerge now (13 years later), resolution indipendence is not one of them.

    • The toolkits applications are developed with (Athena, Motif, GTK+, Qt) compute the dimensions of widgets (buttons, textfields, etc.) on the dimensions of the text they contain.
    • The X servers know (or can be told) the DPI resolution of the display
    • One of the 14 fields of font names is used to choose font sizes in tenths of typographic points: this is interpolated with the known DPI resolution of the monitor to obtain an actual size in pixels.
    • Toolkits derived by Xt (Intrinsics) have a mechanism for the user to specify fonts for every class of widget via X resources (i.e. your ~/.Xdefaults file). Shortly, it works for every Athena or Motif application.
    • GTK+, which is not derived from Xt, has a nice ~/.gtkrc in which you can do the same thing (and you can do it from your Gnome control panel).
    • Qt, which is not derived from Xt, has a similar mechanism to specify font sizes.
    • Many complain that the Helvetica and Times fonts supplied with the X Window System are bitmapped. So what? Use Adobe Type 1 fonts (like Utopia), or TrueType fonts (Arial and Times New Roman), which are vectorial.

    So, unless the GUI developer was moron and specified absolute dimensions for widgets, there is resolution indipendence. Just specify the size of your fonts in tenths of typographic points (the 8th field): the layout manager of the toolkit then will make your app equally usable at 640x400 and 1600x1200.

    The real problem here is only with bitmaps: they have a hinerent size in pixels. Buttons containing only a bitmap won't scale (unless the developer arranges things specifically, i.e. scaling the bitmap to be n rows of text high).

  • It's called DLP (Digital Light Processing), and its from Texas Instruments. It's used in digital projectors, and I believe it's the technology used in the new theaters with a digital picture.
  • Ohh, then don't complain, I've the 7000 with 14" screen... :(
  • The classic photoshop rule of thumb was to have a image DPI 3x to 5x the LPI of the press.

    Actually, that will just give your imagesetter a headache from the extraneous data you're downloading. 2xLPI is almost always an adequate resolution, and 2.2 is the absolute maximum you'll ever need. See http://www.adobe.com/support/tech doc s/c29e.htm [adobe.com] or Ch. 3 of the Photoshop manual.

    Danny

  • Hrm, is someone moderating their own submissions?

    Methinks yes.
  • by Enahs ( 1606 )
    PDF is PostScript
  • Even if it were at CompUSA, I wouldn't buy it. I worked there as a part-timer for close to 4 months and the corruption involved there -- everything from selling TAP (warranties) under penalty of being fired to kicking people out when the wanted customer service -- was deplorable.

    Ever since then, when a friend wants a computer part I go to another store or get it from an online source.

  • 1.64 miles is 2.64 kilometers.

    What ever happened to metric?

    Didn't we learn from that nasa flub last year? The future is not in miles or feet or inches.
  • by Anonymous Coward
    Well, the current high-end cards come with 64MB onboard (don't know if this can all be used for the framebuffer, or if for 3D-use only...). So in a couple years, this should be affordable enough for mainline desktops.
  • Where do the get these numbers?

    The resolution is something like 3500x2500. Best commercially available displays have something like 2000x1500. 3 or 4 times sharper is more like it.

  • That was said in /. discussion about new small 'planet' found with mass less than Plutos.
  • As for scalable graphics, this will be really interesting to see. One of Windows greater failings (IMHO) has always been its lack of geometry management. Most Windows apps basically nail things to specific X,Y positions in a dialog, rather than having a fluid layout which specifies relative attachments. (This is one area where Motif does something better than Windows).

    I haven't done any Windows GUI programming so I can't really compare, but GTK+ still has many distances measured in pixels. Getting resolution dependance out of our applications will take quite a long time.

    The other question I have is whether scalable graphics for the GUI is really feasible on existing 72 dpi displays with all the aliasing effects that implies. Does the new Mac interface really use vector graphics for *all* its icons and such? If we've all got 22" 200dpi displays, sure, but that's not going to happen for a long time yet.

  • By the time it had reached the PC, it had evolved.
    Here's a snippet from a DCA/Intel spec for an early API:

    "
    1 1 Transfer type:
    0 - 200x200 dpi, fax mode.
    1 - 100x200 dpi, fax mode.
    2 - File transfer mode.
    "

    which included the new (at the time) high resolution fax mode.
    Incidentally, it wasn't even 100*200, the original ITU spec (T.4 I think) specified 196*98 dpi.

    However, you're right, after that it evolved even further.

    Best bit of code I ever wrote was a T6 (group 4 Fax) decoder.

    FP
  • That 5 years to market might be also a safety measure. Why ? Because no video card on the market right now can handle that many pixels in 2D yet. At their rough 9 megapixels, that means 36mb per screenpage for full resolution. Any game will therefore require at least 72mb for double-buffering.. more for 3D. Lastly, can you imagine the abusive CPU load required to ferry that many pixels from system ram to video ram ? Next thing you'll know, Internet Explorer will be using offscreen video ram for its browser cache since it will surely require at least 128mb onboard. Just plain nuts!
  • Think printing. Right now your monitor has a resolution of 75 dpi and bitmaps are made for that resolution. But you still can print them on you 600 dpi laserprinter. Although you don't take 100% advantage of the 600 dpi, it'll still look better then when it's printed on a 300 dpi printer.

    I don't know how dpi settings work in X, but I can imagine that X could be made to use 3x3 pixels for a single pixel from a 75dpi bitmap (200 dpi is almost 3 times as high as 75 dpi).

    So, all you have to do to be ready for such a display is alter X. At a later stage you could rebuild/enhance your wm to use high-res bitmaps. Or you could scale them, etc..

    Thimo
    --
  • "...going for $30K a peice and are only making a few (10!) per year."

    Wouldn't 10! a year be 3,628,800 per year?

    Yes folks, that was a joke.
    ------------------------------------------- --------------------
  • All bodies orbit around a commen point in space, the common center of mass. The sun and earth do it to. If you look at the sun over a long period of time, it will look like its wobbling because it is orbiting around a point thats not exactly at its own center (though, given the mass of the Sun, its pretty damn close.) The common point that the moon and Earth orbit around is about a thousand miles under the surface of the earth, so you can still say that the moon orbits around the earth. If it were outside the earth, then it might make more sense to call it a binary system.
  • by MrBogus ( 173033 ) on Saturday November 11, 2000 @08:01AM (#630404)
    Printing presses are fundementally different than computer displays. Look closely at a magazine, and you'll notice that the "dots" are 1) arranged diagonally, and 2) are of variable size.

    The classic photoshop rule of thumb was to have a image DPI 3x to 5x the LPI of the press.
  • Not the only error. Current displays have something like 2000x1500 resolution. How is 3800x2500 is 12 times sharper?

    Not even 4 times sharper as a matter of fact!

    People who write these articles need to take some remedial math classes.

  • Ok, for accelerated games, you wouldn't have to pass that much data over the AGP bus. The data rate may remain the same, but you still have to draw many more pixels per frame. If you get 60 fps on a 1600x1200 (1.92 Mpixel) display then on a 9 Mpixel display you will probably end up with 13 fps.

    On the other hand, not all of the stuff that appears on the screen are games. Any non-accelerated fullscreen app will most likely suffer.

  • by Anonymous Coward
    Named, fairly sure, for Ernest O. Lawrence, inventor of the cyclotron; also located in Livermore, Calif., fairly sure. Very interesting stuff goes on there. Full name is Lawrence Livermore National Labs, also fairly sure. (llnl.gov?)
  • <p>However, 2000x1500 isn't exactly a common resolution, today. Many more people have 1024x768.

    <p>3840 x 2480 = 9,216,000

    <p>1024 x 768 = 786,432

    <p>9,216,000 / 786,432 = 11.7
  • Who are you to tell a customer what the can and can't do. Theres too much interuption from assistnts and support these days. A consumer should be allowed to make irrational choices.


    DEW YEW KEEP A TROSHIN
  • Finally, my eyes may get a rest. I have considered display technology to be one of the major hangups to further progress towards the mythical, so-called paperless society. Just like the flicker of Saturday morning cartoons (accompanied by the requisite overdose of simple carbohydrates), my old CRT makes me EVIL!!! #-( They may say five years, but I bet the process will accelerate. Hope to be not-tanning in front of one of these in a couple of years.
  • As a visually impaired person, I have embraced the low-resolution displays of computers in the 1980s. I loved the 40-column displays. And when I got an Amiga, I saw no problem with using 80-column mode on my TV, even though it seemed blurry to everyone else. Windows at 640x480, however, pushed it to what I thought was pretty much the limit of my comfort.

    Then, I went to a LAN party, and saw all the 20/20 people doing Windows at 1600x1200, on 15-inch monitors, and complaining that "It starts to get a little blurry on my monitor when I try that..." Then I tried installing Linux and X-Windows on my own machine, and found that X-Windows was meant to never ever NEVER run in 640x480, because all the applications I found seemed to be designed for 1024x768 -- even though they had 7-pixel-high fonts.

    This new era of high-resolution displays struck fear into my heart, that in ten years all computer applications will run at 3000x2000 resolution, with 10-pixel-high fonts. And do you seriously believe that people won't design web pages to be "best viewed at 5000x4000"? Or that they aren't already?

    But, in the short term, while a 640x480 or 800x600 large-fonts display is still a realistic option, a display like that might actually be a good thing. See, most LCD screens only work at a certain resolution -- 800x600, 1024x768, etc. If you try to decrease the resolution, you get either a big black border of wasted space, or you get random patterns of thick and thin pixel rows and columns. Either way, it's ugly. But if you start at 3000x2000, it becomes less ugly, because you're not alternating single rows and double rows of pixels anymore; you're alternating quadruple and quintuple rows of pixels. This would be good, not just for me, but for gamers who might want to play different games at different resolutions. Starcraft, for example, still plays only at 640x480 if I'm not mistaken.

    Of course, the best option would be if people designed everything to be actually scalable for a change. MS Windows has some support for scalability; you can set 800x600 for "Small Fonts" or "Large Fonts" and it works fine with most, but not all, apps. Other objects change size too, such as icons. Bitmaps, however, will always be bitmaps, and that affects web pages. Have you ever played Sissyfight [sissyfight.com]? A 200-pixel-high window, but 6-pixel-high fonts abound. Or Pixeltime [pixeltime.com] -- only usable because the pics can be zoomed and the text is largely inconsequential. Hopefully, when people think in inches instead of pixels, we'll see fewer sites like those. I just hope the backlash doesn't create pages that say "Optimized for a 22-inch display," though such a thing would better expose the inherent arrogance of such a design choice.

    Now, I imagine some of you are drooling over this display for the reason my friends always give for their insanely-high resolution: "Just think of how many more windows I can have open at once!" Of course, after a certain point, it would be easier on the eyes and wallet to just use two displays. Break that down into cost-per-pixel, cost-per-square-inch, etc. Perhaps dual displays might even have organizational advantages: "The 17-inch display is for code, the 15-inch display is for man pages and instant messages."

    Of course, none of this applies to desktop publishing, where the situation demands something as close to paper as humanly possible. Or video production, in which having a pixel-perfect HDTV display window would be very useful. But for mortals, well, we'll just see whether we use this power for good or evil.

  • by lowy ( 91366 ) on Saturday November 11, 2000 @08:11AM (#630412) Homepage
    A friend of mine who works at IBM sent me a screenshot. Looks great:

    <IMG src="22inch.png">
    <DISPLAY WARNING> If you can read this message your monitor is not high enough resolution to view this picture.
    </DISPLAY WARNING>
    </IMG>
  • In CompUSA, when it says ".28", it doesn't mean DPI, it means dot pitch. .28 means that each dot is .28mm, measured diagonally.
  • by Anonymous Coward
    >14 DPI -- Think Apple II in lowres mode. Breakout ...

    Nah, the Apple II lowres mode was more like 5 or 6 dpi. You're thinking of double-lowres. :D
  • by Sax Maniac ( 88550 ) on Saturday November 11, 2000 @09:27AM (#630415) Homepage Journal
    I saw this in person at the SC2000 IBM booth a few days ago... the guy hawking it called it "Big Bertha". It was amazing. Seriously, no story can do it justice as you have to see it in person.

    Unfortunately, they are going for $30K a peice and are only making a few (10!) per year. The seller seemed confident that the price/availability would be going down/up very soon.

    pr0n jokes aside, I know more than a few graphic artists who would rip out the liver of their best friends for one of these.

    As for scalable graphics, this will be really interesting to see. One of Windows greater failings (IMHO) has always been its lack of geometry management. Most Windows apps basically nail things to specific X,Y positions in a dialog, rather than having a fluid layout which specifies relative attachments. (This is one area where Motif does something better than Windows). Geometry management scales with resolution or font size, where absoulte positioning doesn't.

  • Yeah, that was actually a typo - I meant x2. Apologies for being so incorrectly "informative".

    There were some cases when we had to go up to x4+ in the old days to fix certain output issues. Too many years and too many jobs ago to remember why exactly.
  • a) Well, that's because the article was written by marketing people whose job it is to make things sound good to the drones.

    b) but there aren't, so it is.
  • You can change the size of the font on the title bar and it will automatically change the size of the widgets on the titlebar.

    I don't have a Windows machine anymore but as I recall this just expanded the bitmaps it did use for the widgets, so you get blocky widgets.

    Things may have changed in the three years since I stopped using it.

    TWW

  • Although you don't take 100% advantage of the 600 dpi, it'll still look better then when it's printed on a 300 dpi printer.

    I do not agree; I think this just looks crap. I assume you mean that you would make each pixel a 2x2 square. Uggh!

    At a later stage you could rebuild/enhance your wm to use high-res bitmaps.

    This is just a kludge, the real issue is trying to come up with a solution which is portable to less well-endowed displays. Scaling is the only way to go.

    TWW

  • Actually, even the 204 figure itself is incorrect. 3840x2400 pixels on a 16"x10" display comes out to exactly 240dpi -- not 204dpi.

    240dpi! Holy moly! Now I know what I want for Christmas (2001?).
    --
  • Just how fucking imperceptive are you?

    ---------------
  • Have a hang over?
  • Oops, looks like I misread the article. I mistakenly took "aspect ratio of 16 to 10" to mean that the actual dimensions are 16" x 10". Since the diagonal is 22" rather than 18.87", this is obviously incorrect.

    Assuming that the diagonal is exactly 22", the actual dimensions of the display are 18.66" x 11.66". This works out to 205.8dpi, which is more or less consistent with the article.
    --
  • You're comparing the linear dimensions. They're comparing the total number of pixels.
  • I saw something like this that IBM was demonstrating at their booth at CHI2000 in The Hague last April. It was called "Easy on the Eyes". They had a 23" monitor that (if memory serves) had 300dpi res. It was simply incredible. They had divided the screen up into quadrants, and each quadrant had a 16M video card running it's fourth of the screen.

    As an example they were showing X rays. If you got your face right up next to the screen, you still couldn't identify individual pixels. In fact, it looked just like a piece of paper. The difference between this setup and a normal hi-res monitor is simply indescribable.

  • We don't have a resolution independent operating systems/applications

    Under Windows 9x, right-click on the desktop (unless you are using a web page with Active Desktop and a JavaScript [right-click trap] [everything2.com]) and choose Properties. In the Settings tab, click Advanced. In the General tab, choose Font Size: Other... and crank the res up to 192 dpi.

  • by Anonymous Coward
    porn 4.5x sharper than it is on hd, shweet.
  • I just acquired an 18" LCD at work, a Philips. It's impressive at 1280x1024 so the IBM must be awesome. However, with all the emphasis on high resolution, what do we know about the IBM's ability to display colours? I ask because my LCD seems not to do greens very well, they look a little washed-out.

  • by stu_coates ( 156061 ) on Saturday November 11, 2000 @06:21AM (#630442)

    The Register [theregister.co.uk] has a little article [theregister.co.uk] about this also. They say that it will be used along side ASCI White.. so the worlds fastest compnuter gets the worlds best display... droool... Dear Mr. Bank Manager...

  • 200dpi screen is actually much better than a fax, and probably better than a 300dpi laser printer.

    Anti aliasing is partly to credit (esp with the 600dpi subpixel arrangement), but more importantly is that a 200dpi pixel is 1/200 inch in size. A 300 dpi laser printer has dots significantly larger than 1/300 inch, so the display will be much crisper. I think. I don't know why I know this, but am pretty sure about it.
  • Could you please give a 3 second description of the correction process? I had understood that the argument was that as long as the subpixels were contiguous with the color-carrying full pixel, no fringing would occur. I had also attributed apple-IIs fringing to the fact that the pixels were quite large.
  • by Pudding ( 9094 ) on Saturday November 11, 2000 @08:52AM (#630453) Homepage
    Limited number of programs?

    OSX runs all classic MacOS software, runs Carbon/Cocoa-based apps, and can also compile most unix apps, including X-windows packages (I have XFree running in OSX, for instance, and run Apache/PHP/PostGreSQL/sshd on the same box.

    That's hardly a limited number of programs.
  • True, it does depend on the reference system. However, in astronomy the reference point is the common center of mass. Since that center of mass is about a thousand miles under the surface of the earth, you can in all truth say the moon orbits about the earth.
  • Another reason why I support GUIs switching to a fully accelerated model. back in 1993, I saw a Matrox card that did HW accelerated TrueType. Given the fact that we have a 100 times as many transistors to work with these days, why can't the make a card that accelerates all of the Postscript primatives? I'd buy one, and the QuarkXPress guys would kill for one.
  • Firstly, fax is 200x100 dpi

    So this is not just twice that, but with 3 independent colours, you can use sub-pixel antialiasing and it appears substantially better than fax resolution-wise. Even ordinary greyscale antialiasing makes it appear higher quality than a fax.

    I can wait 5 years. But everything already looks a blur to me anyway!

    FatPhil
  • by joshv ( 13017 ) on Saturday November 11, 2000 @06:32AM (#630464)
    We don't have a resolution independent operating systems/applications. Thus, all that will happen on these displays is everything on your windows or linux desktop will just look smaller, not crisper and sharper.

    I might be wrong though, I think OS X with it's display PDF engine could actually make very good use of these displays.

    -josh
  • Just imagine pr0n on one of these! The boobs would be more real then in real life!

    Ok, maybe not MORE real.. most of them have implants.

    Not that I know, or anything!


    ------------
    CitizenC
  • by nagora ( 177841 ) on Saturday November 11, 2000 @06:36AM (#630467)
    It's time now to start getting rid of bitmap based window managers. All those nice open and close widgets and the icons on your desktop for applications, drives, printers etc are going to be tiny on these things.

    Making them into bigger bitmaps will just make them non-portable to older/cheaper machines.

    Time to get scalable icons working, whether you're Windows or X. There should be just enought time before these start hitting the market in bulk, although print and design houses will want these displays sooner than most and will pay for them.

    TWW

  • I recently setup a MAC for a gentleman who can only read text that is about 1.5inches high. I setup his 19" monitor to run at 1024x768. I then jacked up the font size. He and I tested various modes and it was found that the higher resolution was best. The reason was that the useless windowing crap was small enough that it did not take up huge ammounts of screen space, but the important text information was still readable.

    Not all was perfect though. While MacOS was very good at scalling the fonts and icons some of the applications were not. Word in particular is very bad at scaleing fonts. The zoom feature seems to be pixel based. If the document is zoomed to 400% the fonts looked awfull as the fonts were very blocky. Instead of taking a 12pt font and scaleing it to 400%, the pixels of the 12pt font are expanded 400%. The hack was simple two macro keys were programmed to switch the font between 12pt and 60pt (or 48pt can't remember).

    The problem is not the high resolution monitors, but rather the software that does not scale its fonts. I am very disapointed with how most software treats fonts. The user should be able to control the size of all fonts. If software was designed properly it would not care what font size was used.

    Web pages are a whole other matter. There is no need for a web page to dictate what font and font size is used. Web designers that need to do that generally make ugly, hard to read pages that don't have anything content.

  • by bellings ( 137948 ) on Saturday November 11, 2000 @01:09PM (#630476)
    I imagine some of you are drooling over this display for the reason my friends always give for their insanely-high resolution: "Just think of how many more windows I can have open at once!"

    I'm afraid I'm thinking of this in almost exactly the opposite way you are. My vision is reasonably good -- slightly better than 20/20 without glasses, even better with glasses. But text is difficult to read from a standard computer display for me, too. Guess what -- it's difficult for anyone to read. Why? In part, because standard displays have awful, awful, awful resolution. And with the standard, antiquated software that comes on nearly every computer made, the size of the text on the display is dependent on the resolution. The better I make the resolution, the smaller the text gets -- it unbelievable to me that I'm still using software shitty enough to demand that. But hey, what can you do? Its not like its a new millenium yet (wait another month and a half for that).

    I guarantee that as high resolution displays become available, the idea that the size of the text on the monitor is somehow tied to the resolution of the monitor will go away. Think, for example, of printers -- imagine if someone said to you today, "I only buy the lowest resolution printers I can find. In fact, I prefer the old 120 dpi bubble jets. That way, the text looks bigger when I print, and its easier to read." You'd look at them as if they had a huge screw loose inside their head. "Why," you'd think, "would anyone on earth believe the resolution of the printer would affect the size of the text? The text is always scaled to be the same size -- the lower the resolution, the blockier the letters get. Lower resolution makes it harder to read -- not easier."

    With any luck at all, in 10 years resolution independent display drivers will exist, and the idea that higher resolution is somehow "harder" to read will go away. Unless, of course, you're still using X windows. Bleh.
  • If it really were "204 pixels per square inch" that would be one of the worst resolutions ever made since it would only be around 14 dpi.
  • nautilus already supports SVG icons... (this is similar to the PDF in MacOS X, except it's open and free)

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...