Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Slashback

Slashback: Memory, Constancy, Triumph 278

Tonight's slashback with news of how you can help rebuild the foundations of the Internet (at least a small corner), more on slimming down the old Cathode Ray Tube, a new compiler which costs a bit more than GCC, and more.

Why not put 'em on Freenet while you're at it ... Imran Ghory writes: "Google has put out an appeal to get NetNews CDs (produced by Sterling Software and CD Publishing Corporation) which archived usenet between 1992 to 1995. Looks like Google is reviving Deja's idea of a total usenet archive."

This sounds like a worthy objective, worth rooting around for -- maybe they'll even give you a credit somewhere.

They know that of which they speak. Hot on the heels of the inexorable GCC project's 3.0.1 release, zealot (and a number of other people) wrote with the news that "Intel will release its latest compilers (the ones that optimize for P4 and can do some auto-vectorization of code) for Linux this Thursday. I'd love to see some performance numbers for compiled code on a P4 if anyone gets their hands on this ... maybe the autovectorization could help some gimp plugins speed up."

You cannot stop the chess updates Álvaro Begué writes: "Junior is the new World Micro Computer Chess Champion, Shredder won in the single processor category (five years in a row) and Goliath won the blitz tournament. Congratulations to all of them. Check out the official website."

Maybe the durned things will stick around forever. In addition to the IBM research on making ultra-slim CRT monitors, an Anonymous Coward points to another article on the future of CRTs: "This is a new technology that can integrate into existing production lines and can halve the depth of a CRT type tube. A TV normally 22 inches deep would be only 11 inches."

This discussion has been archived. No new comments can be posted.

Slashback: Memory, Constancy, Triumph

Comments Filter:
  • I've always felt CRTs were easier to look at. Maybe 30" monitors won't be out of the question with this new technology!
    • by Anonymous Coward
      Seriously. My current CRT takes up way too much room on my desk. I'm looking forward to having larger screens using less deskspace.

      LCDs are nice, but you miss out on the flood of radiation pouring out the front.
      • I know what you mean. Years of computer work have altered by biological structure to the point that if I'm away from a CRT for more than an hour or two I begin to feel weak.
    • Really? Since using a 13.3" TFT LCD (on a Toshiba 2805) I've hated switching back to CRTs. In fact, I use my laptop at work rather than a company-provided desktop w/ CRT. (And until Ricochet died [ricochet.com] I would often not plug in to the company network, but that's another story.)
    • by BigBlockMopar ( 191202 ) on Thursday August 23, 2001 @10:32PM (#2211852) Homepage

      This is a new technology that can integrate into existing production lines and can halve the depth of a CRT type tube. A TV normally 22 inches deep would be only 11 inches

      This is nothing new, but it's an incremental improvement. I'd like some technical info before I can decide whether or not this is just a marketing stunt or other dubious improvement.

      When TV sets first came out in the 1940s, their CRTs more resembled oscilloscopes. They were long, and with small screens. Their deflection angles were about 25 degrees.

      As the early 1950s dawned, TV sets started to feature electromagnetic deflection. New, horizontal and vertical ouput tubes were suddenly able to support the current requirements of deflecting the beam 45 degrees towards a new big-screen 17" display.

      The 1960s saw the beginning of the embrace of color television. As there are three electron beams in color TV sets, the neck was bigger than in monochrome sets. More deflection current was required to drive a 17" color set than a 17" black and white. High-tech new beam power amplifier tubes were developed to deal with the loads - compactron tubes like the 6LU8 and 21GY5 replaced the venerable 6BQ6. The spillover was that the mass-produced new high-power deflection tubes could also be used to make tighter deflection angles on black and white sets; the 19DUP4 was a Philco B&W picture tube released in 1965. It had a whopping 110 degree deflection angle, making for a TV set that had a 19" display but was only a foot deep.

      Solid state TV sets using high-power MOSFET transistors have been able to handle the bigger current to drive new tight-deflection 110 degree color tubes. So far, it's been incremental.

      But there remains a problem. A TV set's deflection yoke has to be driven with a sawtooth wave. There's a slow ramp up in voltage, then it quickly snaps down to off. Then another slow ramp and another quick snap. This corresponds to the beam sweeping sideways across the screen and then resetting to the left hand side very quickly.

      Because the output amplifiers are neither fully on nor fully off, they're running in linear mode. All the energy not actually used to drive the yoke during the ramp is simply wasted as heat. But that energy isn't free... won't these things be meant to deal with Energy Star and other certifications? Tighter deflection means more deflection current means more wasted power in the amplifiers... and if the EPA buckles by defining a new guideline for thin monitors like these will purport to be, they'll be in competition with LCD monitors.

      LCD will win.

      The CRT will always be with us, but its time in the mainstream is coming to an end. This sounds too much like a marketing ploy, and goes too far against physics to be anything else.

      • "Because the output amplifiers are neither fully on nor fully off, they're running in linear mode. All the energy not actually used to drive the yoke during the ramp is simply wasted as heat. But that energy isn't free... won't these things be meant to deal with Energy Star and other certifications? Tighter deflection means more deflection current means more wasted power in the amplifiers... and if the EPA buckles by defining a new guideline for thin monitors like these will purport to be, they'll be in competition with LCD monitors."

        Suppose we drive the yoke with the filtered output of a d/a converter? Instead of a big honkin MOSFET being driven in linear mode, we use..let's say...a 16 bit sawtooth generator. Now, I can't think of a DAC IC that would be beefy enough to do the job and amplifying it's output puts us in the boat we are trying to avoid. However, 16 somewhat stout transistors could be used to drive the yoke directly through an appropriate LC network. Now 16 transistors sounds bad but maybe it could be done with 8 or 12. Instead of a big hot power waster being driven in linear mode, we have an array of transistors being snapped on and off in digital mode. I don't design CRTS but I can't image why this hasn't occured to real engineers. There might even be power ics for this purpose.


        • Instead of a big hot power waster being driven in linear mode, we have an array of transistors being snapped on and off in digital mode

          Okay. How do you make the big array of transistors output the linear voltage that you need? A big ladder of resistors...

          So, instead of having the cost of one transistor, the manufacturer has that much more to deal with. The price rises and the reliability drops.

          Remember, these things are gonna be sold to idiot consumers, who can't understand technical benefits of anything.

      • Because the output amplifiers are neither fully on nor fully off, they're running in linear mode.

        This isn't true, at least for horizontal deflection (which requires the most energy). The output amplifier is basically running in switching mode; the sawtooth is generated by the energy stored in and released from the yoke's inductance. The dI/dt energy released can be stored elsewhere for the next cycle (in another inductor or in a capacitor) or just dissipated -- but not in the amplifier.

        You're absolutely correct that wider defection angles require more drive energy (for a given beam energy). Unless they've found a way to do more deflection before the beam is fully accelerated (which would reduce deflection energy requirements while making focusing more difficult), these units are going to suck massive amounts of power.

        -Ed

        • This isn't true, at least for horizontal deflection (which requires the most energy). The output amplifier is basically running in switching mode; the sawtooth is generated by the energy stored in and released from the yoke's inductance. The dI/dt energy released can be stored elsewhere for the next cycle (in another inductor or in a capacitor) or just dissipated -- but not in the amplifier.

          It's nice to finally hear from someone else on Slashdot who apparently has some clue of electronics! :)

          But I beg to differ. Maybe not in more modern TV sets and monitors, but on most stuff right up to the mid-80s, you could clearly pull the sawtooth off the plate of the horizontal oscillator or vertical oscillator using an oscilloscope.

          Resonance is what keeps the TV set efficient enough to be practical, but it's not what makes the sawtooth. It's far too fundamental to be trusted simply to the resonance of the yoke.

      • The CRT will always be with us, but its time in the mainstream is coming to an end. This sounds too much like a marketing ploy, and goes too far against physics to be
        anything else.

        speaking as someone who almost entirely converted to lcd (I run dual SGI 1600sw lcd's at home in xinerama mode. mmmm - nice!), I have to admit it falls down badly when it comes to photo work. the colors really aren't real and the viewing angle makes monitor calibration all but impossible.

        I do all my code devel on lcd's. but when I need to do photo retouching, it will ALWAYS be done [finally] on a crt. sad but true.

  • by Chairboy ( 88841 ) on Thursday August 23, 2001 @08:03PM (#2211410) Homepage
    Crikey! Usenet archives going back even FURTHER? Great, now people will be able to trace me back to my great Usenet roots....

    "My name is Dave Rhodes. In September 1988 my car was reposessed and the bill collectors were hounding me like you wouldn't believe"....
    • I said some foolish things on USENET, but fortunately it seems to be just before the Google archives. I'm really hoping that none of my postings will be discovered...

      I guess I can be accountable for my youth.
      • Great googly moogly, I also hope so. I used to be a conservative back then, arguing with liberals on the internet since 1988. Well, they convinced me, and I don't really want to be reminded of just how dumb I was when I was 20.
    • Just think of it as a way of proving to newbies that you were on Usenet way before it was cool.

      "My name is Dave Rhodes. In September 1988 my car was reposessed and the bill collectors were hounding me like you wouldn't believe"....

      MAKE ENEMIES FAST!!!!

    • That's ridiculous - everybody knows that you make lots of money in real estate - you buy the houses with someone elses money! No risk! Even twin midgets can do it!

      (P.S. No offense to twin midgets)
  • "This is a new technology that can integrate into existing production lines and can halve the depth of a CRT type tube. A TV normally 22 inches deep would be only 11 inches."

    All thanks to those changing laws of physics! [slashdot.org]

  • by CptnKirk ( 109622 ) on Thursday August 23, 2001 @08:14PM (#2211458)
    The topic is Slashback: Memory, Constancy, Triumph. Yet there isn't any mention about memory. Maybe they forgot. :)
  • Compiler costs (Score:1, Insightful)

    by Red Moose ( 31712 )
    Jesus christ! $399-499 for a goddamn compiler! Surely that sort of thing can't happen in the current tech meltdown. Wouldn't a more sensible price be better (say under $100)? I would expect that less companies would be likely to invest in this as if you look around at the belt-tightening going on, aren't free things more attractive now?

    Also, are they doing those mods in compliance with the GPL? Also, someone give me a goddamn reason why GLibc 2.2.4 should not be compiled with GCC 3.0.1. I did and it works without any problem (then again I don't know jack about the real reason).

    • I guess you haven't priced compiler liscenses lately...

      GPL has nothing to do with this compiler. And the reason is, it generates faster code.
    • Jesus christ! $399-499 for a goddamn compiler!

      That's not very much at all for a company. Especially if it will optimize enough to give a measurable performance increase. If you get a 10-20% increase for free (ok $500), it's well worth it. Compare this cost to what it would cost you to pay an engineer to optimize his code.


      • Compare this cost to what it would cost you to pay an engineer to optimize his code.

        The optimizations that an engineer would make would have a much more dramatic effect than tickling some opcodes.

        • Sometimes, yes. In some cases though, the compiler has more tricks up its sleeve. The issue is that a mature compiler has been programmed using the combined optimisation strategies from many _really_ good engineers; unless you've learnt every trick that all those engineers know, the compiler may be able to out-perform you, given the same piece of code to implement.

          Of course, the compiler can only work with the C that it's given by the coder. There's things you can do like making structures an even power-of-two size which will speed the code up; this is a trade-off against memory usage which only the coder can make. But after that, it's up to the compiler to make it as efficient as possible. For instance, on some processors a compiler may implement a integer multiply by 9 as "shift-by-3, add original value" which is often faster than a single multiply instruction, and most engineers wouldn't write their C this way.

          Grab.

    • well, by using linux those companies are saving tons of money compared to other systems. Hence, they have more money to spend, and many would probably prefer to spend it on a compiler that would probably build the tightest possible run time assemler code, especially if the company makes products that are not able to leave something to chance by using something like gcc, or require that level of optimization for their programs. Look at other compilers (like MS VCC), those cost around that much too, so I see the cost as no surprise.
    • Ah I see. In my quick page-scanning substitute for actually reading, I thought Intel were releasing a modified version of GCC, but after proper reading I see the comment was just on "hot on the heels of GCC".


      /me slaps himself across the face

    • Re:Compiler costs (Score:4, Interesting)

      by kurt555gs ( 309278 ) <kurt555gs&ovi,com> on Thursday August 23, 2001 @08:47PM (#2211594) Homepage
      The new corperate america, u would think Intel would be giving this away seeing how AMD is kicking their butts and without this optimization the P$ is a slug.

      Now they want to charge to make their dog chip work right?

      and no one else sees this?
    • Re:Compiler costs (Score:2, Insightful)

      by mattis_f ( 517228 )
      Since Intel is mainly a hardware producer, you'd think they would give away their compiler and even open up the source for it - and thereby boost their chip sales. Apparently, a program sold today is more worth to intel than 10 P4's in a week. Of course, they will have to offer support and maintain the compiler as well - which will cost money in the future. They're making a lot of strange decisions over in Santa Clara these days.
    • I'm pretty sure Jesus didn't set the price, although I'm sure he could produce a compiler that would justify it!
      • by Anonymous Coward
        No blasphemy is allowed here. If you take the good Lord's name in vain again I *will* call the FBI and you will be imprisoned for life.
    • It *would* make more sense for Intel to release the compiler either free or Free if not only to foster Pentium 4 acceptance.
  • Thanks, MSNBC (Score:1, Interesting)

    by 4thAce ( 456825 )

    S-Cubed works by bending beams of electrons in a way that allows the electron gun -- which shoots out the beams -- to be moved closer to the screen.

    This, to me is like saying "S-Cubed works by making CRTs smaller." With what, hyperspace? Gee, do you think you could be a little more specific?

    Would appreciate it if someone could find a relevant patent application.

  • ...do they really work? If so, why doesn't AMD develop similar software for Athlons? Are they just too small... or is the intel stuff a bunch of marketing phooey for PHB's to swallow? Are there any more questions?
    • they do, and it works well. IIRC, they submitted benchmark results to SPEC where a Pentium chip (not sure what one) smoked several others in many benchmarks. SPEC rejected those benchmarks because Intel used a special proprietery compiler with the tests and not a normal compiler a developer would use.

      Hence, Intel has compilers of their own that work very well, but why they aren't made public like this Linux one is, I wish I knew, as it could undermine MS-VC in terms of compiled code performance.
      • Intel has sold a compiler for windows for a long time. It can also be integrated into VC++. Though it also costs $399 and most people who buy VC++ for $99 aren't going to pop for another 400 to marginally increase runtimes for most executables. (Though games/video/encryption and other code that can make use of SSE2 would benefit greatly)
        • Once again, if you're using the cheapo "standard" version, you aren't using the real VC++ compiler. The optimizing compiler is only present in the professional and enterprise versions , which respectively cost around $500 and $1000, IIRC.

          The standard compiler is great for fucking around but you really want the real one for production systems. At my office the Windows weenies have a MSDN Universal subsrciption so they have all the cool toys anyway. If you aren't familiar with the wacky world of Windows, the MSDN Universal subscription is about $3000 per year and includes monthly (!!) shipments of the latest patched Microsoft OSs (all of them... Win2k Pro and Server, Me, et cetera), Visual Studio Enterprise (which includes VC++, InterDev and a whole bunch of other shit), plus beta releases of upcoming products. If you're a MS shop it's pretty sweet.

          It may come with other toys; I'm not really sure, I'm not in the Windows group (I'm in the "web" group, we run AIX) and just use their VC++ install media on my NT workstation.

          Come on, NT Server licenses cost $600-800 a piece. You think they're going to practically give away their fast compiler?

      • SPEC rejected those benchmarks because Intel used a special proprietery compiler with the tests and not a normal compiler a developer would use.

        Do you have evidence of this? I see plenty of SPEC CPU benchmarks using Intel compilers.
        • I found out about it during my college cpu architecture course. It was mentioned in the book for the class, which has been sold back to the bookstore. The book was: David Patterson and John Hennessy, Computer Organization and Design: The Hardware/Software Interface, Second Edition, Morgan Kaufman Publishers, 1997.

          I dont recall what Pentium chip this happened with or when it happened, all I remember is that it did.

          I've also been corrected by others on Intel making Windows compilers available for purchase. They do offer compilers and optimizers for purchase. I am assuming it is these compilers that the SPEC data you are viewing was generated from.
    • Basically it has some builtin optimizations that try to make your code run as fast as possible on the P4. In fact to get decent performance on a P4 you pretty much HAVE to use a P4 optimized compiler, or assemble it yourself (assuming you know the ins and outs of the P4).

      Intel probably has a document somewhere that will at least enumerate exactly how you should write your code for the P4 (at the machine level), the Intel compiler just follows that standard.
    • I've never used them, but Intel does provide high-performance math libraries [intel.com]. So, their compilers probably have real technical optimizations as well (not just marketing fluff).
    • My dev server is Intel (hey, give me a break, the AMD760MP wasn't available three months ago), and I've considered buying the Intel compiler. But I'm wondering, would it run on my Athlon workstation? Obviously I wouldn't be able to use the Intel extensions -- or the AMD extensions, for that matter, ha ha -- but would it run at all? Knowing Intel, I wouldn't be surprised if the compiler refused to run on an Athlon.

      I imagine that Intel's compiler market is rather small. Almost every Windows shop runs either Microsoft's or Borland's compilers, most GNU/Linux shops run GCC, and UNIX shops run either GCC or a vendor's compiler (but most UNIX shops don't run x86 anyway, heh).

  • by KidSock ( 150684 ) on Thursday August 23, 2001 @08:19PM (#2211487)

    Would you be surprised if Intels compiler produced faster code than GCC? I believe Linus has stated that GCC is a bit "bloated". I wonder if you can compile the Linux kernel with it (minus assembly of course). That might be interesting, particularly for P4. Linux could get an instant speed boost. And such a radical switch in compiler might expose flaws in the code. Definately a worthwhile excercise if nothing else. And even though the average user isn't going to buy it to compile their kernel, the distro's might for their precomiled kernels (err, wonder how that would work ;-/).
    • The Linux kernel is really designed specifically for gcc. Aside from using GNU extensions to C, in many places the code is designed specifically to get good object code out of gcc.
    • I'm not exactly sure what compiler bloat is supposed to mean since what matters is the assemby the compiler generates and not how many lines of code the compiler was written in. Secondly it is very likely that a compiler written by Intel engineers for an Intel chipset will perform better than a general purpose compiler written by volunteers on Intel chipsets. Finally there are many that would argue that the Intel compiler has been of higher quality than gcc for quite sometime especially with regards to C++.

      PS: The fact that a post as empty as yours is at +4 is a sure sign that all the good posters have either left Slashdot or no longer actively partcipate. Sad. :(
      • Re:Bloated Compiler? (Score:2, Interesting)

        by kurowski ( 11243 )
        I'm not exactly sure what
        compiler bloat is supposed to mean since what matters is the assemby the compiler generates
        Yeah, just like I don't know what word processor bloat is since what matters is what the document looks like [Word]. Or what's text editor bloat since what matters is the text generated [Emacs] (/me ducks). And what is language bloat [C++] since what matters is the implementation of the compiler? [g++] Oh hey, no wonder they've had such a hard time producing a good compiler... wonder if it's bloated like some of the languages it compiles?

        Point is, I'm betting that a compiler written for a specific chip and specific language (i.e. Intel's compiler) will perform better (i.e. produce better code) than a "compiler collection" wuth multiple pluggable front- and back-ends, all other things being equal. (Not that all other things necessarily are equal in this case (Go GNU!).)

        P.S. I don't think your trolling will help to improve the quality of the posts on Slashdot.

        • Point is, I'm betting that a compiler written for a specific chip and specific language (i.e. Intel's compiler) will perform better (i.e. produce better code) than a "compiler collection" wuth multiple pluggable front- and back-ends, all other things being equal. (Not that all other things necessarily are equal in this case (Go GNU!).)

          This is an illogical statement. Apache and IIS support using multiple language to develop apps while my homemade webserver only supports C++. Does this mean my webserver is of higher quality than Apache or IIS? gcc is written a modular manner and the different language compilers are written by different people so talking about compiler bloat (whatever that means) is moot.

          The important point is that Intel engineers with access to all sorts of internal Intel resources wrote a compiler that optimizes specifically for Intel chipsets while the gcc folk wrote a compiler that optimizes for x86 as well as other chipsets. The fact that the Intel guys spent 100% of their efforts on Intel chipsets while the gcc guys didn't is more likely to be the reason that Intel's compiler will outperform gcc and not because of any nebulous concept as compiler bloat.
    • Re:GCC vs. Intel (Score:5, Interesting)

      by wfmcwalter ( 124904 ) on Thursday August 23, 2001 @10:38PM (#2211865) Homepage
      >Would you be surprised if Intels compiler >produced faster code than GCC?

      Not really. As the GCC folks readily admit, GCC is presently suboptimal at generating code for highly superscalar instruction sets. This isn't too much of a problem for P1->P4 (but gets progressivly stickier) which aren't very rich in that regard, but it gets to be a significant issue for LIW and VLIW architectures (including IA64).

      This isn't a bad reflection on GCC or its developers, however - writing such a compiler (in particular, an instruction scheduler that keeps the various pipelines efficiently filled) is very hard, and this hitherto hasn't been an issue for the mainstream architectures at which GCC is targeted.

      I remember reading somewhere that Philips spent more writing the compiler for its TriMedia VLIW chip (which is 5x5, as I recall) than they did actually designing the chip itself.

    • Compaq's Alpha compilers produce better code for Alphas than gcc does for Alphas. From what I understand, this is almost entirely because of Compaq's compilers being tuned for the Alpha's memory heirarchy (i.e. cache and stuff), whereas the gcc folks are more generic in this area.

      All in all, the gcc folks have made a very good tradeoff. Their portability and generality allow them to quickly move to the latest, greatest architecture, giving them a nontrivial across-the-board performance increase. Compaq's Alpha compiler will become completely useless when there are no more Alphas.

      The same goes for Intel's latest tuning of an x86 core. Let's see how good their P4 compiler does on the Athlon or Merced. Not that anything could help Itanic's performance...

      -Paul Komarek
    • I wonder if you can compile the Linux kernel with it (minus assembly of course). That might be interesting, particularly for P4. Linux could get an instant speed boost.


      Possibly, but you'd have to ship the compiler with the dist since you would be hard pressed to link dissimilar code together dynamically. I'm no expert on this, but I do remember woes with using a sun compiler on Solarix x86 with the gcc compiler. This failed miserably with Apache / mod_perl and with simple perl + CPAN libraries. We had to go with the Solaris comipler all the way, which was a royal pain, let me tell you.

      Unless someone has some info to the contrary, you'd have to forgoe most any precompiled linux binaries, which will definately get into your hair, as I've definately found.

      And such a radical switch in compiler might expose flaws in the code. Definately a worthwhile excercise if nothing else.


      I'm not completely sure, but doesn't gcc extend C with various types of proprietary compiler attributes? I believe it's possible that the configuration stage can nullify them, and it's been a while since I've looked through Linux source, but I do remember those attributes hanging around.

      Still, I'm sure it's possible, and I'd be curious to learn of anyone's success..

      On the massochistic side, has anyone compiled Linux with a MicroSoft comipler? :)

      -Michael
  • It's disappointing that there's no more info on the CRT depth reduction in the writeup other than :

    S-Cubed works by bending beams of electrons in a way that allows the electron gun -- which shoots out the beams -- to be moved closer to the screen.

    A quick check of Sarnoff's website [sarnoff.com] doesn't reveal much either - their last press release was in late July. Pretty slick company though - nothing wrong with Flexible plastic LCD's [sarnoff.com] (again light on the details). You'd think they'd be a little more forthcomming with details, but I guess in the world of patents you can't risk anything.
    • See this comment [slashdot.org]
    • by Ldir ( 411548 )
      US Patent #5719476 looks like a likely candidate. It mentions reducing the depth of the CRT as a benefit.

      Links:

      IBM/Delphion [delphion.com]

      US Patent Office [uspto.gov]

      • That's clever. It's more of a cost reduction scheme for CRTs with very tiny spot sizes and wide deflection angles. As is pointed out in the patent, there was already a way to get very tight beam focusing, but it took extra sets of deflection coils that had to be driven separately. This scheme accomplishes the same result somewhat more cheaply.

        The description reads like one of those analog devices that takes way too many alignment adjustments. But some of that can be automated, and components are stable enough now that many of the values can be fixed at the factory.

        My guess is that the new scheme has some of the same elements of this one, and involves multiple correction coils to fix the beam distortions introduced when you deflect an electron beam through huge angles.

  • I didnt see any mention in http://news.cnet.com/news/0-1003-200-6947172.html of when a free P4 compiler would be available. How long would it take for the GCC folks to have a working P4 compiler?
  • Ask the NSA (Score:1, Informative)

    by __aadkms7016 ( 29860 )
    The NSA probably has a complete Usenet archive;
    there may also be independently-kept archives
    at other agencies.
  • by Anonymous Coward
    Amiga

    1. Hold Left-Shift, Left-Alt, Right-Shift and right-alt
    2. Press any of the F keys and get a message!
    3. To get a message toward Commodore, do this
    4. Hold down the same as step 1 and hold down an f key
    5. Insert a disk and you get the message "We made the amiga..."
    6. Take the disk out and you get "And Commodore F**ked it up!'

  • (OT) quickies? (Score:3, Offtopic)

    by casret ( 64258 ) on Thursday August 23, 2001 @08:41PM (#2211575)
    How come we haven't seen quickies in a long time?
  • In this CNET article [cnet.com] about the release of Intel's Linux compilers, they quoted the purchase price as $399 for a download, $499 for a CD. Somebody should tell them that blank CDs are a lot cheaper than they used to be...

    (I know, I know. The boxed version probably also comes with some printed documentation, supposedly justifying the higher price. It still seemed funny to me..)

  • Flattest CRT (Score:4, Interesting)

    by computechnica ( 171054 ) <PCGURUNO@SPAMCOMPUTECHNICA.com> on Thursday August 23, 2001 @09:05PM (#2211657) Homepage Journal
    Candescent Technologies [candescent.com] has been working on this technology since 1991 and it looks like its about ready to go prime time with it. It has the same brightness, contrast, refresh time, and viewing angle that normal CRTs have but uses less power than LCDs in the same size package. Can't wait to hang one of these on the wall.

  • CRT (Score:5, Insightful)

    by Mike Schiraldi ( 18296 ) on Thursday August 23, 2001 @09:07PM (#2211660) Homepage Journal
    Funny how everyone wants what they don't have:

    "I hate this stupid CRT. I wish i had an LCD monitor. Cheapskate boss."

    "I can't wait 'till i get this laptop back to the office so i can plug it into a CRT instead of having to squint at a stupid LCD."
  • by big.ears ( 136789 ) on Thursday August 23, 2001 @09:30PM (#2211714) Homepage
    NOOOOOOO! As a young, stupid college freshman in 1992, I discovered usenet and made a fool out of myself several times. I have been resting peacefully at night for the last decade, thinking that my past was safely hidden from the present, believing that nobody would be able to hold me responsible for the misdeeds of my youth. I guess I'm going to have to change my name now.
    • I have a friend who runs his own encryption company now and he's lamented to me that he wishes that he could excise some his posts made in earlier years from Bugtraq and USENET archives because he now receive several emails a day from script kiddies asking him to teach them how to steal AOL passwords and hack into hotmail.
    • Cheer up, man. My first posting was entitled "n".

    • Thank God most of my BBS posts are dead. ;)
    • Oh good lord, you have no idea. Trying to gross out people on alt.tasteless in the very early 90s granted me the legacy of some exceedingly sick shit, much of which I posted in my real name. I stumbled on them a short while back while searching for my name and hometown. (I foolishly had mention of it in my sig.)

      Why does youth always have to coincide with incredible stupidity?

      Maybe Google is going to milk the cash cow of charging for selective deletions. I'd pay $50 for each of certain posts to go away permantly.
  • KDE C++ question (Score:1, Interesting)

    by Anonymous Coward
    Objprelink produces code that loads quicker - but is it at the expense of slower code - all virtual function jmp to a jmp of the real virtual function?
  • An LCD, at 30 Watts, is a substantial cost savings to use, especially when you have lots of screens.


    Yes, they cost more, but what are you really paying for?


    I'd also be curious about recycle potential. There is much less material in an LCD, how about polution from disposal? How much of that can be reused and recycled? How about compared to a CRT?


    Bob-

    • To add to the usability of flat-screen LCD displays:

      Here at work we have a Customer Care command center, with about 20 LCD monitors in one little room. It allows for easily locating and monitoring those departments and individuals that are getting too many customer calls. The LCD's are hung three monitors high on two walls. This room would have to be 2-3 times its current size of 12'x 12' to fit as many CRT monitors (even if they were half the size of current CRT monitors). And this is just for your average customer call center, nothing all that special. I'm sure the savings on running 20 LCD's 24hours a day is much better than trying to cool down what the equivalantly sized room X 20 CRT displays would cost.
  • How is there any "may" about this? IBM would have to be nuts to not license this technology to a mass-producer or two, they'll rake in the dough from licensing fees!

    Every freelance graphic designer who has up until now had to surrender a big chunk of their living space to a hulking 19" or 21" CRT (because of finances or because of LCD color issues) will be flinging wads of money at the makers of slim CRT monitors. Not to mention the regular joes who just want a 17" or 18" LCD, but can't justify spending ~$1000 on a display.

    Hell, I'd pony up for two of the things, just to replace what I have now and get my desk to stop bowing in the middle from the weight of my old-school 17" and 14".

    ~Philly
  • There are some issues with post 2.95 gcc with newer CPUS, specially with Athlons.

    Those are explained here [utk.edu]. Cache handling seems to be the big problem.

  • Anyone know what the prizes were? The website is devoid of details in this matter. But, it did have a neat interface.
  • by Mr_Person ( 162211 ) <mr_person@@@mrperson...org> on Thursday August 23, 2001 @10:21PM (#2211829) Journal
    All of the people complaining about Google posting their Usenet posts that they'd rather not have made public need to go here [google.com] and look and number 16.
    Google will honor requests to remove messages that you have posted yourself. In Usenet parlance, this is known as nuking a post. If you would like to remove one or more posts from our archive, please send an email to groups-support@google.com (And follow their other directions)
    • BUT - if you've used spamguard (munged your from: or reply-to: addr), there's no way to nuke your posts.

      to nuke, you have to be the 'email owner' and that means they send a confirmation to the from: addr and you reply back saying you agree with the nuke request. you obviously can't do that if you've spamguarded your posts.

      another area this doesn't address is when you've nuked your own post, but some "helpful soul" has done a followup and copied the bulk of your text in their reply. in that case, you'd have to go around getting all those other folks to submit nuke requests (good luck...).

      in short, just assume that once you post, its "out there" and generally can't be taken back.

      • to nuke, you have to be the 'email owner' and that means they send a confirmation to the from: addr and you reply back saying you agree with the nuke request.

        The problem being, for example, that email addresses might not still be around. I was using an academic account in the early to mid '90s which either no longer exists, or has been recycled.

        People say things on Usenet, as they do on IRC, with the expectation that it will not be around forever. That's one of the reasons both those channels are often used to discuss controversial subjects. If you say something in haste, or play devils advocate, it doesn't matter because it was expected to evaporate.

        Now, there is the scope of massive out-of-context abuse of the system, what if you're 30 years old and you don't get that job because the interviewer searched google and found out that as an 18-year-old freshman you were an anarchist?

        Google should honor all reasonable requests to delete postings from the archive (here is a list of email addresses I have used, for example). IANAL, but it might be better for them to do so now, rather than waiting for the first lawsuit.
  • by Lumpish Scholar ( 17107 ) on Thursday August 23, 2001 @10:28PM (#2211841) Homepage Journal
    ... why don't they try searching for it [google.com]?-)

    (I'd love to see JMS's preproduction Netnews postings about Babylon 5, myself.)
  • by Laplace ( 143876 ) on Thursday August 23, 2001 @10:52PM (#2211898)
    I've spent the last two days downloading, installing, and trying to compile with the Intel C++ compiler for Linux. The compiler is installed now, but I can't compile with it. My first program had one line that printed out "Hello, world."


    The compiler crashed and burned. Their techical support site (which you get to by clicking on a creepy NDA) didn't contain much information. The links that did look interesting were broken


    Eventually I found a document contained a list of known bugs. One of them was " was not included in the distribution. This will be fixed in the next update." Fantastic!


    Has anyone out there successfully installed this compiler? My employers are very interested in using it (we want fast code for our intel machines), and I am very interested in trying it out.

  • That's what I saw here [unimaas.nl], on every human's face while they waited for their computers to figure out what to do next.

    You'd think they'd at least have had a foosball or ping pong table or something. If I ever get into something like that, I'll remember to bring a copy of War and Peace [promo.net].
  • backup of usenet: (Score:2, Insightful)

    by mgblst ( 80109 )
    I have a backup of the first few years. I have posted it here for posterity, but removed the header:

    Test

    This is a Test

    TEST

    Test!

    anymore i have left out??
  • Another "uncommercial" move by Google. Hard to see how they can make any money putting old Usenet content online. Then again, it's hard to see how they make any money running such a huge server application (one billion web pages indexed and archived) without selling banner ads, never mind those obnoxious popups and embedded animations. No "portal services" or other attempts to divert people to their own content. No fancy "co-branding" deals. They license their tecnology. And they sell "ad words", which appear discreetly in a corner of the page and are easily ignored. Oh, and a few tchatchkas [googlestore.com]. That's it. No other revenue streams.

    Yet they are in the black. Meanwhile, ambitious efforts like Infoseek, Lycos, Yahoo, and NBCi are floundering or defunct. Perhaps there is a lesson in that. I certainly hope so.

Never ask two questions in a business letter. The reply will discuss the one you are least interested, and say nothing about the other.

Working...