Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Sun Microsystems

New Processor Design from Sun Microsystems 92

IQ writes "This NYTimes article discusses Sun's latest chip, known as Microprocessor Architecture for Java Computing or MAJC. Looks like a huge, fast MultiDiePackage with a lotta chips. " Fits in well with Sun's continuing attempt to route around Intel-these chips are look like they are philosphically aligned with Jini. More specs will be coming out later on this month. (Free login required @ NYT).
This discussion has been archived. No new comments can be posted.

New Processor Design from Sun Microsystems

Comments Filter:
  • This could be useful...
    1. Just like PC chips understand x86 instructions... as a new standard, just like the PIII and whatnot
    2. This type of technology can no doubtedly be moved over to other architectures easily. If a lot of the java code gets interpreted inside the chip, other architectures can take on this quality.
    3. Turn it into a PCI mounted board and make it portable to those with those welded in CPU's or just plain old stuff..
    Then again...
    1. Hopefully this won't require some sort of software support like the Cyrix CPU's did with their drivers.
    2. Sun won't make a big deal over outsourcing this
    Just some ideas to spout out.
  • I think this has been done already. My understanding is that VM Labs has made a chip, and has already begun making partnerships with other companies like Motorola. Here's my source of info:

    Here's an article from Wired magazine interviewing and profiling VM Labs [wired.com]

    Here's VM Labs's url: [vmlabs.com]

    Is it just me or isn't the VM Labs chip pretty much the same thing?

    Has anyone have current news of VM Labs progress in getting it's chip in devices?

    B.

  • Thanks for weighing in with a detailed, thoughtful comment on the subject. Like you, I too feel that it would interesting to see specific-language chips and see how well they DO work rather than just speculating on it. -- Michael Chermside
  • s/William/Bill A , but then I realized that he is God.
  • What about Scheme? How does that compare to Common Lisp?
  • As long as we're piling on... "you're" should have been "your".

    -- Eric
  • Don't think that Java magically turns people into great programmers. I've seen code like this:

    try {

    someFunction()

    } catch (Exception e)

    { }

    (An evil coding practice, if I've ever seen it.)

    Java removes a certain group of bugs, but plenty plenty remain for QA to find.

    rbb (Has anyone compiled a list of Java coding conventions for Meta level issues?)

  • by DonkPunch ( 30957 ) on Monday August 02, 1999 @07:47AM (#1770241) Homepage Journal
    I'm seeing a lot of posts saying that chips designed for a specific language are a Bad Thing. I tend to agree.

    BUT I think this argument overlooks something important. A Java chip would not interpret Java at the brace-and-semicolon level, it would read Java bytecodes. Java bytecodes are basically machine language for a microprocessor that exists only in software. It is only logical to make such a chip in hardware eventually.

    Furthermore, if the specs for a "Java chip" are open, what is to keep compiler writers from implementing back-ends which write Java bytecodes? I'm not a compiler writer, but it seems like it would be quite possible to implement, for example, a C or C++ compiler which writes Java bytecodes instead of x86/68000/Alpha/Sparc/whatever machine code. Such a compiler would make the "Java chip" usable by people who don't like writing Java.

    I seem to recall seeing at least one compiler that takes a non-Java language (Perl, I think) and compiles it to Java bytecodes. Also, I know there is one regular slashdot reader who is doing Java programming at the assembly level -- any comments? If a Java chip sees widespread use, anything-to-bytecode compilers would seem inevitable.
  • It's been done. I think a masters student at Cornell wrote a C-compiler than generates Java byte-codes for his master's thesis.
  • JPython -- Python to Java bytecodes. Not Perl. Many apologies.
  • In the current issue of Scientific American [sciam.com] (August 1999), the Oxygen Project is explored. It reveals an approach in making a chip, along with other programming and devices, more efficient and faster by using logic gates and compiling the wires automatcally on the processor. Basically customizing the wiring for each application.

    The chip is called Raw. It is covered in the 4th part of the article, Raw Computation.

    'til dawn...

  • by Anonymous Coward
    Yes, Byteheads, it's now time for GNU Eiffel. [loria.fr]
  • I don't see a need for chips optimized to run an intereted language.

    You don't see a need for a processor with instructions like

    aload someString

    invokevirtual SomeClass::setProperty()V
    but have no problem with processors with instructions like
    mov eax, someString

    mov cx, itsLength
    jsr setProperty

    What is the difference you percieve other than the Java machine is stack-based? In what way are the x86 instructions not "interpreted"?

  • The EE Times article suggests that MAJC won't interpret Java bytecodes; bytecodes would have to be interpreted, or compiled into native machine code, as on most other platforms. (It also suggests that other languages could be compiled into its native instruction set as well.)
  • ...suggests that MAJC
    won't interpret Java bytecodes; bytecodes would have to be interpreted...

    Make that

    ...suggests that MAJC
    won't directly interpret bytecodes; bytecodes would have to be interpreted by software...
  • I'm currently porting GCC to the JVM.. If you have any info on this could you forward it to me?

    trent@csee.uq.edu.au
  • Sun have had processors around for a while now which have been designed to execute Java bytecodes directly...

    ...such as picoJava.

    The interesting stuff is the VLIW aspect...

    ...but with an allegedly-VLIW (assuming VLIW isn't just being used as a marketing-speak alias for "buy this, it's c00l", as e.g. RISC appears sometimes to be used) instruction set, it appears that this chip isn't designed to "execute Java bytecodes directly".

    ...lending itself to bytecode environments in general (not just Java)

    In what fashion does an underlying VLIWish instruction set lend itself to bytecode environments better than does a non-VLIWish instruction set?

    Basically it looks like they're making a stab at a new *style* chip architecture, not just overclocking some knackered design a la Intel.

    Well, to be fair, Intel are also working on what they consider a new style of instruction-set architecture, even if it appears that many of the basic ideas for it came from HP.

  • Scheme is to Common Lisp as C is to C++...
  • Copyright © 1999 Dow Jones & Company, Inc.

    I'd be particularly alert for bias in articles put out by this company, and would love to know who owns them. ;^)=

    Unfortunately, the corporate fact sheet page on the Dow Jones Web site [dowjones.com] doesn't seem to say anything about ownership of their shares by other corporations, although the shares are publicly traded on the New York Stock Exchange.

    (I.e., if you were just jumping on the "MS" part of "MSNBC", and inferring that this was some Evil Microsoft FUD Plot, note that the article looks as if it might be a re"print" of a Wall Street Journal article, not something put out directly by MSNBC.)

  • Common Lisp, like Scheme and most other modern Lisp variants, uses static scoping by default. (Though dynamic scoping is available too.) Dynamic scoping is useful in a few cases, but usually just causes problems. More important is that objects have dynamic extent. Features like function closures, continuations, etc., are what make Lisp Lisp. That and the ()'s...

  • Okay, I'm not up on my Vague-Tech-Reporting lingo. Does this mean we'll be seeing a CPU that runs Java bytecodes natively? If that is case, kudos to Sun! I'm really impressed with Java as a language, and I would like to see Java programs run at something resembling native binary speeds.

    Of course, if this isn't what Sun is proposing, could someone tell me what this means?

  • by ChrisRijk ( 1818 ) on Monday August 02, 1999 @05:11AM (#1770261)
    Sun conjures Java CPU for media apps [eet.com] at the EE Times.

    The EE Times also has an article about next gen server technology [eet.com] IBM (via Sequent) and some info about Sun's next-gen stuff. As usual, Sun are saying very little. From what I've heard seperately though, Sun are working on both a form of NUMA and something else called COMA (Cache-Only Memory Architecture). They might be doing both (on the same machine) for their next-gen server - Project Serengheti, because NUMA is good for some types of applications, while COMA is good for others, so by doing both, the end-users can choose which memory architecture best suits their needs.

  • by Anonymous Coward
    Essentially Sun wants to compensate for Java performance deficiencies by hardwiring aspects of the JVM as well as creating Jini-enabled devices in hardware.

    Thankfully our market system has developed good reflexes to this "one company does everything" solution as it leads to monopolies and abuse.
  • This is just part of Sun's newest rollout of the Java platform. They're also tossing out Edition versions, with a "Micro" edition for Palm Pilots. I say nice...

    ---
    Spammed? Click here [sputum.com] for free slack on how to fight it!
  • Java has it's origins rooted in a project called OAK (pun unintended). This sought to provide the environment for settop box like devices, so Java is really coming full circle.


    Chris Wareham
  • by Anonymous Coward
    Wasn't the goal of Java to provide a robust cross-platform environment for running application software? If Java performance requires specialized chips, it would seem to defeat the purpose. Sun has no idea of what to do with Java. They have not made it very easy to port Java to different architectures. For some platforms, the porting work has gone on for years with no end in sight. It is the classic ``solution in search of a problem''. It's time for Sun to get Java back to basics and finish what was initially promised.
  • I was just reading an article about this over at msnbc [msnbc.com]. It was more in-depth than the nytimes article. There were a few things that really caught my eye at msnbc

    ...Sun officials already audaciously refer to MAJC as the most important semiconductor architecture of the next 20 years. In part, that's because the chip is particularly well-suited, they say, to handling the enormous streams of visual and audio data expected in the multimedia age. In addition, MAJC should yield a family of microprocessors that are easy to program using Sun's Java language, that can be used in everything from cheap consumer devices to Internet server computers, and that over time will grow even more powerful, and more quickly, than rival chips...


    ...Sun, for instance, claims that within several years, it should be possible to generate an interactive computer-animated movie like Toy Story in real time using a single MAJC chip...
  • See this JavaWorld article [javaworld.com] from a former Sun engineer. This is about Java from the embedded point of view.
  • There are more articles (that don't require annoying registrations) at The Register [theregister.co.uk], News.com [news.com] and Techweb [techweb.com].

    Solaris Central will also be covering additional news and updates to the processor. It should be interesting to see what unfolds...
  • It's too bad the article on MS-NBC is very biased- look at all the words loaded with negative conotations:

    "Thanks to an unusual design..." Unusual according to who? The author? Computing industry experts? This must be considered biased opinion unless a source is given.

    "Sun figures it can sell cheap versions of the chip for use in inexpensive consumer-electronics" Cheap chips, inexpensive electronics? Cheap has strong negative conotations, while inexpensive is generally considered a Good Thing(tm). It's interesting to note that it wasn't "inexpensive versions of the chip for cheap consumer electronics"

    "It's still a risky bet, though..." According to who? This persons' stock broker? To be fair, it backs up this statement with examples of past failures, but printing it as a statement of fact is bad form.

    "Intel, which has also moved aggressively into communications-related chips, will also pose a competitive threat." ...and so will TI, and ~Transmeta~ (?) and all the other companies not aligned with MS, and not mentioned by the reporter. This statement points to Intel in a positive light, but fails to mention competitors. Another example of possible bias.

    "Sun officials already audaciously refer to MAJC as "the most important semiconductor architecture of the next 20 years."" ...and Microsoft audaciously assumes that every computer sold will have Windows on it. Audacious is a loaded word, and should not be used in a news story, unless someone else is quoted as saying it.

    "Analysts are reserving judgment on such claims until Sun formally discloses the details of the architecture on Aug. 16." ...but MS-NBC isn't. It's jumping right out of the gate with lots of unsubstantiated claims, and un-sourced opinions.

    To be fair, there is some positive stuff in the article too-

    MAJC chips should be able to display complex graphics and handle digital-communications tasks at extremely high speeds -- far faster than a general-purpose Intel chip, for instance.

    the chip is particularly well-suited, they say, to handling the enormous streams of visual and audio data expected in the multimedia age.

    Copyright © 1999 Dow Jones & Company, Inc.

    I'd be particularly alert for bias in articles put out by this company, and would love to know who owns them. ;^)=

  • thanks :)

    didn't realize this one.

    shows my ignorance :)

    cheers!

    Peter
  • The main problem with specialized hardware is that the people building it tend to get run over by the Silicon steamroller.

    "After two years of development, we're proud to announce the new HyperAccel 3000 CPU with hardware support for Snobol. It runs at 100MHz and provides a 4x speedup over general purpose processors for Snobol applications. What's that you say? Intel makes 500MHz CPUs now? #@!$"

    I'm not saying that the above scenario will happen in this case. It's usually the small outfits who can't afford to keep up that get burned, and Sun isn't that small. Further, in some situations, the speed gain is so large, that even if you use previous generation fabrication, you can still win: witness the 3D graphics market. But you are competing against parts that have enormous sales volumes and all that implies. If it comes down to a chip with Java support that gives a 1.5x speed-up vs. a commodity CPU, I'd bet on the commodity CPU. It will probably be available at 1.5x higher speeds at comparable cost.

    "MyGarage Software just announced a JIT compiler that is 1.5x faster than any earlier Java compilers for x86 CPUs? #$@!"
  • Certainly any programming language can be compiled into JVM bytecodes; however, the instruction set of the JVM has been designed around Java's object model. Other languages, such as C++ or Eiffel, that use a different object model are going to thus be at a disadvantage. These language can be (and in the case of Eiffel, are) compiled into JVM, for those cases where such is necessary, but now that we still have a choice (that is now that Sun hasn't completely bowled over the marketplace with Java), we should demand a Virtual Machine that is not predisposed to just one programming language.

  • Unless I've managed to get this all mixed up, Lisp has dynamic binding, but not dynamic scope. That is, a procedure invocation is always evaluated in the environment in which the procedure was defined, not the environment in which the invocation occurs. Where it makes a difference is when the procedure refers to non-local variables. So, e.g. (this is Scheme, not Lisp):

    (define foo (let ((a 1)) (lambda (x) (+ x a))))
    (let ((a 2)) (foo 5))

    would return 6, not 7, because the invocation of "foo" sees the "a" bound in the first line's "let", not the second, since that's the environment in which the "lambda" was evaluated. Once I was writing a Scheme interpreter (in Java, by the way) and I noticed where with a one-word change I could select between dynamic and lexical scope, by changing which environment to extend when binding the arguments for an application.

    That said, I agree that dynamic binding (which I assume is what you meant) makes Lisp incredibly powerful. In fact, it makes nearly all other languages (including Java) seem downright primitive. I mean, imagine actually having to recompile a program each time you want to test a change! In Lisp, you don't even always have to stop the application to apply a patch, let alone rebuild it. Just re-evaluate the definition of the procedure that is changed and code that calls it will seamlessly see the new version. Since symbols are bound dynamically, there's nothing to re-link.

    The major argument against Lisp has always been performance, but with moderm hardware that's less of an issue -- to be fair, compare it to Java, not C. Besides, with modern compiler technology, the difference is not as great: I've actually seen a piece of Lisp code run significantly faster than the exactly-equivalent C code.

    Now consider the fact that things like maintainability and availability are becoming more important than raw performance. I would think that the ability to apply a patch to, say, an e-commerce server without having to bring the system down, even for a minute would be of a lot of interest to the people running those systems.

    Lisp was ahead of its time -- its time is coming now.

    David Gould
  • by __aasmho4525 ( 13306 ) on Monday August 02, 1999 @05:38AM (#1770294)
    For quite some time now, we've all watched the worldwide criticism of specialized hardware that implements a more abstract instruction set, lisp, java, smalltalk (not sure if the latter actually was turned into hardware), etc. Why is the criticism so harsh? I've not yet really seen anyone GIVE IT A CHANCE before discarding it as a toy. First, a disclaimer: I happen to think java is the best language (ok, toolkit, platform, etc.) that's come around in a long time for general-purpose programming (NOT for operating systems, but hear me out here...). Like it or not, the vast majority (i'd venture a guess at 90%) of software written is NOT (and need not be) of operating system calibre in terms of robustness, quality, performance, maintainability, etc. In many cases, the life expectancy of the software is far too short, because needs, requirements, etc. change very quickly, to warrant the additional time spent in development. Now, remember, i'm a purist by heart, but i do have a pragmatic side to me too. Occationally, the costs just don't justify the benefits. Again, like it or not, i have worked with a great many people that are under too much pressure, lack the skills, or simply don't care enough about the quality of their work to do a good enough job with an "easy" language, let alone one that lets them shoot their foot even more effectively... Anyone who truly thinks that java is "too slow" on modern hardware with modern dynamic compilation technolgies really does need to do a bit more experimentation on their own. There are few problems that i've needed to solve in the last few years that i couldn't *easily* solve with Java, and never did i think that the quality or performance suffered (especially now with heuristic compilers). Remember now, I wasn't building 30,000 user systems, maybe 3,000. Is it the best tool for *every* job? Hell no. Does it solve some things VERY effectively? Absolutely. Would i still write any software requiring the utmost performance in c or c++? Hell yes. As history has taught me, profiling my code shows that 90% of my time is spent in 5% of the software. Again, what percentage of the software I've written has requirements demanding utmost performance? less than 5 percent. Now, i'm biased, that's clear. But, seeing in the first 3 posts, not one constructive thing could be said, i felt it my duty to *try* and present a more pragmatic opinion... i, personally, would LOVE to take a shot at using a higher-level-of-abstraction instruction set, just to see for myself whether or not they're of utility. i don't have the experience with them to either condemn or praise them. i wish the same humility were infectious. as always, my opinions are mine alone, i speak for only myself, and i apologize if i offend. Peter
  • ack. my apologies for losing the formatting :( gosh that's as ugly as anything i've ever seen :) sorry bout that... Peter
  • Building a hardware java machine isn't necessarily a monopoly just because Sun is doing it. If Sun doesn't allow anyone ELSE to do it.. THEN that'd be a monopoly.

  • I'm still waiting for OO to become fashionable. The dominant paradigm still seems to be to write an application that wraps itself around some data in a file somewhere and uses it to configure the app somewhat, using a slightly different app for each type of data. So much for data-driven.

    When I receive my data as an object that I can query for its fields, because the app that generated it created it that way, then I'll be impressed. Till then, well, how many of you are writing 20 different scripts to parse syslog 20 different ways?
  • By the time a single MAJC chip is powerful enough to render Toy Story in real time, computer animation will be so advanced that Toy Story will look like a relic of the 50s.

    Sorry, but that is just -too- much hype.


  • login: slashdotid
    password: slashdot

    also

    login:cypherpunks
    password:cypherpunks

    any others?
  • Java performance is already quite good on the Solaris SPARC. Nearly as good as native. It still hurts pretty bad on the PC, tho...

    IMHO - the best application for a java chip would be in the handheld market.
  • you bring up many good points. i need to go and *really* learn lisp.

    i only learned as much lisp as i *needed* to do my job (specifically, writing some modes in emacs).

    if there's one thing that really intruiged me about it was that it is dynamically scoped, opposing just about every other language in common use throughout the world. this, unfortunately, is so confusing to the masses, while being WILDLY useful to those who know how to harness it's power... ;)

    is it that feature that you can triangulate down to when you think about what *really* stands out in lisp?

    again, it comes down to:

    **I** believe i can learn any computer language in the world and be productive. It's my hobby. Functional languages are my toy right now.

    on the other hand, the folks who are not "into" learning languages & the science of computing don't have the persistence i seem to. it's not that i don't *wish* they would, i just must pragmatically accept that they will not.
    we have different priorities, and that's a *good thing*.

    so, seeing as software maintenance is so incredibly important to me (and plays a significant majority-role in software lifecycles), can i expect most software engineers to quickly acquaint themselves with the paradigms behind java? i feel fairly confident in saying yes, because the language is not *that* different to what the masses are accustomed. i'm not certain i can say this about lisp, as much as it intrigues me.

    regardless, **I** need to go learn more lisp ;)

    Peter
  • Ada, too.
  • Actually, Jini is open source... anyone can create a Jini device to use or sell. Java is a programming language... anyone can write a program in Java. It's like C++ or Delphi or any of the other programming languages. These are not "one company does everything" technologies.

    By hardwiring aspects of the JVM, the Java programs will run faster. This is nothing new, the CISC and RISC chips all have various functions hardwired in.
  • It's more like a general purpose CPU with some bits to help make it easier to speed up Java, and other languages.
  • until it is running BSD?
  • Sun have had processors around for a while now which have been designed to execute Java bytecodes directly, so this aspect of the new design isn't a big deal (calm down, Javaphobics).

    The interesting stuff is the VLIW aspect, lending itself to bytecode environments in general (not just Java) as well as hardware-optimised multimedia stuff.

    Basically it looks like they're making a stab at a new *style* chip architecture, not just overclocking some knackered design a la Intel.

    I'm guessing the Java-related is at least in part due to the marketing guys wanting a hook on it.

    As for the guy who said 'whatever happened to SPARCs?', well really, pay attention ;)


GREAT MOMENTS IN HISTORY (#7): April 2, 1751 Issac Newton becomes discouraged when he falls up a flight of stairs.

Working...