Actually, the PDP-11 instruction set *IS* the basis of 80386 CISC - even though PDP-11 style is not a basis of 8086, 80186 CISC. Check it you & you'll notice the 80386 designers flipped head over heels and every which way they could to give the 386 the programmer architecture of the PDP-11. The instruction word structure is not as beautiful - but they tried like hell to achieve an assembly level presentation that matched.
If you compare PDP-11 assembly with 80386 assembly, nothing special - because with the 80386 Intel *FINALLY* got over to the IBM360 style orthogonal instruction set that the PDP-11, VAX-11, Motorola 68000 had implemented for years. If you look at the Intel 8080, 8086, 80186, 80286 instruction sets, you'll see a mucked up bucket of junk. That goes for lots of other micros and minis from the era, too. Back then it was easy to sacrafice the instruction set to save registers and control logic - there was an historical thread of minimal logic CPUs with crappy instruction sets. But there was also a thread of beautiful orthogonal instruction sets comming out of the late 1950's and the 1960's, exemplified finally by the IBM360. The PDP series contributed a lot of orthogonality in this period, too, but the 360 was 32 bits while the other orthogonal PDP's were 18 and 36 bits. The PDP-11 was a power of 2 bit width, half the 360's, and came out looking a heck of a lot like the 360.
DEC made a sample of the mucked up variety as well - check out the PDP-8.
PS - yes, I know the PDP-11 is 16 bit, while the 360 and 68000 are 32 bit - I meant to refer to the "orthogonality" of the instruction sets, and structure of the instruction words, rather than the bit widths when I made the comparisons above...
Note, too, that the IBM 360 instruction set is 32 bit and highly orthogonal, very much as is the PDP-11, and later the Motorola 68000, in fact the 360 instruction set pre-dates the PDP-11 by several years.. Both DEC and IBM were heading in the same direction over some of the same years that way. It's hard to really claim that DEC (Gordon Bell) copied IBM there, but it's also really hard to claim he didn't.
This article did not discuss the reorganization plans. Instead it whined and complained about Microsoft's poor sales performance.
Article by a professor who took the course along with a small group:
1) RMS himself clarifies at least his intent in developing a (free as in freedom, free as in beer) OS & tools for everyone differentiates between widely empowering technology like OSes, compilers, and printer drivers versus specialized applications with few users. He points out that if the ecosystem is small, then proprietary relationships may be necessary and therefore appropriate. (Sorry no time to dig out the quote, but its in his stuff on the FSF site.) The question is what will be better for common good, so consider size of the user community, business models, etc. A kind but proprietary business with good practices that survives -> is better than an over-idealistic business that fails -> is better than a mean business with selfish intentions and bad practices that enlists and then controls customers.
2) Are the benefits of going public and free worthwhile against the loss of proprietary value? If your company will make larger revenue because your competitors have adopted your software, then go for it. That means the driver of your revenue has more to do with your business activities like selling, integrating, servicing, designing solutions. For example, if being able to integrate your equipment easily with your competitors means you make more money. But if you rely on the performance/capabilities of your software to drive revenue, then keep it closed until your business has grown up to become more service oriented.
3) Don't expect your competitors to play fair with the free software they pick up. They're not going to contribute back as they should. They might not admit they are using the software.
4) You don't need to go public with your free software yourself. Your question was w/respect to the community, so maybe this point is not relevant. Customers should be looking for free software in case a) you fold & no longer service their maintenance needs; b) they wish to take development on a different tack, they should be able to start with your product as a basis; c) they want remarketing rights, etc. But just because you sell them free software doesn't mean they intend to remarket or even give it out to anybody else, though they have the right. As to these customer needs, you may be able to come to an informal understanding that is mutually beneficial, or you may provide for the specific rights they wish in a specific license for them instead of making the software fully free.
On the otherhand, if other posters in this thread are correct and the image is only a spectrogram, then certainly the word "photo" does not apply. A photo should be a record of spatial data.
What is your definition of 'photo'? Does it involve 'photons'? Can the photons have frequency in the microwave? If I had microwave sensitive eyes would my photos have microwave photons? What if I had a microwave sensitive camera?
But I think the articles are making quite a fuss about spatial resolution - are you sure the image doesn't contains some spatial elements as well as just time & frequency?
From your doppler shift explanation, can we conclude, since the profile of the image is has some width, that the object is rotating? If it were not rotating, then the image would simply be a vertical line?
Still a bit confusing...
Imagine how I felt when that signal finally made it through my slow outer layers!
If this is a "radar" image, where the telescope sent a pulse and got an image from the reflection, why in the picture does it look like the illumination is comming from the above the object? Shouldn't the whole visible face be illuminated? I would like to see all the detail received by the radar. If this is artificial illumination of a solid model build from the facing radar data, I wish the illuminator position would be near my point of view. If this is the actual radar image, then I am confused about the presentation.
a group of interconnected clusters. With the ability to route between them. The idea of interconnecting clusters is the core idea of the original Internet in the first place... And if that was Day 1 of Internet Genesis, then Day 2 was hosting multiple application spaces. Like Gopher versus WWW versus FTP versus email - nobody has ever claimed it all was one big homogenous lump!
Has it ever been the desktop??
At least since about 1985 almost all computers are embedded. Embedded systems became multi-tasking/mutli-processor quickly, so we've even been able to put all our "operating system 101" college learning to good use. A lot of embedded systems have involved networking and data base as well. Not to mention signal processing, and on and on.
Desktops have been a small slice of the pie for a long time.
Of course some of us were born before embedded systems (ahem..), but back then the desktop only had a dumb terminal anyway...