Want to kill off the desktops? Find something with better display and user input.
How about the same display and user input. Have you not heard of USB and HDMI? A laptop can be easily connected to an external keyboard, mouse, and monitor. I do this all the time! Since even a relatively low-end computer is more than good enough for most tasks, there is really little down-side to this approach. The extra expense is justified because you can carry it with you.
Hmmm.. I remember the Atari 1020ST was sold as the first computer ever to be under $1 per Kilobyte. It is true that $0.50 / gigabyte is nothing magical from a tech standpoint, but this is not about tech, it is about psychology. Human beings are not entirely logical, and emotions play a large part in decisions.
Bingo. Laptop users. Laptops are on the way up, desktops are dying. And since the higher-end laptops (ultrabooks) are even ditching optical drives to save size and weight, what do you think are the odds that they will make space for a 2nd drive. In fact, I would not be surprised of the 2.5" drive bays went away entirely in the next three years, to be replaced by slots (probably PCIe or something similar). Unless you are going for a larger device -- gaming or workstation laptop, you are not going to have the luxury of two drive bays.
Well, I can see a use for this. If you HAVE an existing FPGA, you could throw a processor on there for free. Some FPGAs have a CPU built-in (such as an ARM), but those parts cost more. With this, if you need some processor, this is not a bad choice. You could go for something like an 8051, but more options are nice to have. This also apparently has a nice software chain (compilers, interpreters, etc.).
If you really need a well-supported embedded soft processor, your choices are OpenRISC, 8051, Z80, 6502, or this (off of the top of my head, let me know if I missed something). Xilinx makes a MicroBlaze, but they charge money to unlock it.
A few years ago we finally started to use VHDL '93 at the moment we expecting to be able to use VHDL2008 in 2028 and this is not a joke, that date is realistic based on historic glacial movement of the hardware industry.
Seriously? You have my condolences for using VHDL. You have my deepest sympathy. Second, why the glacial pace? SystemVerilog is supported by all major sim makers (at least to the extent needed to support UVM). Even synthesis tools are starting to support the SystemVerilog constructs that make sense in hardware (structs, unions, etc.).
Really, unless you are stuck using some specific tools that you can't upgrade or update, there is no reason that you can't switch to SystemVerilog today! Although, I admit that SV does not bring nearly as much to synthesis as it does to Simulation.
BTW: I come from the custom silicon world. I don't really use FPGAs much, so SystemVerilog may be beyond the capabilities of the free tools.
You could say that offering all options at a discount costs them nothing. You could also argue that it does deprive them of revenues. There are arguments both ways.
It is sort of like Windows 7 home vs Windows 7 pro vs. WIndows Server. They all pretty much share the same code base (maybe less so for the Server version). The only difference is a switch or two.
If you argue that turning on the FFT and serial protocols costs them nothing, you are right! Once the scope is in your hands, it costs Agilent and Tek next to nothing to enable that feature. For Agilent, it is an unlock code. For Tek, it is a module that costs them only a buck or two to make.
On the other hand, it actually DID cost something to include those features. A lot of serial decode stuff is done hardware and software. The software costs a lot of money to develop and test. The hardware part adds some cost to every single unit sold, plus the cost to develop that test that. So, imagine that all of these extra features (FFT, serial decode, etc.) were included standard with every scope. This means that the price would have to be raised to cover all of the NRE costs. So, the price of the scope rises for everybody. For those that need the extra features, they are getting a great bargain. For everybody else, they are paying more for something that they don't need.
So, by locking features that need to be unlocked, you piss off the people who feel like the features are already there, and they are being artificially prevented from doing something that they ought to be able to do. If you unlock everything, you raise the price for the very budget-conscious customers. There is no perfect answer.
In all fairness (and as a former Agilent employee), you would not believe the amount of work that goes into those things that you don't get with cheap PC-based scopes and low-end stand-along scopes. They do a LOT of work making sure that the front end (analog stuff between BNC and A/D converters) is correct. Also, lots of DSP-ish type stuff right after the A/D too. I am a digital designer, and I worked on some of the oscilloscope chips, and I don't even understand a lot of that of that stuff.
For a hobbyist working with bandwidth-limited signals, and everything is 5V or less, the cheaper brands are probably fine. However, how do you tell if your scope is lying to you? Do you know aliasing when you see it? I have seen some PC-based scopes do the voltage offset (where you twist the little knob to move the waveform up and down) all in software, and seen the clipping in the A/D -- nasty stuff. You really need do to that in the analog front end You also have how many waveforms per second that you can display. If you have a glitch that happens only rarely, if you are capturing only 30 or 100 waveforms per seconds, you might not see the glitch. On the other hand, if your scope is capturing 50,000 waveform/second, you stand a MUCH greater chance of seeing it.
I do admit that scopes are a pricey purchase, and part of that is due to the low volumes involved and the high amount of R&D. But, if you need something that you can trust (you make your living off design work and are not just a hobbyist), you really need to get something professional from a reputable company.
Can you please explain? The "G" in "GCC" stands for "GNU," so isn't that "The Bazaar" by definition?
Streaming media is by far the largest consumer of bandwidth.
This has nothing to do with their network infrastructure, and everything to do with the fact that they would like you to pay out of pocket to stream media on their network. With a 10gb monthly limit on my 4 user plan, if I go away on a trip and watch 3-4 netflix movies in HD, I've used up my entire monthly allowance, and then streaming becomes pay-per-view at $10+ per movie.
They are annoyed that they have customers who still have an "unlimited" plan, and they are effectively converting those users to having a usable 5gb plan.
Of course there would be. That's not the point, and not at all what I was talking about. General education ensures that a person at level 1 with aptitude for level 5 under perfect conditions, can get as close as possible to level 5. Or are you having trouble reading?