Your comments give me such a thrill,
But your comments don't pay my bills!
Your comments give me such a thrill,
The MSR reactor is the best stove I've used. Fuel efficient and fast, if a bit pricey. Sometimes I consider using it in my kitchen instead of the range.
Think of it this way: Digital is math, Analog is Physics.
As mentioned before, the world is analog. Obvious things like audio and video interfaces need analog circuits and always will. Our ears, voice, and eyes are all analog. However, in todays circuits the analog content is growing not shrinking. Phones have batteries that need to be carefully managed. The digital circuits need many power supplies that need sophisticated regulation. These are all analog circuits. The wi-fi, bluetooth, cellular, NFC, and other radios are all very analog intensive circuits. There are a multitude of monitoring and control circuits on any modern piece of electronics that are all analog. The USB, HDMI, firewire (old, I know), Thunderbolt, etc. interfaces are all high-speed circuits that need analog drivers, receivers, and clocks that are analog. The CPUs, DSPs, etc all need clean clocks that are generated by PLLs that are analog even if they contain digital elements. Even the digital logic gates themselves are analog. The voltage levels that devine 0s and 1s are very analog. Their accuracy is quantified and defined to work with supply noise, clock jitter, and timing errors. All analog stuff. The digital math only works when these analog problems are reliably solved.
So please, keep saying that the world is digital and analog is going to die. It only provides more job security. We can't hire good analog designers fast enough.
A lot of the posts here seem to be some version of "This just shows that women aren't cut out for CS." My experience makes me think this may be an American issue.
I have been working in electrical engineering for 16 years. (I know, not CS but chip design is a tech field with commonalities.) It is a majority male field with approximately 15%-20% women. However, NONE of them that I see were raised in America. Most of the women I work with completed undergraduate degrees in China or India and got a graduate degree in the US. There must be some reason why American women are repelled from engineering. Or is it that Chinese and Indian women are just fundamentally better at tech than American women? (Before anyone brings up the H1 visa issue, that's not what is going on here. We have a hard time finding qualified candidates regardless of where they are from.)
It would seem that the fact that we (in the US) socialize girls from a very early age to stay away from thins like tech is relevant here. If you are curious, go into any major toy store (e.g. Toys R US, the independent stores are actually a lot better about this.) There are vanishingly few toys that are just toys anymore. There are only "girls" and "boys" toys. The girls toys are all pink and predominately princess themed. Now, you may say that this just confirms that boys and girls are different but you would be using circular logic to justify a pre-concieved notion.
Children are very sensitive to societal norms, both boys and girls. When high school girls feel that being too smart will make them less feminine and threaten boys, they will have a tendency to conform to these expectations. Sure, there will be some who don't but they will be fighting the system to some degree.
Saying that the low percentage of women in CS is proof that women don't like or aren't good at CS is simply pointing to the current state of affairs as a justification that it is the only possible way things can be. The reality is more nuanced.
The new semiconductor technology angle in the article seem highly fishy to me. Apart from the fact that the statement felt like it may have said "In 10 years we will all be living in colonies on the moon", III-V materials have been losing market share to silicon for decades.
The article mentions that great electron mobility of the III-V materials, which is true, but forgets to mention that they had poor hole mobility. Now I am not a process expert, so maybe there are new techniques to address this. However, over the past 20 years or so this meant that you couldn't make very good CMOS logic and had to use NMOS only architectures. This and the poor scaling has kept the III-Vs away from large scale integrated logic chips.
The III-V devices were used in RF circuits, but they were replaced by Si-Ge and now many RF circuits use regular silicon processes. The III-Vs are still useful for optics.
The truth is that silicon has many problems that may prevent the industry from continuing to scale circuits to smaller geometries and the available workarounds are generally painful. But, the other options are worse.
Maybe in 10 years we will all be using cell phones that use carbon nanotubes... in our colonies on the moon.
In reality, sound is all analog. Those vibrating strings on that guitar... analog. The vocal cords in the singers throat... analog. The vibrating membrane on those drums... you get the idea. The challenge comes in when you want to store that information so that you can play it back later (by creating vibrations in someone's eardrum most likely). In studio recordings, the limits to the noise floor, distortion, and frequency response is set by the analog circuits unless it is a really crappy system.
Before digital computers were available, the only options were to create static variations in physical media, i.e wax cylinders, vinyl records, magnetic tape, etc. The variations were analogous to the sound waves in the air (hence calling them analog).
Digital sound samples the sound in time and quantizes them so that the can be represented numerically. The beauty here is that the physical medium no longer matters. Once you have the numbers, you could store them on spinning magnetic disks or marbles in shot glasses. The difference is cost and practicality.
There is a lot of information theory to cover here, but the relevant basics are that the quality of the stored digital data (talking about PCM here, compression is an other layer entirely) is how finely you quantize each sample (e.g. bit depth) and how often you take samples (e.g. sample rate). In a well designed digital audio system, these factors will not be the limiting factor of your performance. This was true in the early days of CD audio. The dynamic range of the ADCs and DACs was less than what 16-bit quantization could achieve. Also, the analog anti-aliasing filters of the day could not handle the 44.1kHz sample rate well as they had to have very steep rolloff.
Nowadays, the studio ADCs are capable of greater than 120dB dynamic range (the best datasheet I've seen is 127dB) and oversampling techniques like delta-sigma modulation have made the analog filters much simpler. 24-bit resolution is more than enough to handle this. Higher sample rates were initially to help with the analog filtering, but that does not matter today since almost all audio DACs actually run at several MHz internally and use digital interpolation filters to generate the oversampled data.
So, the theoretical 144dB dynamic range of 24-bit audio is not achievable today and will likely not be for the foreseeable future. Going to 32-bit only makes sense if you already have 32-bit hardware and you don't save any resources by going to 24-bit. There is a slim case to make if you are doing lots of processing, but the advantage over 24 bit is just a practical one in most cases.
This kind of turned into a rant, but there seemed to be a lot of analog vs digital comments and I wanted to try to provide some perspective.
From "Bart the Fink" in season 7...
Krusty (to Bart): Bah. What good is respect without the moolah to back it up. Everywhere I go I see teachers driving Ferraris, research scientists drinking champagne.
This is a research idea that MAY be useful, the demise of CMOS silicon has been highly exaggerated.
From the summary:
"an inverter, which was able to switch on and off 500,000 times per second" -> 500kHz is not so great
"however, began to break down after 2 billion cycles" or about 1 second at current processor speeds. That increases to 4000 seconds at 500kHz, or a little more than an hour.
Also, we can put billions of error free transistors on a chip for a few dollars. THAT is the real hurdle that nothing else has been able to clear yet. We will likely be with silicon for a while after it stops shrinking for this reason.
not sure about "eFUSE" specifically, but the fuses on chips are used to write permanent data to a chip after manufacturing, typically during testing. If this is being done in the field and is un-doable, then it is most likely some non-volatile memory that they are calling "eFUSE". I'd bet that the firmware makes it look like a fuse except under certain circumstances.
Actual fuses in chips are thin pieces of metal wire that connect to the power supply voltage. When you try to read the voltage, that connection gives you a logic "1". To blow the fuse, you use a higher power supply than you would normally use and run enough current through the thin wire so that it melts and breaks the connection. Then, when you try to read back that bit it will read a logic "0" for ever. This is very handy for encryption, calibration data, and manufacturing information such as lot# and chip location. Many years back, IIRC, Intel tried to put serial numbers on their CPUs that could be read back by the software. They backed off after some public outcry. This used the fuses described above.
Take a look at your mocrowave door sometime. It uses the same idea. THere is nothing special about the glass, it just has a piece of metal with small holes in it. The EM (microwave) frequency is about 2.5 GHz IIRC. Those little holes are smaller than that wavelength, so nothing gets through.