throwing transistors at a nebulous problem does not make that problem go away. We don't have anything approaching consensus on what Intelligence is let alone how it might be replicated when we can put trillions of transistors on a die. The canvas has gotten bigger; the features smaller, and we still struggle just to compile a viable application unit with no AI in in sight.
That 10 line PID control loop requires carefully chosen and manually tuned constants to maintain loop stability. Those are constants are determined through empirical evaluation of the system being controlled and its performance envelope. To dynamically adjust those PID constants would require a lot more lines of code and a persistent dataset used to evaluate how effective any adjustments to those constants were over time. That is not AI. Not. Even. Close.
At best it would be an adaptive control system, and they are easily fooled by subsystem failures, aging sensors and other issues that make the approach unsuitable for many applications. Because an adaptive control system that makes these kinds of changes to that 10 line PID control loop can easily paint itself into an unsafe regime.
What is a Motorola 360? I have never ever seen one in use, nor a Sammy gear or a google glass for that matter. I guarantee that when apple sells 10 million iwatchrd the first year, we will all see them everywhere. And yes, I know what a moto 360 is, I'm just proving a point. Also, nobody knows what the iwatch will look like.
I have no idea how successful the iwatch will be, what I do know, it is already a long way from being perceived as being first. It is not walking into a market which has years of necessary frand patents. It is walking into a market with large companies Sony; Samsung; Google already having products(some on their second generation) and patents. Whatever the iwatch looks like they changed the game...and it is costing them now. Oh and I like the look of the Motorola 360 too, so its looking pretty good for an unlauched product.
Take a wander back to the first generation iPod, or the first iPhone. Apple did not invent those device classes.... they innovated them. They refined them.
Small innovative MP3 players had been around for years before the iPod came out... iRiver had some of the best ones available at the time, and they went far beyond what the iPod started out as. But the innovation was embedding a mass storage device, and a well thought out user interface, and having a relatively seamless process to load media onto the device, and creating a media market place to reduce friction in media sales, and really great marketing and advertising....
Same thing with the iPhone. It wasn't the first smart phone.... it was the first rational smart phone. The first with a sensible UX story. The first with seamless media integration.
All of this evolved EVERYONES expectations of what a smart phone could be and ultimately SHOULD be.
As others have pointed out Apple did not invent the GUI even before XEROX there were bold steps in that direction that simply fell flat... Even the LISA... the first Apple attempt at what became the Macintosh was a bloated piece of crap. Apple didn't invent the laptop either. Their first attempt was the Macintosh Portable... IBM Selectrics weigh less and take up less desk space.... forget about using it on your lap... your legs would go to sleep before you got the thing to boot. Their second attempt in each case was full of win.... Mac 128K took off.... and in its time so did the PowerBook 100... the first practical laptop by any manufacturer.
But the founding and initial success of Apple would not have happened without Wozniak.
And you never, ever would have heard of Wozniak without Jobs.
Wozniak needed what Jobs brought to the table as much as Jobs needed what Woz brought. Each, without the other, would have been nothing.
Jobs was the Jelly to WOZ's peanut butter. Without both, you don't get to a PB&J, crusts or no.
I was under the impression that while humans mostly cannot hear ultrasonic sounds, the existence of them can be perceived as a kind of "texture" to other sounds that we can hear. Removing these frequencies all together from all sounds sources can make stuff sounds more artificial.
Nope, it's 100% bullshit. Audiophiles cling to it as justification for spending money on 96 or 192 kHz shit.
When recording a physical sound, the sum total of all frequency components interfering with each other will be recorded by the microphone. A microphone does not record individual frequency components, it records a physical pressure wave. Your ear picks up the effects of frequency components outside of its range interfering with frequency components inside its range. A microphone does the exact same thing.
96K and 192K sample rates with 24 or even 32 bit float sample widths have nothing to do with audiophile gear. It has to do with digital audio processing. Processing at higher sample rates during mixing and editing reduces losses and aliasing errors that creep into the audible portion of the signal from effects, filters, and summing. During final mastering the sample rate is down converted back to 44.1Khz 16 bit as the last step. If you do all the post-processing at 44.1Khz, 16 bit your effective SNR goes to hell in a bucket even with just a few of digital filters in the signal chain. Sure, you can start with a 44.1kHz source and up convert it using interpolation, but that is not as accurate as sampling the live source at 96 or 192kHz. Starting with really clean high resolution sources means that the final result has much better SNR than is possible otherwise.
RPi is an excellent machine, but the GPIO cannot handle realtime apps. What it really needs is a realtime I/O controller. Maybe something like an XMOS controller and an FPGA.
This is exactly what I do with with an Olimex PIC32 T-795H. It breaks the PIC32 IO out to breadboard compatible pins, and comes with an open source version of MMBASIC installed. It is easy to upgrade it to one of the later closed versions of MMBASIC that is more VB-like, and has better performance. Performance is not too bad, it processes about 1 BASIC token per microsecond. MMBASIC even supports treating the unused portion (192K) of FLASH as a file system, and can do autostart to a BASIC app, and it supports app chaining. You can literally plug this thing into a breadboard, plug it into a usb port, open a VT100 terminal and start writing code. On a Mac, you can use screen, but you'll need to modify the function key mappings to get the VT100 function keys the MMBASIC editor supports to work correctly. 32 bit 80MHz BASIC machine that is ready to rock.
Watch The Mill videos.... It is all spelled out.
You cannot win the Ops/W/$ race by spending resources on O-O-O. You have to be smart about the whole chain. FUs are cheap. Handling optimal cases is expensive in the current regimes. So change the rules. Suddenly difficult problems get easier... Intel and AMD and other incumbents are scared shitless of changing their aging ISAs, and Programming Models. And with good reason... they would be forcing every customer to recompile on version 0.1 releases of new compilers, and working with new hardware spins....
Does anyone now still program on 6800, 6502, or Z-80 for commodity level hardware? No! Because those machines are done. They were invented by people who had no concept of where things were going. Even Intel/AMD are pretty much clueless. They fear changing because it has huge costs. So instead they keep flogging an ISA that is so decrepit that it farts zombies!
Funny you should mention that. Lots of mainframe families were doing that decades ago. APP-LOADER: "Oh! Hey! This is a new code module, I don't have a local copy of the binary I need for this. It appears to also include a generic machine code block for my family of processor cores. No cached binary for me, it was built on a different serial numbered machine.... hold on a sec while I post-compile-schedule the instructions to my local object language and re-link it..... There ya go! This will run like oiled snot on me now! Oh! And I have automagically replaced the old incompatible object-binary with the new binary I just built! Have a nice day!"
I'd like nice toys like that on the desktop. Closest we get now from any vendor or OS zealot(s) is.... "Can I re-link this for you? It may run faster."
6502 didn't have a HCF instruction but it did have a key 6800 feature that later RISC machines lack. Almost all operations have an implied target. The A register, or in the 6800 the A/B register. The Mill seems to be making a better use of the implied target concept by putting all resultants on The Belt, thus reducing the classic register juggling that happens in complex code threads on all RISC machines. I think that this architecture has some legs. I hope they get some silicon taped out, and get LLVM hammered into shape to deal with a machine that doesn't expcicitly have globally named registers, or state for that matter.
USB 3.0 and 3.1 can supply 5V@2A or 12V@3A or 20V@ 5 Amps.
I think with 100W available melting a little plastic should be no problem.
So if they can legally access the data stored on the Irish servers, they must produce that information.
This is really simple:
One cannot refuse a court ordered demand for documents just because those documents are stored in safe deposit box in Switzerland. If the person in possesion of the deposit box key is before a US court, they must produce the documents stored in that Swiss bank, as ordered, come hell or high water.
(IANAL) but this is just common sense. Is M$ Legal really this stupid?
Motorsports are considered sports, and their primary attribute isn't physical exertion (besides the extremely long race forms), it's knowledge and skill.
You clearly have never driven a vehicle set up for racing on a track for any length of time. It requires top notch driving skills, mental endurance, and a very large commitment to physical training. Race cars (any format) beat the shit out of the driver, even if they don't have a collision. The steering has almost no power assist, and the rate at which the pedals are used is orders of magnitude more often than driving the commute. Add to that the lateral Gs that keep trying to rip hands off the steering wheel, push legs/feet away from pedals, and push the drivers head out of the region that they can see their mirrors accurately. While it is true the racing harness keeps the torso pretty well strapped to the seat, it takes a lot of strength and endurance to keep out of oxygen debt when the toro is bring pounded around in that seat.
It is even more profound for motorcycle racing.
Golf is a game (like chess, not a sport, like baseball) you play while walking around in a clear-cut that has been reseeded with grass.
Now get off of my lawn with your damn crooked sticks, dimpled balls, and silly looking shoes.