Forgot your password?
typodupeerror

Comment: Re:What about Oregon and Washington? (Score 1) 368

by metaforest (#47668451) Attached to: Comcast Drops Spurious Fees When Customer Reveals Recording

In WA it is sufficient to inform the party that a recording may be made. I know this because I was Director of Operations for an inbound call center.
The call flows announced at the beginning of the call, "This is Blah Blah Blah, your call is important to us. All agents are busy at this time, your wait time will be x minutes. This call may be recorded for Quality Assurance Purposes." Every call heard the entire greeting even if agents were available at the time the call was answered. I don't have a link but this was vetted by the staff attorney as best practice.

Also, the agent who answers the inbound call, or places an outbound call, knows that all calls are recorded. It is part of their training. I don't think there is any way that an entity like Comcast or CenturyLink would have a leg to stand on if the customer records the call without announcement. However, IANAL.. YMMV.

Comment: Re:Idiot drill makers (Score 1) 101

by metaforest (#47596295) Attached to: Fixing a 7,000-Ton Drill

Can't make your shit out of tungsten so when you hit a teensy 8-inch pipe you don't fuck the drill head up?

They should be asking for a refund on their drill head. I've blown apart 8-inch pipes with 10 inch coring bits and did NOTHING to the bit, which itself was about 1/8th the thickness of the pipes inner walls.

Mohs hardness scale, do you even, motherfuckers?

The cutting discs(Tungsten) and the head face(hardened steel) are hardened materials.
The the cutting face was not damaged at all AIUI. It was the head support and bearing assemblies that got creamed by the head chewing through an 8" steel, well casing which is much thicker than el-cheapo galvanized pipe.

The damaged assemblies were designed to be supporting a much lower torque/vibration load cutting through relatively soft, water-saturated, landfill-slurry.

Comment: Re:Looks ok to me (Score 1) 229

by metaforest (#47491787) Attached to: Chicago Red Light Cameras Issue Thousands of Bogus Tickets

In Seattle I challenged a parking violation with photographic evidence. The magistrate accept the evidence into the record. After some back and forth over the interpretation of the evidence, she proffered a deal.
Magistrate: "I'll reduce the fine. $2.00."
Me: "The evidence clearly shows I was not in violation."
Magistrate: "You are more than welcome to take the matter up with a judge."
Me: "So this is really about getting a conviction, not about justice."
Magistrate: "$2.00. Take it, or leave it."
Me: *fuming* "$2.00 it is. Have a good day, Your Honor. *bowing*

Comment: Re:It's a question of mass production (Score 1) 564

throwing transistors at a nebulous problem does not make that problem go away. We don't have anything approaching consensus on what Intelligence is let alone how it might be replicated when we can put trillions of transistors on a die. The canvas has gotten bigger; the features smaller, and we still struggle just to compile a viable application unit with no AI in in sight.

Comment: Re:Warp Drive (Score 1) 564

This.

That 10 line PID control loop requires carefully chosen and manually tuned constants to maintain loop stability. Those are constants are determined through empirical evaluation of the system being controlled and its performance envelope. To dynamically adjust those PID constants would require a lot more lines of code and a persistent dataset used to evaluate how effective any adjustments to those constants were over time. That is not AI. Not. Even. Close.

  At best it would be an adaptive control system, and they are easily fooled by subsystem failures, aging sensors and other issues that make the approach unsuitable for many applications. Because an adaptive control system that makes these kinds of changes to that 10 line PID control loop can easily paint itself into an unsafe regime.

Comment: Re:First; best or cheapest (Score 1) 209

by metaforest (#47251649) Attached to: How Tim Cook Is Filling Steve Jobs's Shoes

What is a Motorola 360? I have never ever seen one in use, nor a Sammy gear or a google glass for that matter. I guarantee that when apple sells 10 million iwatchrd the first year, we will all see them everywhere. And yes, I know what a moto 360 is, I'm just proving a point. Also, nobody knows what the iwatch will look like.

I have no idea how successful the iwatch will be, what I do know, it is already a long way from being perceived as being first. It is not walking into a market which has years of necessary frand patents. It is walking into a market with large companies Sony; Samsung; Google already having products(some on their second generation) and patents. Whatever the iwatch looks like they changed the game...and it is costing them now. Oh and I like the look of the Motorola 360 too, so its looking pretty good for an unlauched product.

Take a wander back to the first generation iPod, or the first iPhone. Apple did not invent those device classes.... they innovated them. They refined them.

  Small innovative MP3 players had been around for years before the iPod came out... iRiver had some of the best ones available at the time, and they went far beyond what the iPod started out as. But the innovation was embedding a mass storage device, and a well thought out user interface, and having a relatively seamless process to load media onto the device, and creating a media market place to reduce friction in media sales, and really great marketing and advertising....

Same thing with the iPhone. It wasn't the first smart phone.... it was the first rational smart phone. The first with a sensible UX story. The first with seamless media integration.

All of this evolved EVERYONES expectations of what a smart phone could be and ultimately SHOULD be.

As others have pointed out Apple did not invent the GUI even before XEROX there were bold steps in that direction that simply fell flat... Even the LISA... the first Apple attempt at what became the Macintosh was a bloated piece of crap. Apple didn't invent the laptop either. Their first attempt was the Macintosh Portable... IBM Selectrics weigh less and take up less desk space.... forget about using it on your lap... your legs would go to sleep before you got the thing to boot. Their second attempt in each case was full of win.... Mac 128K took off.... and in its time so did the PowerBook 100... the first practical laptop by any manufacturer.

Comment: Re:Creativity (Score 1) 209

by metaforest (#47251359) Attached to: How Tim Cook Is Filling Steve Jobs's Shoes

But the founding and initial success of Apple would not have happened without Wozniak.

And you never, ever would have heard of Wozniak without Jobs.

Ever.

Wozniak needed what Jobs brought to the table as much as Jobs needed what Woz brought. Each, without the other, would have been nothing.

^^ THIS!

Jobs was the Jelly to WOZ's peanut butter. Without both, you don't get to a PB&J, crusts or no.

Comment: Re:Hardware sampling rates (Score 1) 121

by metaforest (#47230979) Attached to: The Computer Security Threat From Ultrasonic Networks

I was under the impression that while humans mostly cannot hear ultrasonic sounds, the existence of them can be perceived as a kind of "texture" to other sounds that we can hear. Removing these frequencies all together from all sounds sources can make stuff sounds more artificial.

Nope, it's 100% bullshit. Audiophiles cling to it as justification for spending money on 96 or 192 kHz shit.
When recording a physical sound, the sum total of all frequency components interfering with each other will be recorded by the microphone. A microphone does not record individual frequency components, it records a physical pressure wave. Your ear picks up the effects of frequency components outside of its range interfering with frequency components inside its range. A microphone does the exact same thing.

96K and 192K sample rates with 24 or even 32 bit float sample widths have nothing to do with audiophile gear. It has to do with digital audio processing. Processing at higher sample rates during mixing and editing reduces losses and aliasing errors that creep into the audible portion of the signal from effects, filters, and summing. During final mastering the sample rate is down converted back to 44.1Khz 16 bit as the last step. If you do all the post-processing at 44.1Khz, 16 bit your effective SNR goes to hell in a bucket even with just a few of digital filters in the signal chain. Sure, you can start with a 44.1kHz source and up convert it using interpolation, but that is not as accurate as sampling the live source at 96 or 192kHz. Starting with really clean high resolution sources means that the final result has much better SNR than is possible otherwise.

Comment: Re:Is like an AppleII with 32 bits. Add BASIC and (Score 1) 138

by metaforest (#46969873) Attached to: A 32-bit Development System For $2

This is exactly what I do with with an Olimex PIC32 T-795H. It breaks the PIC32 IO out to breadboard compatible pins, and comes with an open source version of MMBASIC installed. It is easy to upgrade it to one of the later closed versions of MMBASIC that is more VB-like, and has better performance. Performance is not too bad, it processes about 1 BASIC token per microsecond. MMBASIC even supports treating the unused portion (192K) of FLASH as a file system, and can do autostart to a BASIC app, and it supports app chaining. You can literally plug this thing into a breadboard, plug it into a usb port, open a VT100 terminal and start writing code. On a Mac, you can use screen, but you'll need to modify the function key mappings to get the VT100 function keys the MMBASIC editor supports to work correctly. 32 bit 80MHz BASIC machine that is ready to rock.

Comment: Oh FFS (Score 1) 181

by metaforest (#46947013) Attached to: AMD Designing All-New CPU Cores For ARMv8, X86

Watch The Mill videos.... It is all spelled out.

You cannot win the Ops/W/$ race by spending resources on O-O-O. You have to be smart about the whole chain. FUs are cheap. Handling optimal cases is expensive in the current regimes. So change the rules. Suddenly difficult problems get easier... Intel and AMD and other incumbents are scared shitless of changing their aging ISAs, and Programming Models. And with good reason... they would be forcing every customer to recompile on version 0.1 releases of new compilers, and working with new hardware spins....

Does anyone now still program on 6800, 6502, or Z-80 for commodity level hardware? No! Because those machines are done. They were invented by people who had no concept of where things were going. Even Intel/AMD are pretty much clueless. They fear changing because it has huge costs. So instead they keep flogging an ISA that is so decrepit that it farts zombies!

Comment: Re:Couldn't one core... (Score 1) 181

by metaforest (#46946973) Attached to: AMD Designing All-New CPU Cores For ARMv8, X86

Funny you should mention that. Lots of mainframe families were doing that decades ago. APP-LOADER: "Oh! Hey! This is a new code module, I don't have a local copy of the binary I need for this. It appears to also include a generic machine code block for my family of processor cores. No cached binary for me, it was built on a different serial numbered machine.... hold on a sec while I post-compile-schedule the instructions to my local object language and re-link it..... There ya go! This will run like oiled snot on me now! Oh! And I have automagically replaced the old incompatible object-binary with the new binary I just built! Have a nice day!"

I'd like nice toys like that on the desktop. Closest we get now from any vendor or OS zealot(s) is.... "Can I re-link this for you? It may run faster."

Comment: Re:Right, because that worked so well (Score 1) 181

by metaforest (#46946899) Attached to: AMD Designing All-New CPU Cores For ARMv8, X86

6502 didn't have a HCF instruction but it did have a key 6800 feature that later RISC machines lack. Almost all operations have an implied target. The A register, or in the 6800 the A/B register. The Mill seems to be making a better use of the implied target concept by putting all resultants on The Belt, thus reducing the classic register juggling that happens in complex code threads on all RISC machines. I think that this architecture has some legs. I hope they get some silicon taped out, and get LLVM hammered into shape to deal with a machine that doesn't expcicitly have globally named registers, or state for that matter.

If you hype something and it succeeds, you're a genius -- it wasn't a hype. If you hype it and it fails, then it was just a hype. -- Neil Bogart

Working...