Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Polls on the front page of Slashdot? Is the world coming to an end?! Nope; read more about it. ×

Comment: Answers. (Score 1) 588

by DrYak (#49827031) Attached to: How Tesla Batteries Will Force Home Wiring To Go Low Voltage

I do wish Slashdot would let you edit posts, then I wouldn't have to reply three times!

I'll group the answers.

Then why doesn't it happen more often?

Well, you need to stick needles into the body quite big and deep to have a good contact (the probes mentioned in this Darwin award). And apply a sufficient voltage to them, for a long enough time. That's quite a convoluted way that doesn't happen in every day life.
(I hardly see example how it could happen, except deliberately as in the example).

Actually a healthy heart will regain rhythm easily.

Generally speaking, yes, I agree. A healthy heart should restart.
That's in fact the principle which is used by defibrillators:
- a firbillation: is a big electrical mess where the cells a completely desynchronised and are firing mostly at random each triggered by the mostly random fires of their neighbours. Electrically, the heart gives a signal that looks like white noise. Mecanically, the heart isn't beating in a coordinated manner, but instead its surface is more or less kind of "vibrating" making tons of small uncoordinated local micro-contraction (that's what fibrillation means).
- fire a charge a the heart
- the charge cause all the muscle cells (and the specialized muscle cells that serve as the heart's equivalent of nerves) to contract at the same time and stay contracted for the short duration of the charge.
- after the shock, most of the cell are more or less at the same position in the cycle. (and thus none will start miss firing due to other nearby miss-fires). They are more or less in "waiting state".
- natural rhythm generator generates impulse as usual, and now all the cell should follow the same impulse travelling along the heart (and its nerve-like specialised fibers).
- heart should contract in a coordinated manner and beat as it should.

BUT....
In the Darwin awards example, the current is constant. Which doesn't cause a "resync" as the single pulse that a defibrillator's shock is. Also, given the low resistance of the salty water medium, the current is probably quite high which is dangerous. (I mean relatively speaking).
There's a much higher risk of the heart going into fibrillation in this case.

Of course adding some heart disease could increase the likely hood of dying.
But the absence of disease isn't a definite guarantee to die from such shocks.

I've had quite a few jolts from 240 volt mains from one hand to the other. Explain why I'm not dead.

Basically: you got lucky.

Probably the shocks where short. Or by luck the travel path of the current didn't happen to reach the heart (I've once had a thunder struck patient that survived exactly because of that: the heart wasn't touched).
The fact that you survived previous shock and the fact that you don't have a heart disease doesn't necessarily make you immortal and doesn't guarantee that you won't die next time.

Comment: Skin is the insulator (Score 1) 588

by DrYak (#49826647) Attached to: How Tesla Batteries Will Force Home Wiring To Go Low Voltage

That could very well happen.

The voltage and the current from a test meter are both insignificant.

The reason why low voltage isn't dangerous usually, is because the skin is a damn good insulator requiring voltage above 100v to break (one of the argument invoked by countries using 100volts, whereas the rest is 220v).

The Darwin Award example did stick needle-like pointy ends of the probe *through* the skin. The skin's high insulation/resistance wasn't there any more to shield against "insignificant voltage". The serum of the blood isn't distilled water but is filled with electrolyte. Quite conducting mix. It also runs through the hearth. The rest of the fuilds inside a body are all rich with electrolytes too. That means that the *inside* of a body can conduct electricity quite well, and the hearth can easily get in its path (specially if you put each electrode pole at opposite side).
(one of the reason why it's not a bright idea to swim during a storm. the inside of your body is a *better* conductor that the water around you in the swimming pool, the skin is the only thing in the way blocking the electricity).

The actual delta-V needed for a muscle cell or a nerve to react is quite low (a few dozens of mili-volts are needed to rise above the threshold and cause contraction or impulse propagation). So with the skin barrier removed, it's quite likely that the remaining salty fuilds (mostly blood, but also extra-cellular fluids) can carry enough to cause a jolt to the hearth, enough to disrupt the normal rhythm.

Comment: Chromium and Netflix (Score 1) 81

by DrYak (#49826253) Attached to: Emulator Now Runs x86 Apps On All Raspberry Pi Models

Only Chromium

You can google around and find several tutorial explaining how to compile chromium with support for Widevine turned on (That's the DRM module used by Google Chrome to play the HTML5 EME/VIDEO streams of netflix).

Now the question is:
- are there Widevine binaries available for ARM ? (Not sure. I might remember having read somewhere about such)
- or, alternatively, can similar JIT emulator as TFA's one run the x86 plugin at a sufficient speed, while leaving enough processing power to handle the remaining of the video playing ? (Luckily, there's some hardware acceleration on the Pi, so maybe it's possible to achieve).

You could do the same using a Firefox compile with support for CDM plugins, and using the Adobe CDM plugin for Firefox.
(With the same limitation, either wait until Adobe does an ARM version for all the various mobile incarnation of Firefox, or hope that the plugins can be emulated fast enough).

Comment: Energy storage (Score 1) 81

by DrYak (#49823639) Attached to: Mercedes-Benz Copies Tesla, Plans To Offer Home Energy Storage

wouldn't power companies be doing it?

Here around power companies DO INDEED do it.
And it's called a hydroelectric dam.

- You let it fill when unneeded (and thus store the energy as gravity potential energy). Or you can even actively pump water into it if you want to charge using electricity as an input.
- You start emptying it through the power station to supplement other energy sources when demand exceeds power capacity (as might happen with some forms of renewable energy).

On a really smaller scale, that has also been always the case with isolated usage of solar panels. When you're to remote to be connected to a power grid, instead of feeding the excess electricity into the grid and using the power grid later when needed, you store the excess electricity into batteries and retrieve it when needed.

(And in a way, if you think about it, lots of on-demand energy power-plants - e.g.: nuclear reactor - do in a way store the energy. Except that the form varies (e.g.: uranium/thorium don't store the energy as chemical states as lithium doesr) and usually can't be directly charged using electricity.)

So yes, power companies DO store energy. But due to the scale at which they work, they tend to chose denser (nuclear fuel) or bigger quantities (lakes at electric dam) than a a few kWh worth of lithium batteries.

Comment: Not only Linux (Score 1) 110

by DrYak (#49778385) Attached to: Linux/Moose Worm Targets Routers, Modems, and Embedded Systems

Which raises the question, why is this even news? Is it more Linux/open-source bashing by the commercial OS crowd?

In fact not all of them even run Linux. AFAIK, Zyxel use their own proprietary OS, call ZyNOS (Zyxel Network Operating System).
The fact that their are listed here shows that the worm doesn't rely on a Linux vulnerability.

If Windows Embed had made any significant inroads as a router OS (haha...) it would probably also be among the vulnerable targets.

Comment: Siri and Apple (Score 1) 65

by DrYak (#49778223) Attached to: Microsoft Bringing Cortana To iOS, Android

I'm surprised they aren't all cross-platform.

Well, Apple is still mostly a company selling hardware. So keeping Siri restricted to iOS makes sense, it's an extra bullet-point to sell of their hardware (from which their profit).
Opening Siri to Android would reduce the perceived advantage of iDevices and bring less money to Apple.

Microsoft - in the phone arena at least - is currently selling *OS* and *software*. Keeping Cortana restricted to Windows would have been a selling point for Windows (and thus selling more license to Windows-based phone makers)... but Windows has completely and utterly failed to attract any significant interest.
iOS and Android are the 2 big player, and there's little room for a 3rd one(*), specially given the network effect.They have no point in actually keeping Cortana restricted, Windows on phone is a lost cause anyway.

So having failed that, of course Microsoft will move to the next possibility: indeed, there's massive value in mining information. Siri and Cortana are running locally, they are just thin clients that record vocal commands and send them to data center for interpretation.
There's a high potential to monetize that, so Microsoft has a strong incentive to push it to as many devices as possible.

---

(*) a 3rd *incompatible* one which uses yet another different standard for apps.
There are plenty of small niches for different OSes (mainly full blown Linux) that still retain compatibility with Android apps.
e.g: Black Berry, Sailfish OS, etc.

Comment: Trolls ahead (Score 1) 206

Don't pay attention. It's basically just trolls pretending to be outraged just because mozilla decided to give the option to end users to use a 3rd party binary plug-in to handle DRM decryption in HTML5 videos.
(you know, the same way Flash, Java, Silverlight, etc. had always been plug-ins too. Except that DRM is much more restricted in what it can do, as it runs in a sand box that only allows it to work as a decryption filter).

Comment: Actually (Score 1) 175

Well, actually the *chinese* backdoor is the one which is hardware embed into the chip that runs the LiteOS.
The 9KB you're looking at are the *russian* backdoor that they managed to sneak in without anybody noticing.
(The remain 1K was written by a coordinated effort of european spying agency... hey not everyone has the ressource of the big player, some need to pool together)

The US you ask? They are busy introducing a new law that will make eaves-dropping access mandatory on all IoT gizmos.

Comment: Standards (Score 1) 77

by DrYak (#49758031) Attached to: New Chrome Extension Uses Sound To Share URLs Between Devices

So, you're saying the problem is that there are currently too many messaging apps, and no agreed upon standard? And the solution to that problem is to create yet another messaging app?

Well technically there is one agreed upon standard: XMPP/Jabber.

But beside Google (who - although helped pushing it forward back then - would rather like that you forgot they support it) and Facebook (who was more or less forced to slap a gateway as an after though to their proprietary system and would like to discontinue it and force you to install their app) no other big major player use it.

Still, it's very popular among lots of small-scale services (which are usually federated among them), and also popular in the corporate world (Cisco, as a random example, provides solution for communication inside a company, that under the hood uses jabber)

But for current big players in the consumer fields (WhatsApp, Skype), there's no such standards.
(And WhatsApp is very active at trying to shut un authorized users out)

Comment: Clear code: Cultural background (Score 1) 414

by DrYak (#49754219) Attached to: The Reason For Java's Staying Power: It's Easy To Read

if you took someone that never read or wrote code before and showed them 100 line, idiomatic programs in Java, Javascript, Python, Ruby, PHP, Perl, Lisp, Haskell, C, Fortran, COBOL, Basic, and a few other languages that Java would not top the list for readability. My guess is that the winners would be Basic, COBOL, and Python.

Depends. My bet is that it entirely depends on the background of the "someone" you've taken.
- english speaker ? mostly used to litterature and philosophical logic ? yes, maybe as you list them.

- background in mathematics ? The order will probably be reversed, with probably Haskell, C and Fortran near the top. And probably APL topping them all. And the guy complaining that most of them still miss support for greek alphabet.

some people are used to see things written down in plain text, other are better used to see things written with symbols.

plain text has the advantage of being a little bit clearer for a person who happens to be fluent in the language which was used to create the language (say hello to dialects of Logo and Excel macros translated into various languages). Otherwise it's completely useless (most of the language you mention are based around english. useless non-english speakers. when I was a kid, I started learning to code in basic before I knew english).

symbolic notation has the advantage of being more compact (requires less typing, quicker to read)
cf. the well know geeky joke of "add 1 to cobol giving cobol" vs "C++"

And well, Perl, let's forget about Perl. It's a write-only language.
The only language your cat can write legal code in just by walking across the keyboard. :-D

(Disclaimer: I used to code a lot in Basic as a you kid. Started C a bit later, and learned english about this time. I code also regularily in Perl, C++, awk, php, 386 assembler, etc. I know bits of R, javascript, python, FORTRAN, did some Logo in french in school as a kid, etc.)

How many NASA managers does it take to screw in a lightbulb? "That's a known problem... don't worry about it."

Working...