They had a scale in the first article. You can probably make your own using the tool.
They had a scale in the first article. You can probably make your own using the tool.
Just in case you were worried about Windows updates, the defective patches are for Office 2007 and Office 2013. From the article:
KB 2817630 is not a security patch, it's a gratuitously delivered functionality patch for Office 2013, and man has it had an impact on functionality. I've seen dozens of reports that installing this patch, possibly in conjunction with the KB 2810009 patch that is part of MS13-074, causes the folder pane in Outlook 2013 to disappear. An anonymous poster on the SANS Internet Storm Center offers this picture of the effect.
KB 2760411, KB 2760588, and KB 2760583 are parts of the MS13-072 and MS13-073 security patches for Office 2007. There are many reports of the patches being offered and re-offered and re-re-
... you get the idea
Thank you very much for the expert insight!
Interesting! I didn't know the safety limit was so low. As I recall, the big danger of standing in front of a microwave oven is supposed to be cornea damage, so now I'm wondering about risks to eyesight from charging the phone while you're talking on it.
I dug up what looks to be the main patent for the technology from 2008:
The microwave energy is focused onto a device to be charged by a power transmitter having one or more adaptively-phased microwave array emitters. Rectennas within the device to be charged receive and rectify the microwave energy and use it for battery charging and/or for primary power. A communications channel is opened between the wireless power source and the device to be charged. The device to be charged reports to the power source via the channel a received beam signal strength at the rectennas. This information is used by the system to adjust the transmitting phases of the microwave array emitters until a maximum microwave energy is reported by the device to be charged. Backscatter is minimized by physically configuring the microwave array emitters in a substantially non-uniform, non-coplanar manner.
I don't know enough about antennas and E&M to evaluate that. Any help here? According to the articles it gets ~10% efficiency at 10 feet and receives (?) 1 watt at 30 feet.
On to the possible crank warning signs:
* According to his LinkedIn profile, he's spent his whole career being a CEO and/or (later) doing software testing at Microsoft.
* He's identified as a physicist, but all he has to show for it is a bachelor's in physics from the University of Manchester (where he also "studied
* Twenty years after he gets his degree, having done nothing but software, he's suddenly producing miraculous hardware based on cutting-edge physics?
* Charger is hidden behind a curtain during a demo.
* Charger is six feet tall, but they're going to consumerize it to the size of a desktop PC in two years, when it will cost ~$100.
* Replacing all their off-the-shelf hardware with custom-built optimized hardware? No problem!
* Current fridge-sized charger has 200 transmitters, but when consumerized will have "20,000 transmitters in an 18-inch cube".
* The only public demo makes an iPhone declare itself to be charging. No electrical test equipment or data shown. No real evidence that it does anything.
* Claims the power goes through walls just like Wi-Fi, even though Wi-Fi signal strength can drop by orders of magnitude when it goes through walls.
* Charger only gets 10% efficiency from 10 feet away in open air, but this is never mentioned as an obstacle. Come to think of it, no technical obstacles are mentioned at all.
“In wave theory and electromagnetic systems, you don’t get linearities everywhere,” he added, describing the science behind Cota. “There are situations where double could mean for more, like double could mean square, or 3 plus 3 apples could result in a net total of 9 apples, so to speak. When you move from the linear version to the power version, things happen that were quite surprising.”
I don't know, maybe I'm being too hard on the guy. Maybe he's been doing physics and electronics as hobbies all this time, actually did come up with a workable idea, and used his management experience to drive the development of a real product. Maybe they really will have a commercialized version ready in a couple months and I'll have to eat crow. I just can't help but feel skeptical of people who announce their world-changing new product before it actually is a product.
Why the fuck he's talking about "image quality"? Until we get 4k displays the quality differences are non-existent.
Resolution is far from the only thing that matters for image quality. Contrast, black levels, ghosting, viewing angle, color reproduction, and even input lag (for lip sync) can make a big difference. For an extreme example, compare LCD vs. plasma at the same resolution.
Try InfiniBand, maybe? Those prices make Monster Cable look cheap, though...
Hmm... okay, so 2160p with 48-bit color at 60fps would be ~24 Gb/sec. Double that for 120fps and you're well into DDR2 and multi-lane PCIe territory.
Interesting! Thanks for the details.
Yes, but I'd like it to 3,840 x 2,160 resolution video at 120 or 240fps.
You realize that's 24 gigabits/second *minimum* just for 4K 120fps raw video, right? (With 4K's better color, it might be 32 Gbps, I'm not sure.) That is not a trivial challenge.
or just 2160p as it should be called
Movies come in different aspect ratios. At 1.78:1 you get 1080p or 2160p. At the also popular 2.35:1 you get ~817p. 720p likewise becomes ~544p. Those aren't really helpful for comparison since 817p isn't lower resolution than 1080p. Only the horizontal resolution is constant, so it actually makes sense to use it. The use of vertical resolution comes from the days of analog TV when only horizontal resolution was continuous, not discrete.
(I'm sure the marketing folks were salivating over it anyway.)
Also, while I haven't watched your hour-long video (summary?), I'm not sure why anyone would target 4096 pixels wide, which would make upscaling existing HD very painful. Doubling the resolution is much simpler, and I very much doubt that 4K was ever a spec as opposed to a marketing term.
Have you seen the price of gold recently?
We're talking microns of gold plating on the surface of another metal. If you're paying more than a few dollars extra for that, it's not the gold that's driving up the price.
That being said, I agree that digital signals and error correction along with electrical and mechanical standards make cable quality almost irrelevant.
Every jurisdiction effectively picks and chooses which laws it's going to enforce and when. It's called "prioritizing". And sure enough, that's what the feds are doing:
The memo directs federal prosecutors to focus their resources on eight specific areas of enforcement, rather than targeting individual marijuana users, which even President Obama has acknowledged is not the best use of federal manpower.
The moral and legal value of prioritization is in the results (i.e. who gets targeted and who gets ignored), not the act itself.
Both were similar in the amount of destruction they caused but Katrina was only SS Cat 3 at landfall, where Andrew was SS Cat 5.
Hurricane Ike produced a similar situation a few years ago. It hit Texas as a very large Category 2, causing far more damage than one would expect from the wind speed.
Accuracy measures how close the frequency is to the target, on average. Stability measures how the frequency drifts over time (and temperature, etc.). Accuracy is more of an absolute measurement while stability is more of a relative measurement. From the article:
The ticks of any atomic clock must be averaged for some period to provide the best results. One key benefit of the very high stability of the ytterbium clocks is that precise results can be achieved very quickly. For example, the current U.S. civilian time standard, the NIST-F1 cesium fountain clock, must be averaged for about 400,000 seconds (about five days) to achieve its best performance. The new ytterbium clocks achieve that same result in about one second of averaging time.
[U.S. civilian standard cesium reference clock] NIST-F1's performance is described in terms of accuracy, which refers to how closely the clock realizes the cesium atom's known frequency, or natural vibration rate. Accuracy is crucial for time measurements that must be traced to a primary standard. NIST scientists plan to measure the accuracy of the ytterbium clocks in the near future, and the accuracy of other high performance optical atomic clocks is under study at NIST and JILA.
So it sounds like accuracy is defined in terms of how well the clock reproduces the ideal frequency of the physical process it's based on. Hopefully there's a physicist or two around who can give us the exciting details.
A bug in the code is worth two in the documentation.