It is coming, finally. In 2010 0.1% of the connections to Google's services were native ipv6, and about the same used 6to4. Now, about 6% of the connections are native ipv6, while 6to4 is almost completely gone. 6% is enough that it's actually starting to matter. The fraction currently seems to be growing by 2.5 percentage points per year, though it might still be accelerating. So perhaps we will finally be free from the curse of NAT in a few more decades.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
At least you *can* turn off all labels in Bing maps. The new version of Google maps doesn't let you do that. I enjoy looking at satellite images without labels, and that's the main reason why I've stuck with the old version of Google maps until now. Perhaps I'll switch to Bing instead.
Current international limits are 50 W/m^2. The sun at noon is 1000 W/m^2, so by that standard the rectenna is going to be very large indeed.
50 W/m^2 is absurd! One of the biggest problems with solar power is how much space it takes. Restricting yourself to 50 W/m^2 means that all things equal (which they wouldn't be, but still) you would be doing 20 times worse than normal solar power. For there to be any point in solar power satellites the flux in the beam must be much higher than the Sun's flux.
Are you sure 50 W/m^2 isn't just for some pathfinder experiments? It seems silly to have the same limits for radiation inside a power beam as everywhere else. It's a bit like having the same air quality standard inside a fireplace as in a city.
>Or rectennas. You recall that SPSS's have a downlink portion, right?
The necessary size of the rectenna is set by the size of the microwave beam as it hits the earth, isn't it? Wouldn't that make its size not grow with the size of the array of solar panels in space? In fact, if all the sending antennas work as a single phased array, wouldn't you expect the beam to become smaller as you make the space array bigger?
I would also point out that selling the content in other territories around the world has been an importance source of revenue for the BBC for many decades.
I just checked this, and I'm surprised by how much money they get from this: One quarter of their income is from commercial BBC Worldwide sales.
Without it the license fee would have to be much higher to support the content that is produced.
I wouldn't say "much higher". It would be 36% higher. Definitely noticable, but not dramatic. Or they would have to produce or buy somewhat less expensive programs. Still, it's much higher than the handfull of % that I had imagined.
>If I pay a license fee to have BBC content, then I don't want others receiving it for free.
Why not? That sounds pretty petty. It's already been paid for, and others viewing it doesn't take away its value. If I paid to have something produced, I would want as many people enjoy it as possible. It's people enjoying it that makes it worth paying for social services, and the more people watch the BBC, the more worthwhile it is paying for it. If something you pay for has a large international audience, then that's all the better in my opinion.
After the tax-payers have already paid for a program, it doesn't hurt them if that program also benefits the rest of the world. In fact, if I paid for some television program I would want it to reach as large an audience as possible.
The problem is with all the stuff the BBC doesn't produce itself, but instead licenses from others. Those license agreements are usually much more restrictive than they were when television was simply broadcast to whoever could pick it up. Those radio waves didn't care about national borders, but current licence contracts do.
Hopefully multiple broadcasters in Europe will be able to share the costs of a licence for broadcasting across Europe (or ideally the whole world), so that the total costs for each broadcaster doesn't actually get any higher. Or of course one could just pass legislation specifying what the cost should be, though that probably wouldn't bee free market enough for the EU.
From the first post of that thread:
The video linked below was made by a Russian Model S owner. He was traveling to Barnaul, industrial city in the Altai Mountains in Siberia, and found himself with 70 miles of range left, but 90 miles away from the destination (and presumably charging facilities).
The owner negotiated with a trucker to tow him for 20km in order to get some additional range via regeneration
As shown in the video he "charged" at near 60kW - a rate which, as the owner notes in the video, is 20 times faster than charging from a 16A, 220V outlet.
Here is the actual video.
Replying to undo incorrect mod. But your first sentence doesn't make much sense.
Weren't "my baby dancing to $somesong" videos fair use though? If the baby only dances to music, and the dancing can't be fully appreciated without the music as context, then that should be a good fair use case as far as I understand it. Though I guess some companies consider fair use to be a problem in itself.
I've had this happen to one or two of my own videos of competitive videogame speedrunning. You can't show that without showing parts of the game, and the videos certainly don't act as competition to the game company itself (rather the opposite), so those should also be fair use.
But to have google remove those ads, one needs to go through a scary form where one basically declares oneself to be ready for a full DMCA process and potential court battle about the fair use. Before doing so, I shoud at least consult a lawyer, but finding and consulting a lawyer is a bit much for a simple youtube video, so almost nobody does this. Overall, the result is that fair use doesn't really exist on youtube.
Youtubers with popular videos get an offer from google to "monetize" the videos. If they accept, google inserts ads into your video, and pays them some of what they earn from advertisers". If you don't do this then there will normally be no ads in your youtube videos (though there may be ads elsewhere on the page, I guess - I use adblocking, so I'm not sure).
So a standard youtube video is non-commercial, since the person who created and/or uploaded the video gets no compensation for doing so. A subset of "monetized" videos are commercial because of the income from the inserted ads. I can see the FAA being against the latter kind of youtube drone video, but not the former kind.
Wow, that's a bit underwhelming. That works out to be an average of 5 W/m^2, doesn't it? Where is all the energy lost? Starting from 1000 W/m^2, which already takes into account atmospheric losses, we divide by 3 to take the day/night cycle into account, by 2 for bad weather and by 10 for solar cell efficiency (10% seems to be mid-of-the-range commercial cell efficiency), and we still get 16 W/m^2. That's pretty bad, but it's still 3 times better than Topaz.
The number I used for solar previously corresponds to an efficiency of 33% with perfect weather at the equator, which is doable with research cells, I think. But I'm far from an expert on this, as you can see.
Do you know why the topaz solar farm gets such poor efficiency?
Yes, solar power doesn't have the highest power density of our power sources. 1000 W/m^2 isn't that bad, though, if one could actually sustain that. For comparison, the world's highest-output coal power plant at Taichung has a surface area of 1.3e6 m^2 and and an installed capacity of 5.5 GW, which gives 4300 W/m^2.
But the coal needs to be mined too. Taichung uses about 15 million tonnes of coal per year. Assuming the Haerwusu coal mine is representative, and that it can pump out coal forever, then with its yearly production of 20 million tons, it can supply 1.3 Taichungs continuously. So the effective surface area of Taichung's power production is 67 square km / 1.3 = 50 square km. That gives Taichung a power density of 110 W/m^2. Huh, that's surprisingly low. If one had installed solar panels over that area instead (though of course, solar panels aren't free), then even with a yearly average efficiency factor of 10% they would rival Taichung.
That's just to say that currently popular power sources can be quite area-demanding too.
I just looked through Nerval's Lobster's last 15 contributions. All were article submissions, Nerval's Lobster doesn't appear to comment on anything. Here's the list:
- Was Linus Torvalds Right About C++ Being So Wrong? [Dice]
- Do Tech Companies Ask For Way Too Much From Job Candidates? [Dice] [Hiring]
- 'Chappie': What It Takes to Render a Robot [Dice]
- Demand for Linux Skills Rising This Year [Dice] [Hiring]
- Who's Afraid of Android Fragmentation? [Dice]
- H-1B Visas Proving Lucrative for Engineers, Dev Leads [Dice*] [Hiring]
- In Space, a Laptop Doubles As a VR Headset [Dice*]
- What Does It Mean to Be a Data Scientist? [Dice]
- Which Freelance Developer Sites Are Worth Your Time? [Dice*] [Hiring]
- Building a Good Engineering Team In a Competitive Market [Dice*] [Hiring]
- What Makes a Great Software Developer? [Dice*] [Hiring]
- The Highest-Paying States for Technology Professionals [Dice] [Hiring]
- What Will Google Glass 2.0 Need to Actually Succeed? [Dice*]
Every single one of them is from dice, though only a few of them actually make that explicit (the non-explicit ones are marked [Dice*]. A large fraction of them are related to human resources and hiring people, which I've marked [Hiring]. So its like Nerval's Lobster is using Slashdot as advertising and recruitment channel for Dice.
The average quality of these submissions was very low in my opinion - lots of vacuous pointy-haired-boss buzzword stuff. Very un-nerdy. How did these get through submission moderation? Were they even subjected to it?
Here's how you arrive at that number: 100 billion neurons (correct), each firing at 200 Hz (big overestimate, all the neurons are never firing at their max speed. A more typical number would be a few Hz), each sending signals to 1000 other neurons (underestimate, I think. The average number of synapses per neuron is about 7000). You now multiply those numbers together and say that that's the total number of calculations. That's how you get 20 million billion.
Let's do the same for a CPU. A modern Intel i7 has 1.4 billion transistors, each cycling at about 4 Ghz. Each transistor is connected to three others. So we get a total of 20 billion billion "calculations" per second.
Wow, that's a lot! But it's also nonsense. A single transistor sending a signal to another transistor isn't a useful calculation. And a single neuron firing at another neuron isn't a useful calculation either. Each neuron fires based on the total firing rate it receives, and a series of pulses is needed to stimulate it to fire itself. Secondly, lots of neurons firing together is needed to achieve even very basic things. The "20 million billion" number for the brain is probably overestimated by at least as much as the "20 billion billion" number for the CPU.
So why is the brain's output so much more impressive than a CPU's output? Probably for the same reason that a 1 MHz computer running quicksort performs better than a 1 GHz computer running bogosort. Algorithms matter.