Falcon Heavy will have significantly smaller payload capacity than NASAs SLS. Perhaps you mean Falcon X (or maybe they call it Falcon X Heavy, hard to keep straight), which is planned to have similar capacity to SLS. NASA is way ahead of SpaceX in development, but has all of the baggage of being NASA, so we'll see who finally gets a vehicle with such capability done first. Don't bet against Musk.
What an absolutely lame response. Those systems exist only as experiments, not as systems that are actually safe to drive on the streets.
Pumping a liquid around at a constant elevation doesn't have a very high energy cost. I imagine you'd lose far more energy through the round-trip chemical reactions than you would through pumping the liquid around.
Storage density is only a problem for portable systems. For fixed storage installations, the important question is "what does it cost per ampere-hour of storage?" Inefficient storage that is cheap can beat highly efficient storage that is expensive.
Of course, to correctly calculate costs one needs to include things that are the result of storage density, like land acquisition and construction of holding tanks. But if the storage medium is cheap, it could come out ahead of some higher density system that has a more expensive storage medium.
Even conversion losses become less of an issue if the storage is cheap enough.
In an emergency brake, you [i]do[/i] slam on the brakes 100%. And communications doesn't matter. For two reasons. First, communication can fail. There's no such thing as a 100% failsafe wireless communication system, so you should not be driving in a configuration that requires 100% perfect communication to be safe. Second, the system you describe requires that the lead car(s) not brake as quickly as possible. So you could easily end up with a situation in which the lead car could have stopped in time to avoid a fatality, but couldn't because of the choice to operate in a train. That's a decidedly idiotic result.
Furthermore, braking isn't the only potential scenario. If the lead car swerves, and the car behind is two feet behind, it may not have enough time to swerve. And, similarly to braking, wireless communication cannot be relied on as a solution to that scenario.
The situation you describe is simply unsafe. No amount of futurist handwaving will change that.
This is a nonsense dream of self-driving car afficionados. It will always be dangerous to drive that close, even if the computer is doing the driving. Different cars have different stopping distances, even the same make/model/year will vary simply because of variable tire/brake wear. In an emergency stop situation, the "couple feet" distance between cars is simply unsafe.
Assuming we're talking about the 2006-2013 C6 Corvette, it has a drag coefficient of 0.28. Lots of production cars have that kind of efficiency. Hell, the new CLA is 0.23. The Prius is 0.26. That Corvette drag coefficient is attractive, but hardly the best of any production car.
Frankly, I'm suspicious of the claim. Note that he was comparing an actual mpg measurement to the rated measurement. Those are not always apples-to-apples. Or the rating speed could happen to be an inefficient speed (given the Corvette's gear ratios). In other words, there will be a speed around 55 mph where a Corvette gets better mileage than it does at 75 mph -- it just might not be exactly 55 mph (or whatever speed they use for "highway") where that speed occurs.
LOL... No. That doesn't happen. You can stand in the sun all day in 25C temps. You might get a bad sunburn. You might get dehydrated. But your brain doesn't heat up and fry in your head.
Hell, where I grew up it was regularly 45C+ during the day. You had to worry about staying hydrated, and about getting sunburned if you didn't have a good tan (or use sunblock), but otherwise you could be out in the sun for hours without your brain overheating and killing you.
The point is that more people aren't getting these maladies. At least not in any way that would correlate with the dramatic increase in exposure to cell phone and WiFi signals.
But I do agree that we can almost certainly show that signals of these types do show up as at least minor changes in cell activity. But is that leading to brain cancer? If it is, then where's the brain cancer spike we should see related to the spike in exposure to these signals?
Depends on the use case and exactly what ones means by "replacing". Within six months of the iPad's release, none of the senior execs at my company carried their laptop out of the office anymore. They still have laptops, though. So Dell still gets to sell them a new laptop every few years. But the requirements of that laptop have declined. It no longer needs a DVD drive to play movies on long flights. They no longer ask for the most cutting-edge thin/light model laptop, since they rarely carry it around.
Personally, though, I find that the tablet is a personal accessory, not a device to do real work on. I use my tablet for reading, light web surfing, games, movies. I still need a keyboard and mouse/trackpad to really do work (anything more than reading email and making short replies just doesn't work on a tablet for me). Even if I really need to do some research on the web (like car shopping) where I want to be able to have lots of pages open and shift between them quickly, I do that on my laptop.
I would guess, therefore, that tablets don't crowd out laptops very much, but they might change what laptop people buy, and maybe even how often they replace them. Maybe you keep your existing laptop longer. Maybe you don't buy the thinnest/lightest new laptop, but instead buy the slightly bulkier, less expensive model. So I think it does affect laptop manufacturers, but it is unlikely to show up as a lot of users who once owned laptops but now do not.
For the record, the sun's heating and radio wave heating would work differently. The sun heats the surface. The sun wouldn't do a particularly good job of heating the brain. The scalp would heat up, but then blood does a pretty good job of distributing that heat around, and the skull would be a decent insulator. Radio waves would penetrate into the brain and heat it directly.
Furthermore, there is at least one study showing that glucose metabolism in the brain increases in the presence of cell phone radiation.
Having said all of that, there's pretty much no way that either cell phones or WiFi are causing brain cancer. We've been engaged in a natural experiment of the effect of these forms of radiation. Both WiFi and cell phone usage have gone from "doesn't exist" to "ubiquitous" in the course of the last couple decades. We're not seeing an increase in any cancer rate that would show a correlation (let alone causation) with the rather dramatic increase in exposure to such radiation.
These parents want someone/something to blame for their child's death. It's very much that simple.
Self discharge on a 60kWh battery shouldn't be more than say 10 Watts, I believe.
Tesla batteries shouldn't be self-discharging faster than say 10% per month. That's like 0.3% per day. Plus, I think this guy had a 60kWh car, so your 1.3% is too low.
A car battery contains about 1 kWh of power. So this kind of draw would drain a car battery in on day. You could probably leave a car parked for a month-or-so without worrying about the battery, so figure the Tesla is using power about 30x faster than a normal car. That further implies a normal car is running at about 1.5 Watts (which sounds about right for a computer running in low-power mode and occasionally checking for things like a nearby key fob for keyless entry).
Of course, you'd expect to lose charge in a 60kWh lithium ion battery at a rate of about 5-10 Watts. Adding the 1.5 Watts that a car's computer can expect to use, and the Tesla should be using about 6.5-11.5 Watts when parked. I can't tell you where the other 33.5-38.5 Watts is going.
'Netflix might say, "I'll pay in order to make sure that my subscriber might receive the best possible transmission of this movie."'"
Isn't that exactly what net neutrality people are worried about? Because it's hardly a big jump from that to "pay us or your subscriber will get the worst possible transmission of a movie".
My position has always been "I am the ISP's customer. I am not the thing they sell to Netflix." If it's more expensive for the ISP to deliver me video than emails, that should be a negotiation between my ISP and me. It shouldn't be a negotiation betwen my ISP and Netflix, that I end up paying for anyway. Or even worse, that negotiation goes bad, and Netflix just sucks for me with no way for me to improve it... and my ISP tells me "but Hulu works fine... you should just switch to Hulu... trust us."