Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment How can openness lead to closeness? (Score 3, Insightful) 250

Because the number one thing openness generates is chaos and multiple competing claims about reality. Say, many Linux distributions, each claiming to be great, and in fact, many variants of Linux distributions often with many versions and many wrinkles, and many varations of packages, libraries, and so on.

If you want to build or customize things, openness is great. If you just one to pick something up, use it, and move on, a huge amount of confusion, overhead, and pain is involved in trying to pick the "right" version (particularly if you're unfamiliar with openness and wrongheadedly looking for the "real" version, as many early Linux dabblers were) and get it to work quickly and easily.

There is thus a huge amount of value added by anyone that quells the chaos—even in a tiny sphere or product—and that can quickly, clearly, and succinctly explain to users just what their version does, without ambiguity either within itself as an instance or over time. The nature of the beast—this value is the result of "closing the openness," if you will, means that it can't be opened, or the value will be lost.

End users want operating systems and devices that are not open systems with unclear edges that bleed into the ecosystem, but rather a single, coherent, object or product that they can acquire, use in predictable and stable ways, and then lay down once again. They want systems and devices about which books can be written (and bought, and referred to months down the road) without quickly becoming obsolete, and with the minimal risk that this book or that add-on that they purchase will fail to work becuase they'd misconstrued the incredibly subtle differences and variations in product naming, versioning, and so on.

In short, massive openness is incredibly generative and creative, but leaves in place a systems/software/hardware version of the "last mile problem" for computing. Having a fabulous network is one thing, but consumers just want one wire coming into the house, they want it to work, they want it to be predictable and compatible with what they have, and they want to know just where it is and what its properties, limits, and costs are. They are not interested in becoming engineers, the technology they use is only useful to them as a single, tiny, and managable facet of the larger ecosystem that is their life.

This "last mile problem" cannot be solved with openness in hardware or software any more than the last mile problem for wired providers can be solved by opening up all of urban geography to any comers and saying "lay all the cable you want, anywhere you want, to and from any building or system!" First off, it would result in a mess of wires (not un-analagous to what we see across much of free software's development space) and next because most consumers wouldn't be able to make heads or tails of it, much less make a choice, and they'd probably resent the complexity in their backyard and try to do away with it.

Openness leads to closedness because to the extent that openness dominates in the development and engineering space, closedness increases as critical need for carrying whatever is developed to the average consumer space, in precisely the same measure.

Comment Re:Futile search? (Score 1) 208

I agree with most of what you wrote. But I have the most interest in sample returns because we have such vastly greater analysis capabilities here on Earth than we could ever send on a mission - especially a lower budget mission. And by leaving off surface science hardware, you save development costs and a significant amount of spacecraft mass.

Also, capturing samples, you don't have to land to have a low impact velocity. If you reach Saturn via ion propulsion then you could at little cost enter a Molniya-like orbit over the plumes so that the spacecraft would be nearly stationary relative to the particles during collection. Enceladus orbits are slow to begin with due to the low gravity (0,114m/s versus Earth's 9,81), and by positioning a high apogee or near-apogee over the plumes it might even be possible to collect jet material at lower impact velocity than one could from the ground. Enceladus's gravity would contribute to decelerating the particles and, if desired, one could have the probe's ascent phase over the plumes (rather than the apogee) for further relative velocity reduction. Impact velocity would be not much more than the random variation between the particles' individual trajectories, and some would impact with near-zero velocity. Combined with a carbon aerogel collector (much less dense than the silica aerogel used by Stardust), I seriously doubt you'd do any damage at all to what's collected - most particles shouldn't even melt.

Every added system is added mass and development cost; landers don't usually come cheap, even on a low-gravity body like Enceladus. And dropping a lander near potentially unpredictable fissure geysers carries a risk. So I personally tend to favor spaceborne collection. That said, one would probably learn more from the surface, and you'd be able to sample surface ices as well, not just plumes.

Comment Re:certified materials (Score 3) 220

You think having the part designed to handle five times the load it actually experienced to not be "with sufficient margin"? How much of a margin do you want them to put, 100x?

RTFA. They were doing statistical-sampling quality control testing of struts. The problem was that most of them were just fine, but there were a very small number which were totally defective and broke at a tiny fraction of their rated value. And no, SpaceX did not make the parts, it was an outside supplier. And yes, SpaceX A) will now be testing 100% of them, and B) is ditching the supplier.

Comment Re:Transparency (Score 1) 220

It's not just about the cost of a failed launch, there's also a huge cost to a company's reputation if a rocket fails. And to their schedule.

Out of curiosity, is there any lightweight way to sense how close a part is to failure *in use*? I mean, finding defects on the ground is great, no question. But what if something would doom a mission not due to a part having a manufacturing defect, but due to an oversight somewhere in the rocket design process, or assembly, or transportation, or launch setup, or unexpected weather conditions, or whatnot? It seems to me it could be a massive boost to launch reliability if one knew that a part was about to fail - for example, in this case, the computers could automatically have throttled back to the rocket to reduce stresses, at the cost of expending more propellant, and possibly been able to salvage the mission. And then the problem could be remedied for future missions, without having to have a launch failure first.

To pick a random, for example, would there potentially be a change in resistance or capacitance or other electrical properties when a strut nears its breaking point?

Obviously, though, if adding sensing hardware would add a high weight or cost penalty, that would be unrealistic.

Comment Re:Futile search? (Score 1) 208

Funny ;) But the main point is that its surface is high radiation and very oxidizing; and as far as we know there's no liquids anywhere on Mars except for possible transients or extremely perchlorate-rich brines (aka, something you'd use to sterilize a rock of life).

On the other hand, subsurface water oceans are common elsewhere in the solar system, and colder bodies are known and/or theorized to have a wide range of alternative liquids.

Comment Re:Holy Jebus (Score 5, Interesting) 220

Also, maybe it's just because I've never worked in that industry before, maybe it's common practice in rocketry, but is anyone else impressed with the use of sound triangulation to figure out which part broke? I've never heard of that being done before.

Sad that the Falcon Heavy won't be launched until next spring, I've been really looking forward to that. Oh well...

Comment Re:Holy Jebus (Score 4, Insightful) 220

Elon is surely really fuming about this one, as I know from past interviews with him that he really doesn't like having to source hardware from outside suppliers. He has the old "robber baron" mindset of wanting to get the whole production chain start-to-finish in house, and it's one of the things that really frustrated him when he started Tesla: at the time of the last interview I read on the subject (something like 3 or 4 years ago), he had gotten SpaceX up to 80% in-house, but Tesla was only up to 20% in-house. Car manufacture has long been all about sourcing parts from a wide range of outside suppliers.

But even at 80% in-house at SpaceX, looks like that remaining 20% still bit them : Seriously, failing at 1/5th the rated failure value? The vendor might as well have given them a cardboard cutout with the word "strut" written on it in sharpie.

Comment Re:Futile search? (Score 5, Interesting) 208

The speed of light also comes into play in the Fermi Paradox. It's quite possible that for a billion years there's been a vast galactic scale civilization in the universe emitting copious amounts of readily-identifiable radiation. But if that galaxy is more than a billion light years away, it would be physically impossible for us to see them.

There's lots of things about the universe that would make it hard for advanced lifeforms to spot each other unless they're close.

And I fully agree about our own solar system (although I personally think Mars is a terrible place to look and Europa is overrated). There's so many "worlds" in our solar system with fluids (including water, although I wouldn't be so bold as to say that it's a requirement for all life) and energy sources to harness. Organic chemicals seem very common too, even complex ones.

Of all of the bodies in the solar system, I think Enceladus has the best potential payoff in terms of "dollars vs. chance of finding evidence of life". Namely because you don't even have to land on it to do a sample return (but if you do want to land on it for better sample collection, it takes little energy to take off again). And because it emits its internal sea straight up into space. And its internal sea has interesting properties - namely, it's a hyperbasic sea caused by serpentinization of its rocky core, which is a process that also releases hydrogen, giving a potential fuel source to hydrogen-metabolizing life.

That said, my dream mission is still a Titan sample collection/return mission using an RTG-powered rotary nacelle craft to fly in hops all across the planet over the course of a year, recharging its flight batteries overnight on the surface and taking small samples from every potential terrain - dune fields, rivers, the various seas, cryovolcanoes, etc. It would then re-dock with its ascent stage (single solid stage similar to a small Pegasus stage), lift the ascent stage out of the atmosphere (to reduce drag) and as fast as possible until it's drained its flight batteries (which would happen quickly with the added load), ditch all unneeded weight and fire the ascent stage to re-dock with the ion-powered orbiter that got it there. The orbiter, having spent the past year skimming the outer layers of Titan's atmosphere for return propellant that doubles as an atmospheric sample return, would then return to Earth, possibly skimming Enceladus's plumes and Saturn's atmosphere on the way for more sample returns.

No question that would be a flagship mission, though, requiring two RTGs and three stages. An Enceladus-only return could probably be done on Discovery or New Frontiers budget (probably the latter).

Comment Re:100 million quest to waste 100 million (Score 5, Interesting) 208

It's a serious point. Our own radio signals are probably indistinguishable from background noise from Alpha Centauri, and they're actually reducing with time, not increasing.

Rather than than looking for "stray radio communication" (you really think an advanced society is going to lose lots of energy to stray communications?), we should either be striving for extreme optical / UV resolution (satellite-based interferometer telescope) so that we can spatially resolve surface spectra on extrasolar planets in our area to look for signs of life; and in general look for signs of energy release that might be associated with interstellar travel, such as antimatter annihilation, directed thrust, solar sail reflection, etc.

IMHO.

Comment eDiversity (Score 4, Funny) 398

About Us:

eDiversity was founded in 2015 by Ayotunde Okonjo, a self-taught Pakistani refugee of African descent. Spending her teenage years in Ecuador facing discrimination as a lesbian of colour, Ayotunde overcame the challenges of her muscular dystrophy and moved to Silicon Valley where she met Kiri Chey, a survivor of the Cambodian genocide and Heba Mohammad, a Yemen-born teacher of the Chemehuevi Uto-Aztecan language, and together their shared interest of underground Soviet-era outsider art and Haitian folk dancing brought them together to form eDiversity.

At eDiversity, we utilize crowdsourced design and 3d printing to provide innovative solutions to underprivileged children as a solution to the global energy crisis. In addition to our LEED platinum-certified central office, we operate five international branches in Kiribati, Nepal, Sri Lanka, Uganda, and the South Sandwich Islands, the latter of which also qualifies as an internationally recognized penguin reserve.

We seek $5,5m in seed funding for 2.5% of the company.

Comment Re:Before and after (Score 4, Interesting) 132

That's controlled for by the randomness of the counties involved - both changes before and after drilling, and with no-drilling areas in the same region as controls (the control county had a drilling ban because it was in the Delaware River watershed). The admissions were largely not due to accidents - cardiology admissions were the strongest correlated. However, the authors don't identify the particular causative factors. They speculate, for example, that it might be diesel exhaust from all of the work vehicles that could be a causative agent. Another speculation is that the development of the industry has changed the demographics of drilling areas.

We really shouldn't be surprised that living next to industry in general isn't good for one's health, just from these sort of factors alone. Exhaust from heavy work vehicles, noise, dust, etc aren't famously conducive to good health. Even living next to a busy road is correlated with negative health effects.

A real problem with the study is, as they wrote, "Given that our modeling approach cannot account for within zip code demographic changes over the study period,". Curiously, while there were positive correlations between wells and health problems in most fields, there were negative correlations in gynecology and orthopedics. They remark "However, within the medical categories of gynecology and orthopedics, inpatient prevalence rates are expected to decrease each year by around 13–14% and 3–4%, respectively. Despite this surprising result, it is unclear why gynecology and orthopedics inpatient prevalence rates are decreasing each year. It is unlikely that these decreasing rates are related to the increased hydro-fracking activity." I'm surprised that they were allowed to get away with this - you shouldn't be allowed to credit increases to an industrial effect while just dismissing data (quite significant data) that doesn't match your hypothesis. There could be actually very useful information about the validity of their overall study and their conclusions in the reason for why gynecological inpatient cases are declining. For example, perhaps the demographics are changing to a lower percentage of women due to the arrival of the drilling industry. Men have shorter average lifespans and in particular a higher rate of cardiovascular disease.

To me, this is a really big hole in their study, and again I'm surprised it passed peer review with it there. But apart from that, I see no problem with the study, so long as people don't overinterpret the results. It's a very broad, generalized study focused entirely on correlation and not causation.

Comment Re:Fundamentally flawed (Score 1) 188

I find it amazing how much people obsess over the cost of production and disposal of a couple hundred pounds of the mass of an EV, and ignore the environmental cost of production and disposal of the rest of the bloody vehicle, both in the case of gasoline cars and EVs. Really, you think that ICE just popped out of the ground preformed? You think mining platinum for a catalytic converter or lead for a lead-acid starter battery is a harmless process? Lead is far more toxic than lithium.

Comment Re:as always no mention of lithium mining (Score 1) 188

1) Most lithium isn't "mined". It's produced from playas where you have a salt crust with briny water underneath. Evaporation ponds are set up on the surface (where it should be added no life more complicated than extremophile bacteria live, and whose surface is identical over vast stretches of land). The brine is pumped into the evaporation ponds to concentrate it and then the lithium salts are selectively crystalized out. The playas are seasonally flooded so there's no year-to-year water loss, and on some the entire top surface gets flooded out, resurfacing it. If you took down your hardware one year, all signs that you were ever there would be gone the next.

2) Lithium salts are relatively nontoxic. Some places actually bottle natural lithium-rich mineral waters and sell them as a health drink. The symptoms of consumption of lithium at below a toxic (high) level are feelings of calm and a reduced risk of suicide. Long-term consumption of lithium-rich water has been linked in one study to longer lifespan.

3) Contrary to popular myth, there are many places on Earth to get lithium. Afghanistan is not a major player, and is not likely to become a major player for a long, long time.

4) Contrary to popular myth, lithium salts are not expensive. They're so cheap that among the biggest consumers of lithium are glassware/glazing and greases.

5) Contrary to the name, lithium is not the largest, nor most expensive, component of lithium-ion batteries.

6) That "it's better not to junk an old guzzler" car is - you guessed it - also a myth. Which you should be able to figure out just from some extremely rudimentary analysis. The average US driver drives over 12k miles per year. If your car gets 24mpg then that's 500 gallons of gasoline, or 1400kg per year. Forget that most of a car's mass gets recycled at end of life, forget about the consequences of all of the oil leaks and the like caused by old decrepid cars - you burn your car's weight in gasoline every year. And the average car on the road is about 10 years old, meaning an average lifespan of 20 years.

Slashdot Top Deals

Testing can show the presense of bugs, but not their absence. -- Dijkstra

Working...