For definitions of 'very strong' equal to 'weaker* and more crack prone then conventionally cast parts, much weaker then parts machined out of forged billets, much, much weaker then forged, rough machined, heat treated, final machined''.
Nope; better than cast, equivalent to wrought static properties (final machining is required to get nice surface finishes and tight tolerances, as it would be with casting). "Sintering" is a misnomer; "selective laser melting" would be more accurate. The parts are fully dense.
Unless I was already 80 they aren't getting any implants into my joints made out of anything short of Titanium.
Want Titanium for that hip, old man? Try i.materialise 3d-print service...
Repeat after me: '3d printers are not able to make full strength metal parts. It is extremely unlikely they ever will be able to. 3d printers are not able to make full strength plastic parts. It is extremely unlikely they ever will be able to.'
Why would he want to repeat that? Fully dense metal parts with equivalent-to-wrought properties out of the machine and with a bit of stress-relief heat-treatment: http://www.eos.info/en/products/systems-equipment/metal-laser-sintering-systems.html http://production3dprinters.com/slm/spro125-direct-metal-slm-production-printer
On the 3D printing front, gimme one that prints steel, aluminium alloys, etc. with the structural integrity of their conventially produced equivalents (i.e. not sintered) and I'll start to take this discussion seriously.
EOS, 3DSystems and Arcam make systems that make Steel, Aluminum, Inconel and Titanium parts with mechanical properties equivalent to (and in some cases surpasing) wrought parts. The "sintering" in the trade names is a misnomer. The process is actually micro welding with lasers or electron beams. Straight out of the machine the parts are fully dense and useful.
In one week I can do experiments that 5 years ago would have taken 10 people a full year to perform. With such throughput it isn't necessary even to formulate a hypothesis. You just test every possible variation and let the data speak for itself. Machines are more consistent than people, don't get tired, if the make mistakes the mistakes are systematic and easy to troubleshoot.
I have to admit, the high-throughput stuff is pretty neat though. I think the great part about it is that it frees up the human to think more about test design and hypothesis formulation and worry less about the mechanics of putting drop A into beaker B.
The annotated ITAR indicates (121.1 Category XV (c)(2), pp50 in that pdf) that there is a speed and altitude restriction: "Designed for producing navigation results above 60,000 feet altitude and at 1,000 knots velocity or greater".
Hopefully they will get credit because their receivers worked at low-speed and low-altitude (on the way down), and they've already integrated their accelerometer data to get very reasonable velocity and position estimates.
"one of the requirements is GPS data over 100k. Even with four separate GPS systems, we were not able to get a high altitude fix." With no tangible record of the rocket's soaring ascent, it's unlikely that Deville and his friends will score the cash. Amateur Qu8k Rocket...
One of the significant hurdles for Carmack's prize is the ITAR speed/altitude restrictions on most GPS receivers. It will be interesting to see what sort of receiver they used; hopefully at least one of the four was an unrestricted receiver.
Griffin was counting on it being too big to fail, so thought he'd get additional funding to cover the overruns.
Absolutely right, he tried to do a Pentagon-style program without a significant Iron Triangle backing him up, so instead of being too big to fail, it was just too big.
I still don't understand how the orbiter would have ever made sense though, every pound of air-frame is a pound of payload given up. Paying to get it up in the first place is expensive, you should bring back as little as possible (teeny-tiny re-entry capsule, separate cargo / orbital lab / what-have-you). Just because the shuttle was initially part of a larger program doesn't make it smart or modular.
The core tenant of design for a long, long time has been modularity and leveragability.
Does long, long time mean 'since the total cluster that was the shuttle design'? Hauling all that tile and structure was a really horrible design decision, Zubrin's criticism of it was spot on.
Not surprised Griffin is trying this. He's always had some agenda.
Griffin is / was such a douche bag. I do not understand why everyone kissed his ass so much, 'oooh, he wrote teh book!!1elventyone!!!'. Yeah, guess what? He spent all that time getting degrees and writing text books and not building or flying hardware!! When the boss draws a rocket on a napkin (Ares I) and ram-jams it down the organization's throat, he is a total and unrepentant jerk-off and should never be trusted with any position of authority ever again. Unfortunately this is exactly the kind of well-credentialed, but worthless asshole that gets promoted in the government, and NASA happens to be particularly flagrant with this sort of buffoonery.
Ah, ranting feels good, especially on the internet where it lasts forever...
Funding NASA helps fund the research and development that allows for the possibility of creating that infrastructure we so desperately need up in space in order to do any of it.
I'd argue funding NASA prevents the creation of infrastructure (gun / laser launch, systems of tethers / rotovators) because NASA can afford one-off rocket shots which result in no residual infrastructure, whereas private industry would have to be smarter (to be affordable).
"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein