Stephenson starts slow, really cranks it up, then ends abruptly.
Yeah, I think you are correct. Tablet sales are down, PC's are up. When tablet sales were rising, PC sales went down.
PC's have some really nice specs these days and 8.1 is usable.
Luckier than me. I do not remember what my father sounded like
No one tacked '-gate" onto the name of the complex, which is the previous poster's point. Now, everything get's "-gate" added to the end.
Why should the crime be " And what means are used to detect drivers who are high on pot?"
We have laws on the books for reckless driving, speeding, etc, etc. We don't have legal limits established for all sorts of drugs that people can legally obtain (including amphetemines, and opiodes). The current metric is how safe a person is handling the vehicle for pretty much every single item a person can ingest. Why would marijuana have a different standard than Vicoden? Than Percocet?
Generally, if you are the type of organization that needs to monitor a 1TB for filespace, you're the kind of company that can fill that 1TB of filespace.
Where I currently work, it is not unusual to fill 1TB in about the same amount of time it took to fill a 100MB drive back int he day.
Honestly, I think FDM is not the future of 3D printing.
Stepping back though, I am thinking of the hundreds of computer companies that have come and gone. There were some very big names that stepping into the PC business at the time as well as others who were big in other areas who moved into this field trying to position themselves. They ran on the potential of the market, not on how to make it happen or on what it would look like. The ones who moved on were the ones who saw the growth market. Altair, Sym-1, Aim-65 were for hobbyists. TRS-80 was one of the few who had it right, but they were early to their window. (I'd argue that Radio Shack was a much, much bigger name then than Dremel is now for the kind of stuff you're talking about. ).
FDM produced parts on this class of hardware is, to be frank, rather crappy. There are over 200 printers listed on 3ders.org and I seriously doubt more than 10-15 will survive the next 3 years and that does match the weedout of the 1st generation of PC manufacturers. (Granted, there were a lot fewer as the barrier to entry then was much higher, but almost none had the engineering and market position to move into the PC clone market. It was not IBM taking the business market that killed most of the 1st generation as much as the fact that they created what amounted to a industry standard that very few were positioned to exploit or have viable alternatives for income. The standard that will need to be met going forward into the second generation of the current 3D printer wave is appliance like behavior with good part quality).*
The main difference here between then, and now, is that major players in the 3D industry are not sitting back. They are very active and have a huge backlog of patents to draw on. HP is already out there with its business class 3D printer. Dremel priced their printer at exactly the same price as the 3D Systems Cube 3 and has almost the exact same specs.
*A discussion for another day. I would be very interested to see if there is any correlation on who survived and who could run Lotus.
You might want to step back for a second.
Your first list is items that deal with engineering issues and, as you say, can be engineered around. 2 of those 4 items do not apply to the 3D dremel printer.
The second part.. has absolutely nothing to do with running a 3D printer and everything to do with part design. You could send me your CAD files, I should be able to run them through slic3r, and print them sight unseen. Parts design requires a lot of skill. Printing out a part, not so much. But, with so few variables, g-code conversion is a relatively simple procedure.
I build CNC machines
I build 3D printers.
I am guessing you have never used a CNC milling machine. Let's look closer:
Some variables for CNC milling (Not exhaustive):
type of bit (material and shape - probably 20 base shapes in a beginner shop. dozens of bit materials)
geometry of bit (literally thousands of options here)
new or worn, and what is the wear pattern (variable every time. Usually not an issue unless you are doing very precise work, in which case, you need to mike the wear and enter it into the tool table)
number of flutes/teeth
roughing or finishing
step over percent
surface cutting speed
is spindle speed variable
feed per tooth
depth of cut
conventional or climb milling
material being machined
coolant feed enable
coolant feed type
tool number in tool table
homing and limit switches
All of these variables play off each other. You can change one variable, it can then cascade into changing 4 or 5 others easily. Many of the variables above can destroy the bit, machine, part, or injure you, if you get it wrong.
The fact of the matter, I can take yoda.stl, run it through slic3r, stick it in a 3D printer and not worry much about it. Someone needs to know the g-code along the workflow, but realistically, it is the coder for Slic3r in this example and it is automated. If the machine is calibrated, it will print. If I run a milling operation through CAM software, it needs to be test cut to verify it won't damage anything. Just not inserting the milling bit all the way can damage the machine.
Now, look at it from an appliance situation. Do I know as the machine designer, what material or bits will be used? Do I know what sort of shape they are going to try to machine? I would have to lock down that machine to a ridiculous degree to get it to behave like an appliance, and even then, I can't be sure it won't damage anything. The Dremel 3D printer looks to be locked down with very few variables. It is designed for people to just load a file and hit "run". From a marketing and legal point of view, which is a better product to market?
"Dremel 3D pre-sale starts Sept. 18, 2014, on homedepot.com and amazon.com, with in-store availability at select The Home Depot® stores in early November."
That's a WOW right there.
I've been through the PC boom in the late 70's and the Internet boom in the 90's. That "no one points at 3D printers" is no more true than when it was said about PC's in 1979 or the Internet in 1994. (I heard that exact sentiment expressed those years).
This is what a boom looks like right before it goes off.
From personal experience..
Trying to design and build a CNC machine to function as an appliance is very, very difficult. There are simply to many factors that impact how well the machine would work. A person who writes g-code for a milling machine has to be able to understand how it will work - balancing the motors, speeds and feeds, materials, and working head. A 3D-printer requires very little, if any, skill on the part of the person using the machine. They can just load pre-packaged items, if they feel like it. It is a much more consumer friendly product with a huge upside.
I've seen dozens of printers listed with better specs, but most of them are dummy specs. You couldn't run most of those machine anywhere near the specs they list. How many 3D printers out there actually achieve the speeds they claim, or the print area?
Honestly, if they can deliver a machine that works at those specs out of the box without tinkering or having to recalibrate, it just might be worth that amount. It looks reasonably solid and rigid and, from an outside view, well designed. (No idea where the spool feeds from from the pics).
Because the article was about device support in the kernel and systemd..
There are already a lot of server centric distributions. Ubuntu is just not a good choice for server side. That says nothing about the ecosystem in general. It doesn't even really say anything about Debian, which is Ubuntu's base.
Funny thing.. Back when I was a day-to-day administrator for Solaris (2.4-2.8), the kernel was optimized for the desktop. AIX, at roughly the same time, was optimized for the desktop.
And shoot.. Even now that AIX is a "server" only operating system, tuning the kernel is still a requirement. Whatever your settings are is kind of irrelevant in the grand scheme of things since no one can tailor a kernel that is perfect for everyone. The first day you roll out your great distro, people will be complaining about the idiotic choices that were made.
As to the "many of the packages required for desktop use not only don't apply to me" statement.. so what? Don't install them. That isn't a reason to get rid of systemd and fork the kernel.
Here's a challenge.. post what you think are the great tunings you think your distro needs, then we'll see if 10% of the people who read the specs agree.
Last week, the complaint was that systemd was making Linux look like windows. This week, the plan is to adopt the Windows server/workstation design philosophy as a fix to the problem..
I saw a lot of assertions in the article, but none seemed to actually have any data behind them. Nor, is it really apparent how a fork would leave either branch the critical number of developers needed to handle the respective branches.That is aside from the fact that the 2 kernels would have about 95% overlap in code base, which would separately need to maintain their own build environments and development paths.
Let us look at one of the assertions:
"However, they're also demanding better performance for desktop-centric workloads, specifically in the graphics department and in singular application processing workloads with limited disk and network I/O, rather than the high-I/O, highly threaded workloads you find with servers. If Linux on the desktop has any real chance of gaining more than this limited share, those demands will need to be met and exceeded on a consistent basis."
How would a kernel fork address this? If the need is there now, in what way is the current environment stopping the developers from releasing code to address these issues?