I agree with everything you just said. Before I moved jobs I was working at a place where I was making 30-40K less than everyone in my shop. I knew the systems better than anyone in there because I had worked my way up the company from lowly technician to a senior engineering position. When I went to my boss and brought up that I needed a raise to stay he agreed, but HR said that due to me not having a degree they weren't going to give me one. This despite being top ranked in every metric within the company from performance reviews to actually getting shit done. So I went out and interviewed for a few jobs, brought back an offer letter that was more reasonable. HR said "Ok, we'll give you a 5% raise", my response was to tell them to fuck off and I left forcing them to hire someone that didn't know the systems inside and out at market rate. Where I'm at now I'm being paid a little above market average, but my co-workers are excellent, my boss is cool, and climate science is a lot of fun. Honestly, I'd work here for a little under market average but my boss is pushing for a serious promotion/pay upgrade.
If you have weeks long running jobs on your desktop you're doing it wrong. There's a reason servers exist in datacenters. I work in scientific computing and people running jobs on their desktop is a huge problem, they spend ridiculous amounts of money for something like a Mac Prol to run this stuff on when they should be buying actual servers instead. Then complain when their desktop is running like shit or their job fails because the building took an intermittent power hit. You can even put GPU compute in servers and have a lot less concern for your systems going down.
I fail to see the problem here. My kids have dozens of sets of those branded Lego kits and you'd be hard pressed to find a single one of them intact. The tie-ins make the kids interested in the sets themselves which is fine because they immediately tear them apart and make new things with them which is a good thing. So if marketing sells a toy to that interests a kid which can then set that kid's imagination free then I fail to see a problem with it.
I'll tell you this, it's way more difficult and far more expensive/time consuming than you might imagine. You should also be very clear about your reasons for doing this, keeping the kid at home is not one of those reasons. I know several homeschooled kids like that and they're a bit stunted, you have to make sure to get out and be active a lot more as well. Look around in your area for enrichment programs, for instance our kids go to public school 1 day per week. It's fantastic, and they make a lot of friends plus get that more structured school environment. Make sure that you're doing this for the right reasons. Finally, it takes an unbelievable amount of discipline from you both as parents. My wife and I are up late hours every week making sure that we have lessons ready, making tests, basically doing the things normal teachers do. If you're still reading and serious about this, it's also very rewarding. Seriously, if you do this right your kid will be light years ahead. You do it wrong and you'll really fuck up your kid's future.
I have a CC3200 and it works pretty well. I'm using it with the Energia IDE and for the most part it's not too bad. I agree that they should do more to embrace the community, but at the moment I don't find developing for these boards that much more onerous than Arduino.
Are you upset? You seem upset.
Working in climate science I can tell you that tape for us isn't going anywhere. Our investment gets larger in it every year, at least monetarily and capacity wise. Several of our groups have growth curves that scale linearly with the output of the supercomputers, meaning our growth is almost exponential. Most of this data is static and doesn't really change once it's been produced, but it does need to be read from time to time. There's no other solution out there that takes little to no power to store, no cooling, and can keep the data for years with minimal loss of integrity. We have data that goes back to the 1940's that we have to keep almost forever, this historical data is hugely important in how we create the models and cannot be lost. So we have to have somewhere to store all that data for the long haul, LTO is the medium of choice because it's vendor agnostic, fast enough, cheap, and large enough to handle what we need it to handle.
It says right in the summary. Docker is one of them. Here's a few others: Doozer(Heroku) DropBox backend services CloudFlare SoundCloud BBC uses it pretty extensively etc....etc
I actually really like the redesign. One reason I didn't buy the previous Model A was that I already had a Model B in the same form factor. This one is nicely squared, will fit in a project box nicely, and is definitely fast enough for some of the project I have in my head. Plug a cheap wifi dongle in and you've got a great IoT platform to play with. It's on par for price with the CC3200 from TI, and while it doesn't have PWM it'll still do quite a lot and is significantly faster than the CC3200.
We have a lot of Centrify in our organization. It's a real pain in the ass to manage.
I wish there were more people like you. My problem is that on paper I don't look perfect, but at my last job I created the entire monitoring and configuration management system from the ground up. I mean literally, I wrote it from scratch(DoD, restrictions on open source, no money to buy anything). I bought a book and started writing, I didn't stop until I'd written a complete product. One that could do everything we wanted from verify configs to monitor the entire system for problems, it was even cross platform with Windows, IRIX, Solaris, and Linux support. I go into interviews and people immediately judge me based on my youth and the fact that I'd only been programming for 3 years without a degree. So they offer me peanuts compared to what I was making. I finally found a job at a national laboratory, but I'm a senior sysadmin instead of programmer which is what I wanted to do.
What we're trying to move towards where I work is RHEL on the server and making use of Docker. The plan is that we'll put some more user friendly OS on the desktop so our users aren't endlessly frustrated by the desktop being shit and let the developers use Docker to create application stack builds. Once they go through the testing and vetting process we'll just push the containers up to the production RHEL servers. This serves two purposes, the people that actually have to interface with the desktop can have something that looks nice like Ubuntu(I get it you don't like Unity, grow up and realize that it's not the horrible end of the world.), Elementary, or some other more desktop oriented distribution that supports Docker. On the other side we get all the excellence that is RHEL on the server side with a nice clean and seamless integration for out developers. It also allows us to keep our developers from needing root or even sudo access because they can do whatever the hell they want with the Docker containers. Once they're vetted for stability and security I honestly don't give a damn how they handle them. I'm very interested in this Atomic Server Red Hat is exploring for this very reason.
It's not what it's capable of, it's what's been tested and is supported. You can certainly go bigger if you want, but don't expect support.
Where it gets even more interesting is when you have things like GPU passthrough to a VM. That's something I'm working on right now, virtualizing Windows and passing it a GPU. I have the VM bridged to the network so it has a native IP address and assign it whatever resources I think it needs to play games. This lets me have a pretty beefy server that's running Windows in a VM as well as doing all the other server tasks I ask of it like file serving, Plex, a VM for web development. All in one machine. Then my desktops/laptops are relatively low powered and let my server do all the work. What was old is new again.
It has the ability to do RBD Snapshots, it's not a perfect solution but it does work. We're actually in the process of building a Ceph system for testing of climate modeling data. https://ceph.com/community/blo...