*I* Just shit my pants...
*I* Just shit my pants...
Since it's, you know, those guys, I'm sure their IT solution for NASA involves Citrix somehow.
Wow, way to over-think things.
What I am saying is my experience with a Mac was wholly under-whelming. You may have been able to make things work. Bravo for you. My experience with a Mac has been trying to get a square peg to go into a round hole.
Photoshop, some involves Maya and or Lightwave. Some involves PowerPoint, and some involves Excel
Everyone can do this. It's not hard. Hell my *phone* can do it. Altering text files to configure two disparate systems to speak with each other? Fantastic.
But what can your Mac talk to as far as equipment is concerned? Industrial automation? Sensor to shooter systems? IC2? PWM? High traffic network servers ?
You and everyone else just made my point about consumer-level 'work'.
I am talking about interfacing with real hardware. Big industrial one-off stuff.
Go back to your photoshop, Maya scanners drones etc. Play-things for adults.
Sorry to burst your bubble.
imho, Macs are great at giving people the impression they are getting shit done, when in reality a lot of times it's really only consumer level word-processing that's going on.
Hell you can barely game on a Mac. I gave one a real try when I was a DOD contractor. There wasn't squat I could do with it but use it for the web, email, and making documents.
When you want to connect external peripherals to interact with the rest of the world you aren't going to see Macs. You're going to see PC's running Linux. There are no professional Mac embedded systems that are mature, robust or widely known.
I can build a new PC and buy a used car for the price of a Mac. I have no need for the 'hipster status symbol' that mac ownership is these days.
It's going to take another World War to make that happen. To either wake them up, or wipe them out.
I've found that design review boards are becoming increasingly hostile toward singletons, too. There was a narrow window where they'd at least consider one, back when people started talking about design patterns. These days it's next to impossible to get one approved, even if there's pretty good justification for it. You can always design around the need for a singleton, and usually the system design will be better without them.
Anywhoo, back in the '90's I worked for a company that was getting a B2 Certification for its operating system. My job basically consisted of reading the entire AT&T C standard library code, finding potential security flaws, writing tests for those flaws and then writing a report with the tests which would be delivered to the NSA. I found the remote buffer overflow in the AT&T telnet daemon a couple years before the same overflow was discovered in the Linux telnet daemon. So the NSA basically outsourced the hard work of finding all those exploits to the companies that were trying to get security certifications. It took three or four guys just a few months to go through all the stuff we had to look at. I'm sure we missed a bit, but I was much more confident in the security of their OS at the end of all that. Too bad they eventually went out of business, were acquired by IBM and their products were killed. You know, progress!
I agree with this sentiment, and hope that Trump shares the same views as you.
Eliot would be proud.
Than Linux has won.:)
Patterns have a habit of repeating themselves. Assuming anything less is not a sustainable pattern.
It looks as though MS has finally accepted the inevitable.
Though I shudder to think what hell may follow with MS getting it's fingers in the FOSS Pie.
"We don't care. We don't have to. We're the Phone Company."