As a sysadmin/network architect (the latter is apparently my official title), I feel somewhat offended. Good unix sysadmins will know at least enough of a system language and about computer architecture in general to understand the systems they are maintaining, on top other languages for automation.
I have sort of been dragged into the land of J2EE Application Servers, kicking and screaming. Turns out that in order to understand J2EE App Servers, you have to understand the entire gargantuan, chtulu-esque stack, and to do that, you have to understand Java to a surprisingly deep level. And in order to do this, you have to understand at least a decent subset of software engineering.
Turns out I that five years later, I fully grok things like the visitor pattern, to much of my astonishment. You would have told me 10 years ago that I would come to understand the difference between pass by reference and pass by value, I would not have believed you.
I have also contributed a fair bit of code to the application I am struggling to run decently on a production glassfish cluster, during the process of tracking down performance problems and other random issues that were being blamed on the app server. Not to mention XSS and SQL Injection vulnerabilities I unearthed and fixed.
Of course this experience may not reflect the "real world" accurately, seeing how I also ended up being the gatekeeper who packages and deploys builds. This stemmed from managing the VCS server, the CI and collab servers, etc. This eventually resulted in me pretty much shaping up the entire development process to the best of my abilities. I picked up a lot of skills I may not have had otherwise.
Now you may feel "this guy is just playing programmer, everything must be shoddy as shit", but rest assured, on a personal note, the last thing I ever wish to do is do something badly. If I don't feel 100% comfortable doing something, I usually abstain from doing it altogether.
In relation to the original topic, I often find myself knowing much about something, and being incredibly disgruntled at the fact that I possess that knowledge. On the flip side, it just makes my resume rather impressive rather than unfitting. It's all a matter of how you present it, i.e. "I am a sysadmin/networking guy that also happens to be well versed with this stuff"
The other issue you're not addressing is that most companies don't seem to understand the difference between a CS major in software engineering and a tech. To them they can ask for 3 years of experience in software engineering with skills ranging from C++ to databases, and also ask you to maintain the website, do tech support, work the servers and help with active directory.
The blur is not from the side of the fine people pursuing these jobs, it's from the management.
If all you want is a decently modern bash and unix userland on windows, you might want to investigate mingw/msys. It still uses a translation system underneath but msys is lighter than cygwin and the toolchain produces are 100% native and link against MSVCRT.
You would enjoy tremendously better performance. At least I know I did. On cygwin bash takes approximately 6 whole seconds to parse my
I still think you're not fully getting the point I was making. The behavior of cat doesn't change in theory. The behavior of the terminal when confronted with a bunch of garbage matching the mimetype of a png is what changes. It present it in a useful form. It is exactly like ansi escape codes for color output, no more no less. I still have no idea why you'd want to see the garbage in the first place, because it is just that, garbage. It's not even the bytes in the file, it's the bytes mistakenly interpreted by your terminal, which results in bleeps and borps, and other terrible fuckups until you reset it. If you pipe or redirect the output stream to something else, you're still working with data, because the view isn't involved.
It's simply a different view on the same data. This is what I meant by separation of view from data, which *is* in general, a good idea. Again, I am not saying I would love to see this in my shell. I'm just saying it isn't necessarily broken.
In short, it basically makes your terminal, the view layer, handle the garbage graciously and display the png. It doesn't mean that cat can now render pngs. If that was the case, it would indeed be broken because I don't expect cat to do this. I expect cat to con(cat)etnate files and toss the output to stdout.
It is a subtle difference, but it is quite significant, and indeed the difference between broken and just arguably useful.
Just being devil's advocate here, but did you actually read the article? Only the view intereprets the bytes which makes sense. If you pass it over a pipe, bytes are sent. I think that is what you meant, because no one actually like cat'ing a PNG in their terminal to see a bunch of garbage that may even break the terminal until you reset it. Or maybe that is a very obscure use case you have?
He separates view from data. Which is okay. Not something I particlarily like or find it useful, being a unix greybeard at heart, but it doesn't break it as you suggest. I suggest you read it over in detail again and don't stop at the first sentence.
The only "security" iOS has is that you have to shell out $100/year to be a developer. Gives great protection against hobbyist programmers, does absolutely nothing against the Russian mafia.
Oh god, are you trying to tell me the billion fart apps, soundboards and shitty glorified flash applets from the early 2000s are written by professional programmers? Or that hobbyists don't have 100$ a year to spare for their hobby? Say it ain't so!
The other day I downloaded an app from the market that says you have to root your phone for it to work. Another one says you need something called launcher pro and other apps. Ridiculous.
A platform has a feature that is not present by default, unless the user manually enables it. An application requires or targets that feature and makes no sense for those not having it to run it.
Then another depends on another application because it doesn't make sense outside that context, and thus states so. (on a sidenote in this case often the application will just ask you to download to the other if you wish to use the feature).
How is this ridiculous again? Are you implying the applications could, or should have worked for the phones that somehow don't have the feature? Was that just feature envy or...?
Using the same logic: The other day I downloaded an app from the App Store that says you have to have a camera on your device for it to work. Another one says you need something called iOS 3.0 or later. Ridiculous.
It sounds like you're implicitly demanding something unreasonable. Or am I really missing something?
Also for the record I just switched my iPhone 3G for a Samsung Galaxy S Capivate. It suits me very well and I found most of the apps I found on the app store were also available on the android market. I don't miss anything from the iPhone. At all. I have the same apps, and much, much more interesting ones.
I mean, I'm not trying to be argumentative here because it all boils down to personal preference, but honestly, let's face it -- the App Store may have thousands of apps, but most of them are useless toys, so that figure sounds very inflated to anyone who has actually owned iDevices, Certainly the core group of actual applications is much smaller. I'm not specifically talking about games here, because I still don't think that's the main use of a device, just a nice added bonus.
However, I do agree that the iPad is a much better device than the alternate offerings. I am still debating the actually