...you definitely want
:O WHAT?! Ok, it's pretty good, but still... textmate deserves a mention as well.
...you definitely want
:O WHAT?! Ok, it's pretty good, but still... textmate deserves a mention as well.
And honestly, who cares what lead to it's discovery?
Me me me me me me me!!!! And anyone else who is interested in making more discoveries.
No, that's how discovery works. People are mucking about doing something and notice something else happening. "Hmm, that's interesting". The science part is often narrowing down a) What's happening, b) what's causing it, c) how to get it to happen by itself.
True, but I think the point is that it could work differently. If we had a more thorough understanding of neurology, we could pinpoint the precise electro-chemical reactions that need to take place. Then we could follow these pathways and determine precisely what was preventing them. Then we would search for the type of chemical/drug/gene that would have the desired effect, AND we would have a pretty damned good idea what other side effects that treatment should have (or how to prevent them).
I'm not saying biology is there yet. Just that I hope to see that in my lifetime.
So what is your point? That there are still 'Mac OS X' server oses around? Or do you really want to claim that there are morons using Mac OS X Server editions to run CGI bash scripts from an Apache web server?
What's YOUR point? Yes, there ARE OS X servers running publicly accessible, vulnerable software. I am not only claiming it but stating I personally know this to be true. And no, I'm sure as hell not going to name names or give you more details than that.
Running shell scripts by a web server as CGI scripts is simply retarded, regardless what flaws the shell might have.
I already said the developers (and companies still using OS X Server) were being stupid. That is irrelevant.
So picking on Apple because a fix is a day later than the hot debian or ubuntu distro is just brain dead.
Apple was too slow. Being days late matters. This isn't kindergarten, and nobody is "picking on Apple". We do need to be honest and critical. And anyone with half a brain should interpret this as one more piece of evidence that Apple is lackadaisical about servers.
Thank you for the well-thought reply, and sorry for the slow one on my end. I was afraid I wouldn't get to this before commenting was closed (again).
I am half playing devil's advocate, half serious. I am not entirely opposed to prioritizing protocols (say UDP over TCP), provided it's done fairly and in a reasonable, objective manner.
However, this still seems to shift the responsibility and open numerous vectors for abuse. If my neighbor decides to run a call center from home, and use 50mbps of VoIP, and my cable provider oversubscribes their node, is all of my traffic constantly throttled? If my ISP also offers TV streaming over RTP, but a competitor uses UDP, the ISP now has an excuse to "prioritize" their own service and harm competitors.
On a sidenote, I don't particularly want my ISP or any of their intermediaries deciding my skype call or streaming video is more important than a deliverable I need to upload over SFTP by 10pm. I'm not a network engineer, but it seems like it would be pretty easy for them to give me 5mbps, 15ms latency, etc. to the appropriate peering. If peering/backbones/whatever are that congested that often... maybe we can address that instead?
I think we both agree, at least, that ISPs have conflicts of interest and should not be trusted.
It was ad after ad for movies from ten years ago.
The worst are the ads telling you not to pirate movies. Since you're seeing the ad, I think it'd be safe to assume you didn't pirate it. Because if you did pirate the movie, you certainly wouldn't be seeing that useless crap.
The stupidity just boggles the mind sometimes.
It's actually kind of brilliant. They want their remaining paying customers to be afraid to pirate. To think it's difficult, immoral, and dangerous. To believe they made the right choice. Bonus points: make them feel superior to those who do pirate.
They should probably include a short video of a an unattractive geek working really hard to hack something, followed by an image of a SWAT team kicking down a door and killing his puppy before arresting him.
You and I know somewhat what a REASONABLE set of rules of rules might be, but GP is right as to the draft language. It basically said every packet has to be treated the same. As to company A and company B, if company A is a hospital and company B is a Nigerian prince, that's a difficult situation to write legislation for. Is it okay to deprioritize email from known spammers and allow the email from a search and rescue team to go through first? That's not allowed if the rule is "all users must be treated the same."
I don't see how "the ISP should treat every packet the same" is unreasonable. The ISP should guarantee latency, throughput, jitter, availability, etc. per their SLAs. The end user can do their own QOS and decide whether they want netflix or remote robotic surgeries to take priority. If the user needs a stronger guarantee, they should get a better connection with a better SLA. None of this is illegal or unreasonable.
How about ads? On a slow wireless link, is it okay to deliver the text of a web page before the ads from DoubleClick ? They are both http web traffic.
Data should be delivered as determined by the client and the server, not the ISP. I'm not a web developer, but I suspect any real browser will load title, layout, text, then images.
Administrators making case-by-case decisions can make reasonable decisions in most cases. Coming up with simple rules deciding what admins must do in all cases for the next 20 years is much trickier, especially for bureaucrats who don't know the tech as well.
Thanks for proving my point. Local administrators should be allowed to prioritize their own networks. ISPs coming up with simple rules that override admins is much trickier, especially for large media companies with conflicts of interest. I mean, bureaucrats.
A lot of us use OS X for server work. A real terminal (though I really just need ssh and scp), can use nearly every tool I can use on Linux, yet not stuck with the *cough* horrendous Linux desktop experience.
Plus, I get the added bonus of being able to ARD mac systems, test AFP shares from servers that use them, and run Win and Linux VMs. The only way to run all three without wasting a lot of time is on a Mac.
That is not going to happen for any private mac user who has not running an Apache etc. and has not activated CGI scripts (and a router configured to route port 80 traffic to your Mac).
In other words, the thousands of businesses and people using xserves or OS X Server to host various sites/apps with the OS-included software?
Sorry, this "Apple is late" mantras are simply bullshit.
Apple is late. Stupid though it may be, many people are using OSX for servers. Apple did once sell these servers, cater to this market, and have enterprise support. Apple didn't even bother to release a patch for 10.6, even though it is still in use on most of these servers.
Apple completely dropped the ball.
Thanks! Just ordered a set...
Are you talking about the War in Iraq, which Obama boasted continuously about ending, despite loud criticism at the time that he was creating the conditions for what's going on right now with ISIS?
I wouldn't be boasting about that anymore, his related words are now one of those things his opponents publish on Twitter so as to illustrate how incompetent he is.
So you're telling me we wouldn't be at war now if only we hadn't ended the war? It's not enough that my friends did 5-10 tours? How many more did you want us to do?
Read the other comments in this article that point out all the pros. I love md and lvm, but they are little league compared to ZFS.
Hell, just the snapshotting alone. User accessible previous copies of files!
It works if and only if the target system is also using LSI RAID controllers.
In the business world where you don't change the underlying OS on a critical system just because you feel like it, it's pretty easy to make sure the target hardware meets the spec.
In the business world, if you don't have the scale and expertise to build your own cluster, you use real enterprise gear in redundant configurations. Whether NetApp/EMC or ZFS on qualified hardware.
If availability isn't important to you, and you can afford to keep spare controllers on hand so you don't have to wait days to source a compatible controller 5 years from now... fine, use LSI. But don't pretend it's somehow smarter to use HW RAID on a critical system.
Your examples strike me as extraordinarily simple. Are these the things you're actually filtering out applicants on? I figure we're talking junior admin work here, but still..
I am fairly secure in my current position, but I occasionally contemplate becoming a fulltime Unix admin to make my life easier. However, all the postings I see have high listed requirements (e.g. 3-5 years experience in an environment with over 1,000 servers). I figured it was really that competitive these days, even for junior positions.
Or is there something else at play here? Are those posted requirements generally bullshit? Are you going for undermarket salaries? Is your organization somehow unique?
Could you please point out the benefit for US American programmers of a job they don't get hired for being in the US compared to a job they can't get hired for abroad?
A US job that they don't get hired for still:
1) reduces competition for other jobs
2) increases wage competition for skilled workers
Both of which benefit the person who did not get the local job.
I suspect you're right about price fixing. However, the fact that someone in the economy has to pay a large sum of real money is irrelevant in determining cost-benefit.
Yes, it's real money. But so are labor costs. And, in theory, those labor costs represent [a portion of] the real value that person is adding to the economy. So anything that makes the employee able to add value more efficiently is overall good.
In general, an employee would not be earning $200/hr on a $7,000 workstation if they weren't adding more than $200/hr of value to the economy in some way. So making them more efficient either allows them to add more value, or gives them more free time to do other things (which tend to benefit the economy and society as a whole).
So maybe there is collusion, price gouging, artificial shortages, or something going on... but I know people who would gladly pay a huge premium for minor speed increases. And that really drives development, which should ultimately benefit the home user.
Serving coffee on aircraft causes turbulence.