Disagree "it's more readable" in all cases. If I'm chaining together a couple of commands, piping the output from one to the next, in *nix I can see them all in my terminal window while with long options from powershell the earlier parts of the line are no longer visible. And no, that's not a corner case, that a common use case.
It is more readable for scripting though.
The truncate option seems the worst of all possible worlds on the other hand. It breaks scripts when they add functionality, so -Rec is no longer unambiguous. Plus I do -R, you do -Rec, someone else does -Recurs, someone else types ti out the whole way. Ugly to maintain over time.
I think I agree and disagree with what you say.
In the "traditional Unix" department, I'm strongest with AIX. It's IBM's flavor,a dn it only runs on their hardware. They do a LOT to make the HW and SW work hand in hand. They have a lot of options to prevent problems or reduce them to non-disruptive (for the end user). They have a lot of admin options that don't require reboots. For virtualized systems the default setup had "dual VIOS" (Virtual I/O Servers) so you can muck with them including rebooting while keeping your virtual machines ("LPARS") purring and happy. If you need big boxes, especially if you are scaling UP instead of OUT, they'll give you a level of uptime that's better than linux can. Becuae (a) less administrative disruptions, (b) less unplanned outages causing disruptions, and (c) only runs on dedicated hardware so it's avoiding all the extra issues that come with that.
However, if you're got an application that scales OUT, Linux can give you better total uptime simply because between hardware and OS licensing you can afford to have more of them. If I'm looking at 50 IBM POWER systems running AIX, vs. 120 linux blades or pizza boxes, and taking any one of them out still leaves your apps available with just a little hiccup to the users who were on that host, that's hard to beat.
These examples both assume that you know what you're doing. Anyone can muck up a machine, being more expensive doesn't protect against that.
It took me a few years for me to discipline myself to including testing and bug fixes in any estimate I made to managers. When ever I would say, "I'll finish coding by X," they would always assume that it would be in release condition by then.
Just give them a breakout like that in the first place.
Techincal Documentation: Z1
User Documentation (or working with a tech writer): Z2
Training (train-the-trainer hopefully): +W
All of them broken down. You look professional, and they have no reason not to include those steps.
I know I suck at doing development estimates.
A struggle is getting people to even agree on what a development estimate is:
Me: "That will take 2 months of development work."
[two months of interruptions, putting out fires and "prioritization" later]
Other: "Why is this not done? You suck at development estimates."
Then make sure you're not surprising them at the end of 2 months. If at the end of week 1 you go to them with "I go two days against the project this calendar week, we still have 38 more to go", they are in the groove for project time and calendar time isn't the same. And if they want them to be, they need to stop you from getting interrupted.
Communication. Verrrrrrry important.
I want to estimate conservatively, but project schedules don't allow for that.
Then your boss/team lead/project manager is doing it wrong. They need accurate information first. They may come back at a later point because they need to change the timing, but not letting you give them accurate information from the beginning doesn't bode well.
There's lots of methods out there to lead projects and software development. Let me focus on classic project management. To (over-) simplify, the idea is to get an accurate idea of how long each piece of the puzzle takes, what resources (time and materials), and what dependencies there are. One the project manager has that, they can chart out the critical path (the tasks that any delays will add time to the project). If that doesn't match what the stakeholders want, then something has to give. Commonly it could be more resources devoted to particular tasks, it could be lowering acceptable quality, it could be pushing out the timeline.
Really, if they've told you how long you have before you estimate it, they aren't doing their job of managing the project and instead are pushing it off onto you, without giving you the authority to fix more than your part of it. It's a recipe for missed deadlines. And it's all-too-common.
So you should be able to estimate conservatively, and then it's there call if they need that part faster. If they do, they need to be willing to put more resources towards it or accept a reduction in quality to get it done on time. Or to revamp the requirements. Or jigger other parts of the schedule so you can start sooner.
Other methods have entirely different ways to estimate. Planning Poker http://en.wikipedia.org/wiki/Planning_poker has adherents, and a big point behind that is that after listening to the story of a part, without discussions (and therefore influence), everyone puts out an estimation card and turns over at the same time. This allows everyone's estimate to be heard. From there you've got high and low estimates talking about why they think it will be long or short, and then go again until there is consensus. It's not trying to match a project plan, but come up with an accurate estimate in the first place.
One of the recurring issues I see with spec is differing assumptions. When someone knowledgeable about some operational part of your business talks about a program doing "X", there's a huge amount of context that goes with it, which may not be shared by the development team (and in rarer cases your QA team).
As a perhaps too-obvious example, in the US if you're dealing with shipping weights you may not consider that you need to specify a field for units and be able to do lb / kilo conversions. Just that you're going to get "a weight" and need to calculate something at a "rate per pound". If it's not specified that the weight could be in other units, that's possible for a developer to me, but someone who dealing with import all day it's equally obvious that weights could come in imperial or metric.
Are things like this someone's responsibility to detail and catch? Of course. Can you assume that they will all be caught? Only if you believe in 100% uptime for computers, too.
I haven't seen a "perfect, one-size fits all" answer to specs. I have seen a wide range from "complete and unchanging" http://en.wikipedia.org/wiki/Waterfall_model that ended up not fitting needs to very iterative that made it nigh impossible to determine costs (and therefore evaluate against ROI and make a business case to do anything). I can say that how I plan for things depends if we're trying to automate a known process which is more static, or new tools with new workflow that as they see what's it's capable of there will always be the next step of "oh, and f it can do THIS, could it do THAT as well"?
Hopefully it'll end up available on DVD eventually, for us poor GNU/Linux users who are not worthy enough for Netflix (or: to any Netflix engineers reading, make it work).
You know, I get netflix on a bunch of embedded systems in my house - one TV, two refurbished blue-ray players I got from Woot! I'd be surprised if none ran Linux. Oh, and my kindle fire gets it, it's Android, so
Really, you felt an article about a new show was improved with a "pity me, I'm persecuted because I use Linux" whine? It's like complaining that your electric toothbrush doesn't run Linux, but you hope Colgate will fix it. I love Linux, doesn't mean that I expect one tool to do everything. Well, Perl, but that's a different story. (And a quick search on CPAN shows WWW::Netflix::API. I'm scared.) Buck up, put out a good story, and if you're editorial doesn't add to it, leave it out.
Circling back around, I was just thinking about rewatching B5 again, it's been a couple of years. I'm looking forward to this, I wonder how long it will take to produce and when it will come out.
Wow, sounds like you've had a string of bad luck.
I'm a PC gamer too. I spend $600 every three or four years to get my desktop machine. I'm not super-high-end by any means, but I'm better than the consoles in terms of graphics, CPU and memory. I usually buy an existing system to get a new OS license and then upgrade the video and memory. I used to build my own systems but I don't have any good places near to get parts and shipping costs per part killed that over the internet. Ah, I wish I lived near a Fry's. Occasionally I'll extend my purchase out longer with some minor upgrades during it's lifespan.
I do have to patch windows
I can't talk about your dongle problems, but since you can't find anything on the internet about it, and we all know how people like to complain on the internet, I can't think that it's widespread.
With the exception of the Wii and the Kinect, for me I find that mouse+keyboard gives me more than any console controller out there. I'm sure that experience varies for others. On the other hand, I wish my monitors were as big as my TV.
Where my use of consoles exceeded PCs was where I had a bunch of people together and we could all do something together.
In the datacenter, applications *shouldn't* care about one OS instance rebooting. In practice, many do and companies find it cheaper to just fix it at the platform layer. I think this mindset is decreasing however so the datacenter uptime issue becomes less and less important.
The number of applications we host that actually allow farms and have low impact to a single server issue are dwarfed by the number of monolithic applications that either need to be run on a single box, or have single points of failure even if partially across multiple boxes.
What you say should be the case, but so many applications don't support it or if they do, don't support it right. We're changing that at the operations level as much as we can, but for some applications there's no opportunity to run multiple active shared instances.
I'd spend say 80% of paperback price to buy an ebook - there's still editing, marketing, and other costs, not just printing and distribution.
But for most popular authors, I *can't* BUY an ebook. I can LICENSE an ebook.
When it's mine, I can move it to other devices, give it away, lend it out, and otherwise have it completely unavailable to change/removal by the publisher or distributor, then I've bought it.
When comparing the value I get from purchasing a paperback (my preferred size, easy to hold and stick in a pocket) vs. licensing an ebook, ebook licensing is worth $2 for me. Which is a crime because a well written, well edited book should return more then that to the author and publisher, but the licensing baggage so greatly reduces the value.
I've been a long time kindle owner, but except for gifts I rarely buy books from amazon. There are other places other there that are DRM free. And I'll support Baen publishing because they supported me with the Baen Free Library for older books in series.
Dropbox, like any service that has servers based in the US, could have everything seized fairly easily. Or just turn it over to a proper law enforcement request, like their TOS stipulates. Given the antagonistic relationship between Kim Dotcom and US law enforcements, Mega would probably would resist much harder. That's got to appeal to some.
I would argue that if you dont want to have these techniques made open in court, then dont use these techniques to pursue someone who will have charges brought up in court. National security investigation techniques implies that this was a case of national security. Seems a bit petty to use these resources to defend disney, pop music, and porn.
I'd love to see a judge say: "Look GCSB, if you can show that Dotcom was a known national security threat to NZ while you were doing this, we can talk about keeping some things off the table. But if you can't, then you weren't doing 'national security' operations and no 'national security' techniques should have been used, and I want to see every single bit that's even tangentially related to this and enter it into the public record."
What I'm interested in is how the collection of communication you have ties together. Say a call comes in to the receptionist, how does she transfer it to you? How do you transfer a call to a coworker, or conference them in on an existing call?
Do you have separate work and personal mobile phones, so when on vacation (or after hours for those that don't have to pick up anyway) you can leave the work phone behind? And of course set your voicemail message to direct callers to the appropriate coworkers who are covering.
That's what I like about a wealth of communication devices, they each have their own strengths. My company desk phone is VOIP, connects using out intranet to all of the other branches (including internationally) and has robust transfer, conference and directory as well as a host of features my mobile does not. In the same way my mobile has features my desk phone doesn't, and we have a corporate IM and email as well, letting me use the best tool for the job.
Agreed. And the states change their tax laws fairly often when you consider across all the states. My wife used to work for aplace and this was her primary responsability. Just to clarify, we're looking at a startup having a dedicated person just to track and pay state sales tax. Or paying a third party to do it for them, which may be cheaper thoguh includes both the costs of the third party, the costs of integrating them into your sales channels (say, to provide the correct sales tax costs on web pages), plus any time spent by members of the startup in finding, setting up, administering and reviewing this.
Ya, big barrier to start-ups. Because it's just as much work for one sale in a state and one hundred thousand.
If they also standardized all state sales taxes it would be less of a barrier to entry. Same rate, same things they apply to, same payment schedule, no "tax reduced economic areas", etc. Of course, then if become cheaper total cost to purchase outside the US (hello global economy), and good luck Rhode Island going after a web site elsewhere in the world for state sales tax. So we'd have to put a tarriff on all international sales to balance it out. We'd still be left with US products more expensive for Americans than those not living in the states, so maybe charge everyone else the amount too. Woo! Oh wait, no.
A rock store eventually closed down; they were taking too much for granite.