Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Challenge to other CEOs (Score 5, Interesting) 482

CEO salaries are getting ludicrous, and while his $1M is not that outrageous compared to others, it still could be translated into about 10 FTEs. Setting a minimum wage for his staff, and making sure they can all survive on what they make at their job, will translate in staff dedication that will be hard to put a price on.

I think he's also thrown the gauntlet down to other CEOs, saying: "Dare you to join me!"

Comment Why is this a story? (Score 1) 94

AWS has been around long enough this shouldn't be an issue. If a given architecture cannot survive downtime from a server, or an availability zone, then the risk is no different than if the servers were in a locally-managed datacenter.

In short, if you don't take advantage of what the cloud has to offer in terms of redundancy, then don't expect zero downtime.

Comment Lock argument doesn't hold (Score 4, Insightful) 174

Let's face it: as far as we know, the door lock manufacturers also have a master key to all our houses. The schematics and design of the lock are not publicly available, and most people lack the skills to know if the schematics they are looking at are secure or not. It's the same with an OS. And while I *could* take the lock apart and figure out how it works, I still wouldn't know if my particular lock were secure or not, because I have not seen enough locks to know if this particular one is good or not.

Anytime this condition arises, we replace our own lack of knowledge with a trust in experts. We have to defer the judgement of security worthiness to an expert we trust, in which case we are again disinter-mediated from knowing if the lock is actually secure or not. We all trust *someone* with very specific knowledge to help us make decisions, whether that be medical, scientific, security or otherwise, and in each of those cases, we can find examples of where the expert has let us down.

Comment This will not eliminate diagnosing issues... (Score 1) 39

As systems become more inter-connected and more dependent on standard components, they also become more difficult to diagnose. Problems in one seemingly benign part of the system can affect it and render it unusable, and now those parts may be spread around virtual datacenters and servers. We need OS guru's now more than ever, but it's also expected that those guru's know many different technologies (hence, DevOps and other automation-oriented skills). It's the corollary to what's happening in development: one developer can build more software faster than ever before, but they must also be knowledgeable in a wider range of technologies.

Comment Missing requirement for quality (Score 1) 349

OP mentions a few of the factors that help achieve better software (very good, motivated developers; orientation towards quality, etc). But the most important one was left out: customers willing to pay what it costs to get quality software, and their ability to spot high quality software up-front (during sales cycle). Until that happens, the quality will continue to be poor, because as OP notes, the cost will increase (driving customers away from higher-quality) and the lack of visibility to higher quality will keep them from getting it (ie, ability to recognize a better quality product during sales cycle and pay the extra price knowing they'll actually get higher quality out of it).

Comment Not true (Score 2) 487

Yeah right dude, Steve Jobbs certainly knows nothing about marketing.

Let's face it, the best tech companies out there are run by tech guys.

Bob Lutz is dead-on. Several companies I've worked for were run by the sales guys, and they ran the businesses into the ground by focusing on short-term profits and allowing their products to languish and eventually become irrelevant. They failed to see that investing in good technology and having a vision would ultimately give you an even larger market share. Just like Apple and John Sculley. Sure he got the company back on solid ground, but he couldn't maintain it because he wasn't a visionary tech guy. When they brought Jobbs back again, we all got iPods, iPhones, Macbook Airs and iPads. The PC world has been transformed, and Apple's market cap exceeds Microsoft's.

Comment Customers don't demand quality (Score 1) 495

But the software only has to be 'good enough' for people to buy it, so there's no ammunition for developers to use to get a better schedule.

I've reflected on this problem quite a bit, and I can't seem to get past "Customers are getting what they are demanding," which is to say "not much."

None of the contracts or RFP's I see are demanding performance measures, quality measures, detailed functional requirements, etc. Nor do I see customers diving into the product and really trying it out before they buy, or comparing the competing products hands-on. Our customers purchase based on limited demos run by the vendors themselves, and those making the purchasing decisions don't have enough experience to thoroughly evaluate a product, nor do they seek the expertise of those who can. In the end, purchase decisions are made with very weak product knowledge, and create contracts that are weak on details (but strong on delivery dates).

This results in frustration all around. Vendors have to deliver what the customer wants, but can't figure out what they want until after they buy it, try it, and then complain it doesn't work they way they want. Customers unwittingly purchase vaporware from vendors. The development teams are asked to deliver functionality with little or no guidelines. Target dates for delivery are off by months. In my opinion, this occurs simply because the customer wasn't clear on what they wanted, didn't make those demands clear in their contract, and didn't make a knowledgeable purchase decision.

Which leads to no time for testing, compressed timelines, no ammunition to do it right, no clear budget or tracking of the costs, and on and on.

At least in my field.

Comment Learning BASIC led me to a CS degree (Score 2) 709

I learned to program in BASIC, on an Apple ][+, back in the early 80's when I was 10 or 11. I loved it, but I started wondering how programs like word processors could access a large document in RAM, and work with files bigger than available memory, and other mysteries...which led me to learn C (with a classic Borland C compiler) at 15, and eventually to a CS degree.

In my case, BASIC (and I did LOGO too) didn't ruin me, it made me more curious and moved me into the more complex languages. When I got to college, data structures class was a piece of cake, as I'd already done linked lists and other structures while learning C, and I could easily deal with pointers and pointer arithmetic, multiple indirection, function pointers, and more. I feel a debt of gratitude to the humble BASIC language.

Just a couple weeks ago, I started teaching my son Apple BASIC from a web-based Apple BASIC emulator, hoping that he'll be as excited about programming as I was.

Education

Why Teach Programming With BASIC? 709

chromatic writes "To answer the perennial question 'How can we teach kids how to program?', we created a web-based programming environment. As we began to write lessons and examples, we surprised ourselves. Modern languages may be powerful and useful for writing real programs, but BASIC and Logo are great languages for demonstrating the joy of programming."

Submission + - Ask Slashdot: Where are my 16-core CPU's? (wikipedia.org)

t'mbert writes: We've been told that computer science needs to prepare for the multicore future, that the number of cores will roughly follow Moore's Law and we'll end up with 1024 cores before we know it. But that doesn't seem to be happening. Intel's Core-2 Quad was released January 2007. Here we are in 2010, and we're just now starting to see 6-core systems. But Moore's Law should have us at 16-core by now. What gives?

Comment Re:Lack of Open, Accessible Standards (Score 1) 660

I second this! This is exactly what's keeping most businesses I talk to from using it. Because of the lack of standards, your implementation either isn't compatible with everyone, or if it is you provide a bunch of really complex options that only PhD's understand.

This also produces fear...IT management doesn't understand all those options and the implications of one over the other, and don't want to be held accountable for encryption that doesn't work.

Until there is a simple, uniform and free way to implement certificate authentication, it's just going to wallow.

Comment Are you kidding? (Score 1) 596

This is what they were saying 2 years ago, that the megapixel wars were over, that the sweet spot was 8MP. HAH.

Remember when 22MP was what you bought in a $30,000 digital back camera? At the same time, the Canon 1D line was 10MP or so. Um, yeah that was only 5 years ago.

I say no, the megapixel wars are definitely NOT over yet.

The other thing mentioned here is a concentration on quality. That's been happening hand-in-hand with the megapixel wars. So the latest in the Digic line of processors can both handle more pixels, faster, and produce better quality images at higher ISO, all at the same time. I would expect that to continue as well.

True, at some point physics will take over, as it was supposed to YEARS ago in the microprocessor market, but now we have 45micron processes that nobody dreamed of. The same will happen for CCD's and so on. Who knows where it will end.

Slashdot Top Deals

Only God can make random selections.

Working...