Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Not true (Score 2) 487

Yeah right dude, Steve Jobbs certainly knows nothing about marketing.

Let's face it, the best tech companies out there are run by tech guys.

Bob Lutz is dead-on. Several companies I've worked for were run by the sales guys, and they ran the businesses into the ground by focusing on short-term profits and allowing their products to languish and eventually become irrelevant. They failed to see that investing in good technology and having a vision would ultimately give you an even larger market share. Just like Apple and John Sculley. Sure he got the company back on solid ground, but he couldn't maintain it because he wasn't a visionary tech guy. When they brought Jobbs back again, we all got iPods, iPhones, Macbook Airs and iPads. The PC world has been transformed, and Apple's market cap exceeds Microsoft's.

Comment Customers don't demand quality (Score 1) 495

But the software only has to be 'good enough' for people to buy it, so there's no ammunition for developers to use to get a better schedule.

I've reflected on this problem quite a bit, and I can't seem to get past "Customers are getting what they are demanding," which is to say "not much."

None of the contracts or RFP's I see are demanding performance measures, quality measures, detailed functional requirements, etc. Nor do I see customers diving into the product and really trying it out before they buy, or comparing the competing products hands-on. Our customers purchase based on limited demos run by the vendors themselves, and those making the purchasing decisions don't have enough experience to thoroughly evaluate a product, nor do they seek the expertise of those who can. In the end, purchase decisions are made with very weak product knowledge, and create contracts that are weak on details (but strong on delivery dates).

This results in frustration all around. Vendors have to deliver what the customer wants, but can't figure out what they want until after they buy it, try it, and then complain it doesn't work they way they want. Customers unwittingly purchase vaporware from vendors. The development teams are asked to deliver functionality with little or no guidelines. Target dates for delivery are off by months. In my opinion, this occurs simply because the customer wasn't clear on what they wanted, didn't make those demands clear in their contract, and didn't make a knowledgeable purchase decision.

Which leads to no time for testing, compressed timelines, no ammunition to do it right, no clear budget or tracking of the costs, and on and on.

At least in my field.

Comment Learning BASIC led me to a CS degree (Score 2) 709

I learned to program in BASIC, on an Apple ][+, back in the early 80's when I was 10 or 11. I loved it, but I started wondering how programs like word processors could access a large document in RAM, and work with files bigger than available memory, and other mysteries...which led me to learn C (with a classic Borland C compiler) at 15, and eventually to a CS degree.

In my case, BASIC (and I did LOGO too) didn't ruin me, it made me more curious and moved me into the more complex languages. When I got to college, data structures class was a piece of cake, as I'd already done linked lists and other structures while learning C, and I could easily deal with pointers and pointer arithmetic, multiple indirection, function pointers, and more. I feel a debt of gratitude to the humble BASIC language.

Just a couple weeks ago, I started teaching my son Apple BASIC from a web-based Apple BASIC emulator, hoping that he'll be as excited about programming as I was.

Education

Why Teach Programming With BASIC? 709

chromatic writes "To answer the perennial question 'How can we teach kids how to program?', we created a web-based programming environment. As we began to write lessons and examples, we surprised ourselves. Modern languages may be powerful and useful for writing real programs, but BASIC and Logo are great languages for demonstrating the joy of programming."

Submission + - Ask Slashdot: Where are my 16-core CPU's? (wikipedia.org)

t'mbert writes: We've been told that computer science needs to prepare for the multicore future, that the number of cores will roughly follow Moore's Law and we'll end up with 1024 cores before we know it. But that doesn't seem to be happening. Intel's Core-2 Quad was released January 2007. Here we are in 2010, and we're just now starting to see 6-core systems. But Moore's Law should have us at 16-core by now. What gives?

Comment Re:Lack of Open, Accessible Standards (Score 1) 660

I second this! This is exactly what's keeping most businesses I talk to from using it. Because of the lack of standards, your implementation either isn't compatible with everyone, or if it is you provide a bunch of really complex options that only PhD's understand.

This also produces fear...IT management doesn't understand all those options and the implications of one over the other, and don't want to be held accountable for encryption that doesn't work.

Until there is a simple, uniform and free way to implement certificate authentication, it's just going to wallow.

Comment Are you kidding? (Score 1) 596

This is what they were saying 2 years ago, that the megapixel wars were over, that the sweet spot was 8MP. HAH.

Remember when 22MP was what you bought in a $30,000 digital back camera? At the same time, the Canon 1D line was 10MP or so. Um, yeah that was only 5 years ago.

I say no, the megapixel wars are definitely NOT over yet.

The other thing mentioned here is a concentration on quality. That's been happening hand-in-hand with the megapixel wars. So the latest in the Digic line of processors can both handle more pixels, faster, and produce better quality images at higher ISO, all at the same time. I would expect that to continue as well.

True, at some point physics will take over, as it was supposed to YEARS ago in the microprocessor market, but now we have 45micron processes that nobody dreamed of. The same will happen for CCD's and so on. Who knows where it will end.

Comment Re:SOA (Score 2, Interesting) 219

Ah, so it's a way to sell more machines to run more infrastructure software (also sold) which companies think will increase their scalability, which they don't really need because most of them are never going to have the amount of business that would force them to scale, where simple client-server software would suffice while they're going down the tubes.

Guess you've never been to the other side of that. The other side of that is a set of applications that are good enough to win your company the business, but that don't work together at all.

You can say "you should have thought about that in the first place, good design would have cleared that up" but good, thorough design that attempts to make everything work together flawlessly results in long development cycles and lost business.

Our company won, we built the best products and got the market share, and now we've got a set of applications that our customers expect and want to work together, and we are struggling to deliver it.

SOA is one way we can help with this problem. We can add SOA interfaces to each application, and start constructing the one integrated product our customers want, in an orderly way, quickly, without re-writing all our apps, and piecemeal. We can add SOA interfaces to each application's back end, one at a time, prove it works, and then work on meta-applications to combine the results.

We built much of the software to handle this ourselves. There are OSS options for most pieces of this architecture if we wanted to use an ESB engine (check out Mule for example), and with our VM environment we should not need significant investment in infrastructure. We just need time to build it, and hence corporate wherewithal.

SOA (and ESB and the like) in-and-of themselves will not provide a solution to enterprise integration, any more than the EAI engines of 10 years ago, but at least they provide a common technology to build around so that other developers can tap into the functionality of our applications.

Comment Same problem as C (Score 1) 963

The same kind of arguments were made in support of C: that hard-to-maintain spaghetti code is not the language's fault, it's the developer's. But the bottom line is this: if the language fails to enforce some basic good coding practices, then the code won't be easy to maintain. It's difficult, if not impossible, to find and hire the kind of talent across your organization that can keep the code maintainable and readable. It's just as hard to review all the code to ensure it meets your development standards, and train junior engineers on how to do it properly. Therefore, just having Perl or C code to maintain means you are almost guaranteed to have harder-to-maintain code, that has to be worked by more-senior developers. Those very developers are who you need plugged into new projects that require senior developers and out-of-the-box thinking. If they are tied up cleaning up the spaghetti that other engineers produce, you lose productivity across your organization, and your senior guys want to quit.

Java, .NET and other shiny new languages help in significant ways in making software products easier to develop and maintain. So Even though I can code nice C code, and I can handle pointer arithmetic and memory allocation well, doesn't mean I can find all the resources I need to ensure that same level of development in all my products.

Slashdot Top Deals

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...