Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Yes, but... (Score 2) 149

The problem with software development is that unless you've done that exact same task before, you really have no idea what's involved. And if you HAD done that exact task before, you wouldn't need to be doing it again, as you could re-use most of your previous work. Unlike with, say, constructing a building, once software is well-built once, it doesn't have to be built a second time, at least within the same company, or if its open source.

Management is also to blame on occasion. I put together a schedule for a videogame project for a major publisher, and the schedule was rejected, saying it wasn't detailed enough. They wanted finer-grained breakdowns of tasks, so instead of one to two week tasks, they wanted one or two day tasks. The only problem: the game wasn't even designed yet - only a rough idea of the genre and licensed property we were using. So, someone (not me, thankfully) dutifully put together a bullshit schedule with fine-grained bullshit tasks, and as the due dates arrived, we simply checked off those tasks in our official project management software.

In the meantime, we had our own spreadsheet with our real tasks and timelines that we used internally, although we tried to match up major milestones as best we could. Since it was a hard deadline, we finished the core game systems as soon as possible, ruthlessly cut extraneous features, and still delivered on time. I'm sure the publisher's producers still think it was their detailed scheduling that kept everything on track.

Comment Re:(sigh) You people still think you're engineers (Score 1) 665

Sure, that's a good point. And like I said, if that works for you, I've got no beef with it. Me, I program videogames for a living, so I think the title "programmer" or even the more specific "videogame programmer" works better for me.

Language changes over time, whether people like it or not. "Hacking" is what people know as computer crimes ("cracking" never caught on), "literally" now also means "not literally", and 99% of the population will use "begging the question" incorrectly, no matter how many times you-few-who-know-better correct them.

It's probably inevitable that "engineer" will come to encompass more than the traditional engineering professions. I look at the list, and I notice "Management Engineering". Who knew that was a thing?

Comment Re:(sigh) You people still think you're engineers (Score 1) 665

Not all of us incorrectly use that title. As a rule, I always call myself a "programmer" (or these days, I guess a "senior programmer" is more accurate) rather than a "software engineer". I think it's a more honest description of what I do.

I don't get my panties in a wad about what other programmers call themselves, but I can understand why certified/licensed engineers don't appreciate the watering down of a title they worked hard for. I guess it's the same sort of annoyance programmers feel when someone calls HTML a "programming language".

Comment Re:I couldn't get past "how do you write a game"? (Score 3, Insightful) 408

As a professional videogame programmer, I can assure you that I haven't heard functional programming discussed much at work or among other peers in the industry. Videogames are giant, data-intense state machines, with lots and lots of state to track and manage. My feeling is that it's really not a great fit for functional programming. Traditional object-oriented programming is *heavily* used (C++ is the defacto industry standard language), because that's a reasonable and proven way to encapsulate complex, independently-operating entities into well-behaved packages of data + code.

It's the same reason that the industry doesn't extensively use unit testing for engines and game code (not broadly at least), which may surprise some programmers in other fields. The reason is simple: many game engine tasks can't be boiled down into simple algorithmic data transformations that can be checked and verified by a simple function. For instance, how would one write a unit test to ensure audio is playing correctly, or a graphic shader is working as intended, or a physics engine "looks" correct in realtime interactions? It's not really practical. Thus, videogames tend to rely on integration tests using QA team members to spot anomalies.

In short, not every programming paradigm can be effectively applied to every problem. That's not to say you *couldn't* write a game purely in a functional language, of course. I just don't think you'd be working to FP strengths.

Comment Re:What's changed? (Score 5, Interesting) 312

Probably true, but even as someone who is likely your political polar opposite, I've always found your arguments to be consistent and well thought-out, even if I don't necessarily agree with all your positions or conclusions. For some reason, I think it's easier to remember a single negative moderation or hateful comment rather than a dozen encouraging responses or positive mods.

Unfortunately, many people use the relative anonymity as an excuse for venting their own frustration, intentionally lashing out at others with caustic remarks or outright trolling. I've found that viewing such people with pity rather than frustration helps alleviate the frustration of dealing with rude people. What sort of person feels the need to lash out at others online? It's sort of pitiable, and I tend to think "how crappy is your life that online trolling is how you choose to interact with others?"

I'm not sure there's any solution, other than ignoring the trolls and trying to set a good example yourself.

Comment Re:It has its uses (Score 5, Interesting) 408

Few "in the OOP world" (whatever that means) promotes inheritance as the end-all-be-all these days. I think that went out of style fifteen or twenty years ago. The notion of eschewing inheritance whenever possible has its own Wikipedia entry, and was described in detail in the famous "Design Patterns" gang of four book.

That being said, there's a time when reality can intrude on "theoretically" clean designs or programming paradigms. Functional programming and unit testing are things you don't see widely used in the videogame development world, at least that I've seen. Not all paradigms and patterns apply to all types of problems. Ultimately, I think that's the most valuable thing I've learned over time. Use the tools and techniques most appropriate to the problem at hand you're trying to solve. Religious wars over programming techniques and methodologies are for pedantic fools.

Comment Re:Pascal, “not clean”??? (Score 1) 628

I agree with you about the cleanliness of Pascal. But just FYI the current MacOS (OSX) is mostly written in C, assembly, and Objective-C, since its based off BSD. You're probably thinking of the original Mac OS, but from what I understand, it's not *quite* correct that it was Pascal either. Instead, they hand-translated some existing Pascal routines from the Apple Lisa into assembly to save on memory. So, it sounds like it was the Apple Lisa that largely used Pascal.

Also, it's wildly incorrect to say there were "no exploits". All modern operating systems have had LOTS of exploits over the years, because very early on, security wasn't even considered. Probably not quite as many as Windows, but certainly not zero.

Comment Re:Fortran (Score 1) 628

Same for me initially: my first language was AppleBASIC, which I taught myself thanks to a wonderful book that almost seemed written for kids (which I was at the time). I spent endless evenings at my parent's workplace using that computer, and eventually it became mine, since I was the only one really using it. Glorious days!!!

Fast forward a number of years, and I learned Pascal using the wonderful Turbo Pascal compile from Borland, followed shortly after by C++, followed by C (sort of an atypical learning order for most). I was in a community college, and took C++ as a self-study course, which I dove into with relish. Again, I more or less taught myself.

At university, I learned a bit of SML, which I didn't really care for, probably because it was so different than what I was used to, and so a bit harder to pick up. Worse, it had a nearly non-existent library at the time, making the most trivial of problems extremely difficult to solve. Writing a recursive function to loop over something still seems insanely overly-complicated to this day, more suited as an academic exercise than what you'd use in a real production environment.

As a videogame programmer, I mostly stuck with C++ for quite a few years, but eventually also learned Lua for scripting, C# for tools, and Python I learned just in the last year for my contract work, a language I enjoy very much.

Comment Re:Time to switch (Score 2) 216

It feels like this is intended to spark some sort of frothing outrage, but it doesn't sound all that unreasonable: So... the free business OneDrive and Skype service that came with their product ends when mainstream support for that product also ends three years from now? Yeah, well... okay? Pony up and pay for a service to store your documents online somewhere. It's really not all that expensive. And generally speaking, it's stupid to count on a "free" cloud service lasting forever. Hell, even *paid* services will disappear if they're not profitable enough.

It certainly doesn't affect me with my single license of Office. I don't even use OneDrive features, preferring to use AWS for backups, since they only charge you for the storage you actually use. Last month I paid 10 cents. S3 is stupidly cheap for storing documents and source code backups, since that takes up very little space.

Slashdot Top Deals

Advertising is a valuable economic factor because it is the cheapest way of selling goods, particularly if the goods are worthless. -- Sinclair Lewis

Working...