Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:I couldn't get past "how do you write a game"? (Score 3, Insightful) 380

As a professional videogame programmer, I can assure you that I haven't heard functional programming discussed much at work or among other peers in the industry. Videogames are giant, data-intense state machines, with lots and lots of state to track and manage. My feeling is that it's really not a great fit for functional programming. Traditional object-oriented programming is *heavily* used (C++ is the defacto industry standard language), because that's a reasonable and proven way to encapsulate complex, independently-operating entities into well-behaved packages of data + code.

It's the same reason that the industry doesn't extensively use unit testing for engines and game code (not broadly at least), which may surprise some programmers in other fields. The reason is simple: many game engine tasks can't be boiled down into simple algorithmic data transformations that can be checked and verified by a simple function. For instance, how would one write a unit test to ensure audio is playing correctly, or a graphic shader is working as intended, or a physics engine "looks" correct in realtime interactions? It's not really practical. Thus, videogames tend to rely on integration tests using QA team members to spot anomalies.

In short, not every programming paradigm can be effectively applied to every problem. That's not to say you *couldn't* write a game purely in a functional language, of course. I just don't think you'd be working to FP strengths.

Comment Re:What's changed? (Score 5, Interesting) 279

Probably true, but even as someone who is likely your political polar opposite, I've always found your arguments to be consistent and well thought-out, even if I don't necessarily agree with all your positions or conclusions. For some reason, I think it's easier to remember a single negative moderation or hateful comment rather than a dozen encouraging responses or positive mods.

Unfortunately, many people use the relative anonymity as an excuse for venting their own frustration, intentionally lashing out at others with caustic remarks or outright trolling. I've found that viewing such people with pity rather than frustration helps alleviate the frustration of dealing with rude people. What sort of person feels the need to lash out at others online? It's sort of pitiable, and I tend to think "how crappy is your life that online trolling is how you choose to interact with others?"

I'm not sure there's any solution, other than ignoring the trolls and trying to set a good example yourself.

Comment Re:It has its uses (Score 5, Interesting) 380

Few "in the OOP world" (whatever that means) promotes inheritance as the end-all-be-all these days. I think that went out of style fifteen or twenty years ago. The notion of eschewing inheritance whenever possible has its own Wikipedia entry, and was described in detail in the famous "Design Patterns" gang of four book.

That being said, there's a time when reality can intrude on "theoretically" clean designs or programming paradigms. Functional programming and unit testing are things you don't see widely used in the videogame development world, at least that I've seen. Not all paradigms and patterns apply to all types of problems. Ultimately, I think that's the most valuable thing I've learned over time. Use the tools and techniques most appropriate to the problem at hand you're trying to solve. Religious wars over programming techniques and methodologies are for pedantic fools.

Comment Re:Pascal, “not clean”??? (Score 1) 617

I agree with you about the cleanliness of Pascal. But just FYI the current MacOS (OSX) is mostly written in C, assembly, and Objective-C, since its based off BSD. You're probably thinking of the original Mac OS, but from what I understand, it's not *quite* correct that it was Pascal either. Instead, they hand-translated some existing Pascal routines from the Apple Lisa into assembly to save on memory. So, it sounds like it was the Apple Lisa that largely used Pascal.

Also, it's wildly incorrect to say there were "no exploits". All modern operating systems have had LOTS of exploits over the years, because very early on, security wasn't even considered. Probably not quite as many as Windows, but certainly not zero.

Comment Re:Fortran (Score 1) 617

Same for me initially: my first language was AppleBASIC, which I taught myself thanks to a wonderful book that almost seemed written for kids (which I was at the time). I spent endless evenings at my parent's workplace using that computer, and eventually it became mine, since I was the only one really using it. Glorious days!!!

Fast forward a number of years, and I learned Pascal using the wonderful Turbo Pascal compile from Borland, followed shortly after by C++, followed by C (sort of an atypical learning order for most). I was in a community college, and took C++ as a self-study course, which I dove into with relish. Again, I more or less taught myself.

At university, I learned a bit of SML, which I didn't really care for, probably because it was so different than what I was used to, and so a bit harder to pick up. Worse, it had a nearly non-existent library at the time, making the most trivial of problems extremely difficult to solve. Writing a recursive function to loop over something still seems insanely overly-complicated to this day, more suited as an academic exercise than what you'd use in a real production environment.

As a videogame programmer, I mostly stuck with C++ for quite a few years, but eventually also learned Lua for scripting, C# for tools, and Python I learned just in the last year for my contract work, a language I enjoy very much.

Comment Re:Time to switch (Score 2) 214

It feels like this is intended to spark some sort of frothing outrage, but it doesn't sound all that unreasonable: So... the free business OneDrive and Skype service that came with their product ends when mainstream support for that product also ends three years from now? Yeah, well... okay? Pony up and pay for a service to store your documents online somewhere. It's really not all that expensive. And generally speaking, it's stupid to count on a "free" cloud service lasting forever. Hell, even *paid* services will disappear if they're not profitable enough.

It certainly doesn't affect me with my single license of Office. I don't even use OneDrive features, preferring to use AWS for backups, since they only charge you for the storage you actually use. Last month I paid 10 cents. S3 is stupidly cheap for storing documents and source code backups, since that takes up very little space.

Comment Re:Attitudes (Score 1) 81

"The Cloud" is overhyped beyond belief these days, but it really does have a place for specific tasks.

For instance: backups and offline storage. Yes, make a local backup, but you need offsite as well. It's just *stupid* not to use a service like Amazon S3 or Glacier for this. Of if you need a turn-key solution for a bit more, Carbonite, etc.

Scalable loads is another one. You can rent HUGE numbers of CPUs to crunch all sorts of data, or push all sorts of traffic to load-test systems. Owning all this hardware would be ridiculously cost-prohibitive.

The "cloud" is just another tool for your use. No tool is perfect for everything, but nearly every tool is perfect for something.

Comment Re: Make America Great (Score 4, Insightful) 619

Only a few people are saying "scrap the H1B" program. My current programming lead is from Europe, and I'd guess he's here on H1B or something similar, but most of our people are from the US. We're glad to have him, but I don't feel he's necessarily displacing any qualified workers. We've had positions opened for many months, and it's extremely difficult to find qualified people to fill those positions. There are plenty of tech workers in the US, but often you need people with very specific qualifications (no, real ones, not made up shit).

But there's also no doubt that the program needs cleaning up to prevent some of the rampant abuse that's gone on. It's CLEARLY being abused by many corporations looking to save money by "outsourcing" without the downsides of the workers residing in another country.

Comment "Disruptive" (Score 5, Interesting) 56

Smelling some serious Silicon Valley marketing bullshit in here. Damn, the whole summary is a buzzword bingo paradise. Question: what AI *isn't* "headless"? He just means "non-customer facing", right? Don't overload the term in confusing ways.

Am I just grumpy this morning, or is this summary as asinine as it seems to me right now? Not only is this schmuck cranking the AI hype train whistle up to 11, he's not nearly as insightful as he thinks it is. Um, yeah, no kidding that AI (a fancy term for advanced data analysis, mostly) will be useful to businesses, instead of the stupid chat bots now being displayed. And yeah, it seems pretty obvious it's not going to put everyone out of work (at least not in the short term), but will just be another tool people use.

Maybe I'm just getting tired of the over-hyping of AI. Almost makes me miss the 3D printing fad.

Comment Apple ][+ (Score 3, Interesting) 857

48KB RAM + 16KB extended, two floppy disk drives, green monochrome CRT display, and a joystick. It was amazing at the time, at least to me.

My parents bought it for their business, but they never really used it, and it eventually became mine. I learned how to program on that computer using AppleBASIC. I also learned that line numbers suck for programming, and only went to 32767 (one of my bigger projects). I eventually learned why there was such a "strange" limit like that after I learned about binary numbers.

Favorite games: Choplifter, Wizardry, Karateka, Aztek, and a few adventure games I can't remember the names of.

Comment Re:He is an idiot... (Score 1) 305

I wonder when Congresscritters will learn to stop talking about the internet? Any time they do, with a rare few exceptions, it simply makes them look profoundly ignorant and/or out-of-touch.

I have no doubt that this Senator never touches the internet himself - not directly at least. It's nice have a room full of government-paid staff member to use the internet for you. I mean, we all have that, right?

Comment Re:/. won't either (Score 5, Insightful) 448

It would be funny, but then you're just playing BK's marketing game. There would be headlines AGAIN about Google doing that, which is just giving them more publicity. How many marketing campaigns end up with several Slashdot headlines (along with plenty of other big-name media outlets)?

The worst thing that could have happened to BK is that this story was ignored. They way they figure it, the longer they can keep this in the news, the more successful their marketing campaign is. The faux anger will dissipate in fairly short order, but we're still all thinking about BK's Whoppers in the meantime.
   

Comment Re:/. won't either (Score 5, Insightful) 448

Let's face it. From a marketing perspective, this is a huge success for BK. A relatively small number people were *actually* negatively affected, and I'd bet very few regular BK customers will actually STOP going there as a result. But for a single commercial, a huge number of people are now talking about BK and Whoppers. Even better, some people shift blame to Google for the insecurity of those voice interfaces. It's highly unlikely and negative legal consequences will come from this either.

Whichever sociopathic marketing asshole came up with this ploy is probably getting a big raise this year.

Slashdot Top Deals

A university faculty is 500 egotists with a common parking problem.

Working...