Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Commitment to stability (Score 1) 149

So, is that like a Python or D commitment to stability, or a C/C++ level commitment to stability? Exactly how committed are they to preserving to preserving backwards compatibility through hell and high water? Because that's why people trust C/C++ - they know that the language committees are not going to suddenly "fix" the language by making billions of lines of code obsolete, simply because it was written fifteen years ago before a bunch of new shiny features have been added.

I think widespread adoption is going to remain limited until it becomes clear how Mozilla plans to shepherd and develop this language as well. Will it become standardized? Will mature cross-platform tools become available? What will the performance penalties be to optimized C or C++ code?

I wish Rust all the best. It's going to be difficult to unseat C/C++ simply due to inertia, as well as convincing programmers of the merits of an entirely new language, but I like the idea of a memory-safe language that doesn't require a runtime or use managed memory.

Comment Re:An Old Story (Score 1) 386

A lot of people besides you also call C++'s backward compatibility "baggage", but it's unwavering backwards compatibility both with itself and with C (mostly) are one of the cornerstones of it's success and longevity, at least in my opinion. Maintaining strict compatibility, even though sometimes ugly, is one of the smartest thing the C++ committee has done.

When a language breaks compatibility, it tends to split the developer community (e.g. Python 2 vs Python 3), since people or organizations then have to decide whether a costly upgrade or rewrite is worthwhile. With C++, we can rest assured that even older C and C++ projects will continue to compile with modern compilers just fine, and we can move ahead with improved, modernized code at our leisure. This assurance of backwards compatibility gives confidence when in investing many millions of dollars in a code project.

Sure, backward compatibility definitely comes with downsides, such as the difficulty in creating effective tooling for C++ source code or slow compile/link speeds, but I have to laugh when people suggest breaking compatibility or switching languages as a serious alternative. When you're working with projects that have hundreds of thousands or even millions of lines of code, you don't just rewrite that stuff at the drop of a hat.

Comment Re:An Old Story (Score 2) 386

C++ isn't about productivity. It's really about performance. If your application isn't performance-critical, you probably don't have a good reason to be using C++. And actually, nowadays it's a lot easier to write much safer C++ than previously, especially with recent changes to the language. Overall, the language is a pretty good tradeoff between performance, productivity, and safety, with emphasis on the performance, of course. I'd imagine that's why game developers use C++ almost exclusively, at least for larger AAA games.

On the other hand, game development tools are often written in languages such as C# (plus a mix of various other languages), because productivity is typically more important than performance for those types of applications (after all, you can just buy faster machines for your developers).

So... why not write games / game engines in Rust, D, or some other arguably "better" language?

Simple answer: inertia. Because there's an entire industry's worth of game developers who know C++ extensively (like myself). Our libraries and game engines are all in C++, and are battle-hardened and well-optimized. Who wants to rewrite all of that? Sample code is in C++. Libraries are C or C++. Etc, etc. C++ is often derided for it's flaws, but it's a language that actually gets *used*, and like it or not, being reasonably popular as a language is also a merit. It means that every major platform for which you might want to port a game has a modern C++ compiler and C libraries to link against, and mature debugging development and debugging tools.

It's kind of a crappy chicken-and-egg problem for new languages to go up against, but that's the reality.

Comment Re:Swift is destroying Rust. (Score 1) 270

Really? Everything I dug up during research said you pretty much had to interop with Objective-C, since Swift classes are exported as Objective-C classes. I'm not doubting you, as I'm sure there's some horribly ugly way to do it, given that we're talking about C here.

Still, for my purposes, it doesn't really matter that much. I'd still prefer a "clean" interop, since I'll be doing a lot of that.

Comment Re:Swift is destroying Rust. (Score 1) 270

Speaking of C++... I'm currently learning Objective-C rather than Swift. Why? Because all I want is a thin interop layer between my cross-platform C++ code (the bulk of my game engine and game code) and the operating system APIs. Objective-C can iterop with C or C++ fairly easily, while Swift can only interop with Objective-C.

Frankly, I wish I could use Swift instead of Objective-C, whose syntax takes some real getting used to.

Comment Re:Short Answer (Score 1) 276

When I mentioned "entertainment platform", I was just talking about videogames or other forms of entertainment you'd find on a standalone PC, not a PC acting as a media center. I probably should have been more specific.

Honestly, I've never considered PCs a good fit in the living room. As far as I can tell, people who set up their own media or entertainment PCs connected to a TV have always been in the "enthusiast" camp, and probably are a fairly small minority. It's far easier nowadays to just subscribe to Netflix and stream to your TV or console or set top box than to set up a PC as a media device.

So, I totally agree with you there. The living room will likely remain the domain of purpose-built hardware for the foreseeable future.

Comment Re:Not for animals or locations (Score 1) 186

You know what's going to happen of course, right? The official name will get completely ignored, and 99% of the world will know it by it's unique, catchy, culturally-insensitive and politically incorrect common name. Like it or not, someone will come up with a much catchier name for the disease in question, and the media will pick up on it, and it will unofficially be known by that name forevermore.

Language is hard to corral by rules, as anyone who rages against slang words being added to the dictionary well knows.

Comment Re:Short Answer (Score 1) 276

I happen to make videogames for a living, so I'm keeping a pretty close watch, thank you. PCs are still very much a factor in gaming largely thanks to MMOs, MOBAs, some die-hard industry hold-outs, and a gazillion indie and retro titles. Steam, GoG, and Origin are still going strong as the industry shifts to downloadable titles en masse. The PC is no longer the dominant gaming platform across the board, of course, but the death of PC gaming has been predicted every year for the last decade or so. It's getting old.

Do you remember how everyone was predicting the death of the videogame console near the end of the last generation of consoles, right when smartphone games were seemingly all the rage? It was equally rubbish. We now know that it didn't replace an existing market, but instead opened up a new one. People can't seem to understand the simple concept that some form factors work better for some types of applications than others.

Comment Re:Not convinced (Score 1) 408

I just recently saw a tragic story in the local news about a father who accidentally ran over and killed his own toddler while backing out of his driveway. That's probably one life that would have been saved and a family spared a tragedy had the vehicle been autonomous - or even had autonomous emergency braking.

Whenever I see these ridiculously improbably scenarios about a computer having to choose x or y, I have to remind people that everyday sorts of accidents (which typically involve mistakes in human judgement or limited perception) are far more likely to occur than these hypothetical scenarios. For the first generation of autonomous vehicles, there's no reason to have to make them 100% autonomous anyhow. Just alert the driver when they have to take control in the rare cases the computer can't figure out what to do, such as when being directed by a policeman.

Let's not let perfect be the enemy of good.

Comment Re:The version number is dead... (Score 1) 154

If Microsoft is releasing "backwards-incompatible" changes to Windows 10 (probably meaning new APIs and features are available), then presumably Windows 10 users will be able to upgrade and patch their systems. This isn't completely unprecedented. For many years, if software was compatible with Windows XP at all, it was generally "XP Service Pack 2 or later". Windows 10 will probably be the same way. For better or worse, if you want to run modern software, you'll likely need to keep your system up to date, which will ensure you have the latest patches and any new APIs (like new versions of .NET, etc).

I don't see a corporate user typically wanting to use a 2016 version of Windows 10 in 2017, as this would also imply their OS is a year behind in security patches as well. Why would they not want to patch up their system? Corporations would only do that if there were some breaking issue in a patch, and MS tends to fix those reasonably quickly on the rare occasion they happen. Moreover, if the software was written before the point when they froze the image, it's guaranteed to be compatible. For new software written after that, it would only be incompatible if the the software made use of new APIs released in 2017. Stuff like that doesn't generally happen accidentally.

I suppose the big assumption I'm making is that there aren't going to be any radical internal changes going forward (like new driver models in Vista) that would cause incompatibility problems, or upgrades that would significantly increase the baseline hardware requirements, not at least while calling it "Windows 10", which hasn't been the trend since Vista anyhow. If they DO, then they'll have to start identifying those release points in some other fashion.

Of course, this is all speculation until we get a clearer picture of what Microsoft's strategy is with Windows 10 and how it works in practice. Honestly, though, I can't think of how this would be really hard to manage in practice. I guess we'll have to see.

Comment Re:Short Answer (Score 1) 276

The desktop is going away in the home.

Partially correct. However, don't forget the viability of the PC as an entertainment platform. A PC has a form-factor that makes it optimal for many types of games that no other form factor can really match. And people who create any sort of digital content, whether it's professional or a hobby, will probably want a PC at home for the task. Plugging a tablet into a docking station doesn't necessarily make it a PC-like system suitable for anything but the simplest content creation tasks.

I think what's more appropriate to say is that a PC won't be *required* in the home for people to use e-mail, Facebook, and light consumption-based tasks. For casual users or non-tech types, a tablet will be just fine. However, I still think many people will still choose to have a PC in the home for some time to come simply because of their sheer versatility.

Comment Re:Linux rootkit (Score 1) 67

Is anyone actually confused that you might be talking about Android when you say "Linux"? I really doubt it. I'm not going to call Linux "GNU/Linux", and neither are most people, because it's just too damned awkward. There's already enough confusion among many non-tech people about the distinction between Linux the kernel and the myriad Linux-based distributions.

Anyhow, I tend to disagree with Stallman 9 times out of 10, but I don't have any problems giving him props for what he's done and what he believes in. He's certainly been consistent in his message at least.

Slashdot Top Deals

It is not every question that deserves an answer. -- Publilius Syrus

Working...