Forgot your password?
typodupeerror

Comment: Re:Ah yes, for that we have D (Score 1) 93

by rdnetto (#47716021) Attached to: Interviews: Bjarne Stroustrup Answers Your Questions

I'm not the GP, but I thought I'd bite.
tldr; It's not perfect, but it's closer than you'd think.

D sounds like a neat language that I'll probably never be able to use. I'm a game developer, and C++ has a native compiler for every machine I would ever need my code to run on

DMD, LDC and GDC (the 3 most popular D compilers) work fine on x86 and x86_64.
LDC supports ARM and PowerPC with some issues. GDC apparently has better support for ARM

as well as an already mature ecosystem (engines, code libraries, sample code, all in C++).

D has very good interop support for C and C++ libraries. There's a significant number of wrapper libraries in dub as well.
In general, C code can be used as is, while C++ libraries often need a wrapper to work around issues like templates.

In fact, C/C++ is pretty much the only option I have if I want my code to be broadly portable.

Yes, C compilers exist for pretty much every architecture in existence, with C++ supported on most of them.
But this is a red herring, because the only instruction sets that really matter to someone in game development are x86, x86_64 and ARM. Whether or not you can compile your code for PIC is completely irrelevant. (That said, LDC uses LLVM for its backend, so it probably has the best chance of supporting unusual architectures.)

It's interesting how a lot of languages don't seem worry too much about backward compatibility, because they want to focus on a clean and better language. Unfortunately, in the real world, there are always massive amounts of legacy code that need to continue to work alongside whatever new whizbang features are introduced, even at the expense of a cleaner or more elegant language.

If I had to give any one reason for C++'s success, it would be the standards committee's stubborn (and in hindsight, wise) refusal to "clean up" the language by removing crufty features and syntax, a lot of which were leftover from C. C++ code from 20 years ago still compiles today mostly unchanged, and that's incredibly important when trying to build up or maintain a large ecosystem. You can see what a huge split it causes in the community when a language breaks compatibility like Python did (2.x vs 3.x), and ultimately, I wonder if it's more damaging than C++'s more conservative approach. As a developer, I'd be hesitant to heavily invest in a language that is more likely to break compatibility and leave me stranded.

Backward compatibility is always an issue with any piece of software. That said, I think there's something to be said for handling breaking changes well as opposed to handling them poorly - anything which creates a rift in the community is obviously an example of the latter.
Python was an example of the that, since it wasn't possible to combine code from old and new versions. While D had a breaking change with D2, there is only one person I am aware of who is still using D1. The standard library from D1 (Tango) was ported to D2, and the syntactic changes were fairly minor and easily remedied.
It's also worth noting that the D1 branch of DMD is still maintained, should you actually need to compile D1 code.

Pretty much every language is going to accumulate cruft over time. Even if D accumulates it at the same rate C++ did, it's relative youth means that it will be much more pleasant to work with, since C++ will always have more. I think the only real way to completely remove all that cruft is to create an entirely new language - no one would have complained about Python 3 if it were marketed as a new language, rather than as a new version with breaking changes (Nimrod is an example of this). This is what D is to C++ - a language with equivalent power that wipes the slate clean.

Comment: Explanation & Thoughts (Score 1) 26

To put it simply, Kolab is a FOSS equivalent to Exchange. On the client side you can use Roundcube (a web UI), KDE Kontact, or anything supporting the IMAP-based protocol. It also supports ActiveSync for use with Android.

I set up Kolab 3.2 on a Debian a while back because I wanted a centralised calendar, etc. that didn't require me to trust Google with my life. It's worked pretty well, apart from a few issues. Configuration is a little tricky, especially as SSL is not the default and there are three different places it needs to be enabled. There are some minor bugs and instabilities, though hopefully they have been fixed in 3.3. Synchronization between the roundcube and IMAP clients can also be a little unreliable.

If anyone has any questions about it, I'll be happy to answer them.

Comment: Re:Surprise? (Score 1) 567

by rdnetto (#47710235) Attached to: Munich Reverses Course, May Ditch Linux For Microsoft

Which is the real issue in doing an office migration. That and replicating Outlook, I don't know about the whole kitchen sink but at least the whole mail/calendar/meeting bit. Somehow I'm amazed that in the last decade open source hasn't managed to pull it off, what the average office worker does is not rocket science. I guess it's just nobody's itch.

KDE Kontact is probably the best FOSS alternative to Outlook - it has email, calendar, contacts, todo lists, RSS feeds, newsgroups, etc.
There's a bunch of free alternatives to Exchange as well, though I'm less familiar with those.

Comment: Re:Is the complexity of C++ a practical joke? (Score 0) 425

by rdnetto (#47676183) Attached to: Interviews: Ask Bjarne Stroustrup About Programming and C++

A practical joke? Are you joking? C++ is not designed so that every feature must be learned and used. It's complexity derives from the fact that it supports OOP, functional programming, generic programming and I'm sure others that Bjarne would happily describe to you and the reasoning behind supporting features being included in the language.

I disagree. D has feature parity with C++, but is significantly simpler and easier to learn. (This is supported by the fact that their eponymous books, The C++ Programming Language and the D Programming Language, are 1368 and 460 pages respectively.)
C++'s complexity arises from a combination of legacy features and inconsistency.
For example, it has two different kinds of enums - plain/C-style enums and enum classes. Enum classes were created because adding scoping to existing enums would have been a breaking change.
Inconsistency can be seen in things like the runtime initialization rules, where the contents of int x[10]; is uninitialized, but the contents of vector y(10) are initialized to zero.
Essentially, poor decisions were made in the long history of C++, and while it's easy to say that hindsight is 20-20, the reality is that those bad decisions are constraining the direction the language is taking and increasing its complexity.

There is an excellent talk here by Scott Meyers discussing how inconsistent C++ is. These inconsistencies are responsible for the vast majority of the unnecessary complexity in the language.

Comment: Re:OK fine but give us a free CA (Score 1) 148

by rdnetto (#47632001) Attached to: Google Will Give a Search Edge To Websites That Use Encryption

The entire point of a CA is trust.

Agreed. But SSL is about encryption - authentication is merely an optional extra (if it weren't, self-signed certs wouldn't even be an option).
No intelligent person trusts the majority of websites, but they may still have valid reasons for not wanting their browsing habits eavesdropped upon.

Using a non-trusted CA would actually be a step backwards.

That depends on your priorities - on whether authentication or privacy is more important to you. Quite frankly, I find it hard to understand how encryption without authentication is worse than no authentication at all.

Even worse would be convincing people that manually installing a cert for a random website is a good idea.

Besides, I do believe that every single major browser now includes dire warnings if you go to a site with a cert from a non-trusted source.

Frankly, this is a usability problem. A user should not receive dire warnings for a self-signed cert; they should get some indication that it's inferior to a trusted cert, but that's it. (I like the red-yellow-green approach Chrome takes with the address bar.)
Dire warnings should be reserved for when a website's cert changes significantly, because that's the best indicator of malicious activity. Using them for self-signed certs just raises the false positive rate.

Certs are cheap. A quick Googling reveals a number of options for under $50/year

Cheap is relative. But more importantly, consider the implications of this. The web is slowly moving towards deprecating the use of unencrypted HTTP. Sure, it won't happen immediately, but it's going to happen sooner or later, especially given the way the IETF responded to the Snowden leaks. Meanwhile, CAs stand poised to charge an annual fee to anyone who doesn't want their site to be decorated by scary warnings. Stuff like this centralizes the internet and makes it more fragile and prone to interference by a single party. We need to be looking at more decentralized options, and making self-signed certs a viable choice is a good start to that.

Comment: Re:OK fine but give us a free CA (Score 1) 148

by rdnetto (#47628767) Attached to: Google Will Give a Search Edge To Websites That Use Encryption

So far all Google has said is that they will boost sites which use HTTPS - as far as I can tell, they haven't said anything about requiring the use of a trusted CA.
Self-signed certs are free, and just as (if not more) effective than the paid ones if your goal is to prevent eavesdropping and not to verify the identity of an unknown server. (Known servers can be reasonably expected to use the same certificate as last time, or at least the same CA).

Given that the centralised CA model seems to have largely failed, not to mention how likely it is that this is being driven by the Snowden revelations, I wouldn't be surprised if this was the approach Google took.

Comment: Re:its why devs cringe. (Score 1) 180

by rdnetto (#47577967) Attached to: PHP Finally Getting a Formal Specification

Putting aside the whole whitespace debate(*), I'm pretty sure that python has its own list of issues.

My understanding is that Python's two biggest issues are a lack of static typing (justifiable, but annoying) and the ability to use arbitrary objects as dictionaries. The latter causes significant issues when trying to optimize code, because something as simple as reading a value from a property becomes a hashtable lookup.

Comment: Re:Web = Garbage (Score 1) 315

by rdnetto (#47564391) Attached to: Programming Languages You'll Need Next Year (and Beyond)

It'll be interesting to see where C# is in 10 years. With Microsoft open sourcing it and the multiple compilers that recompile into different environments that are here and are being developed I'm wondering if it'll be everywhere or if it'll be fading away like VB is?

I don't believe C# will ever flourish outside of Windows, unless Microsoft starts supporting it on non-Windows platforms. Mono has had a lukewarm reception, and in my experience isn't as solid/reliable as .NET.

D on the otherhand, has similar syntax and is much more portable. I can easily see it overtaking C# if it gets some proper corporate support behind it...

Comment: Re:Finally! (Score 1) 474

by rdnetto (#47493323) Attached to: World Health Organization Calls For Decriminalization of Drug Use

As far as I can tell, the only issue with the workhouses was that they provided a better standard of care than the employed poor received. That should no longer be the case today.

The other issue was possibly that the profitability of running such a place was overestimated given how few "able bodied idlers" there were, but it seems obvious that such places would need to be state funded.

Comment: Re:The medium is the message (Score 1) 154

So surprise surprise VR goggles aren't turning out to be a screen you wear on your eyes but a whole new medium. I am willing to bet that there will be a genre that takes off on VR and that genre might not even really exist right now. Something really different.

I suspect they would work quite well for (an evolution of visual novels), since those are already set in the first person, but don't require moving around the way FPSs do. Not sure how the controls would work though...

Comment: Re:Fixed what seem like fundamental GUI bugs? (Score 1) 108

by rdnetto (#47472619) Attached to: KDE Releases Plasma 5

2. Can a single misbehaving plasmoid still cause the entire desktop to freeze? (This typically happens to me if the network connectivity is lost: poorly-written plasmoids that need network access can block and cause everything -- not just the plasmoid in question -- to freeze.)

I believe this is no longer the case. One of the big changes in Plasma 5 was rewriting the process model used for plasmoids. That said, I can't find a source to confirm this, and am too lazy to download and run one of the Project Neon ISOs.

Comment: Re:P2P helps movie buffs outside the US (Score 1) 214

by rdnetto (#47464307) Attached to: Economist: File Sharing's Impact On Movies Is Modest At Most

I'm no lawyer but I believe game translations are legal, as they are usually released as pacthes rather than redistributing the entire modified game.

From Wikipedia, citing the US Copyright Act:

A “derivative work” is a work based upon one or more preexisting works, such as a translation ... A work consisting of ... modifications which, as a whole, represent an original work of authorship, is a “derivative work”

Whether or not it's a patch is immaterial; as a derivative work the subtitlers cannot legally distribute it. In practice, both the publishers (at least the smarter ones) and subtiltlers tend to ignore this.

The law was quite clearly written when a translation of the original work would be a substitute for the original. e.g. owning an English translation of a book would negate the need to own the original. IMO works which do not substitute for the original work such as subtitles or dubbed audio tracks should not be considered derivative works. An alternative would be for translation to be recognized as fair use (since it is merely the analog equivalent of format shifting). Of course, the actual likelihood of US copyright becoming less draconian is quite remote...

If you have to ask how much it is, you can't afford it.

Working...