Given the number of security issues related to buffer over-runs, I wonder whether C/C++ should provide a safe buffer that would help alleviate these issues? Sure it might compromise performance slightly, though it might be acceptable when faced with the alternative of unexpected issues due to an unforeseen buffer overrun.
I haven't yet decided whether this is yet another programming language we needed, but I will be interested to see whether Apple release the Swift support in LLVM as open source. One thing that I dislike more than new programming language for the sake of doing so, are single-platform languages.
I didn't see much in the article, but seeing the following PDF there appears to be multiple technologies at play. One of them being 'channel bonding':
Better, faster ways to access inept content.
Its not the content that matters, but the bragging rights on how you access that content.
It does, but you should never under-estimate the ability of people to bother reading or paying attention to such details.
Apple does have a way to deactivate iMessage, but when you leave the Apple eco-system people don't realise that something that they were taking for granted suddenly gets in the way.
BTW the knowledge page for deactivating iMessage (never tried it): http://support.apple.com/kb/TS...
Well, this isn't any different that a friend stopping using Google Talk.
Network infrastructure. Despite the writing being on the wall, it has been considered as comedy. The comedy is now laughing at them. As usual it is going be a question of people panicking over something that could have been planned for.
They are by default, but there is the IPv6 privacy extension RFC4941. Also if you use DHCPv6, then you can decide exactly what IP each host gets.
Civil expenditure vs military expenditure. It's sad that it takes a military budget to do stuff, when a civilian space agency could do just as well.
The reality is that when asked the question 'why are you doing this?', the answer in one case will be a fuzzy 'important defence stuff' and people will stop asking questions, while in the other "researching technology for future manned space flight" and then people will start questioning it even more and each want to be a stake holder in the budget.
Not quite the same thing. Nature works at its own pace, but when you have geological evidence you should take heed of it. Geology can only help so much, because the exact time element is where things are fuzzy. On the flip side there are geologists who are more cautious about announcements and then get put in jail (case in Italy) - it's hard to win when everyone wants a scape goat.
For me it's like buildings or bridges that were built badly. You know they will fail, but not when. You know when the failure happens it won't be a pretty sight.
I am not aware of the documentary indicated, but a quick search turned up this "60 minutes" video, also covering the subject: http://www.styleite.com/news/l...
I think the problem comes from living in a bubble. We all live in a bubble and think of the reality around us being the reality for everyone else. It's not until you step outside of the bubble do you realise the assumptions ions aren't necessarily true. What will often be the case is different people solving different problems with different languages. Sometimes it's down to the suitability of the language, sometimes it's down to the local skill set and sometimes down to what's considered to be the latest trending language.
Learning a new language takes a time investment and changing the way we approach coding problems.
As a Java developer I am still wrestling with whether Scala will end up supplanting Java or whether it will be a side language that will simply influence the direction Java takes in the future?
For me languages fall into three main categories, those that stay in the main steam, those that influence the main steam languages and those that simply fade away, because they have been replaced by something 'better'. For the influencers they sometimes stay in the background because while innovative don't necessarily add a reason for such a radical change and by the time the look like they may be gaining steam, they lose it to the fact the 'mainstream' languages have picked up the best features.
From what I see the game engines are still C/C++, but are scripted in things like Python. At the same time, using the right APIs a lot of the hard processing can be handed off to specialised hardware, such as GPU, whether for graphics or physics.
BTW while JS is not generally thought of a choice for high performance games, this demo shows what may be a sign of the future:
Garbage collection is only as good as the algorithm in place and the load it places also depends on the type of application in place. In most cases it hasn't really caused me much pain.
There are cases where Java is actually more performant than C/C++, but can get brought down by the GC. The performance gains are down to the JIT.
At work, a team that uses Java in high performance application presented to us way of analysing program performance and ways of addressing them. One of the things we were made clear about was the way you analyse performance can actually mask a performance issue, so you need to be careful of how you analyse your application.
One other thing I learnt from this presentation is about a JVM called Zing. It was amazing how much better in certain circumstances it was than the Hotspot JVM. From what I understand the improvements are very much around the JVM. The only catch is cost. They know that companies are willing to pay for the gains it gives, so you'll need to decide whether the project warrants the extra cost for the performance boost.
What a horrible waste. I hope they at least had the libraries open to the public as a well-publicized "everything's free bookstore" for a few weeks before hauling the leftovers to the dump.
I must admit I got the image of book burning, without the burning. The end result is pretty much the same, in the sense it is destruction of knowledge and culture. Then again I see a lot of common with Harper and a certain historic figure with a narrow moustache (not Charlie Chaplin).