Honestly it's not the best language to learn OOP concepts. It's too complicated and makes too many compromises to be compatible with C. But if you are sticking to Linux, it will be the only choice for significant development involving UI or system level programming. On OSX, Objective C would be much easier to learn coming from C and on Android Java would be a good choice. If you wanted to just learn current best practices without regard for practicality, there are interesting newer languages like Scala.
Slashdot videos: Now with more Slashdot!
Besides not being able to support real native apps and not being properly sandboxed due to neglect by companies more interested in servers... sure. Java plugin, flash, silverlight and HTML5 are all intended to serve the same needs. It's mindshare that matters.
Extensions do nothing for non-developers besides lulling them into false sense of security. An average user is not going to know that
Nope, the real answer is system level protection to ensure that an app can not do any more damage than a text file. Application sandboxing works pretty well on mobile. Yes, there is always cat and mouse game with malware, but infected phones/tablets are much less common than desktops or laptops. Most "infections" are free games that run on background and open ads in your browser rather than credit card number keylogging. I think this should be default experience on a consumer device. Freedom of tinkering and development is also very important, but should involve explicit steps and visual reminders to make sure no software or person can gain unrestricted access to your device without your knowledge and understanding.
That's been part of Android for years. Here is one article describing what needs to be done.
I would be all for killer robots with software designed to not kill when dumb weapons always would. Like a missile that can recognize children/other likely noncombatants near a target and abort the strike.
Drones that just fly for days and look for people to kill would be a problem, yes.
With Pandora model, each play potentially introduces artists/albums that the listener has not heard before. When the song is playing, there are purchase links on the bottom of the screen. This is different from Spotify's on demand access. Pandora is not able to charge its users same rates (or get most people to sign up for pay subscription in general) and is helping artists get sales from other channels. I think some difference in rates is reasonable. It would make more sense to compare Pandora with iTunes Radio and other similar services.
Hmm... I see very little chance and even limited desire for "the rest of my life" being 70 years. Hopefully there is a small chance of it being just 10 years but, being human, its not out of the question. As it happens I have some responsibilities on this planet I am not willing to step away from. But I fully understand people whose personal equations are different,
I am sure this is based on some analysis of failure data. Regardless, this is a bad move when people are already cooling off on discrete graphics, especially on laptops. Intel integrated graphics will now run many games adequately on small screens and there are obvious cost/form factor/battery life advantages. If you don't cater to hardcore gamer/technology enthusiast market that is most interested in overclocking, just who is going to buy your chips and cards?
Latest iMac sure looks nice, but I wonder if 4K at close distance would be any different. After all, it's only considered useful for pretty big TVs. Sounds like number-based marketing like clock speed in Pentium 4 days. What would the framerate be like if I try to play a game on this thing?
Not every Smart TV has a microphone, camera or any of Samsung's spying ambitions. Most just have slow Netflix and Vudu apps and 3 HDMI ports. I would guess there is some profit sharing arrangement with included providers that makes the TV cheaper than it would be without mediocre hardware to allow these apps. After all, there is an $40 Fire TV stick with much wider capabilities.
So relax, buy the TV and don't use these apps. Personally I find them handy as the last resort when I moved other devices to another room or don't want to hunt for my cell phone or remote. But nobody is forcing you to even configure WiFi to enable them.
Maybe that volume of crappy code! I would rather that person wrote low tens of thousands of lines which were good.
Cleanup for the sake of cleanup projects never work. Current code performs some function and nobody can keep enthusiasm reading bad code for months just to have it perform same function in the end.
Instead, you can gradually raise code quality by setting a high bar for new changes. For example, have each change reviewed by a couple of developers other than the author who are known for good style. If a new utility method is added, ensure that the code was searched for existing similar facilities. When legacy mess has to be used, it should be wrapped into a clean interface. And so on.
Rolling distros are great if you are a technology enthusiast and completely manage your own machine. If you are supporting a large number of users or servers, you want to test a fixed configuration and deploy it to everyone once a year. In general the key to stability is to branch a code at some point and focus on bug fixes rather than new features/cleanup/refactoring.
Have you ever actually tried to fix an unbootable computer over "simple video chat" with a non-technical person? Hehe.
I would install a pre-shared key and not give it "govt agencies and hackers". If they have a secret backdoor into TLS or intel hardware, I am screwed anyway.
It will have a much better camera than a high end commercial Webcam and, if there is no existing $0.99 app, you can easily write one to take photos/videos and upload them to a location of your choice. Bonus: can work without another Internet connection, or even provide a WiFi Hotspot to employees/customers while still functioning as a wwbcam.