This reminds me of iPython notebook. It allows to run/re-run python commands and display either text or graphics. You can also insert "formated comments", save a session, and share it. It's now reaching a good maturity, and is becoming a kind of "python" killer apps for scientists.
As a side note, in addition to Python, it accepts shell commands, when preceding them with a !, to it could even replace a normal shell.
On this aspect, Python does handle interoperability pretty well (at least with C and C++). It might just have a little bit too many options:
* ctypes: connect to any C library directly (you just have to not do any mistake in parameters, as there is not check)
* Python C extention: write a wrapper in C.
* SWIG: "automatically" generates the wrapper, based on some
* cython: write C code using python syntax
Personally, I just use ctypes or cython, and it's quite easy to interpolate with any software library I need.
There seems to be an official answer from Samsung here: http://samsungtomorrow.com/4676
It's in Korean, but here is the translation, provided by sammobile.com:
"Under ordinary conditions, the Galaxy S4 has been designed to allow a maximum GPU frequency of 533MHz. However, the maximum GPU frequency is lowered to 480MHz for certain gaming apps that may cause an overload, when they are used for a prolonged period of time in full-screen mode. Meanwhile, a maximum GPU frequency of 533MHz is applicable for running apps that are usually used in full-screen mode, such as the S Browser, Gallery, Camera, Video Player, and certain benchmarking apps, which also demand substantial performance.
The maximum GPU frequencies for the Galaxy S4 have been varied to provide optimal user experience for our customers, and were not intended to improve certain benchmark results.
We remain committed to providing our customers with the best possible user experience."
As many people have already written, it's not the drive to tablet and phone that is reducing the user community, it's the fact that Gnome has become so bad compared to other DEs that people moved away. The main question is "why has Gnome become so bad?". I'd say it's mostly due to not listening enough to user feedback and lack of good judgment on what is good for the users.
Don't get me wrong. I loved Gnome, used it all the time, even used to send patches for the bugs that were annoying me (actually, I even had SVN commit rights at some points). But I stopped because Gnome 3 was worse than Unity _and_ LXDE, and because developers started to close all my bug reports as WONTFIX or, worse, because the patch would not apply anymore... after 2 years of being ignored.
I'd suggest these changes to all the core Gnome developers:
* first fix bugs before adding a new feature (or a new app)
* review and merge as many patches as you get from outside people, as soon as possible (that's how you build a developer community)
* review the entire interface and especially the fixed/default values so that Gnome is _super_ comfortable to use right out of the box
* do not ever remove features, and never accept regressions
* make sure your interface can be used by power users too (yes, that means putting back _some_ configuration options), they are the (future) developers
* listen a bit to user feedback (that one is difficult because it's typically a very noisy channel, but it's necessary)
* pick a few known and powerful programming languages, and stick to them for all the core applications. Honestly, just drop Vala: as great as it could be, it's not up to a DE project to develop a new programming language, and almost no one outside of the community knows it. If it was up to me, I'd say, just pick C, C++ and Python.
Keep like this for 3 years, and Gnome will be relevant again.
I'd also suggest to pick 2 or 3 apps and focus on them so much that they are the best for the task among any other competitor. This way, people will have incentive to use Gnome, and all the distributions will make sure these apps and all the dependencies are installed by default and working well. For instance, I'd pick: Evince, Rhythmbox, and Aisleriot.
Indeed, the hardware specs are really weird. It seems even that the two Ubuntu set-ups were done using different CPU speeds (2.5GHz vs 2.9GHz).
So, I wonder if it was the same hardware, and was reported differently, or it was really 3 different mac minis...
Lego Mindstorm might be a nice approach. It's available both in Dutch and Danish, and uses a graphical language with a great graphical interface dedicated to kids. I use it to teach (Dutch) programing and robotics to kids and it's amazing easy for them to make and modify the software.
The main drawbacks is that, although the software is free, you need to get a 200€ lego robot to make it useful. It also has only a Windows (and probably Mac) version. IMHO, the robot has the advantage to bring additional interest to the kids. It makes programming much less abstract.
To try the software before buying, look for the lego mindstorm nxt 2 iso on the lego website (it's a bit hidden).
1. Go to OpenStreetMap.org
2. Find a place which is not yet fully mapped (i.e. anywhere but Europe). I usuallly find a place which I've just read about in the news, or my next holiday area.
3. Click on "Edit".
4. Draw a couple of roads.
5. Profit (and let the others profit as well).
Admitely, quite a few people find it boring but if, like me, you enjoy seeing the world from above and seeing new places, it's great. It really changes your mind, and can be stopped at any moment.
On the website of a business that Alan seems to run separately from his job at Intel, he had aldready mentionned familly illness. (http://www.ultima-models.co.uk/news.html). I guess this is the "familly reasons".
Alan Cox has already contributed enourmously to Linux but hopefully things will get better for him and his familly, and he'll be able to contribute even further
Lately he has been trying to cover a bit the mess than Intel had done with the Poulsbo hardware (GMA500). As an owner of such a hardware, I'm very grateful for this. So I now wish him and his familly all the best in the hard time.
It's always great to learn a new (human) language. It will allow you to discover a new way of thinking, and let you see the world through a different point of view.
That said, let's be honest right away, if there is one part where it will bring you almost nothing, it's for software development. 99% of software communities online are discussed in English. 99.9% of software comments and software documentation is written in English. I happen to speak French, English, Dutch and Spanish (nothing special, I'm just European). I have been doing software development for more than 10 years and I cannot recall ever using any other language than English except when doing translation. The only advantage is that you'll be able to understand a bit better why translators are mad at you when you write bad printf()'s.
So go ahead, learn a new language, it's a great experience. I'd recommend one with a big amount of speakers like Spanish or Chinese (this one, I promise, will completely change your understanding of the concept of "language"). However, don't kid yourself, it's pointless with respect to software development
I guess it's due to the exclamation mark in the middle of the word. If you look for "O!Play" (with the quotes), it works.
The summary links to Grotesque, but what they use in the article is "Square Grotesque", a modified version which is _really_ square and IMHO hard to read (and which apprently quite appreciated by car manufacturers). Concluding every Grotesque font is hard to read is definitely not what the research demonstrated.
The best is to have a look at the paper, which has good examples. A similar font can be found on wikipedia there: http://en.wikipedia.org/wiki/Eurostile (but I find this one is still slightly easier to read).
Don't start by adding big features to the project. It's the hardest part, and there are many people doing this already. The best is to first concentrate on the quality assurance:
* Look at bug reports, try to reproduce them, add your insights, and maybe even find a fix for them
* Write test cases, that's what is most missing nowadays in open-source projects. It's the best way to ensure that the library will work correctly on every kind of hardware/software combination. Only someone like you who know both about programming and about the domain of the tool (mathematics) can do it right!
Elegance and truth are inversely related. -- Becker's Razor