That would escalate quickly.
That would escalate quickly.
There's tons of thorium right here on earth. Mining it on the moon sounds pretty impractical.
I'll believe their claims when I see some test results they can back it up with.
Not exactly what you asked for but snes-sdk is a tinycc-based C compiler and SDK for the SNES. Pretty cool if you want to write your own games and don't want t write everything in assembly.
It had signed packages years before Debian
I don't know who got what first but as you said, Debian has this too.
and nice all in one file
Same as Debian, you have a [package-source].orig.tar.gz, [package-description].dsc and [debian-specific-patches].diff.gz.
And rpm specifically forbids (although clueless commerical vendors sometimes ignore it) interactive install/update, absolutely required for mass installs or unattended update.
You can do unattended installs or updates in debian with
DEBIAN_FRONTEND=noninteractive apt-get -q -y dist-upgrade
I think it's pretty safe to assume that the systems are equivalent nowadays.
The difference is that the raspberry pi is/will be immensely popular. There will be a huge community supplying software solutions and hardware modifications for it. It doesn't matter that some random china-device has twice the power and a touchscreen when you're stuck on Android 2.1, closed kernel, and no updates in sight.
It's the people that matter, not so much the hardware.
I don't care for Anad's benches much because they seem to like synthetic compute benchmarks. That is really all kinds of not useful information for a game card. I want to see in game benchmarks. If any compute stuff is going to be benchmarked, let's have it be an actual program doing something useful (like Sony Vegas, which uses GPUs to accelerate a lot of what it does).
But... They tested with 10 games, 1 raytracer and 0 synthetic benchmarks. I don't know what they usually do but this article was very focused on real world performance.
I can't find any reference in the article, but from the formulation in the summary it sounds like the IT department set up new accounts to have the 12345-password as default (without expiration), and then asked the users to change the password.
If that's the case it sounds like a terrible idea to me. Better to generate default-passwords as complex random strings. Then it'll be in the users' interest to change their passwords because they're hard to remember and type. And if they don't, even better!
That's a weird point. It's not like you need to know what a single of those things mean to use an Eee Pad. Just like you don't have to know what iOS 4.3.5, Apple A5, PowerVR SGX543MP2, NEON SIMD, Objective C, jail breaking or capacities touchscreen is to use an iPad.
My experience with closed source linux drivers is that they're usually very poorly integrated with the rest of the system. The companies usually like to solve everything their own way (tm), rather than using the frameworks all the open drivers use.
When AMD dropped their support in fglrx for my radeon x1300-based GPU in my laptop (yes, they drop support for hardware whenever they feel like it) I had to start using the radeon driver on my ubuntu-machine. Everything has worked much better in the system since the switch. Suddenly I don't get some special AMD catalyst control center-thingie to change resolutions and set up external monitors etc. Instead the normal standardized gnome settings work like a charm. Also the system sets the correct resolution for my screen once, right after the kernel has been loaded (ie. before gdm/X).
If the radeon driver from TFA gets included in ubuntu 11.10 I would definitely give it a shot for my desktop machine, which has a Radeon HD 6870 card. The fglrx support for this card is just terrible. Sure, performance wise the OpenGL works well in games, but the normal X11 2D acceleration is terrible. Here are some annoyances with it:
* Whenever gksudo is activated it throws random garbage on all my monitors for about a second before displaying the password dialog
* Random "holes" in windows at random times, ie. squares where the desktop background suddenly becomes visible instead of the window contents. This won't go away until the window is redrawn.
* OpenGL and XV surfaces are always on top. So if I watch a video and put some window on top of the video surface, the video will be in front of the window regardless of Z-order.
If I could get a driver that plays nice with the rest of the OS, gets regular updates with the rest of the system, and doesn't have weird bugs in its 2D rendering I would gladly sacrifice 50% OpenGL performance. It's not like I utilize the GPU that much anyway.
While I agree with you regarding application programming, need, etc. I must clarify that I was talking about graphics/game applications that require the full hardware potential.
If you compare this new architecture with an arguably over complicated architecture like the playstation 3 I'd argue that writing software that utilizes the hardware to its full potential is indeed hard. And in this context, making a more elegant, integrated GPU/CPU will make the lives of us poor indie game programmers a bit easier.
Integrating CPU, GPU and unifying the memory address space will probably make things easier for programmers. So hopefully it'll help programmers utilize the hardware better.
The "duh" in the population are those who believe that "duh" science is "duh" though. More often than not the outcome of a study is the expected results. When it's not, however, it challenges our preconceptions and we have to adjust to the new facts (or do another study
Just because our intention tells us that something works a certain way it doesn't mean we can accept this as a scientific fact. This is a strength of the scientific method, rather than a weakness.
if you're building spacecraft in orbit, I guess it would be nice to mine most of the raw materials in space.
Just release what you legally can. If someone is interested they can replace the floating point parts.
The reward for working hard is more hard work.