I view h.264 as another tie in to expensive tools that force you to pirate and not update your own pc just be job competitive. That is against the spirit of the web. No free tool can exist because h.264 is licensed and proprietary.
The hell kind of reasoning is that? Have you ever actually tried creating a webpage? H.264 is not proprietary. The only thing that even touches H.264 is your video encoder. You probably already have one, and if not, there are plenty of good ones out there that you can use.
What is H.264 forcing you to pirate, exactly? How is H.264 preventing you from updating your PC? Why can no free tools exist? Have you read the actual license on MPEG-LA's website?
DRM has nothing to do with it. XP does not include a software H.264 decoder because it didn't exist at the time XP was released.
Can you even get an ARM chip with less than 1MB RAM anymore?
Quite easily. Cortex-M series microcontrollers often have 8KB of RAM or less. See the STM32 for an example. NXP also has some extremely small Cortex-M0 chips.
On an embedded system with just a few kbytes of memory, like say an ARM-powered gadget, self-modifying code is still relevant, even in 2012.
I work on embedded systems and can tell you that this is not the case. Generally on a system that is very limited in RAM (talking in the range 128 bytes to 128KB), the programs are stored and executed on flash ROM and are never copied into RAM, preventing the use of self-modifying code. Even on a system that could, you never write self-modifying code. The tiny performance gains you might see just aren't worth the incredible pain and risk. The exception being JIT compilers, which usually only run on machines with a lot of RAM anyway.
No your NOT! It's target is education and "third-world" countries
Right, because the most effective way to give students computers is to buy a HDMI monitor, USB keyboard, mouse, and power adapter for your $25 computer (oh wait, you want internet? $35)
Never mind that that it's an even worse idea in third world countries. As much flak as the OLPC gets, it solves far more problems than this board does - very low power consumption, a battery, mesh networking for internet, a durable case, and a complete GUI software stack with Python and Logo built in.
It has potential to be "game-changing" because IT education in the UK is a joke - technical ability is shunned in favour of teaching Microsoft products instead.
Oh, right. It's for the "third world country" that is the UK. No, actually, the UK's schools are already well equipped with computers. Why add another lower piece of hardware in just to run open source code? Why can't you do that on a Windows box? How about installing Visual Studio? It seems the real problem is not offering classes, not the lack of hardware.
The Pi project is an attempt to start a similar UK computer culture as seen in the 1980s.
An interesting proposition. By allowing the students full access to a computer, rather than a limited login environment, they can start hacking away at the hardware. However, why would students do that if they already have a fully working Linux kernel? In the 1980's, part of the attraction was working close to the bare metal. It's too bad you can't do that with the Raspberry Pi, due to its Broadcom chip. Broadcom chips are notorious for having zero documentation, and the one on the Pi is no exception. Aside from a GPIO reference document they released recently, most of the chip is shrouded in mystery and binary blobs. For example, the entire video subsection of the chip is undocumented - the only thing Broadcom provides is the area that it takes on the memory map.
At the current price - no, not really a 700mhz cpu AND gpu with 256mb ram, 2xusb and ethernet for $25?
Actually that's the $35 model. The $25 model only has one USB port and no ethernet.
No the stupid thing is, you're stupid for not checking your facts first.
The parent comment is retarded. How is it moderated insightful?
How do comments that start out with sentences like these get moderated insightful?
I'll briefly mention that the point about this being targeted at EE's and hobbyists is in fact somewhat true. Why else would it have a header for a bunch of GPIOs, I2C, etc, and why else would they pressure Broadcom into writing documentation for it? There is also a theory that Broadcom is subsidizing the chips (based on the total cost of components on the board), with the intention of it being a sort of evaluation / demo board / PR combo. But the rest is speculation, I'll leave it at that.
Note that compositing != GPU acceleration. Mac OS X has always used compositing, but it was entirely software. There are still good reasons to do so. I'll compare for you:
No compositing, one frontbuffer: You don't get your own pixmap to draw onto. You have to send drawing commands to the display server to draw on your behalf, to prevent you from drawing wherever you want on the frontbuffer. Unfortunately, if you have something complicated to draw, the user gets to watch as the drawing happens. When drawing a new object, generally the algorithm used is to draw the background, and then draw the objects in order from back to front. This means whenever the screen is updated, the user will see flicker whenever any objects are updated because they may briefly flicker to the background color. To work around this most modern toolkits (Qt 4, GTK 3) render to a pixmap, and then just tell X to draw their pixmap when they are done. This avoids the flicker but uses a bit more RAM.
With a compositor, the application still draws to the pixmap, but instead of requesting the X server to immediately draw their pixmap to the screen, they pass it a handle to the pixmap and the display server can draw it whenever. This makes a lot of things easier, like vertical sync and effects, as well as things like color format and color space conversion.
Drawing the pixmap on the screen is really the same operations, no matter if compositing is on or off. And the API your compositor uses shouldn't matter too much either if the underlying implementation is optimized. The highly optimized Gallium3D blitter is going to just as good as the traditional X blitter, if not better. The only thing slowing it down in this case is the fact that OpenGL API is rather overkill for blitting, but hopefully the llvmpipe backend is optimized for this use case. And it's probably not worth it to make the compositor support two drawing APIs, like KDE, as they both end up doing the same thing anyway.
What you mention, the pattern of generating pages using AJAX, is still fairly new. I use it a lot for data that is updated in real time, and some websites are using it for static data as well. I very much like it for cleanly separating the presentation from the backend server side code, even though some NoScript using purists hate it.
Durability will be interesting. Solar cells are extremely fragile. However, there are a number of strong encapsulants out there, so it shouldn't be too much of a problem.