Forgot your password?
typodupeerror

Comment: Re:Turtles all the way down (Score 2) 325

by TD-Linux (#40045785) Attached to: 'Inexact' Chips Save Power By Fudging the Math
Large safety factors are bad engineering in a lot of fields. Maybe not for architecture and bridges, but for airplanes, the safety factor is as close to 1 as possible (and there are certainly lives on the line). The weight savings are always worth it. In fact, in aerospace, safety factors down to 0.9 are common, meaning the part _will_ more than likely fail at some point, and so it is inspected regularly for signs of fatigue failure.

Comment: Re:Wasn't Chrome supposed to drop H264 support!? (Score 2, Insightful) 320

by TD-Linux (#39344905) Attached to: Mozilla Debates Supporting H.264 In Firefox Via System Codecs

I view h.264 as another tie in to expensive tools that force you to pirate and not update your own pc just be job competitive. That is against the spirit of the web. No free tool can exist because h.264 is licensed and proprietary.

The hell kind of reasoning is that? Have you ever actually tried creating a webpage? H.264 is not proprietary. The only thing that even touches H.264 is your video encoder. You probably already have one, and if not, there are plenty of good ones out there that you can use.

What is H.264 forcing you to pirate, exactly? How is H.264 preventing you from updating your PC? Why can no free tools exist? Have you read the actual license on MPEG-LA's website?

Comment: Re:Wasn't Chrome supposed to drop H264 support!? (Score 1) 320

by TD-Linux (#39344309) Attached to: Mozilla Debates Supporting H.264 In Firefox Via System Codecs
You don't. The vast majority of systems today already include a decoder, you don't need to include one in the web browser. This actually makes a lot of sense. What business does the web browser have decoding the video? If you offload it to the system, it'll often be done by dedicated hardware that's a lot faster and consumes a lot less power.

Comment: Re:This isn't nearly as bad as the division bug (Score 1) 292

by TD-Linux (#39266657) Attached to: AMD Confirms CPU Bug Found By DragonFly BSD's Matt Dillon

On an embedded system with just a few kbytes of memory, like say an ARM-powered gadget, self-modifying code is still relevant, even in 2012.

I work on embedded systems and can tell you that this is not the case. Generally on a system that is very limited in RAM (talking in the range 128 bytes to 128KB), the programs are stored and executed on flash ROM and are never copied into RAM, preventing the use of self-modifying code. Even on a system that could, you never write self-modifying code. The tiny performance gains you might see just aren't worth the incredible pain and risk. The exception being JIT compilers, which usually only run on machines with a lot of RAM anyway.

Comment: Re:Sick of pi - Retarded Comment (Score 1) 55

by TD-Linux (#39265657) Attached to: MIT App Inventor Back Online

No your NOT! It's target is education and "third-world" countries

Right, because the most effective way to give students computers is to buy a HDMI monitor, USB keyboard, mouse, and power adapter for your $25 computer (oh wait, you want internet? $35)

Never mind that that it's an even worse idea in third world countries. As much flak as the OLPC gets, it solves far more problems than this board does - very low power consumption, a battery, mesh networking for internet, a durable case, and a complete GUI software stack with Python and Logo built in.

It has potential to be "game-changing" because IT education in the UK is a joke - technical ability is shunned in favour of teaching Microsoft products instead.

Oh, right. It's for the "third world country" that is the UK. No, actually, the UK's schools are already well equipped with computers. Why add another lower piece of hardware in just to run open source code? Why can't you do that on a Windows box? How about installing Visual Studio? It seems the real problem is not offering classes, not the lack of hardware.

The Pi project is an attempt to start a similar UK computer culture as seen in the 1980s.

An interesting proposition. By allowing the students full access to a computer, rather than a limited login environment, they can start hacking away at the hardware. However, why would students do that if they already have a fully working Linux kernel? In the 1980's, part of the attraction was working close to the bare metal. It's too bad you can't do that with the Raspberry Pi, due to its Broadcom chip. Broadcom chips are notorious for having zero documentation, and the one on the Pi is no exception. Aside from a GPIO reference document they released recently, most of the chip is shrouded in mystery and binary blobs. For example, the entire video subsection of the chip is undocumented - the only thing Broadcom provides is the area that it takes on the memory map.

At the current price - no, not really a 700mhz cpu AND gpu with 256mb ram, 2xusb and ethernet for $25?

Actually that's the $35 model. The $25 model only has one USB port and no ethernet.

No the stupid thing is, you're stupid for not checking your facts first.

...

The parent comment is retarded. How is it moderated insightful?

How do comments that start out with sentences like these get moderated insightful?

I'll briefly mention that the point about this being targeted at EE's and hobbyists is in fact somewhat true. Why else would it have a header for a bunch of GPIOs, I2C, etc, and why else would they pressure Broadcom into writing documentation for it? There is also a theory that Broadcom is subsidizing the chips (based on the total cost of components on the board), with the intention of it being a sort of evaluation / demo board / PR combo. But the rest is speculation, I'll leave it at that.

Comment: Re:Software GPU Emulation (Score 5, Informative) 237

by TD-Linux (#37967876) Attached to: GNOME Shell No Longer Requires GPU Acceleration
Meh, the compositor has to draw the pixels, one way or another. KDE has two backends, XRender and OpenGL. If acceleration isn't available, the XRender backend can still run in software, and is pretty fast. KDE also supports no compositing at all, but with software compositing it's becoming irrelevant.

Note that compositing != GPU acceleration. Mac OS X has always used compositing, but it was entirely software. There are still good reasons to do so. I'll compare for you:

No compositing, one frontbuffer: You don't get your own pixmap to draw onto. You have to send drawing commands to the display server to draw on your behalf, to prevent you from drawing wherever you want on the frontbuffer. Unfortunately, if you have something complicated to draw, the user gets to watch as the drawing happens. When drawing a new object, generally the algorithm used is to draw the background, and then draw the objects in order from back to front. This means whenever the screen is updated, the user will see flicker whenever any objects are updated because they may briefly flicker to the background color. To work around this most modern toolkits (Qt 4, GTK 3) render to a pixmap, and then just tell X to draw their pixmap when they are done. This avoids the flicker but uses a bit more RAM.

With a compositor, the application still draws to the pixmap, but instead of requesting the X server to immediately draw their pixmap to the screen, they pass it a handle to the pixmap and the display server can draw it whenever. This makes a lot of things easier, like vertical sync and effects, as well as things like color format and color space conversion.

Drawing the pixmap on the screen is really the same operations, no matter if compositing is on or off. And the API your compositor uses shouldn't matter too much either if the underlying implementation is optimized. The highly optimized Gallium3D blitter is going to just as good as the traditional X blitter, if not better. The only thing slowing it down in this case is the fact that OpenGL API is rather overkill for blitting, but hopefully the llvmpipe backend is optimized for this use case. And it's probably not worth it to make the compositor support two drawing APIs, like KDE, as they both end up doing the same thing anyway.

Comment: Re:Javascript as assembly (Score 1) 253

by TD-Linux (#37231098) Attached to: Announcing Opa: Making Web Programming Transparent
I think you missed what the parent was talking about. He didn't mean implementing more things in Javascript, he meant allowing developers to use non-Javascript languages to generate Javascript. It's so people don't have to use Javascript or learn it, not the other way around.

What you mention, the pattern of generating pages using AJAX, is still fairly new. I use it a lot for data that is updated in real time, and some websites are using it for static data as well. I very much like it for cleanly separating the presentation from the backend server side code, even though some NoScript using purists hate it.

That does not compute.

Working...