Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:It's so hard to cut speading... (Score 1) 128

I think the Rhineland model is a better bet. After it's 50 years old, which isn't long, will you just increase the number of years? I agree with Karl Marx on "capitalism will destroy itself", only I don't think it's a good thing and can be, and is, prevented by regulation. We can also use regulation to align outcomes we want with the profit motive.

The gap was big and growing before the credit crunch. A good website on this and the affects of a unequal society is: http://www.equalitytrust.org.uk/

Comment Deeper problem (Score 1) 128

There was a time where artists and programmers learnt in their bedroom. This was great because when they came into industry, it didn't take much to get them up to scratch as they had been doing it at home, just with less tools and no access to others. Now, in the game company I work at, we struggle to get people who know what we need.

I wonder how much of the recruitment problem is noise to signal ratio because of:

http://www.youtube.com/watch?v=lGar7KC6Wiw

These kids go and do "games" courses, but aren't being taught what they need, because really, they don't want to know and the course is about bums on seats to make the education stats look good. I was on a "VR" course that was similar, but I dropped out and went into industry when our "professional 3D artist" didn't know what was skinning or IK and seamed to make everything out of spheres, and our programmers didn't know anything about real-time 3D. That was 10 years ago, not sure it's got better since.

I also wonder how much of the problem is no one learning "roll in the mud" C/C++ that is required. Those learning at uni, and even at home, seem to be learning only managed languages, so don't really understand computers. They don't get memory, data and instructions, only objects and garbage collection. Even if you are going to use someone else's engine, that still puts you at a disadvantage. Though of course, as long as the tech is "good enough" it starts becoming about game play and artwork....

I also wonder if this is limited to the game industry after last week's link to:
http://blog.expensify.com/2011/03/25/ceo-friday-why-we-dont-hire-net-programmers/

I think this kind of thing makes people angry because they know, deep down, there is at least an element of truth to it, but don't want to take the ivory tower blinkers off and see. Same kind of people who shout that programmers should do GUIs for everything, and there should be no CLI. Tough. For real time, you need to know what the computer is doing, even if you are using a virtual machine on top (in which case, you need to know what that is doing too, so it's actually making things more complex for you). For advanced computer use, you need to learn the CLI.

Comment Re:Instruction bloat (Score 1) 235

Not done any x86-64, but from a quick glance at the disassembler, it doesn't look that different..... ARM was design to be written by hand. It was not written for high-level compilers. Read/listen to what Sophie Wilson has to say on the subject. ARM came from the Acorn, a platform I cut my teeth on. On the Acorn platform, pretty much everything important was written in ARM by hand, and what wasn't was BBC BASIC with ARM for the hot spots. Since then, new instructions have been added, for things like processing multiple input on a single instruction, virtual memory, etc, but it's not like it suddenly has anything even close to mess of x86.

Comment Re:It's just ARM heads (Score 3, Insightful) 235

My money on 'why' is Windows compatibility and closed source locking the platform more than chip design. The best design doesn't always win, in fact, it often doesn't. This was because of Windows going critical mass and with it x86. So much money was poured into x86 you just got more for your money with x86, and the more that was the case, the more sold, and the more it was the case. This meant it came into the server market from the bottom, and then the same happened there. It's a good example of a bottom up revolution. Now it looks like Wintel compatibility doesn't matter so much (web/freesoftware), and something similar could happen with ARM driven by them being "good enough", cheap and low power. That's why Intel are pooing their pants and MS are hedging their bets with Windows on ARM.

Comment Re:Sparc (Score 2) 235

ARM is more compact that a normal RISC architecture because most instructions are conditional and can have bit shift or rotations too. This doesn't just mean less instructions required to do common tasks, but it also means less branching, which isn't only faster, but again leads to less instructions as 'branches' can share instructions. In the old Acorn days I remember how much noise there was about how much more compact it was than x86. ARM isn't the RISC your daddy knew. Thumb just makes it even more compact, and unlike x86's instruction decoders, it can be powered off and not used.

x86 really problem is legacy. It has to be backwards compatible, ARM doesn't bother. You might says that's terrible, and people do, but they aren't thinking open source, where the repository can be just compiled for the different target as needed. The user doesn't need to know or care, they just do apt-get regardless. Or if, you are that way inclined, you could argue about Java/.NET bytecode making code compiled at run time achieving the same thing. Either way, you can free up the chip design from legacy.

Comment I think I agree (Score 1) 323

I don't like graphics cards. It's a like a whole separate computer, with all that processing power and memory, not accessible from the main computer properly. It would be better to bring it all back to the centre. RAM is the best example of this. The graphics card can have nearly as much RAM as the rest of the system but day to day, my graphics needs are tiny. So why can't that RAM be used to cache disk, which would be much more useful to me. The processing power, ok, it's in the form of a bag of stream processors taking different instruction set to the CPU, but I'm sure it can be used for more than graphics. Yes there are APIs to do that already, and yes, we are clearly heading in this direction anyway.
This doesn't take away DirectX or OpenGL, it just means you don't have to use them. It also means other, new breed, APIs can come along that generically use all that unified power, and don't care about make/model of hardware like graphics card. Like in the old days of software rending. Back then, the screen was just an address, and what you did with that address was up to you. You could use an off the shelf renders, but you where free to write some crazy thing of your own if you wanted. In Linux world, Gallium is interesting because it heading in this direction, making graphics APIs implementations hardware agnostic, but it doesn't have the game market to really go crazy with it. We could maybe hope for some crazy demos though. :-)

Comment Re:As always, XKCD seems apropos (Score 2) 159

Go on then, I'll bite.
That's not Linux's problem, it's Adobe's and the graphics card manufacturers. Loads of reimplimenting of closed stuff needs to happen for it to be Linux's fault. (That's Linux as a platform, not as just a kernel) With Gallium/DRM/KMS/Wayland/etc and HTML5 hopefully it will be Linux's problem and will all go away nicely. Having said all that, works ok for me now with the closed Flash player and the closed NVidia drivers. It's just unpalatable (and you are left in the slow lane of X developments, oh and booting is ugly and you can't switch virtual tty safely or fast).

Slashdot Top Deals

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...