Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Bull! (Score 1) 223

Eclipse: Who, at the end of the day decides what happens? I'm not as sure :), but an organization formed specifically for that codebase, arms length from anyone else

I dunno, when I interacted with the Eclipse community it seemed like everyone who made decisions had an IBM email address. IBM employees decided which bugs to work on, which features to add, and which bugs reports should be classified as enhancements. The Eclipse guys will jump through all kinds of semantic hoops to avoid closing a bug report as "valid but won't fix." I suspect that reality-bending, reality-denying mentality must come from the project's corporate nature. Maybe it is evidence that IBM cares enough about the Eclipse project to apply some dumbass management metrics to it.

Since plugins are OSGi bundes, the high-level aspects of plugin development are influenced by the industry consortium that controls OSGi, but 95% of what developers care about is decided in-house at IBM. Maybe the developers aren't technically making those decisions as IBM employees, but they know who pays their salary.

Comment Re:Same problem as movies. (Score 4, Insightful) 166

The kind of fear you're talking about is great, but I rarely get it from games. I treasure the memory of playing an "Aliens" mod for Doom. "Keep it tight, people." "Check those corners... check those corners!" You can't see anything, but you can hear the aliens breathing. Then SHRIEK they're in your face. It got to the point where my heart was going a mile a minute the whole time, and it took me a long time to calm down afterwards so I could sleep.

Unfortunately my success at finding good horror games (and horror movies, for that matter) is so low that it isn't even worth trying anymore -- one good game in ten means paying hundreds of bucks to find a good game, plus the frustration of sitting through so much garbage.

So, yeah, I do like the second kind of horror you're talking about, but I've kind of given up on it.

As for immersion, I agree completely. Few modern games are really immersive. Note to game designers: There's nothing realistic about looking at a bush and not being able to figure out whether you can step over it, or looking at a tree and not knowing whether there's an invisible corridor that will prevent you from ducking behind it. Nobody in World War II died while trying to take cover behind a pile of sandbags that were un-jump-overable for the sole reason that they marked the edge of the battlefield. Level designers need to stop putting visual verisimilitude over everything else and once again start considering the verisimilitude of the whole experience.

One aspect that all immersive games share is that you can almost always predict how objects and terrain will affect your movement. Gosh, just like in real life! That's actually more important to immersion than making the terrain look realistic. If you're examining every rock and bush as a game construct instead of perceiving it as a "real" object, then you're going to see the monster, scary noise, or eerie apparition as just another game construct. You think, "Well, here's a new game object. Looks like a werewolf. Let's figure out how to interact with it. I'll start by walking towards it and seeing what happens. I'll probably get killed, but I might as well be systematic if I want to figure out how it works." That's not an immersive experience :-)

"Hmmm, there's a toddler sitting on the floor of my living room finger-painting with blood."

Immersive game reaction: "What... the... FUCK is a mysterious toddler doing in my house? Whose blood is that!!?"

Non-immersive game experience: "I wonder if I can jump over him?"

Comment Re:Same problem as movies. (Score 4, Insightful) 166

It also doesn't work if dying means you respawn one or two minutes earlier in the game. Call of Duty single-play mode is like this. I realized this when I realized that I never paused the game when I wanted to answer the door, talk on the phone, or get a drink from the fridge. I just left my guy standing there, and if he died, so what?

When save points are so frequent, dying doesn't even impede your progress through the game unless you do it five times in a row. As a result, to make the game challenging, there have to be individual segments that are INSANELY challenging, which just makes you angry. You only get scared if dying once is a big deal. If dying twelve times in a row is what it takes to get a gamer's attention, he doesn't get scared. He gets angry.

So, you go through the entire game never being scared. You're just bored, moderately engaged, or angry depending on whether the difficulty is too low, about right, or too high.

Games that let you choose your save points, like the original Doom and Doom II, were much scarier, because you would limit your saves out of pride, and you'd also get caught up in the game and forget to save and then GAAAH I'M ABOUT TO DIE AND THIS IS REALLY SERIOUS! You panic about dying because it took half an hour of good play to get where you are, and if you die, you lose it all.

If (like me) you were normally too proud to save in the middle of a level, it meant that there was a great buildup of suspense through the level, because you had more and more to lose the further you got. In checkpoint games, it doesn't matter where in the level you are, so there's no buildup and climax, no arc to the game at all except what they can build up artificially through tacked-on story elements.

Comment Re:Don't be a douche (Score 1) 551

If a status report contains anything interesting, it results in the manager walking over for a chat anyway. Plus, programmers hate to think about bureaucratic stuff. Producing a status report on a regular schedule means constantly thinking about it -- when's the next one due? Have I missed one? You can't imagine how angry and resentful it makes programmers if you make bureaucracy a constant presence in their minds. Bureaucracy is hateful knowledge and a hateful intrusion into their thinking. You might as well force the marketing guys to use Linux and create marketing brochures in Emacs using LaTeX.

On the other hand, if the manager takes responsibility for periodically soliciting status from developers, then everything is good. The manager is the one who needs the information, so he won't forget to ask for it. Developers are usually happy to chat about their work for a few minutes. Best of all, face-to-face two-way conversation means that a lot of potential misunderstandings get ironed out immediately.

Comment Re:Key Point # 1 (Score 3, Insightful) 551

There's an alternative theory of human social structure, in which men naturally organize themselves into a hunting band with one leader over a group of more-or-less-equals. The leader maintains his position because the other guys like him and trust that they will be successful under his leadership. The leader usually isn't even be the roughest, toughest guy. The biggest sin in this kind of group is overvaluing yourself relative to your contribution to the group: arrogance and selfishness are punished.

That's quite different from wolf pack model where there's a heirarchy from the strongest at the top to the weakest at the bottom. The only sin in a wolf pack is weakness: weakness is punished ruthlessly.

In a wolf pack model, the manager would have to be the best coder, the strongest personality, or the toughest hombre. But in real life the manager is usually a poor (or washed up) coder who is allowed to play a "superior" role because the people under him believe the group will be successful under his leadership. Managers who believe they are better than their underlings face constant undermining and insubordination.

Comment Re:Don't be a douche (Score 1) 551

These are creative people, and will resist things like status reports and hard work schedules.

Well, to be more accurate, they'll work as hard as they want to, and they probably shouldn't work any harder. Don't try to make them work harder, or you'll be a douche.

And if you want status reports, ask. Don't nag them with, "Why haven't you updated me with your status?" You're the manager; take responsibility for keeping up to date.

Inevitably they will sometimes go over schedule or screw something up. That is when you have to be very careful not to make them bitter and resentful. The first time it happens, trust them completely. Just say, "Okay, keep me informed, and let me know if I can do anything to help." Then slowly escalate your involvement, but...

Don't get involved in helping them out technically. If a developer is drowning, assign another developer to help him out -- don't stick your nose in.

The best way to maintain acceptable technical quality is not for you to review their plans and designs but for them to talk things over with each other. If they don't do it naturally, assign projects to teams or pairs of developers rather than individuals. Tell each pair they can divvy up work between themselves however they like, but they're both responsible for knowing and vetting the overall design and alerting you to possible slippage.

Comment Re:Java? (Score 1) 962

My message might not have been entirely clear. I think Java is great as an entrenched and highly popular language. In most ways it beats the popular languages that it displaced. I usually prefer it over C++ even though it's a lot less expressive.

However, Java makes lots of compromises that kids will pay for, but not benefit from. Like you said, kids don't care about the fact that every object is (notionally) a heap object with a built-in mutex to support "synchronized," but professionals do care, and the result is that professionals wanted non-Object types, resulting in a dual type system where ints, floats, and booleans are fundamentally different from Integers, Floats, and Booleans. For beginners, that's just confusing. Integers, Floats, and Booleans are related because they're all Objects, but ints, floats, and booleans are not related at all.

The experienced programmers that Java was aimed at were already familiar with those two ways of organizing types, so it was reasonable to ask them to juggle both in Java. In fact, I bet most professionals who were hesitant about Java were actually reassured by the presence of primitives. It made Java seem less "weird" and less like Smalltalk.

Asking kids to learn two type systems that were shoehorned into the same language as a performance hack, though, is an entirely different matter. There's no reason to distract beginners with unnecessary complexity when they're still learning the basics.

Comment Re:Java? (Score 2, Informative) 962

Python and Ruby are pretty popular and far from weird. Logo was designed for kids, and many of us have fond memories of using it as kids, so it isn't odd that people recommend it.

Java is actually a really poor choice. Java started as a language for set-top boxes but it made it big time because it was a pretty successful attempt to address the concerns of large-scale commercial software development in the late 1990s:

  • Reliability, for which the language designers chose garbage collection, exceptions, and lack of direct memory access. Those choices are pretty good for beginners, though it's a drag that you have to learn about exceptions right away before you can write any real programs.
  • Easy, manageable deployment. Beginners don't lose anything here. But now it gets bad....
  • Easy no-brainer mixing of code from different sources. That means a literally global package namespace based on domain names: com.yourcompany.Foo instead of Foo. For professionals, using this system for a few weeks, especially with IDE support, is enough to convince you that every language should do it that way. It just works without thinking. But for kids and beginners? They get frustrated at having to type a bunch of lengthy imports. They don't need a literally global namespace, but they have to pay the keystroke-and-readability cost of it anyway.
  • Conceptual tractability of extremely large programs. That means no macros and no operator overloading. The lack of operator overloading is a real frustration for beginners. Beginners don't like special cases; special cases just slow them down and take up extra space in their brains. Why are operators and methods different? If operators don't belong to classes, why can't I define functions that don't belong to classes? Why can classes override toString() but not +?
  • A language that isn't bristling with features but which contains a fairly idiot-friendly implementation of class-based OO, the most widely understood form of OO in the industry. These two priorities conflict a bit. After adding objects, interfaces, and polymorphism, they started to feel like the language was complicated enough, so they left out everything else. I think that's why there are no namespaces and no free-standing functions -- they hit the limit of their conceptual budget had to do some brutal elimination of features. Classes are used as namespaces and as homes for functions that should be free-standing -- which is a really confusing overloading of the class concept if you're a beginner.
  • Good-enough performance for the compilers and VMs of the time. That's where we got the stupid difference between Objects and built-ins. It's obvious to professionals why it makes sense -- avoid the overhead of heap Objects with built-in mutexes! But beginners shouldn't have to be exposed to weird language inconsistencies that exist solely as performance hacks.
  • An easy transition, both intellectually and emotionally, for C++ programmers. That basically determined the entire syntax and (see above) determined that the OO system would be a simplification of C++ classes. Obviously beginners don't benefit from the fact that Java has a lot of superficial similarities to C++, and once again they pay the price without reaping the benefits.

Those are just the warts that come to mind right now. The worst thing about Java is that its warts and oddities can only be explained by saying, "You'll understand someday when you have to work on huge software systems," which to kids sounds the same as, "You'll understand when you're older," the classic parental cop-out.

I think for kids it's better to use a language based on a clean set of abstract concepts, because when a kid asks "Why?" you want to be able to give him an answer that he can grasp without having any experience of large-scale software engineering. Otherwise you're just teaching him that he'll never understand the reasons for things, and the most important decisions are made by people who know better than him. That's not the right lesson to teach a kid, unless you want him to grow up intellectually passive and easily controlled.

Comment Re:Python (Score 1) 962

I think Python would be great for a first language for kids. Pascal would be a great second language, or a first language for kids who have a desire to be hard-core.

It's easy to forget how fascinating something like Pascal or C++ is the first time you learn it. "Each piece of data lives in memory as a pattern of bits. Because memory is just strings of bits, the program must know what kind of data is encoded in the bits before the memory can be interpreted as a data value."

If you're a jaded C++ programmer, reading that makes you think, "Oh, hell, I see where this is going. I'm going to have to declare the type of every single &^*)&*ing variable. Why the hell do I still have to do this stupid stuff in 2008? Screw it; I've had my fill of shared_ptr<vector<shared_ptr<shared_ptr<Blob> > > > > > > > > > > > > > > and I don't waste my time on type declarations unless I get paid for it."

A kid, on the other hand, thinks "Oh. So that's how it works. I knew Python was too good to be true! The computer doesn't 'just know.' There's a way that it knows, and I'm going to understand it! I'm going to know the secret!"

And then polymorphism, which is especially cool if it's presented to you from two directions, like a cheap paperback thriller bringing two strands of the plot together. Motivation... implementation... motivation... implementation... motivation... implementation... until suddenly you are blown away by the awesome power and simplicity of the vtable.

It gets a lot harder to appreciate things like that when you get old and jaded.

Comment Re:Logo, LISP, Scala, F#, Erlang, and Haskell (Score 1) 962

Scheme! There's something great about starting out with an interpreted language - instant results keep kids interested.

Scheme also has the advantage of two brilliant books for beginners, The Little Schemer and The Seasoned Schemer. I think eleven-year-olds would get a kick out of the fact that these books are written for adults, written in fact to be challenging and engaging for adults, yet are written in a very simple format with very simple language, exactly as if they were written for children. The message to adults -- set aside your burden of knowledge, become childlike, start from scratch with Scheme. The message to kids -- this is an area where you can approach serious stuff on equal terms with adults.

As a child, it is always gratifying change to hear that adults would understand better if they were more childlike and less handicapped by experience. That alone is enough to fire a kid's enthusiasm for programming.

Education

Best Introduction To Programming For Bright 11-14-Year-Olds? 962

firthisaword writes "I will be teaching an enrichment programming course to 11-14 year old gifted children in the Spring. It is meant as an introduction to very basic programming paradigms (conditions, variables, loops, etc.), but the kids will invariably have a mix of experience in dealing with computers and programming. The question: Which programming language would be best for starting these kids off on? I am tempted by QBasic which I remember from my early days — it is straightforward and fast, if antiquated and barely supported under XP. Others have suggested Pascal which was conceived as an instructional pseudocode language. Does anyone have experience in that age range? Anything you would recommend? And as a P.S: Out of the innumerable little puzzles/programs/tasks that novice programmers get introduced to such as Fibonacci numbers, primes or binary calculators, which was the most fun and which one taught you the most?" A few years ago, a reader asked a similar but more general question, and several questions have focused on how to introduce kids to programming. Would you do anything different in teaching kids identified as academically advanced?

Comment Re:Don't write off the Java *platform* (Score 1) 997

I second this post. The Java platform has tons of tools, amenities, and libraries. All you need is a decent language: take your pick! Java the language is now the C of the Java world: it's the lowest common denominator, the systems programming language, and the application language of choice for conservative no-nonsense types.

Scala is the C++ of Java-land: a sophisticated multiparadigm programming language that does everything Java does and more. On the other hand when you compare Scala against C++'s weaknesses, Scala is the anti-C++: terseness through type inference, excellent support for functional programming, small and clean language core, and able to be described in a twenty-page white paper. Suck that, C++!

Programming

What Programming Language For Linux Development? 997

k33l0r writes "Recently I've been thinking about developing (or learning to develop) for Linux. I'm an IT university student but my degree program focuses almost exclusively on Microsoft tools (Visual Studio, C#, ASP.NET, etc.) which is why I would like to expand my repertoire on my own. Personally I'm quite comfortable in a Linux environment, but have never programmed for it. Over the years I've developed a healthy fear of everything Java and I'm not too sure of what I think of Python's use of indentation to delimit blocks. The question that remains is: what language and tools should I be using?"

Comment Perfect time to ditch the GIL (Score 1) 357

There are three things that hold python back in scientific computing as far as I am aware and they are iteration, recursion and multithreading.

Finally somebody mentioned threading! I came in here specifically to find out whether 3.0 has any threading improvements. The Python FAQ has long said that they can't get rid of the global interpreter lock for two reasons: It might make Python twice as slow, and it would break all C extensions. If Python 3000 is going to break the C extensions anyway, this would be a perfect time to ditch the GIL. (I don't think anyone would object to making single-threaded Python twice as slow, except maybe web developers.)

I don't see any mention of it anywhere, though, so I'm not optimistic :-(

Slashdot Top Deals

I've noticed several design suggestions in your code.

Working...