Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Who cares how? The better question is why the b (Score 2, Insightful) 973

Crap, I was not logged in.

That is NOT the raw video, that's the "MP4 they provided [that is] larger but is still blurry and obviously not the source video." The file is an mp4 (do helicopter cameras use that? Doubt it.) Are the timestamps clear? No. Is it still in a boxed frame in a lossy codec with titles? Yes. Is this file in the format they received it in? Maybe, but I'd still like to have it without any of the tampering they did to it to add titles, etc.

Comment Re:False Advertising? (Score 1) 739

You're oversimplifying just as much. There were many factors that led to the Dreamcast's demise, not the least of which was Sony spreading flat-out lies about how their upcoming PS2 could render Toy Story in realtime so you'd better hold off buying. Or how Sega was already skating on thin financial ice and their first batch of games had a high defect rate. Or yeah, how fast it was possible to pirate games on it. Pretty much anywhere that had a college campus was a funnel of pirated software and games, Sega was just too weak to take hits from every direction.

Comment Re:Fonts are too small (Score 2, Informative) 198

I'm with you, man. Nobody else has problems with cut and paste files, just the Mac. It breaks the metaphor, but so does dozens of other things they've done to add glitz over the years. (btw, select, drag, hold down the apple key and release mouse button to cut and paste across file systems.) I am used to the OS X user interface, but it took me YEARS to get used to it. There are still things I find annoying. I am not sure what to say on the Expose thing. I can hit f9 over and over and the windows will NOT tile into the same place.

Here are some things I would love to have the option of on Mac OS X:

* Solid window border
* no drop shadows or fading
* remove transparency in menus
* Minimize with no effect
* focus follows mouse

The first four are eminently doable and there is no reason for not having the option other than the fact that Apple is a walled garden. The last one, I am told, is not possible to implement fully correctly on OS X because of assumptions the OS API makes about windows being in focus. Oh well.

In defense of Apple, many annoyances have been fixed since back when I first started using OS X 1.2. Adding alt-tab was great, expose is excellent, so is Spaces. The combination of all three of these speed up my workflow tremendously. It is my environment of choice, but I won't kiss Jobs ass. Some things should be configurable and they just aren't because the OS appearance is a marketing tool for them. I don't care much how stupid Gnome on Ubuntu supposedly looks because I can change it to whatever I want and be happy. Not so on a Mac.

Comment Re:Yes and No (Score 1) 599

No, it's just different than everybody else's code by ignoring C coding standards, so it takes a huge mental context switch to work on one guy's old code. When I code in C, I code to C standards. When I code in Java, I code to Java standards. When I work for a place I adjust my coding style to the company standard. I don't just do everything the same way I've been doing it forever. You might say this isn't a COBOL thing specifically, but I have heard similar complaints elsewhere.

Comment Re:It is age discrimination (Score 1) 599

If this were a thread about H1B, people would be screaming about how Americans won't work AT THAT PRICE POINT. Devil's advocate, why won't older workers work at that price point? One place I've worked at hired a number of older workers who had made career changes late in life, and came in older but not really experienced. If I looked at our hiring practices it was obvious that they were trying to get a diverse workforce (nothing inherently wrong with that) so it would be really hard to say that we were being discriminatory.

It seems like in a bad economy the _very_ experienced software designers and architects (often older) get screwed because nobody wants to pay that much and are cutting back on the really far reaching products that need that level of talent. That's where I think we are right now.

Comment Re:And now for reality. (Score 1) 354

If you're involved in the game industry then you should know that just because the technology is out there doesn't mean it's cheap to license or even available for a particular platform. 1970's is pushing it way back. If that was true then it should have been no problem for the Sega Saturn or any other platform in the 90's that could use multiple CPU's, but it was an issue because there was no developer support or people didn't want to program games in FORTRAN to take advantage of multiprogramming. OpenMP goes way back to the 1990's but by 1997 we had exactly one fully compliant C++ compiler, cross-platform C++ was a joke, and half the libraries didn't compile here or there or were not lean enough for or supported features that were not practical on embedded or game systems. Library support in general on game platforms, even the devkits, sucked ass all the way through the playstation 2, in particular at launch.

I remember using a set of macros that let you define blocks of C code that could run in separate threads, but it wasn't cross platform at all. There were other products that were too bloaty or we could not incorporate into a game without violating licensing with either the library or with any of the platforms. Nobody I knew was even using C++.

Basically starting in around 2006 as you said, because the PC platform as you said was starting to become multi-core, was when it started that you had all the libraries you might need, C++ was commonly used on every platform, code was pretty portable. Even now it is totally realistic to license a game engine that is still single-core for your game that predated 2003 in origin, like Unreal Engine. What I am getting at is that now is the multiprogramming time, and it didn't really start becoming practical until we got to the convergence C++ ubiquity, C++/library portability ubiquity, C++ feature ubiquity, multiprocessing library ubuiqity, multiprocessing platform ubiquity, and multiprocessing engine ubiquity (still not there yet but close.)

I am not considering client-server and offloading stuff like network/sound to other threads/processes because if you were on a platofrm that supported that that already had been capable for a long time and that is not the stuff that it killing your performance. On older single CPU systems using a multithreaded engine was just stupid because even if you shunt off something you need to another thread you are just gonna wait around for the result or drop frames if it took too long. So why bother changing the model to something confusing and unnecessary and just put everything in a single-threaded loop. That's the way everybody had been doing it since forever, and there was no reason to give a shit until recently so why spend all the money and time. Well as you said, the time is now, and things are changing. But I think saying things like it's been possible FOR THE GAMING INDUSTRY since the 1970's is just ridiculous. I can't see any reason why anybody should have cared until the last couple of years.

Comment Re:Why redirect them? (Score 1) 512

Past initial development (coding to standards,) sometimes up to another 30% is needed to get one of the sites I work on working correctly in IE without breaking all or some of the other major browsers. Those development costs are the fault of Microsoft. Microsoft is the parasite, not the companies that simply want us to support whatever browser has the most users and then the browser changes on them in a few years. Imagine if the web development world could bill Microsoft directly for the extra costs associated with supporting their crap browsers.

Comment Re:one error will invalidate a computer program?!? (Score 1) 505

That statement was kind of breathless, but the study he was citing focused on bugs that specifically affected the accuracy of the output, and found that they were a common occurrence. I agree with the author, if you are going to use a computer program to get results, you need to publish code otherwise your methods are packaged in a black box. A lot of people don't want to do this because scientific code is not usually done by people we can say are knowledgeable in how to write reliable, verifiable code. It's usually a pieced together means to an end. Not that there's anything wrong with that, IF it can be available for verification. I HATE reading studies that for example constantly refer to a dataset and then never give you the dataset. I guess unlike many people I don't naturally trust the authors to be perfect.

Slashdot Top Deals

The rule on staying alive as a forecaster is to give 'em a number or give 'em a date, but never give 'em both at once. -- Jane Bryant Quinn

Working...