Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Oversimplification in the article (Score 4, Informative) 135

"As long as that SSD doesn't stall trying to pull blocks off the top of that queue, it really doesn't matter how deep it is. So if you have 10GB of free space on your partition, you only need to call wiper.sh / fstrim once every 10GB worth of file deletions."

This isn't necessarily true. Earlier Trim will improve the performance of the SSD drive because the drive knows more free space -- more free space allows the drive to 1) pre-emptively erase flash 2) coalesce fragmented blocks 3) more efficiently combine write blocks 4) perform wear levelling operations with less overhead.

Early trimming can have a similar effect to the manufacturer increasing slack space which increases performances on nearly all SSD's.

Comment Re:News? Stuff that matters? (Score 1) 93

Yeah but this "archaeological chemist" thinks that ancient Egypt was a desert, whereas most others have concluded that it was a lush rainforest, and that the people of that day were into farming on a large scale. That and there aren't many comments on this "nerd fair" article.

It wasn't a rain forest. Yearly flooding of the Nile provided fertile soil and water which allowed for farming of the flood plain. Go a bit away from the Nile and you were still out in the dessert rather quickly. This is why nearly all major ancient Egyptian sites are along the Nile River (whereas a rainforest would have allowed for a more geographical dispersed population).

The Nile no longer floods every year though due to the construction of the Aswan damn.

Comment Re:They printed off assembler (Score 5, Insightful) 211

Whatever your complaints about your job, at least debugging your code doesn't involve stepping through assembly on a pencil and paper virtual machine.

That was how I wrote my first published game back in the 80's. I have no complaints. Everything was new back then and even though the "wheel hadn't yet been invented", programming was still exciting and it was some of the most fun coding I have ever done.

Comment Re:How about benchmarking the binary? (Score 1) 196

We are targetting custom / closed-wall systems. Single binary EXE for user systems - No user level DLL's or external binary loading allowed (for security purposes). Incremental linking not allowed on retail or profile builds (it's a compiler level hack that potentially adds a thunk per function and at the very least adds one per function moved and also significantly changes the memory footprint for code memory layout from build to build on the functions you are modifying). Additionally, we use link pass optimizations (i.e. link-time inlining) on those builds on certain platforms which on retail and profile builds which require incremental linking to be disabled.

In the best case in the above mentioned builds, we do have about a 15 second compile time followed by the full link time.

But even without these limits, it very easy to find programs where a single change in a common header causes a full recompile and non-cremental link times on all large projects (especially ones with a lot of redundant template functions per translation unit) can grow significantly.

Comment Re:Compile time is irrelevant. (Score 1) 196

Faster compile times make for faster iteration... which lets you test global changes for examples - which optimizations actually work - more easily. Not to mention that having better iteration on a program usually produces a superior product.

Also, as a developer, faster compile times make my life a little less frustrating so I'll be less likely to pull out all my hair while waiting on the computer.

Comment Re:How about benchmarking the binary? (Score 1) 196

You obviously don't work on large projects where build times can be 30 minutes and link times can be 5-10 minutes on top of that. In the past we have tried just about everything possible to make our compiles faster because it allows more iteration and less time waiting on code building. This include minimize include dependencies and looking at dependency graphs, benchmarking distributed build systems (incredibuild), working with pre-compiled headers, examining unity-builds / unified builds (think one CPP that includes many other CPP's in the same system), etc. We also buy fast hardware (8 core CPU's with 16 threads), 32 GB Memory, and fast SSD's. All because minimizing build time is means more productive time for developers.

Comment Re:Citation needed that most Jag games ran on 68K (Score 2) 41

The fast co-processors (Tom and Jerry) didn't have instruction caches (as you would think of them today anyhow). They did have a small amount (4K) of directly mapped local memory. They were originally designed to run programs either in this memory or in normal memory. However, due to bugs in the chip, you could only reliably run code from the 4K internal memory. Since this was directly mapped, that meant all your code had to run in 4K. If you wanted to run larger programs, you needed a small amount of resident code that swapped functions or chunks of larger code into memory and did fixups on them and then ran them on the GPU. Most developers didn't have the expertise to do this themselves so indeed, a lot of game code ran on the 68K with certain heavy lifting functions (graphics transforms or blitter programming) happening on TOM and then usually just Audio/DSP (software mixing) on JERRY.

FWIW, the hardware was quite buggy as well. I think I averaged finding around one undocumented hardware bug per week in the various coprocessing chips while working on the system.

Comment Tempest X3 - Playstation (Score 1) 41

It's worth noting you didn't have to be "one of the 5 people with an Atari Jaguar" to play the original game. High Voltage Software did a port of the game to Playstation titled Tempest X3. I even did a very tiny amount of work on that project although I don't remember if I received a formal credit or not.

Comment Re:Wow. (Score 1) 333

What I want to know is who they had to waterboard to get insurance companies to provide information about their policies written at a 6th-grade level...

One benefit of Obamacare is standardizing insurance policies for what they will cover, eliminating many fine print items (like pre-existing conditions, age restrictions, setting standard limits for copays and out-of-pocket expenses). The only major differences are deductible, premiums, and doctor's network within an insurance class on the exchanges. This makes it much easier to make apples-to-apples comparisons and actually makes the free market of the exchanges work better for consumers.

Slashdot Top Deals

If God had not given us sticky tape, it would have been necessary to invent it.

Working...