Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:*happy campers* (Score 1) 121

I played ET for days trying to figure out WTF was going on. I still don't know. But I did, oddly, enjoy playing it and trying to figure it out.

Actually, that's something I liked about SC2 as well -- I lost the first time I played, after many hours of game-play. By the time you figure out you've lost in SC2, a salvageable save-game was so old as to be basically useless, since one forgets all the places they hadn't visited or what they had and had not yet done.

FWIW, I actually wondered if someone would mention ET when I made that post. Thanks. :-)

Comment *happy campers* (Score 1) 121

In complete agreement -- Star Control II was the best game ever. I normally don't fan-spam on /. but dagnabbit I just had to chime in.

Of course, someone should take odds on whether or not a reboot can come close to doing as well as the orignal (the original #2 that is.. StarCon was a fine but simplistic game and StarCon 3 did not exist. IT DID NOT EXIST I TELL YOU). Still, I'll play a sequel just on the chance it comes close.

Total Annihilation was one of my faves as well... along with absolutely everything Atari did in the 80s. How the mighty have fallen.

Comment Re:Sigh (Score 2) 445

Of course it's pollution. The first google'd definition is: "The presence in or introduction into the environment of a substance or thing that has harmful or poisonous effects." (wikipedia's entry explicitly calls out light as a pollutant).

First, light is clearly a thing, and we've added it to an environment in which it would not have otherwise been. Second there are lots of studies that bright, constant lighting at all hours is harmful to the otherwise indigenous or natural ecosystems: light pollution has been linked to changes in melatonin production, problems with bird migration, sleep cycles in nocturnal animals, the ability of vulnerable animals to hide at night during normal foraging times. Here are a few links:

http://www.plosone.org/article/info:doi/10.1371/journal.pone.0056563
http://physics.fau.edu/observatory/lightpol-environ.html
http://en.wikipedia.org/wiki/Ecological_light_pollution

There are many many more. Sure some human benefits of illumination may outweigh these, such as safety, but with more options becoming available (more efficient, dimmer, more focused lights), those benefits can be had with a lower polluting impact. It's not just a problem for astronomers, although I would like to see the stars a bit better!

Comment I'd recommend LED Strips (Score 1) 445

You can get LED lighting fairly simply these days, and I think it's a lot better for outdoor use. Basically, think christmas tree lights but more subtle. You can get tubes or flatter strips that you can put pretty tastefully wherever you actually need to see. Consider lining walkways with dim LED strips rather than blasting everything with an obnoxious bright light. It's easy to attach them to deck rails or gutter lines. On a dark night they're enough to see what you're doing and where you're going and on a well moonlit night, well, you shouldn't need them. :-) You can light up a pergola well enough that you can sit and hold conversations quite comfortably... to me the softer lighter light feels more natural than a single bright beacon on a pole.

They also have the advantage of being long-living and low cost (typically as they're overall lower wattage than huge floods).

Search amazon for "rope light" or "led strip light". Pre-strung ropes with plugs are the simplest, but you can get long strips of light that you can daisy-chain which require special ballasts (AC adapters).

Comment Local vs. Hosted (Score 1) 196

The question doesn't really specify much about how VMs are used. I use a laptop and desktop with straight-booted Windows and Ubuntu for a variety of things, when I want a dedicated machine (normally either for video games or something CPU/GPU intense like Image/Video processing), but I host local VMs on those machines when I'm multi-tasking (which normally means programming or data manipulation on one hand and Netflix and web browsing on another).

BUT, I only use VMs on my "servers"... boxes with more consistent usage patterns, anyway. Mostly at this point I use Amazon EC2 which is virtual but kind of cheating since its easy to treat the machines like standalone remotely hosted entities. Still, technically that's all-VM all-the-time.

Comment Re:"I have done nothing wrong" (Score 1) 447

Oddly, I think you are correct but for the wrong reasons. The court seems to hold that Treason can only happen with "enemies" when war has been levied. Since the US is not at war with Russia or China (or the UK for that matter), Snowden is probably in the clear on Treason. If they were, however, providing classified information to them could qualify as giving "aid". As Snowden continually seems to consider himself an American, "adhering" is probably not going to stick in any case as well.

Still, the U.S. has not formally been at war since WWII, I believe, although congress has authorized military action since then, and it's unclear if those actions would qualify someone as an "enemy", for purposes of Treason. If it could be shown that Al-Qaeda, (for example, or someone with whom we're in military conflict, or someone who has declared war on the US even if we did not reciprocate formally), received this information and made use of it, would be the only plausible path I can see for a charge of Treason, which I suspect the courts would eventually deny. "Lesser" (than treasonous) criminal charges, on the other hand, are far more likely to succeed.

So, yes, I think he is in the clear, legally, on Treason.

WSJ recently had a good article on it: http://online.wsj.com/article/SB10001424127887324688404578543410828226862.html

Comment Re:To quote Einstein (Score 5, Insightful) 381

I think you're confusing feature-creep with a comment that was meant to be about edge-scenarios. Allowing someone to configure parameters that were never spec'ed to be configured is feature-creep (gold plating, extra coding, call it what you will), and I agree should be avoided and adds unnecessary (or not obviously necessary) "complexity".

Handling an edge criteria that was implied but not explicit in a specification is what is typically meant of "corner case", and is not the same thing you described. Recognizing that the customer asked for something logically impossible (they want two data sets to reconcile, but they are at unexpectedly incompatible cardinalities), or something that, upon investigation while building an app, wasn't precise enough (they asked for this to be their standard green, but their standard list only includes red and blue).

It's nearly impossible to specify all of those prior to coding, which is why the typical "waterfall" development techniques have fallen out of vogue. You're always going to learn things while coding, and this is one of the main contributors towards apparently unnecessary complexity. If I design version 1 of a program perfectly, and customers have new requirements for version 2, it's unlikely that the "simplest" implementation of version 1 will be the one that is most conducive to an upgrade. You end up with a choice between refactoring completely or sacrificing some efficiency and simplicity to graft the new features onto an otherwise good version 1.

I think Dr. Dobbs is nitpicking, though. There are definitely many ways to address, measure, or understand simplicity, and I agree that it should not be THE goal in and of itself. But the idea of making code easy to read, easy to understand both in the micro and macro sense, and just generally "simpler", has many merits.

Comment Re:Who is "we"? (Score 1) 601

"We" is a tricky word, but ultimately I do find it ironic that it seems like more people are bent out of shape over the NSA/Snowden mess than were over the Patriot Act (although I have no numbers, I may be remembering history incorrectly). Many of "us" were against the explicit and well-publicized expanse of government surveillance powers and the erosion of privacy more than a decade ago that clearly precipitated the sorts of things the NSA has been doing since. Has been doing, quite possibly, LEGALLY, since, and because of things like the Patriot Act.

The fact that the NSA leaks brought more public discourse is probably a good thing, although I'm not sure I agree it was justified. But I think it was predictable given the post 9/11 tradeoffs ostensibly for safety over privacy, and I think "we" had a strong opportunity to tell the government not to go down this path long before Snowden's leaks came to light.

Comment Re:Bogus argument (Score 4, Insightful) 311

If you're worried about the lineage of a binary then you need to be able to build it yourself, or at least have it built by a trusted source... if you can't, then either there IS a problem with the source code you have, or you need to decide if the possible risk is worth the effort. If you can't get and review (or even rewrite) all the libraries and dependencies, then those components are always going to be black-boxes. Everyone has to decide if that's worth the risk or cost, and we could all benefit from an increase in transparency and a reduction in that risk -- I think that was the poster's original point.

The real problem is that there's quite a bit of recursion... can you trust the binaries even if you compiled them, if you used a compiler that came from binary (or Microsoft)? Very few people are going to have access to the complete ground-up builds required to be fully clean... you'd have to hand-write assembly "compilers" to build up tools until you get truly useful compilers then build all your software from that, using sources you can audit. Even then, you need to ensure firmware and hardware are "trusted" in some way, and unless you're actually producing hardware, none of these are likely options.

You COULD write a reverse compiler that's aware of the logic of the base compiler and ensure your code is written in such a way that you can compile it, then reverse it, and get something comparable in and out, but the headache there would be enormous. And there are so many other ways to earn trust or force compliance -- network and data guards, backups, cross validation, double-entry or a myriad of other things depending on your needs.

It's a balance between paranoia and trust, or risk and reward. Given the number of people using software X with no real issue, a binary from a semi-trusted source is normally enough for me.

Comment OpenVMS and PDP's relationship... (Score 1) 336

"Not sure about the OpenVMS vs PDP comparison"... Since one is software and one is hardware, the confusion makes sense, but the comparison was basically valid:

The PDP-11 was 16-bit and gave way to the 32-bit VAX-11 (except, apparently, in nuclear power plants). The operating system developed to run the VAX-11 system was VAX-11/VMS (later just VMS). As the OS matured and Digital Equipment Corp (DEC)'s hardware moved to the Alpha CPUs, VMS matured along with it until HP eventually ported it to Itanium as OpenVMS.

While all of those hardware platforms ran multiple systems (early UNIX variants even ran on the PDP-11), VAX/VMS were tightly integrated to anyone that worked with them, and the PDP/VAX lineage was well established. In that way, comparing the lifecycle of PDP-11 to OpenVMS makes a lot of sense (and, honestly, I'm very surprised by which lasted longer!).

Comment Re:So what is it? (Score 2) 166

It's a news aggregator... the progression of RSS/ATOM readers, if you will. I like them -- I get most of my news through random other places (like Slashdot), but for things that may slip by like rarely updated must-read blogs, I liked google reader. It's pretty easy to live without, but it did combine like 10 sites that I regularly hop around down to one link, which was helpful.

Comment Problem Solving Is Why (Score 1) 656

Parent got it right; it's about problem solving. I have dual math and cs degrees, and while most of the actual math escaped me decades ago (I couldn't solve half the diff-EQs or integrals now that I could in college), the practices and thought processes have (IMnsHO) made me a better programmer. Programming is about efficiency as much or more than it is about knowing any specific language or being able to execute a particular task. Most importantly, I think is the ability to have faith that your code is correct and complete... proofs in linear algebra and number theory were immensely helpful for that. Testing edge cases and knowing that your loops will terminate properly flex the same muscles as proofs by induction. I think of Pollard's rho more doing database programming than I did in math classes, but I'm glad someone pointed it out to me there.

Math can also be directly applicable depending on what you're going into. Visual and game programming is full of geometry and trigonometry. Artificial Intelligence, Big Data, Data Mining all require statistics, hashing algorithms, efficient tree traversal, and all sorts of things that span the boundary between CS and Math. In the end, though, all of programming is just implementing algorithms, and all algorithms are just math problems. The two complement each other brilliantly.

Comment Commodore "Compute's Gazette" Magazines (Score 1) 623

My dad brought home a commodore and I subscribed to Compute's Gazette (I think that's what it was called) -- a magazine with a lot of commodore stuff in it. One thing they had was pages and pages of bytecode that you had to type in with no debugger or syntax to speak of. I learned a LOT from that, and from the built-in basic the OS had. The first thing I really remember programming to completeness was a Julia and Mandelbrot set generator... in Commodore basic. It was not fast; I could see the program drawing pretty much every pixel. Good times.

I ended up with a degree in computer science, but I'd say that was more an opportunity to practice than it really was how I "learned" to program. Algorithms and Operating System classes had some concepts that I hadn't run across, and every class was an opportunity to learn or find new snippets of knowledge. But the formal things I learned like "Bresenham's circle algorithm" and topological sorts, or anything from the Dragon Book, the volumes of Knuth (ah), or the Numerical Programming books were important conceptually... probably good to make efficient code, and great ways to not recreate the wheel. But the only way to "learn" to code is to code.

Slashdot Top Deals

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...