Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:It's good if they don't code like 90s C++ devs (Score 2) 298

Techno-machismo teens playing games trying to get their code into the least number of characters and the least amount of memory.

Given how many programs I've used over the past decade that have required unreasonably high amounts of RAM, and/or were subsequent releases of software that solved a particular problem in earlier iterations but required 1/10th the memory to do it, I wouldn't mind a handful of these guys getting their desks back. Adobe is a great place to start.

I've had to fix or test so much of this junk and it's still just plain stupid.

I'm by no means a programmer, but from the handful of times I've seen what you're talking about, I'll say this: if it's possible to save 10% of RAM by shifting the ease-of-use burden to the comments instead of the actual code, then as an end user, I'm 100% in favor of commenting the everloving hell out of difficult to read code that saves RAM when users are running it.

You don't have to save memory! Memory is there to make your code readable. Use it!

Wrong. Use as much as necessary, but no more. I might have 12GB of RAM in my laptop, but it's not all for you. It's for EVERYTHING I do, and when I start adding VMs or large After Effects projects to what I'm doing, 12GB starts to get pretty cramped. Some RAM usage will be inevitable, but wasteful RAM usage is wasteful.

I most certainly have plenty of admiration for the Demoscene.

Comment Flashback to 2011 (Score 1) 160

http://tech.slashdot.org/story...

Seriously guys, when Microsoft 1.) had the idea years ago, 2.) has the investment capital to give this a viable shot, and 3.) with Azure, has an immediately viable and marketable need for a set of servers that can be dynamically powered up and down...and THEY haven't gotten it to be a viable idea...I sincerely doubt that a startup in the Netherlands will have greater success.

To be fair though, one would imagine that the Netherlands is colder, for more of the year, than the majority of the continental US. Still, servers coming up and down with the thermostat does not seem to be a good enough idea to be of real assistance.

Comment Re:Simple (Score 1) 166

Due to the laws of nature messages can safely be assumed to have been transmitted no later than the time of reception.

I wonder if I should patent this...

Yes, patent it. However, be sufficiently ambiguous in your patent that it will somehow apply to time machines. If the patent passes muster, you'll obviously receive a visit from either your future self, or a beta testing team that needs you to not file that patent. Either way, you'll likely end up either very rich, or very dead.

Comment Re:The sky is falling... or maybe not. (Score 1) 362

Tell me why I don't want secure boot and an OS signed by Microsoft or one of the mainstream Linux distributions.

Acronis True Image/Clonezilla/Ghost/Veeam.
Hiren's Boot CD/Active@ Boot CD/UBCD4Win.
GPartEd/PartEd Magic.
FreeNAS/Nas4Free/OpenFiler/UnRAID.
Endian/Smoothwall/Untangle/IPFire.
FreePBX/PiaF/Trixbox.
Univention/Zentyal/ClearOS.

Sure, desktops for office use will never have a problem. However, the inability to use desktop hardware for any number of other dedicated services is most decisively a bad thing, and the inability to boot into some form of a recovery environment is even worse.

If you're buying the desktops for internal corporate use, leave secure boot on and password the BIOS, everyone's happy.

Comment Re:Like Voyager's golden record? (Score 1) 169

I can already not play my 16bit DOS games on a modern computer without fancy emulation and that is less than 20 years.

Machine code may get a bit dicey, but also keep in mind that a lot of the development then was at a much lower level (frequently based on CPU cycle, not multicore, etc.). Around that era, WordPerfect 5.1 ruled the roost of word processing, and I bet you that you could open one of those documents in 15 minutes or less today.

NASA themselves have had issues with recovering data from moon landings and other missions that have been kept in storage.

...because they taped over them. That's a smidge different than being unable to read their own tapes.

Then take into account proprietary codecs. Just think back 15 years ago where everything online was Realmedia

Not only was Realmedia much more likely to be streamed than downloaded, but Realplayer is still around, as is RealAlternative, and VLC can read several flavors as well. "uncommon" and "unable to be used" are two different things.

what makes you think that MP3s will be around in 100 years?

Ubiquity. Portable MP3 players have been around for nearly 15 years. The format itself is about 25 years old. You can play back an MP3 on almost literally every OS ever written, there are a dozen FOSS applications to do so, and aside from iTunes going all AAC by default, MP3 is what everyone else uses. Admittedly, media formats are difficult to extrapolate for 100 years, but I'd bet that if one digital format was going to make it to its 100th birthday, MP3 would be the most likely contender (depending on whether ASCII text would be considered a "media format"; JPEG would be my second guess).

What about Quicktime or the heavily patent incumbered h.264 which every man and their dog are working to replace?

they're likely to die.

At the very least we would need to store with it a complete description of the coded format and the algorithm to decode it.

Then there's the hardware issues. What kind of hardware has a really long term data storage capability? Flash memory doesn't and there's evidence that some forms of flash memory need constant data refresh to prevent bit-rot. CDs don't and even if they do it's unlikely that the CD drive will be operational 100 years later. Harddrives suffer the same mechanical problems but even if they didn't how do we interface them? SATA? USB-C? Thunderbolt?

100 years is a VERY long time.

You're correct in this. One very possible method would be to store the same thing on 100 different flash drives, each of a different make and model - it's possible that the data would survive at least one of them. In the case of video, it's possible that the answer is "include a handful of DVD players" - specifically, one with an analog output. From there, it might be possible to include two flavors of disc - one with a garden variety DVD video (in the event it's possible to output a composite signal to something in 100 years), and the other specifically formatted to be visible from an oscilloscope - odds are good that those will still be around in 100 years, and analog video will be somewhat visible that way. One other nice thing about DVD is that the logo is fairly distinctive - even if the people who dig it up need to go to an antique shop or an outright landfill, it'd at least be a distinctive way to give a glimmer of hope.

Here's yet another question to ask - what do we say, and how do we say it? Language itself is likely to morph over that time. Time capsules are interesting for this reason - it gives a glimpse into what society thought was important then, and what the society of yesterday thought would be relevant upon reopening.

Comment only one reason why uTorrent is still popular (Score 3, Informative) 275

...because it's popular.

Older versions could fit on a floppy disk, and didn't require an Installshield Wizard. Now, it's not at Vuze levels of bloatedness (though Vuze beats to a different drum and has a pretty nice "content store" for Creative Commons content and similar), but it's gotten big and annoying. Transmission works on Windows (...and OSX...and *nix...and plenty of routers and NASes...) and is nice if you don't need RSS feeds. QBittorrent does RSS and is simple to use. Deluge, while being a bit awkward, does a good job. if you're into a super-configurable ecosystem, rTorrent has 101 plugins and browser based frontends, but can also run exclusively from the CLI if that's your thing. The list goes on and on, but utorrent seems to be coasting on inertia, nothing more, nothing less.

The interesting thing is that a similar "we'll borrow some unused CPU cycles" method of revenue generation caused a huge mess with Digsby, an IM client that was great and had a pretty good following until that point. Then again, with most technical folks opting for one of the plentiful alternatives to utorrent, I don't see this being a major impact.

Comment Re:"Conservatives" hating neutrality baffles me (Score 5, Informative) 550

I can't believe the bullshit I see from some of the "conservatives" I know who treat this like some kind of commie takeover of the Internet.

The foundational problem we're dealing with here is that the majority of the public doesn't understand how the internet works. The Slashdot crowd has long since learned to deal with that at a micro level. However, we hear different things than the rest of society. Net Neutrality to us means "the bandwidth and throughput of internet traffic won't be artificially limited based on its source or destination." To them, it means "The government will tell me what I can and can't post on my Tumblr blog". With no concept of IP routing, peering, or the Comcast vs. Netflix case that brought Net Neutrality into common vernacular.

Whether this is because "understanding how the internet works and what net neutrality does and doesn't impact" is a genuinely complicated topic, or because the Kardashians have killed far too many American neurons, is a separate topic entirely. To be fair though, if the government was indeed regulating what we could and couldn't post, could and couldn't say, or how we were allowed to say it...we'd be up in arms, too.

Comment Re:Law and Order: Bankruptcy Court (Score 1) 145

Let's face it, people: Hacking is boring to watch.

I'd concur with that - for most people, IT work is both boring and difficult to grasp. Part of it is laziness and stupidity, but it'd be unfair to place all of it under that umbrella - lots of what we do involves having some understanding of a dozen different other concepts that aren't immediately obvious.

I just watched the episode.(spoiler warnings) For the reasons stated above, I'll cut them slack for having the malware code glow red in their visualization - malware isn't always clear. However, I won't cut them slack for things like saying that game consoles have unique identifiers that enable the console companies to track pedophiles, while simultaneously showing the bad guy being tracked by that system where a bad guy is both smart enough to hack the firmware of baby monitors and dumb enough to use a system that could easily trace him that way, especially AFTER he becomes aware that the FBI is after him. I'll cut them slack for the concept of the baby monitor hack - its got its own list of messes, but y'need a story somewhere. I won't cut 'em slack for having the password tattooed on one of the guys - they're running that kind of operation, and they're never going to change the password? Not even partially obfuscate it by adding zeroes to single digits where everyone knows that you don't type the zeroes? On a more practical note, is that guy guaranteed to be there all the time so they could reference the password? Bonus round: the guy showed a holographic representation of a cadaver...because that was really necessary and couldn't have been done with a garden variety photograph...

At least for me, the general list of things I'm willing to overlook:
-UI mockups. CLI output only makes sense if you know what you're looking at, and the last thing anyone wants is more expository dialogue that doesn't advance the plot.
-Simplifying of IP addresses and their "tracing". I've seen enough Google Maps dots on machines without GPS to know that it's at least "close enough", unfortunately. "inadmissible in court" doesn't necessarily mean "useless", and again, we need a plot device.
-Character tropes. I don't look like a stereotypical nerd (no beard, not overweight, no glasses, don't live in the basement, don't have a game console), but Hollywood's got their rack of characters: If there's a white male, approximately 50 years old, and isn't the father of another male character, he's the bad guy. If there's a good looking white girl, she's probably someone's love interest. If there's a mother, her role is, generally, "mother", unlikely to do anything to truly advance the plot independent of her maternal context. Hispanic guy on a motorcycle = gang member. Dad: clueless and aloof, though sometimes has a single pearl of wisdom. The list goes on, and though I'm not a fan of that being the case, it's not "computer techs at the expense of everyone else", and we've got characters like Skye, Chloe O'Brien, and Jake Foley that were generally positive, rounded characters.

Things I won't give a pass on:
-logic fails, especially if I'm cutting slack for a part of one.
-unreasonable expectations of technology.
-unreasonable expectations of people (i.e. "make the situation dire enough, and time will never be necessary").
-simplistic love triangles (much as I love Fitz and Simmons, the "because they're both science" reason is annoying).
-nonexistent database relations - "show me a list of 40 year old females in Spokane, who are of Irish descent and whose great grandparents came through Ellis Island in 1899 that have bought P90X and are allergic to gluten."

Comment Re:The internet of crap ... (Score 1) 130

Stop rewarding them with your money for some shiny baubles which are doing nothing but spying on you and monitizing everything you do.

I do, actually, wonder if anyone has run this through Excel. Consider the following scenario:
Acme TVs makes a "Smart TV", intending to monetize the information gained from viewing habits and similar. Suppose said TV retails for 400USD. Does that sticker price reflect a subsidy from the marketing data they're expecting to get? If so, then the best thing we can do is to buy these TVs, then never connect them to the internet. This way, they've spent $425 to sell me a TV for $400. Either TV prices will go up, or they'll sell 10,000 TVs with only 5,000 reporting data back...if that proportion goes low enough, is it possible that, paradoxically, voting with our wallets could mean buying things we don't want so that it stops being profitable?

Comment monoculture again? (Score -1) 296

Just thinking out loud here, the IE6 monoculture was terrible, and we all hated it...and justifiably so. However, with Firefox, Chrome, Safari, and Opera all based on WebKit now, have we simply embraced a different monoculture? Admittedly the main difference here is that WebKit is more open than Trident, and the days of ActiveX and Java are more behind us than not...But is having an alternative render engine a better situation, or just redundant coding?

Slashdot Top Deals

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...