Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:It's good if they don't code like 90s C++ devs (Score 2) 298

Techno-machismo teens playing games trying to get their code into the least number of characters and the least amount of memory.

Given how many programs I've used over the past decade that have required unreasonably high amounts of RAM, and/or were subsequent releases of software that solved a particular problem in earlier iterations but required 1/10th the memory to do it, I wouldn't mind a handful of these guys getting their desks back. Adobe is a great place to start.

I've had to fix or test so much of this junk and it's still just plain stupid.

I'm by no means a programmer, but from the handful of times I've seen what you're talking about, I'll say this: if it's possible to save 10% of RAM by shifting the ease-of-use burden to the comments instead of the actual code, then as an end user, I'm 100% in favor of commenting the everloving hell out of difficult to read code that saves RAM when users are running it.

You don't have to save memory! Memory is there to make your code readable. Use it!

Wrong. Use as much as necessary, but no more. I might have 12GB of RAM in my laptop, but it's not all for you. It's for EVERYTHING I do, and when I start adding VMs or large After Effects projects to what I'm doing, 12GB starts to get pretty cramped. Some RAM usage will be inevitable, but wasteful RAM usage is wasteful.

I most certainly have plenty of admiration for the Demoscene.

Comment Flashback to 2011 (Score 1) 160

http://tech.slashdot.org/story...

Seriously guys, when Microsoft 1.) had the idea years ago, 2.) has the investment capital to give this a viable shot, and 3.) with Azure, has an immediately viable and marketable need for a set of servers that can be dynamically powered up and down...and THEY haven't gotten it to be a viable idea...I sincerely doubt that a startup in the Netherlands will have greater success.

To be fair though, one would imagine that the Netherlands is colder, for more of the year, than the majority of the continental US. Still, servers coming up and down with the thermostat does not seem to be a good enough idea to be of real assistance.

Comment Re:Simple (Score 1) 166

Due to the laws of nature messages can safely be assumed to have been transmitted no later than the time of reception.

I wonder if I should patent this...

Yes, patent it. However, be sufficiently ambiguous in your patent that it will somehow apply to time machines. If the patent passes muster, you'll obviously receive a visit from either your future self, or a beta testing team that needs you to not file that patent. Either way, you'll likely end up either very rich, or very dead.

Comment Re:The sky is falling... or maybe not. (Score 1) 362

Tell me why I don't want secure boot and an OS signed by Microsoft or one of the mainstream Linux distributions.

Acronis True Image/Clonezilla/Ghost/Veeam.
Hiren's Boot CD/Active@ Boot CD/UBCD4Win.
GPartEd/PartEd Magic.
FreeNAS/Nas4Free/OpenFiler/UnRAID.
Endian/Smoothwall/Untangle/IPFire.
FreePBX/PiaF/Trixbox.
Univention/Zentyal/ClearOS.

Sure, desktops for office use will never have a problem. However, the inability to use desktop hardware for any number of other dedicated services is most decisively a bad thing, and the inability to boot into some form of a recovery environment is even worse.

If you're buying the desktops for internal corporate use, leave secure boot on and password the BIOS, everyone's happy.

Comment Re:Like Voyager's golden record? (Score 1) 169

I can already not play my 16bit DOS games on a modern computer without fancy emulation and that is less than 20 years.

Machine code may get a bit dicey, but also keep in mind that a lot of the development then was at a much lower level (frequently based on CPU cycle, not multicore, etc.). Around that era, WordPerfect 5.1 ruled the roost of word processing, and I bet you that you could open one of those documents in 15 minutes or less today.

NASA themselves have had issues with recovering data from moon landings and other missions that have been kept in storage.

...because they taped over them. That's a smidge different than being unable to read their own tapes.

Then take into account proprietary codecs. Just think back 15 years ago where everything online was Realmedia

Not only was Realmedia much more likely to be streamed than downloaded, but Realplayer is still around, as is RealAlternative, and VLC can read several flavors as well. "uncommon" and "unable to be used" are two different things.

what makes you think that MP3s will be around in 100 years?

Ubiquity. Portable MP3 players have been around for nearly 15 years. The format itself is about 25 years old. You can play back an MP3 on almost literally every OS ever written, there are a dozen FOSS applications to do so, and aside from iTunes going all AAC by default, MP3 is what everyone else uses. Admittedly, media formats are difficult to extrapolate for 100 years, but I'd bet that if one digital format was going to make it to its 100th birthday, MP3 would be the most likely contender (depending on whether ASCII text would be considered a "media format"; JPEG would be my second guess).

What about Quicktime or the heavily patent incumbered h.264 which every man and their dog are working to replace?

they're likely to die.

At the very least we would need to store with it a complete description of the coded format and the algorithm to decode it.

Then there's the hardware issues. What kind of hardware has a really long term data storage capability? Flash memory doesn't and there's evidence that some forms of flash memory need constant data refresh to prevent bit-rot. CDs don't and even if they do it's unlikely that the CD drive will be operational 100 years later. Harddrives suffer the same mechanical problems but even if they didn't how do we interface them? SATA? USB-C? Thunderbolt?

100 years is a VERY long time.

You're correct in this. One very possible method would be to store the same thing on 100 different flash drives, each of a different make and model - it's possible that the data would survive at least one of them. In the case of video, it's possible that the answer is "include a handful of DVD players" - specifically, one with an analog output. From there, it might be possible to include two flavors of disc - one with a garden variety DVD video (in the event it's possible to output a composite signal to something in 100 years), and the other specifically formatted to be visible from an oscilloscope - odds are good that those will still be around in 100 years, and analog video will be somewhat visible that way. One other nice thing about DVD is that the logo is fairly distinctive - even if the people who dig it up need to go to an antique shop or an outright landfill, it'd at least be a distinctive way to give a glimmer of hope.

Here's yet another question to ask - what do we say, and how do we say it? Language itself is likely to morph over that time. Time capsules are interesting for this reason - it gives a glimpse into what society thought was important then, and what the society of yesterday thought would be relevant upon reopening.

Comment only one reason why uTorrent is still popular (Score 3, Informative) 275

...because it's popular.

Older versions could fit on a floppy disk, and didn't require an Installshield Wizard. Now, it's not at Vuze levels of bloatedness (though Vuze beats to a different drum and has a pretty nice "content store" for Creative Commons content and similar), but it's gotten big and annoying. Transmission works on Windows (...and OSX...and *nix...and plenty of routers and NASes...) and is nice if you don't need RSS feeds. QBittorrent does RSS and is simple to use. Deluge, while being a bit awkward, does a good job. if you're into a super-configurable ecosystem, rTorrent has 101 plugins and browser based frontends, but can also run exclusively from the CLI if that's your thing. The list goes on and on, but utorrent seems to be coasting on inertia, nothing more, nothing less.

The interesting thing is that a similar "we'll borrow some unused CPU cycles" method of revenue generation caused a huge mess with Digsby, an IM client that was great and had a pretty good following until that point. Then again, with most technical folks opting for one of the plentiful alternatives to utorrent, I don't see this being a major impact.

Comment Re:"Conservatives" hating neutrality baffles me (Score 5, Informative) 550

I can't believe the bullshit I see from some of the "conservatives" I know who treat this like some kind of commie takeover of the Internet.

The foundational problem we're dealing with here is that the majority of the public doesn't understand how the internet works. The Slashdot crowd has long since learned to deal with that at a micro level. However, we hear different things than the rest of society. Net Neutrality to us means "the bandwidth and throughput of internet traffic won't be artificially limited based on its source or destination." To them, it means "The government will tell me what I can and can't post on my Tumblr blog". With no concept of IP routing, peering, or the Comcast vs. Netflix case that brought Net Neutrality into common vernacular.

Whether this is because "understanding how the internet works and what net neutrality does and doesn't impact" is a genuinely complicated topic, or because the Kardashians have killed far too many American neurons, is a separate topic entirely. To be fair though, if the government was indeed regulating what we could and couldn't post, could and couldn't say, or how we were allowed to say it...we'd be up in arms, too.

Comment Re:Law and Order: Bankruptcy Court (Score 1) 145

Let's face it, people: Hacking is boring to watch.

I'd concur with that - for most people, IT work is both boring and difficult to grasp. Part of it is laziness and stupidity, but it'd be unfair to place all of it under that umbrella - lots of what we do involves having some understanding of a dozen different other concepts that aren't immediately obvious.

I just watched the episode.(spoiler warnings) For the reasons stated above, I'll cut them slack for having the malware code glow red in their visualization - malware isn't always clear. However, I won't cut them slack for things like saying that game consoles have unique identifiers that enable the console companies to track pedophiles, while simultaneously showing the bad guy being tracked by that system where a bad guy is both smart enough to hack the firmware of baby monitors and dumb enough to use a system that could easily trace him that way, especially AFTER he becomes aware that the FBI is after him. I'll cut them slack for the concept of the baby monitor hack - its got its own list of messes, but y'need a story somewhere. I won't cut 'em slack for having the password tattooed on one of the guys - they're running that kind of operation, and they're never going to change the password? Not even partially obfuscate it by adding zeroes to single digits where everyone knows that you don't type the zeroes? On a more practical note, is that guy guaranteed to be there all the time so they could reference the password? Bonus round: the guy showed a holographic representation of a cadaver...because that was really necessary and couldn't have been done with a garden variety photograph...

At least for me, the general list of things I'm willing to overlook:
-UI mockups. CLI output only makes sense if you know what you're looking at, and the last thing anyone wants is more expository dialogue that doesn't advance the plot.
-Simplifying of IP addresses and their "tracing". I've seen enough Google Maps dots on machines without GPS to know that it's at least "close enough", unfortunately. "inadmissible in court" doesn't necessarily mean "useless", and again, we need a plot device.
-Character tropes. I don't look like a stereotypical nerd (no beard, not overweight, no glasses, don't live in the basement, don't have a game console), but Hollywood's got their rack of characters: If there's a white male, approximately 50 years old, and isn't the father of another male character, he's the bad guy. If there's a good looking white girl, she's probably someone's love interest. If there's a mother, her role is, generally, "mother", unlikely to do anything to truly advance the plot independent of her maternal context. Hispanic guy on a motorcycle = gang member. Dad: clueless and aloof, though sometimes has a single pearl of wisdom. The list goes on, and though I'm not a fan of that being the case, it's not "computer techs at the expense of everyone else", and we've got characters like Skye, Chloe O'Brien, and Jake Foley that were generally positive, rounded characters.

Things I won't give a pass on:
-logic fails, especially if I'm cutting slack for a part of one.
-unreasonable expectations of technology.
-unreasonable expectations of people (i.e. "make the situation dire enough, and time will never be necessary").
-simplistic love triangles (much as I love Fitz and Simmons, the "because they're both science" reason is annoying).
-nonexistent database relations - "show me a list of 40 year old females in Spokane, who are of Irish descent and whose great grandparents came through Ellis Island in 1899 that have bought P90X and are allergic to gluten."

Comment Re:The internet of crap ... (Score 1) 130

Stop rewarding them with your money for some shiny baubles which are doing nothing but spying on you and monitizing everything you do.

I do, actually, wonder if anyone has run this through Excel. Consider the following scenario:
Acme TVs makes a "Smart TV", intending to monetize the information gained from viewing habits and similar. Suppose said TV retails for 400USD. Does that sticker price reflect a subsidy from the marketing data they're expecting to get? If so, then the best thing we can do is to buy these TVs, then never connect them to the internet. This way, they've spent $425 to sell me a TV for $400. Either TV prices will go up, or they'll sell 10,000 TVs with only 5,000 reporting data back...if that proportion goes low enough, is it possible that, paradoxically, voting with our wallets could mean buying things we don't want so that it stops being profitable?

Comment monoculture again? (Score -1) 296

Just thinking out loud here, the IE6 monoculture was terrible, and we all hated it...and justifiably so. However, with Firefox, Chrome, Safari, and Opera all based on WebKit now, have we simply embraced a different monoculture? Admittedly the main difference here is that WebKit is more open than Trident, and the days of ActiveX and Java are more behind us than not...But is having an alternative render engine a better situation, or just redundant coding?

Comment Re:I guess it depends on where you live (Score 1) 101

I wonder if you live near where I live. It's actually pretty funny. My new-ish employer and I go back and forth a bit; he's on Verizon and I'm on T-Mo. He laughed at me when I told him that I really liked T-Mo and that their coverage was great. As I go from place to place, I've never once been without coverage, no matter where. In the office, I have solid LTE...and they have a range extender (i.e. paying Verizon to use your own internet coverage). They've listed basements where they can't get signal, and I'm like, "wait...you don't have service there?" Every. Time. I was surprised when I was at one site where he said that service was sketchy. I had full coverage, and got 6.9MBytes (yes, bytes, not bits) per second downstream.

Now, in fairness, the last time I took an Amtrak ride (last year), there were PLENTY of dead spots, but Verizon had noticeably fewer of them. If "middle of nowhere" coverage is important, then Verizon is still probably the better bet. I sure won't be switching, though.

Comment Re:The policy is pretty clear on windows.com. (Score 1) 570

Blows the subscription model idea out of the water.

tl;dr: "supported lifetime" seems to be the easy cop out here, especially since it is a term that doesn't apply to desktops in the same ways that it does in mobile, and trying to force desktops to adhere in that manner isn't a good thing...

True, but it raises other questions instead. "supported lifetime of the device" is a highly suspect phrase here. We see Apple giving phones and tablets approximately three years of "supported lifetime". Apple can do that for a few reasons, but amongst the reasons why the customer base generally tolerates it is because Apple releases new devices mostly-annually, so getting three years of both hardware and software improvements is usually a worthwhile investment for consumers - the "supported lifetime" is acceptable because of trade-up.

Desktops are a completely different animal. I suspect that a significant minority (if not a majority) of iPad users have a desktop or laptop in active use that is older than their iPad, and I'd similarly suspect that the majority of them would prefer to upgrade their iPad this year, rather than their desktop, given the choice of only being able to spring for one or the other. What we've done with desktops is had two functional tiers of existence: in-warranty (where the OEM generally updates and fixes things), and out-of-warranty (where 'the computer guy' handles this stuff). Mobile devices are more disposable, in no small part because repairing them and upgrading them isn't always desirable - either expensive, impractical, or both. This is the area where traditional desktop/laptop computing has always had an edge.

Moreover, it is highly irregular for OEMs to provide OS upgrades to their computers. I had an HP laptop that I bought with XP after Vista's announcement (but not its release), so I registered the machine and HP sent me a Vista disk and key. That was highly irregular, and had to do with my purchase date, not my warranty length. I bought my Origin laptop with a three year, soup-to-nuts warranty in early 2011. That ran out last year, long after Windows 8's release. No Windows 8 disc in sight, and while admittedly I didn't ask, I sincerely doubt Origin would have sent it to me, despite the laptop being within the "supported lifetime of the device" by most practical definitions. The machine does, however, get regular security patches for the version of Windows that it /does/ run. Mobile devices' concept of "updates" involve both "security patches" and "OS version upgrades", whereas desktops do not.

So how does all this tie into Windows being a subscription or not? Well, both "OS updates" and "supported lifetime" are more clearly defined terms in mobile devices, and bringing that paradigm to the desktop yields yet another issue: the "broom problem". If you're a Doctor Who fan, there was an episode this season where The Doctor was saying that if you take a broom and replace the head when it wears out, and then you replace the handle because it breaks, and then replace the head again because it wears out, it's not the same broom anymore. With desktops, we have the same issue - the hard disk dies, we replace the disk. We upgrade a 4GB RAM module with an 8GB module, we add a video card to play games, and the PSU to power it, then we want to SLI, so we replace the motherboard, which mandates a different processor, and change the DVD burner to a Blu-Ray burner...If all these upgrades and replacements happen over three years, we end up with a completely different computer than the one we started with - at what point does the Windows license no longer apply? Microsoft has arbitrarily pointed to "the motherboard", which in fairness is about the closest thing one can get to a reasonable component definition (it's got the most hardware-as-detected-by-Windows of any single physical component), but even those die or have some sort of issue or whatever. What if the CPU is supported, but the motherboard is not? I sincerely doubt Gigabyte is supporting this board anymore, but it is, in theory, compatible with a CPU I purchase today from Newegg. Conversely, if I put an old IDE DVD-ROM into a computer that's otherwise fully supported, does it no longer count as a "supported lifetime of the device" because one component is no longer serviced?

I don't like it no matter how you slice it. Either Microsoft gets super-duper strict with component swaps in order to uphold the definition of "supported lifetime", or Microsoft lets basically-anything qualify in the hopes that revenue is made elsewhere (tracking, mobile sales, and "apps"). The third option would be that MS goes "subscription", which then raises the obvious question of "exactly how crippled will the computer get if the subscription isn't paid", a line so difficult for Microsoft to toe that the closest present analog - a copy of Windows that's failed activation - relies almost solely on nag screens and a lack of wallpaper.

Finally, it's not entirely inconceivable that Microsoft isn't trying a new twist on their old standby - Embrace (new technology, 'services first' focus), Extend (market share by making it possible for every user to get the new version that is largely integrated with the new services), and Extinguish (make "Free Windows" progressively less useful with the shiny new "subscription" option a click away).

It will be interesting...and I'm most certainly hanging onto my plastic disc editions, just in case.

Slashdot Top Deals

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...