Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Comment Re:Life has taught me (Score 1) 165 165

Cost : "between NAND and DRAM."

Even if it was cheaper to fab right now than NAND, they wouldn't admit it, because they'd be less able to charge a premium price for it. I'm betting that since it has a higher density than NAND and a simpler construction, it will probably end up cheaper than NAND in the longer run.

And DRAM is horribly expensive to fab. So "cheaper than DRAM" leaves a large window.

Right now they are pitching it at the enterprise storage market but that's only smart business - while they ramp up production capacity, get the highest price for it you can.

Comment Re:How? (Score 2) 356 356

You see it a lot more if you are searching for other types of content through less than legal means.

If you want to torrent something, you'll get pop-ups of webcam girls, porn sites, etc, that you didn't ask for and weren't in the market for. I imagine for the youth crowd, that's probably the main way they get exposed to it - they want to torrent the latest Iron Man movie, and they get pop-ups for Iron Dick.

Comment Re:How? (Score 5, Interesting) 356 356

Option C: Get a subscription with a newsgroup service for a fraction of the money a porn site will cost you, download as much as you like over a securely encrypted connection, have plausible deniability as to what the content was.

Option D: Get one of those P2P thingamajiganibobs

There are so many ways to get porn on the internet other than the vanilla website-and-a-subscription method.

And porn has the ultimate "Long Tail". There already exists enough digital porn for virtually anyone with a normal-ish kink spectrum to whack off to something new twice a day for the rest of their life. Even if you destroy the porn industry (which this won't, because not every jurisdiction is stupid), people will still trade and use porn, with impunity.

Comment Re:How? (Score 1) 356 356

In this country it used to be that under-18 year olds could only get a Solo or card, which a lot of places used as a basic age check.

But I read that our banks don't issue them any more because they weren't as widely accepted, because they didn't let you spend funds you didn't have access to.

Comment Re:GPL is a valid option, but overrated (Score 1) 250 250

This is why so many projects require contributors to sign over their copyright when they commit code. They want to retain that control over the licensing centrally.

Copyright assignment is a speedbump in the road that helps to prevent contributions though. Many people are uncomfortable with the notion that their freely contributed work can be taken into a closed project. Many more people are just uncomfortable with the paperwork you need to do.

You only have to look at what happened to OpenOffice when it forked. OpenOffice kept the copyright assignment clause. Now the project is dying in a garden shed in the Apache server farm. LibreOffice ditched copyright assignment and behold, it thrives.

Comment Re:Creative might have been one (Score 1) 250 250

I can confirm that's true. Digging out the driver disk when you reinstalled was a total PITA. A lot of the secret sauce on the card was software, without DRM controls their earlier hardware could probably have been pushed to have most of the features of their later models.

The Linux drivers just work though. I remember booting Linux on that hardware the first time and seeing a colourful SBLive banner in the bootup messages and thinking "Huh. It works!"

Comment Re:Have they fixed it so 2 devs can work together? (Score 4, Insightful) 132 132

You do need special considerations for XML files though - there are several solutions

The weakest solution is to rely on the ability of the target user to spot diffs and correctly merge XML files. And also not to use automatic merging, ever, because the nature of XML files means that conflicting changes may not occur in adjacent lines.


The next (and inadequate) solution is to order the XML consistently - you can do this in your diff tool, or you can write your tools to produce a reliably ordered file in the first place.

Many tools that work on XML files exhibit what I call "juggling" - the elements and attributes change order when you change the value of them or their siblings, because the software is directly using the DOM to manipulate the file - and does this by creating new objects and removing the old ones from the collection. This is a real PITA for text-based diff tools because not all the changes will even conflict with each other (element sequences are often spread across multiple lines, more so if you put attributes on their own line to enhance the ability to merge).

So, you can either write your code to write a consistent order - usually by serializing a fresh XML stream from a model when you write the file.

Or you can add a layer that re-orders the document when you diff it - many of the available diff tools will let you do this. For some files, I used to write an XSLT sheet (to re-order elements consistently). For attributes, I wrote an extra option for Tidy that sorts attributes - doing that plus laying them out on separate lines is sufficient for many files. I've gone as far as writing custom tools that unpack HTML written into an attribute (with all the escape sequences that entails) into a CDATA section for clarity, runs it through Tidy, and then repacks everything after you're done.

Intermediate : I've thought of taking this a step further and converting the XML to a directory tree of text files designed to merge well, principally to make things clearer for end-users who currently have the kind of diff-tool-plus-converter described above but still occasionally make merge errors.
The next step is to write tools to specifically diff your model. This is probably a bridge too far for most developers, because we have the kind of brain that can abstract a text representation of the model and map it to the actual model that will be created. For end users, it may well be advisable.

Diff / merge tools are a field that need more work - currently the main users are developers who can cope with them being a bit immature. But we will increasingly see collaborative tools based on the kinds of version control that we take for granted, and normal users will need to be able to do this stuff too.

Comment Re:im sure the meeting was interesting (Score 4, Insightful) 132 132

> no more free development for .NET 4.6 with visual studio

Community supports the vast majority of useful features... and really, what's the problem with it costing money if you want more than 5 developers or have a $1M+ turnover company? You're still allowed Community if you're using it for classroom learning, academic research, or open-source development.

If you're working for a company that presumably makes money from writing software (in one way or another) is it really so bad to give some of that money to a company that helped you do that with their product? If you hire a developer, their salary is far more than the $1,119 it will cost you for VS Pro with MSDN ; do you really want to waste their time by making them write their code with a text editor and build it with just the .NET SDK tools?

I usually prefer SharpDevelop for my .NET dev but I've not done any in a long time - I'd be inclined to give Visual Studio a go, even if I've found it's prior iterations to be far too handholding and patriarchal.

Comment Re:Holy Jebus (Score 1) 220 220

And the most depressing thing about this?

It basically excludes new players from the market. Only the big firms have the resources to go through all that compliance paperwork. Which means the only people left are the lying, cheating, scum that caused the problem in the first place.

This has only fostered a risk-averse mentality that chokes every aspect of government and big business. No wonder it's the small firms that have the reputation for innovation - it's because they still have their innocence and aren't wasting 95% of their energy looking over their shoulder and checking up on things to cover their asses.

Comment it's a bit of a no-brainer really. (Score 1) 203 203

They already have by far the best remote-desktop service. [1]

A screen-recording tool just needs to be able to serialize the stream that RDP uses to disk - very efficient, very conservative of space. I can't believe that the Linux RDP tools don't already do this.

[1] "Best" as in - works, works well, has low latency and the tools are easy to get hold of and set up.

NX or whatever it has evolved into was very good in terms of performance, raw X11 blows over a network (which is ironic given that's part of it's design brief and one of the principle things holding back the Linux desktop environment for so long). Neither are easy to set up or use.

Comment Re:The NSA has done several things to help securit (Score 5, Informative) 105 105

Stronger for everyone except them, perhaps.

They did something similar, put a couple of specific constants, into the Dual_EC_DRBG random number generator. It was later shown that they amounted to a skeleton key - if you knew the numbers used to derive the constants, you could predict the future output of a given RNG instance with only a small amount of sample data. So any encryption based on Dual_EC_DRBG could be considered to be broken by the NSA (somewhat conveniently, in a way that only the NSA could actually prove).

Despite the poor performance of this algorithm which lead most implementers to ignore it, it managed to end up as the default in the product of one of the most trusted vendors, RSA. There was speculation that the NSA bribed them to make this design choice. [1]

Unsurprisingly, it was withdrawn from the standard in 2014.

[1] The only comment on that story makes the same point - that the NSA, in the past, had reinforced weaknesses in DES. In the light of the later evidence about Dual_EC_DRBG, that may bear further examination - if the change was the tweaking of constants, it's entirely possible that this reinforced the standard for everyone but the NSA.

The wages of sin are unreported.