Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment Re:Yes, it works fine. (Score 1) 91

Yes it does.

Well, yeah, without sound and with one 1fps or less. Meanwhile Steam can pump a game at 1080p@60 over the network without much problem, sound included.

Even ignoring the slowness when it comes to fast moving content. It's missing a lot of fundamental features, such as the ability to move apps between devices or screen sharing. You have to stop an app and restart it to move to another device. That you have to pipe the protocol through SSH if you want a bit of security also makes it more complicated to use than it's needs to be.

Comment Re:This is a pattern. It happens to everything. (Score 1) 91

Unix is build with a one-system:many-users mindset. That was a great idea 30 years ago. Today's reality however is the other way around many-systems:single-user, everybody has a tablet, a smartphone and a PC, sometimes multiple of each. Unix provides nothing to deal with that. You can try to export your home directory via NFS, but that falls apart the moment you have the same app running on two different computer and both want to access the same file. Some programs solve the problem at the application level, i.e. bookmark syncing in Chrome, but that doesn't scale to the rest of the system.

Once upon a time X11's network transparency was its claim to fame and it did provide a bit of a solution to the many-systems:single-user problem. But today it's close to useless. Ever tried to stream a video over X11's network connection? Doesn't work. The protocol just can't deal with today's workloads. Proprietary alternatives exist that can handle this much better.

In terms of security the whole idea of giving every app that a user runs access to everything that the user can access is also foolish. But at least there is a bit of hope in fixing that. Android solves that by running each app under a different user. Ubuntu tries to solve that by sandboxing. It's all still far away from being the default way you run apps on a desktop Linux, but at least people have recognized the problem.

Comment Re:You can't have it both ways. (Score 3, Interesting) 91

The big issue I have with HTML is that it's useless for publishing larger content, like books or even just multi-page articles. Thanks to hyper links it is of course possible to add some Next/Prev buttons to a webpage to represent such content, but those links are just hacks, not markup. eReader have developed their own formats (.ePub, .mobi, iBook, etc.) for accomplishing this task, but while they often are little more than a .zip with .html files, none of them are proper part of the Web and your regular web browser won't read them.

HTML had <link rel="next/prev"...> markup going back to HTML2.0, but it was never properly supported by any browser or developed into something that would be powerful enough to replace .ePub and Co. This to me is one of the big failures of the Web that nobody really talks about. The Web should be the place where you publish content, it should be the replacement of paper, but instead people are forced to use .ePub or .PDF for that task, as plain old HTML isn't doing the job.

The other elephant in the room are of course the hyper links. The Web still lacks any kind of content-addressability, it's all location based, thus when server go down or it's URL layout changes, all your hyperlinks break. Basic tasks like linking to a specific paragraph from another article are also not possible with HTML. Project Xanadu never got much traction, but it's really time for the Web to learn a thing or two from what they tried to accomplish back then.

Comment Re:GPL (Score 1) 176

I don't get why you think taking a portion of a GPL-licensed work means that portion wouldn't itself be GPL-licensed. Of course it is, like the whole work is.

The stance of the FSF is that game assets are not software and thus having a GPLed engine bundled with non-free assets is ok. The assets and the engine doesn't form a complete work that as whole has to be GPL licensed. Going by that interpretation taking the assets from a GPL game and including them in a proprietary work would be ok, as it's just the reverse direction of what is already common practice (see Doom, Quake, Dosbox, ScummVM). The GPL would apply to the art, but not to the engine.

The license follows the graphics no matter how you get them.

That's not the issue, the graphics will obviously stay GPL, the issue is if it does force other parts of the program to fall under the GPL. In a statically linked C project the issue is clear because all the used library become part of the resulting executable. In a lot of other context the coupling between the GPL code and the proprietary code is not so tight. Using a GPLed grep in a proprietary shell script is for example considered ok, but using a GPLed grep() function loaded from a dynamic library would be considered a violation going by the common interpretation, even so it's essentially the thing. The proprietary code wouldn't need to contain any GPLed pieces, it would just call them. In the case of graphics the only thing that couples the code with the assets would be a filename.

Things get even more complicated when you are dealing indirect couplings by third parties, not by the distributor of the proprietary application. For example assume a proprietary application provides support for themes and then a user builds a theme from GPLed assets. Is that a violation or not? What about plugins or scripts?

This whole situation really isn't clear and even the official GPL FAQ is a little vague when it comes to how to deal with plugins and basically leaves the user with a "it depends".

Comment Re:GPL (Score 1) 176

If you use GPL code, you take on that license. It's really that simple.

The only area where the GPL is clear is statically linked C code, everything else is very open to interpretation. Take for example a GPLed game. Now somebody comes along, takes all the graphics and releases a proprietary game with them. Is that a violation or not? What if he doesn't actually include the graphics, but loads them from the web or straight out of your git repository or their fork of your git repository? What if it's not graphics, but interpreted code code? At what point do the assets require the rest of the code released under GPL? You can go with "never", "always" or "up to a judge's interpretation".

Even the FSF can't manage to get a conflict free interpretation out of it. On one side, if you use one of their GPLed libraries, your main app shall be released as GPL as well. But on the other side they want the right to clone proprietary APIs without adhering to the license. So what is it? Shall APIs have copyright or not? The FSF wants it both ways.

Comment Re:Why does this matter? (Score 4, Interesting) 246

Did Youtube demand money from her in exchange for fire, theft, and kneecap insurance?

Yes, that's one of the main points of her open letter. Youtube has a system in place, Content ID, to stop piracy and it works quite well. The crux is that they only allow it's use to musicians who have agreed to license their content to them or at least that's assumed, as they don't publish any rules. Everybody else gets left in the dust and isn't allowed into Content ID and thus their content can be shared on Youtube without permission. Which according to her argument violates the requirements for "Safe Harbor" protection and makes Youtube guilty of mass copyright infringement, as that "Safe Harbor" law requires technical measures to be made available to everybody.

Comment Re:A Change in Society (Score 2) 157

I think before we see any large social changes we will see a lot of laws and regulation put into action against such technology. Social networks for example can lock away access to peoples photos behind a login and when logins are only given out to people who have verified their identity spidering becomes much harder. That's not even far future, many services already require a mobile phone number for id and some countries don't give you a mobile phone number anonymously. Collecting the data will still happen, but it will be much more troublesome and could be made illegal on top. So whenever somebody would offer a public search service, they could be shutdown relatively fast (unless Bitcoin and Tor make it commercially viable to put in the effort).

Given the advances in computer graphics I could also imagine that social networks will get flooded with lots of fake data. Face swap yourself into all kinds of photos and it will become much harder to find who you really are, since lots of information about you online will be fake. So maybe we end up with a general distrust about data we found online about a person.

Comment Re:C for insecurity (Score 3, Interesting) 104

I'd blame the OS instead. Giving each process full access to the system just isn't a good way to do things and constantly leads to problems like this. Python can stop some those problems, but it provides by no means a secure sandbox. If you access the filesystem in Python, you still have full access to the filesystem. In cases such as this the process should be limited to exactly the data it needs to get the job done, meaning an input image, an output location and a bunch of configuration parameter.

Comment Re:Why? (Score 1) 127

The lack of a packaging format for third party apps has been one of the biggest and most persistent problems in the Linux landscape for ages. I have no idea of the quality of the work Ubuntu is doing here and it does seem to duplicate the work going on with xdg-app, but I really wouldn't mind getting rid of tarballs and shelf extracting shell scripts. The monolithic dependency trees that package managers require at the moment just don't scale and never provided a good way for third party apps to plug into them (just adding random third party repos is inherently fragile and insecure).

Comment Re:Stop complaning already, remove the tinfoil hat (Score 1) 515

I have done that, but the update didn't carry over the applications from Windows7 for some reason, it presented a mostly blank install. The only two application it did carry over where some old copies of Word and Excel, both of which failed to run in Windows10. The rollback to Windows7 however seems to have worked. Given how aggressively Microsoft is pushing Windows10 update I would have expected a better tested upgrade routine.

Comment Re:You consented to the install... well sorta (Score 4, Interesting) 515

Yep, that is my experience as well. When you click the "download and install later" option, that's it, the update will now be carried out and you have no way to cancel it. The dialog box that is presented to you before the final update does not have a cancel button or a close button or any other means to not carry the installation out, you can delay the installation by some days, but you have to set a date for the install, there is no "ask me later".

Comment Modern games are dumb (Score 1, Interesting) 45

I don't think modern games would be a good choice for an AI training, as most modern games are extremely simplistic and build in such a way that the player can hardly fail at all. You have endless respawns, navigation markers and all that stuff to help you. They often also have level up mechanics that could be exploited by an AI. Old games like Doom and Quake seem to be a much better fit, as in those you have to actually navigate on your own instead of just following a magic quest marker. Those games also tend to have direct player control instead of the fly-by-wire you have in Assassins Creed where the character walks on his own and player input is just a lose suggestion for where he should go.

Comment Re:Will it ever get cheap again? (Score 1) 107

Price per GB has resumed dropping [cbsistatic.com] since the effect of the Thailand flooding and HDD consolidation in 2011-2012.

The most frustrating part isn't so much the price per gigabyte, as that is back to pre-flood levels, but that you only get that good GB/$ rate in the $100 price range, while you used to get that in the $50 range. If you only need 1 or 2TB you are still paying quite a bit more then five years ago, it's only with 3TB-8TB drives that you get a slightly better rate then pre-flood.

Slashdot Top Deals

The only perfect science is hind-sight.

Working...