For someone who just needs a torrent every 3 months or so, this cat-and-mouse game quite annoying. How about making a Tor hidden service for things like thepiratebay, just like the silk road? ( https://www.torproject.org/docs/tor-hidden-service.html.en ). I am wary of suggesting it, because it will turn the powerful media lobby against Tor, but someone is going to have a fit about Tor sooner or later anyway. In fact, Tor is quite extreme, because it allows hosting of *anything* without any possibility for censorship. Most people (excluding me of course) would want to be able to censor some kinds of (more or less extreme) information, be it porn, exploits, national secrets or copyrighted material.
Dropbox uses LAN sync when available. Granted, people don't usually stick gigabytes of stuff on dropbox and expect it to sync immediately, but it's still a bonus when it's fast.
Maybe it's a bandwidth issue. At 480Mbit/s you can push 1366x768 24 bit colour at 20 Hz. There would possibly be some compression, but it can't be too fancy as you'd have to drive it from USB power (if you allow an external adapter then just use DVI or DisplayPort anyway). Now with USB3 there is 10x as much bandwidth and more power, but that only equates to a factor sqrt(10)~3 in linear resolution (i.e., bandwidth increases greatly with pixel density). Still, a factor 3 on 1366x768 is amazing resolution, so USB3 would be the way to go.
Isn't Google's music service DRM'ed, even if you "buy" the music? And things like Spotify are of course DRM'ed. Seems like even in music, it's reverting back to DRM. Seems like we're moving to a different model, where people don't "own" copies of entertainment / cultural works. It's not inherently a problem, if the majority prefers this model then people and business will be happy. There is one problem: things can be more easily censored and modified by government and business. (I'm against DRM, won't buy DRM'ed things to keep, I record some over the air TV and buy CDs [CDs have better quality than MP3s])
Correction: it's a(n) MOOC not MOOCS
I should add a disclaimer, I will not be buying this for a while, because I only use Linux. I will keep it in mind. And it's very cool that companies listen to the users (hopefully) and make better (or at least different) products
I hope the slashdot crowd puts their money where their mouth is then. It's a good idea, VPNs are always a hassle to set up and tune, so this would be welcome. I wonder, though, if "normal" people will try out this... On the other side, if you went the cloud route, you'd be the ten thousandth or so VPN provider, with only performance to differentiate the product. And you may even have lost out on performance, despite the channel bonding, if the competitors had servers all over the world.
I think there is hope for both business plans. The personal VPN server market hasn't been cracked yet. There was Hamachi, but it was bought by some company and not much happened. OpenVPN is as hard to set up as ever. NAT and firewalls mean that you need layers of fallback for reliable operation. I would suggest making a Linux version with low system requirements, in addition to the "Enterprise" linux version, because linux users will be overrepresented in the group of people who run always-on systems at home, and it could also run on VPSs. The enterprise VPN market is quite crowded, I can't say anything about how that will go. The hosted VPN market is equally crowded, but there is also a huge demand, partly because of inane geo-IP restrictions on various services. You'd have to sell it on speed, and speed is very much key for things like video on demand. I'm not sure about the value of channel bonding for personal use, as for many people their home connection or even courtesy wi-fi at coffee shps is significantly faster than the mobile connection, so switching to wi-fi when possible should give good speed and less monetary cost. This feature would be brilliant for enterprise systems though.
SuperGenPass has a lot of limitations due to its design, but its simplicity makes up for that IMO. It is not a password manager, just a hasher, which hashes the domain name and the master password into a unique 10 char alphanumeric password. Only one site I've used has complained about this, and that was eBay, which required punctuation as well. It can't handle well if a password must be changed (you can add something like "2012", "2013" to the master pw though). It is great that the passwords are stored nowhere, so there is no need for synchronisation or backup.
Password managers and SuperGenPass are a good solution, but too complicated for most people to use. The system suggested in the article doesn't work either. When a password DB is compromised there will be no entry in the audit hook. The audit hook will only give an elert too late, when the hackers use the password.
There are much better options for improving authentication. It's not easy to do without relying on a third party though, while still allowing logins from various new computers with little effort.
I hadn't heard of it, but you can sort of work it out from the name. I assume it means to require vendors to label food that is genetically modified, or has some such ingredients. I'm also in an area where it's not debated, but there are strict rules for what things can for example be called cheese, so I think GMO labelling would a be logical extension of existing laws. I'm quite indifferent to gentic modification as a technology, and I certainly see the benefit of producing more food. I would avoid it just to protest the asshole business practices of the companies producing the modified organisms though.
The 1. missing idea from my previous post is the output format. There is no reason to have documents be a stack of pages when they are displayed on a screen. It is absolutely boneheaded. There are solutions for producing HTML from TeX source, this was the first search result: http://hutchinson.belmont.ma.us/tth/ . I don't know why academics keep ignoring this and keep making PDFs which are only good for printing and for displaying on large monitors. There are many small devices which are better suited for reading (e.g. on the train), and PDF papers look like crap on them ( http://ask.slashdot.org/story/12/12/01/214255/ask-slashdot-tablets-for-papers-are-we-there-yet ). The problem with HTML is that it can't be saved locally and passed around easily. Maybe EPUB can help. The page I linked has a section on how to make EPUBs. So my suggestion is to have a prominent option to output to EPUB. Strike the collaboration features, we can handle using git or SVN for a few more years.
The 1. missing feature in TeX land is collaboration features. It's not horrible -- you can split the doc into files for different sections (don't know if you can do this in LyX) and use source control or Dropbox -- but it's not particularly elegant. Just having seamless integration with source control would be great: some kind of interactive conflict handling and easy committing of all dependent resources. It could also be useful for single-user projects to have revision tracking. Perhaps the Lyx project could be a git repository by default, but I would of course prefer if it supported SVN and anything else that comes along too. Something like the SVN integration for Eclipse would be cool, but it wouldn't have to be that comprehensive. Lyx would of course still have to support stand-alone files without all the VCS mumbo jumbo.
Bah, who cares about a few gigabytes on real computers (including netbooks too). Maybe sysadmins with hundreds of diskless clients care, but with installing TeX on a shared mount, that's no problem. And who worries about updates anymore when there's apt, yum and hundreds of hacked together solutions on Windows. Maybe sysadmins who have hundreds of clients who needs updates, but don't ahve unlimited bandwidth
Remote compilation is interesting at first glance though because it can take ten or more seconds to compile a large Latex file on a slow computer, and compilation is single-threaded, so having a really fast server for this could be beneficial. Most other text processing jobs don't require much juice, with the unfortunate exception of *displaying* PDFs. After compilation, the resulting PDF file will have a size of order a few MB, so there will be practically no transmission delay on a LAN, and a few seconds over the internet. The problem is to upload the content to the server, including all graphical content. No problem on a LAN, but it would be a nightmare for home users, because the upload is typically 10 % of the download speed.
The CLSI does allow for caching, but it requires an URL for the cached content, so you'd need another server just to hold a second cached copy of the files. It would be an interesting challenge for developers to write code to manage the uploads -- with correct queueing and error handling. In the end I think that the time saved by having fast compilation is going to be negligible (except for on a LAN, but then the sysadmins would have to set up 1) an upload server and 2) a compilation server, and this is probably too much, except possibly at huge universities and NASA and CERN). It seems more interesting to have a purely remote system *including an editor* on the web (no, X11 forwarding with LyX doesn't cut it, too slow). That way one could work on documents from computers without having to install anything, for example when one has to borrow a computer. This wouldn't be a LyX project though.
Seems like performance is a big concern for the submitter, and then LXC is a great idea.
My primary disk is a 5-disk array (with 2 SSD cache devices!). About 4 TB free. I had a RAID10 setup with four disks before, but it was getting close to full. 8 TB is overkill, but it was just a matter of adding a single drive. My backup disk is only 1 TB, so much of what I have are files that can be re-generated, or which have copies in other places. Also TV recordings. My *secondary* active disk a 500 GB hybrid disk in a laptop, and it's above 80 %. Tertiary active disk is the app storage on the Android phone, I suppose, and that's almost full too.
It's not *just* out of generosity. It is in Google's interest to have users submit as much data as possible to them, and the users are more likely to do so if the government doesn't have easy access. It's a great thing nonetheless, but it just happens that there is a positive correlation between the interst of Google and that of the users.