Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Not impossible (Score 1) 58

100 words per minutes might be a stretch, but it doesn't sound all that impossible given that the speed record was set with hunt&peck typing by moving a cursor across the screen. Some fancy machine learning that could guess whole words at a time or something along the line should have no problem beating that by quite a margin. It wouldn't even need to be perfect, just close enough, to give a drastic speed up (i.e. like Tab-completion).

See this earlier work that guessed video sequences from brain activity. Getting information in a more 'holistic' form out of the brain instead by just cursor movement seems plausible.

Comment Re:I miss software that works. (Score 2) 467

Any old 8 or 16-bit software from decades past, if we have any of that software around today, it still works.

You are kind of ignoring the gap that existed back then between computer architectures. All your C64 programs wouldn't work on an Amiga. You couldn't read the data that you saved on 5.25" floppies in your 3.5" drive either. Each new computer generation essentially meant that you had to start all your computing from scratch, neither programs nor data could be carried over. The easy data transfer via USB or the Internet just didn't exist back then.

It took decades until we had working emulation and data formats and media that you could make work between different computer architectures.

Also if you just stick to the same hardware and OS, your software will still keep running even in the modern day. Windows XP might no longer be supported, but it runs just as fine as an old Workbench on an Amiga. There is some software that wants to be online activated, but patches for that exist in most cases and it's not that big of a deal unless you try to play a MMORPG where most of the computation happens server side.

Comment Re:Read my post again (Score 1) 311

This is the kind of circular argument that's impossible to refute. "Show me the numbers!" "Here are the numbers!" "No, those don't support my predetermined conclusion, so they must be fake." It's exactly like the Trump administration's attitude towards climate science, to pick one recent example. But whatever, the same strategy worked out so well for the fossil fuel interests that I guess I shouldn't be surprised that other ideologues have latched onto it.

Comment Re:No it won't (Score 1) 311

This is common knowledge to anyone who has worked in the field - it's like asking for a citation for the claim that eating too much junk food leads to obesity. But here are two data points:

http://blogs.sciencemag.org/pi...
https://en.wikipedia.org/wiki/...

So that's less than 20% of approved drugs that are discovered in academia to begin with. Academic labs aren't large-scale operations - a single-investigator R01 grant from the NIH might be $5 million over 5 years, and most investigators won't have more than a handful of these. For the really big superstar labs, let's assume a very generous upper bounds of $10 million per year (not all of which is necessarily from the government). If it's a big multi-investigator project, maybe double that. Except for a handful of big centers (like the NIH itself, or genome sequencing centers), academia just doesn't operate at a large scale - a typical university research department is just an aggregation of many smaller units that are largely autonomous. The hidden advantage to these organizational limitations is that failed projects usually fail before anyone spends too much money on them. So let's hypothesize at the extreme, academics spent no more than $50 million per drug candidate. Compare to the numbers in the Wikipedia article.

Now, you could of course argue that because drug development is informed by the public-domain knowledge generated by taxpayer-funded researchers, drug companies are leaching off the public in that way too. I guess that's technically true (albeit difficult-to-impossible to quantify), but you might as well argue that because the government invented digital computers, companies like IBM and Intel should have been nationalized. (Note that the difference in salary between academia and big pharma is relatively large - to shift more drug development to academia, you'll need to raise salaries, or find a lot of scientists willing to work for academic salary while doing grunt work on massive projects that will mostly likely fail.)

To pick a more specific example, the NIH spends approximately $1.2 billion per year on aging-related research (including but not limited to Alzheimer's):

https://www.nia.nih.gov/about/...

Most of that will be single-investigator grants, and as anyone who has worked in basic research can tell you, the majority of the grants that are funded won't lead to any immediate treatments, although they may provide useful information in the long term. In contrast, here is an estimate of the total cost per Alzheimer's drug being $5.7 billion (including failures, and keep in mind the overwhelming bulk of that is spent by drug companies):

https://alzres.biomedcentral.c...

This isn't to argue that taxpayer funding of basic research isn't valuable - it's absolutely essential IMHO. But most of what it produces isn't going to lead directly to new drugs or treatments.

Obligatory disclaimer: I do not work for a drug company, but I did receive funding from them as a government scientist, and receive a small bonus from IP licensing fees every year. Frankly it was far more trouble than it was worth; drug companies are kind of a pain in the ass to deal with, even if you only talk to the scientists.

Comment Re:No it won't (Score 1) 311

They do a few clinical trials after the government has done the really expensive stuff (what's called "Basic Research", IIRC).

This is simply wrong. The development process (which includes a lot more than just clinical trials) is far more expensive than the basic research component - and that's without even counting how many projects simply fail without anything to show for it.

Comment Re:here's where the road goes... (Score 3, Insightful) 126

The loss of control over personal computing and web browsing especially is completely self inflicted in the name of ease-of-use. If Mozilla wanted, they could have build a Freedom browser, but instead they build a crappy Chrome clone and to get a little bit of freedom back you have to install all kinds of third party addons (e.g. even basic things like saving video). It's mind boggling how featureless modern browsers are by default.

As for the DRM, it's tricky. If there is no standard, you either get no video or you get a proprietary plugin. The lack of a standard doesn't make DRM go away and companies have to problem breaking standards to squeeze DRM in there. I don't like DRM being a standard, but I don't think it will make things worse than they already are. On the plus side, if there is a standard it might be easier to crack.

Comment Re:What's the point of 140 characters to begin wit (Score 2) 77

People already use Twitter for long messages all the time, but due to the character limit they have to do it by posting images of text or other ugly workarounds. If Twitter wants to give priority to short messages, that's fine, but forcing people into workarounds is just stupid. They had the same thing already with photos, people used TwitPic and other workarounds for a long while until Twitter finally started to allow images. It hasn't killed Twitter, quite the opposite, it made it more versatile and useful. I would expect much the same happening with long text support. Not supporting more than 140 characters isn't a feature, it's a leftover from it's SMS beginnings.

Comment Not dead yet (Score 3, Informative) 230

Flash isn't dead yet. While most mobile webpages no longer use it, on the desktop you still see it pretty frequently. As for its impending death, that has been a long while coming:

* using Flash to design a whole website became mostly unnecessary due to HTML/CSS becoming more powerful

* using Flash for vector animation became replaced by regular video and Youtube

* using Flash as video player became unnecessary due to HTML gaining a <video> tag

I am not quite sure what happened with Flash and gaming, Newgrounds.com is still around, but you rarely hear about it anymore. Doing games in HTML with <canvas> and WebGL is now possible as well, but I don't really see those very often. I assume Unity and mobile gaming took mostly over what was once done in Flash.

However what really killed Flash was Adobe no longer supporting it. When software is full of security and performance issues, it's no surprise that people will move away from it. Flash got popular in the first place because it did things that your browser wouldn't be able to do on it's own. But while browsers got more powerful, Flash just sat there and didn't really improve much at all.

Comment Re:Yes, it works fine. (Score 1) 91

Yes it does.

Well, yeah, without sound and with one 1fps or less. Meanwhile Steam can pump a game at 1080p@60 over the network without much problem, sound included.

Even ignoring the slowness when it comes to fast moving content. It's missing a lot of fundamental features, such as the ability to move apps between devices or screen sharing. You have to stop an app and restart it to move to another device. That you have to pipe the protocol through SSH if you want a bit of security also makes it more complicated to use than it's needs to be.

Comment Re:This is a pattern. It happens to everything. (Score 1) 91

Unix is build with a one-system:many-users mindset. That was a great idea 30 years ago. Today's reality however is the other way around many-systems:single-user, everybody has a tablet, a smartphone and a PC, sometimes multiple of each. Unix provides nothing to deal with that. You can try to export your home directory via NFS, but that falls apart the moment you have the same app running on two different computer and both want to access the same file. Some programs solve the problem at the application level, i.e. bookmark syncing in Chrome, but that doesn't scale to the rest of the system.

Once upon a time X11's network transparency was its claim to fame and it did provide a bit of a solution to the many-systems:single-user problem. But today it's close to useless. Ever tried to stream a video over X11's network connection? Doesn't work. The protocol just can't deal with today's workloads. Proprietary alternatives exist that can handle this much better.

In terms of security the whole idea of giving every app that a user runs access to everything that the user can access is also foolish. But at least there is a bit of hope in fixing that. Android solves that by running each app under a different user. Ubuntu tries to solve that by sandboxing. It's all still far away from being the default way you run apps on a desktop Linux, but at least people have recognized the problem.

Comment Re:You can't have it both ways. (Score 3, Interesting) 91

The big issue I have with HTML is that it's useless for publishing larger content, like books or even just multi-page articles. Thanks to hyper links it is of course possible to add some Next/Prev buttons to a webpage to represent such content, but those links are just hacks, not markup. eReader have developed their own formats (.ePub, .mobi, iBook, etc.) for accomplishing this task, but while they often are little more than a .zip with .html files, none of them are proper part of the Web and your regular web browser won't read them.

HTML had <link rel="next/prev"...> markup going back to HTML2.0, but it was never properly supported by any browser or developed into something that would be powerful enough to replace .ePub and Co. This to me is one of the big failures of the Web that nobody really talks about. The Web should be the place where you publish content, it should be the replacement of paper, but instead people are forced to use .ePub or .PDF for that task, as plain old HTML isn't doing the job.

The other elephant in the room are of course the hyper links. The Web still lacks any kind of content-addressability, it's all location based, thus when server go down or it's URL layout changes, all your hyperlinks break. Basic tasks like linking to a specific paragraph from another article are also not possible with HTML. Project Xanadu never got much traction, but it's really time for the Web to learn a thing or two from what they tried to accomplish back then.

Slashdot Top Deals

The power to destroy a planet is insignificant when compared to the power of the Force. - Darth Vader

Working...