Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Pity there isn't a -1 ; Conspiracy Theory mod (Score 5, Insightful) 246

Slashdot needs ones. Seriously, for a community that claims to hate FUD, the OSS types sure like spreading it when it is about the "right" groups. If you actually care about what kinds of things the telemetry communicates back at various settings, the information is all out there for you. No, SSH data isn't one of them. However I am going to imagine you don't, and this is just crap you want to fling at "the bad guys" because you can.

Also a thought for you: Your OS, by definition, has access to anything any program on the system is doing. What would stop it from looking in at any 3rd party SSH server you ran, if you think it does that?

Comment Not going anywhere in data centers (Score 1) 406

I have bought some Dell R430 and R730 servers, which are latest generation (Haswell based Xeons, DDR4 RAM) and guess what their one and only video output format is? That's right, a VGA port. No DVI, no DP, just VGA. No surprise either: Go have a look at high end network'd KVMs. They are all VGA. It works, so it is staying around in that space (same deal as serial for that matter).

It is certainly a standard on the decline, digital transmission makes more sense particularly since our displays are digital these days, but dead? Not hardly. I'm sure it'll be around many years from now, just in more niche areas.

Comment Re:Not to undermine the enthusiasm but... (Score 2) 180

Apache does not require release of source AND does not have the advertising clause of the "older" BSD licenses, so I'm not sure what about this project might be violating the Apache license. Overall Apache is pretty permissive and it's hard to violate except by providing source code but claiming said source code as your own (e.g. removing copyright notices and replacing them) - strangely, releasing a binary-only Apache component with no advertising (e.g. every Android device on the market except for Nexus devices) is more legal than releasing source for the component that removes all original author credit. I do recall the Android-x86 guys were pretty unhappy about RemixOS being effectively a for-profit kang, but the nature of Apache was such that there wasn't much they actually could do about it.

GPL is a whole other story. If they are failing to provide sources in compliance with the GPL then they can burn. (Technically they have to provide sources themselves, but I will let someone slide who says "We used unmodified upstream sources which can be found at X" when there is no evidence to the contrary.)

Comment Re:Another good idea that will get shut down (Score 1) 180

Yeah, as much as I like Android, it is not suitable for desktop systems except for a few special niches.

I can see it making sense for someone to do an HTPC build that was Android-based in order to run Android games. But to be honest a SHIELD Android TV would be an easier/better/likely cheaper solution for Android games, especially since many of them only have ARM native components and have a severe performance hit on x86.

It makes no sense as a desktop/educational OS right now - which is why the Pixel C has been slammed by so many for having an OS inappropriate to the type of product.

Comment It does when they buy it for work (Score 2, Insightful) 80

The reason I'm very anti-Apple is particularly our younger professors decide that they need to have apple computers, phones, and tablets to be hip. So they get them, against recommendations. Now never mind that these cost a lot more money than they'd spend on equivalent hardware but then the support issues start. Turns out that Mac don't just magically work, and they have problems with things (accessing the central storage is something Macs have been particularly problematic with) and they whine to us despite promising that they understand and will support things themselves.

Apple wants to pretend to be good for the enterprise, but their enterprise features are garbage. So people get them, want them to integrate, they don't, and then they cry about it.

Comment Seems obvious to me. (Score 1) 311

Simply make any use of the well-known logical fallacy types a crime against sanity, then see which methods are used in court. I think exile to Mexico would be a good punishment for those found guilty. Appeal to CowboyNeil would be categorized as "heresy and witchcraft".

Comment They also can be useful in lower end apps (Score 1) 71

If you want something that uses less power. It is as true today as ever that you can do more with less juice in an ASIC than in software. So sure, you throw a big CPU at something it can often do the trick. But maybe you don't want a big CPU and associated support hardware, maybe you have a reason to want something lower power. In that case, dedicated hardware comes in.

Also I think many people who dis hardware firewalls have never seen really difficult networks. It isn't so much the traffic that causes trouble, but the number and randomness of connections. I work on a university campus and we were getting firewalls back in the early days of them as dedicated appliances. On paper, our network as easy, we only had like an OC-3 (155mbps) to the Internet and you could get 1gbps firewalls no problem... ya those fell over the moment they were turned on. They could not handle the nature of our traffic. We ended up getting some of Cisco's very first hardware firewalls, and they worked well.

Comment Re:A difficult subject (Score 1) 308

Absolutely, on all points. I'd perhaps add one other - we've got potentially good diagnostics, but they're not used for this, they're rare and they're horribly expensive. (Problem is, it's longer and less clear than yours.)

An example. Hospital MRI scanners are around 2 to 2.5 T, which gives sufficient resolution to see severe injuries and malformations but not much more. Medical scanners can go up to 7.3 T and research scanners actively used go up to 9.1 T. At this upper end, blurred sections of the brain are almost crystal clear. You can see not quite to the neuron level but fairly close. Subtle issues can be detected. It's more than good enough to find out if there's a problem with mirror cells, bandwidth issues (too much or too little) and similar fine-scale deformities. The best scanner that can be built that can take a human head is around 13 T. It's unclear what this would show, I've not been able to find any info on it

I wouldn't ask psychiatrists and neurosurgeons to have an underground bunker with dozens of such devices armed with top technicians at the ready, although if one of them is the sole winner of the US Powerball at its current 1.2 billion dollar level, it would be nice if some of it was spent on such things. However, MRI as a diagnostic tool is strongly discouraged, apparently, which seems to defeat its value as a means of rapidly identifying and classifying evidence you can't otherwise get to.

I counted the total number of scanning technologies (excluding minor variants) and came up with 33 different diagnostic tools that could be used at the level of the brain. Of those, I have only known two of those to be used in practice (EEG and MRI), never even remotely close to the levels of sensitivity needed to analyze the problem unless, as I said, there's a problem at the grossest of levels. EEG, for example, is performed with as few leads as possible and the digital outputs I've seen look like the ADC is cheap and low resolution. Nor have I ever been impressed by the shielding used in the rooms (the brain is not a strong source, so external signals matter a lot). I've read papers where MEG is used, but it seems to be almost exclusively research with very, very few hospitals actually using it.

This doesn't contradict your statement that there are no good diagnostic tools, partly because nobody has the faintest idea if these tools would be any good in diagnostics (as it's forbidden by the great overlords), how you'd read the data (if it's not actually used then nobody can understand the output at all, and if it's used but never for diagnostics in mental illness then there's no means of understanding what the output means in this context).

That's just the bog standard medical gear, though. Whilst it should be useful (your experience shapes your brain, your brain shapes your experience and this recursion should mean that you can identify traits of one from the other), there will be other tests. In fact, there are. There are hundreds of questions that make up the official test for autism spectrum disorders, but I've only heard of (and then second-hand) one doctor actually running through them. Most glance at the DSM (which is worse than useless and the criteria listed in it are largely rejected by both the checklist and those definitely in this category) and that's it. The checklist is probably not optimal and is probably incorrect much of the time as autism has a very wide range of causes (both known and suspected) and congealed categories of unrelated conditions won't work with any single checklist. Researchers hotly dispute even when it can be diagnosed and at what age it first appears. That's clearly not very helpful.

But that's positively enlightened compared to something like "Borderline Personality Disorder" (a label given to anyone who doesn't fit any billable category and is generally considered not worth wasting time on by the medical and psychiatric professions). Here, there really isn't an actual diagnosis as such, just an identification that there's a problem and that it's not something insurance will pay for.

We need a good, solid ontology of mechanisms of mental illness that forgets tradition and costing, whether there's any lawful or technological way to detect them or not, because those can be measured (even if only in principle) consistently and reliably. It's not enough, by a long way, but at least there will then be a clear indication of what the gaps are, what precisely we don't have diagnostic tools for (and perhaps even theory for).

This would also be the starting point from which medicines or therapies can be developed. You're right about side-effects. About half my current medications are there simply to counteract the side-effects of other medications. One I was put on, temporarily, shut down my colour vision. The problem? Doctors had to experiment on me. They had no idea what would happen until they tried me on something. I really do not like being used as a lab rat when the long-term effects of even short-term exposure is unknown but where it's known that even the short-term effects include death even at the lowest end of the therapeutic range with no understanding of why or to whom this will happen. It strikes me as... all a bit vague.

Slashdot Top Deals

"Card readers? We don't need no stinking card readers." -- Peter da Silva (at the National Academy of Sciencies, 1965, in a particularly vivid fantasy)

Working...