Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:He should seek legal advice. (Score 1) 368

Those are the statutory maximums. However, there is a get out clause. The data holder only needs to provide a copy of the data which can be accessed without "disproportionate effort".

In other words, your name might have been mentioned a couple of times in an e-mail conversation, and those mail spools have now been purged under retention policies. However, there might be a great-great-grandfather backup tape with a snapshot of the exchange server on it, and that might contain e-mails referring to you. However, the effort involved in creating a new exchange environment, restoring the snapshot to it, and running a search is not reasonable for a generic information request (but retrieval might have been appropriate in the context of court-ordered discovery).

If the data holder advises that the costs of data copying are disproportionately large, they can refuse to provide you with a copy. If you insist, then they are entitled to charge you their legitimate costs in making the copy.

Comment Re:What a fuckup (Score 4, Informative) 368

The old system may not have been phased out completely - only phased out for new data. In fact, this is typically what happened with the older systems. Data was stored on MO discs, and stored on yards and yards of shelves. Although the data on the discs is in an open and standard format, the discs are an obscure and obsolete format.

When a new system was installed (which after about 2000 would have been networked with data stored on a large server, rather than individual discs/tapes) it would have been too labour intensive to convert the format - and indeed, the existing equipment may not have supported it, or if it did, it may have required expensive configuration on both the image acquisition device and on the server side (to set up a connection from e.g. a CT scanner to an image server is an expensive process - typically configuring the server's IP address in the "image destination" config on the scanner is a manufacturer service call out - $4k+; and there must be a matching entry on the server with the scanner's IP address - again, software vendor only setup + new image source IP address licence - $5k+).

So, even though the old system has been decommissioned for new use; the discs may still be available, and the workstation still functional, so that the discs can be read and the study examined by a doctor that needs it. However, there may be no way to transfer the data to a new format. E.g. the workstations may not have been fitted with a CD Writer; just the MO drive.

This means that there is no way for the hospital to get the data off an MO disc and onto a contemporary format (like CD or DVD). The only way to do it, would be to acquire an old external SCSI CD-writer compatible with the old workstation (which may be something obscure like a sparcstation or an SGI indigo II) from a specialist IT supplier - or acquire an MO drive which can be connected to a modern workstation with a CD-Writer, or network access (in fact, even that isn't the end of the story, the old equipment may have been unix/linux based, which means the MO discs might be formatted in ext2 - an MO drive on a Windows workstation won't help with that). It is entirely plausible that this is the first request they have had for the data to be migrated to a new format, and that the equipment and configuration needed would have been expensive.

Comment Concentrated solar is less efficient (Score 4, Informative) 237

Unfortunately, this is a concentrated light solution. This means that the figures quoted for efficiency are in the presence of direct sunlight. However, this is only a proportion of energy generated from PV modules, hence the "efficacy" and therefore, total energy production, of concentrated solar solutions is less good than unconcentrated modules.

The reason comes from diffuse sunlight - light that has been diffused by the atmosphere or by clouds. This typically accounts for 10% of module illumination in direct sunlight, and much higher in the presence of atmospheric haze/cloud; even in lightly overcast conditions, you can expect unconcentrated PV to yield approx 10-15% of direct illumination yield because of the diffuse illuminance.

Diffuse light cannot be concentrated by optics, thus concentrated solar PV modules cannot utilise the diffuse light (more precisely, they can utilise it, but not concentrate it - thus if the system uses a 10:1 concentration, then the energy yield from diffuse illumination falls from 10-15% to 1-1.5%).

A boost from 30 to 33% efficiency by switching to concentrating modules could be completely wiped out by the loss of diffuse yield, even in direct sunlight. In non-direct sunlight, hazy or cloudy conditions, the yield can be reduced much more severely; resulting in a net reduction in productivity, despite the higher nameplate efficiency.

This technology is most suited to areas with the most intense direct illumination; e.g. dry areas, at low latitudes (where the role of diffuse light is diminished in proportion).

Comment Re:Any other variables..? (Score 3, Informative) 206

The original research cites a large number studies with large numbers of children (hundreds or thousands). One of the major studies cited looks at different "types" of neglect which they call "global neglect" and "chaotic neglect". These mean multi-modal or single-modal sensory deprivation; e.g. no exposure to speech, or no exposure to physical experiences (for example, not allowed out of bed), no exposure to cognitive stimuli, etc.

The research showed that for "chaotic neglect" (i.e. one aspect of stimulus missing), brain scans were usually normal, or only slightly abnormal (e.g. brain volume reduced). However, for "global neglect" (multiple aspects of stimulus missing), then nearly half the brain scans were abnormal, showing severely reduced brain volume.

Of course, there are other aspects to neglect, not just sensory and intellectual deprivation; but that was not what the image, or the description in the text was about; this review purely (as far as possible in an observational study) looked at the differences between partial and severe sensory/intellectual deprivation.

Comment Re:Meh... (Score 5, Informative) 234

You're right about the network architecture, but things rapidly get complex.

Let's take the example of MRI/CT. How much data is in a CT or MRI study, or even an X-ray study? A single X-ray image (e.g. a Chest X-ray) taken with a modern digital machine, is about 60MB (30 megapixel image, 16 bits per pixel).

My new CT scanner, if I prescribe a "full neuro" protocol, generates 16000 files of 500 kB each. The reason I'm doing a "full neuro" it means that minutes count. I need to have that data set sent to not just a PACS (image repository and viewing software), but also to a PC with 3rd party software (which has the complex software capable of analysing the data) and I have to have it ready within 5 minutes. Not only do I need to have it in my office in 5 minutes, the doctor who is dealing with the patient in the ER, needs to have (some) of it in the ER within 5 minutes. Then, after everything is said and done, I need to send the data to my office at the university, so that I can run it through my research software.

If it was just PACS - no problem. You put the scanners and the PACS incoming-data server on a restricted VLAN. Have the incoming PACS server communicate with the main PACS application and data-store servers over a private VLAN, and have the PACS app servers face the hospital clients on the main hospital VLAN (or individual departmental VLANs).

However, at my hospital we also get several hundred CTs/MRIs sent in from outside per day, that need to get onto the PACS. Many come on CD/DVD. Some come via VPN tunnels. Some come via 3rd party proprietary transfer services. (The DICOM protocol used to transfer medical images doesn't support encryption, so must be tunnelled in some way). Now you have to somehow connect all these incoming points to your restricted VLAN (or you open your wallet to your PACS vendor for another software license at a cost that makes oracle enterprise look like chump change).

What if your PACS vendor has you buy the balls on your SAN contract, so that you are paying $10 per GB + $2 per GB per year? Do you really want to send that 8GB dataset to PACS (which can't actually do anything useful with it- and remember, as a medical-grade archiving device, you can't delete)? Or do you now need to start putting PCs with 3rd party software on your restricted VLAN so they can talk to the scanners?

Comment This is extremely common. (Score 5, Informative) 234

The term medical device has a broad definition; it includes obvious things such as laboratory analysers, X-ray equipment, etc., but it also includes PCs running specific types of software, such as medical records software. Most of these things run general purpose OSs - some embedded; some desktop.

E.g. Windows XP is a common platform for things like ultrasound scanners, MRI scanners, etc. XP embedded is quite common on things like laboratory equipment. Variants of linux are also in widespread use - albeit, often old. E.g. I work with an MRI scanner that runs a 2.2 kernel.

Now, things like analysers and scanners are usually on their own VLAN (or should be) with connections only to their application servers, with the servers heavily firewalled from the general purpose VLANs; however, this often isn't the case, and I've seen a number of installations where you can just sit down at a random PC, and SSH into an MRI scanner (these things usually have generic root passwords which are written in the service manual - once you know what the passwords are, you can get into any device of that make and model).

The biggest problem, however, is that these machines never get updated. The manufacturers often won't support any updates to the OS, or even permit hotfix installation, nevermind a 3rd party security package (for more general purpose devices). For example, one hospital earlier this year, upgraded their PACS system (software for storing and displaying X-ray/MRI/CT images) and bought a new set of dedicated workstations (quad core, Xeon E5, 8GB RAM, Dual Quadro), but because the PACS client software had to interface with a number of other client software packages, and those vendors had strict requirements; these machines ended up being loaded with XP SP1 32-bit and Java 1.4. Unsurprisingly, these aren't regularly patched, and more importantly, they can no longer update their anti-virus software as the current version of their chosen AV software won't run on this configuration (so they're stuck using an obsolete, unsupported version).

I saw an extreme example of this a few years ago when the Confiker worm hit. There were a group of hospitals in a major city, which shared the same infrastructure, and they had a very large PACS system. The worm got onto the PACS VLAN, and essentially killed the servers. The system was completely down for days, because as soon as the servers we rebooted or re-imaged; the worm killed them again. The vendor stubbornly refused to apply the hotfix and refused permission to install the hospital's antivirus system on the servers/workstations. The only thing that got it moving was when the CEO of the hospitals made a conference call with the hospitals lawyers and the CEO of the PACS vendor, telling them that they were going to f**k them so hard with the SLA stick, that they wouldn't be able to sit down for a month. After that call, the vendor agreed to install the hotfix, and the system came back online.

Comment Re:...Why? (Score 2) 328

No, it isn't. It's just that unlike GPS, the precise part is open to the general public.

Correct. The L5 intermediate-precision Galileo signal will be freely available to the public. However, this signal is not as precise as the GPS encrypted "precise acquisition" (aka military) signal.

The freely available L1 signal has essentially an identical format as the GPS "coarse acquisition" signal, and is therefore expected to offer broadly similar performance.

The advantage of having 2 distinct frequencies available to the public is that it is possible to correct for atmospheric dispersion. Within the ionosphere, the propagation velocity of the signals from the satellites can be altered by prevailing "space-weather" conditions; this is actually the major source of error in GPS. As different frequencies are affected differently, using a multi-frequency receiver makes it possible directly to measure this dispersion. Single-frequency receivers must either use a model based upon satellite ascension, or utilise a correction obtained from another source (e.g. DGPS or SBAS).

The galileo service plans to offer a paid-for premium service, offering another signal technically equivalent to the GPS "precise acquisition" signal, and encrypted using commercial cryptography. This service is also expected to provide high-bandwidth downlink for rapid acquisition of satellite ephemeris and differential-correction data (if conventional terrestrial based assisted GPS techniques are not available), as well as a guaranteed level of service, with guaranteed OTA alerts if a satellite malfunction is detecte, allowing a receiver immediately to drop a bad satellite rather than attempt to solve the location problem with bad data. As this signal will be transmitted on a 3rd frequency - triple frequency receivers which could perform an even more precise measurement of atmospheric signal dispersion are possible.

Comment Re:Thoughts (Score 3, Informative) 163

It's a neutron emitter. Alpha-particles will interact with Beryllium nuclei to emit neutrons. By encapsulating a mixture of Americium 241 and Beryllium, the alpha radiation (and gamma radiation) can be contained, but the neutrons allowed out, where they can be used for chemical analysis (in this case for analysing the composition of the rocks around the well bore).

Quite apart from the fact that the source is dangerous in its own right, emitting neutrons which are an ionising radiation, they are a particular nuisance, because they can leave "radioactivity behind" by activating the nuclei of nearby materials (metals are particularly troublesome).

Comment Re:Already Broken (Score 3, Informative) 183

I haven't tried IE or firefox, but magnifier doesn't work on Chrome windows. The magnified view just shows an empty page.

I'm guessing that whatever chrome is doing - openGL, or whatever it is using to composite the pages, bypasses whatever layer magnifier hooks into.

Similarly, the mcafee tool probably works by using graphics hardware overlays, and rendering the image directly into the graphics buffer, and then using hardware compositing. This works quite well to defeat low-end screen capture software. The better software, such as FRAPS, is capable of capturing the overlays, and then re-compositing the final image in software.

Comment Re:Not about ATA, about enterprise data storage (Score 5, Informative) 192

The "Turn off Windows write-cache buffer flushing on the device" option activates an ancient windows bug, and should never be used.

When Windows 3.11 was released, MS accidentally introduced a bug, whereby a call to "sync" (or whatever the windows equivalent was called) would usually be silently dropped. At the time, a few programmers noticed that their file I/O appeared to have improved, and attributed this to MS's much marketed new 32-bit I/O layer. What a lot of naive developers didn't notice was that the reason their I/O appeared to be faster was that the OS was handling file steams in an aggressive write-back mode, and then calls to "sync" were being ignored by the OS.

Because of this, there was a profusion of office software, in particular, accounting software, which would "sync" frequently - some packages would call "sync" on every keypress, or everytime enter was pressed, or the cursor moved to the next data entry field. As on 3.11, this call was effectively a NOP, a lot of packages made it onto client machines, and because it was fast, no one noticed.

With Win95, MS fixed the bug. Suddenly, corporate offices around the world had their accounting software reduced to glacial speed, and tech support departments at software vendors rapidly went into panic mode. Customers were blaming MS, Win95 was getting slated, lawyers were starting to drool, etc. Developers were calling senators and planning anti-trust actions. The whole thing was getting totally out of hand.

In the end, MS decided the only way to deal with this bad PR, was to put an option into windows, where the bug could be reproduced for software which depended upon it. The option to activate the bug was hidden away reasonably well, in order to stop most people from turning it on, and running their file-system in a grossly unstable mode. However, in Win95 - Vista, it had a rather cryptic name "Advanced performance", which meant that a lot of hardware enthusiasts would switch it on, in order to improve performance, without any clear idea of what it did. At least in Win7 it now has a clear name, even though it still doesn't make clear that it should only be used for when using defective software.

Comment This type of law isn't unique. (Score 3, Informative) 119

It's not much different in a number of other countries, notably the UK.

If a crime is committed over your internet connection, you are liable - unless you can provide proof of identity of the perpetrator. For a commercial ISP, this isn't too hard - they can tie a communication to an account, and the name of the account holder is good enough.

If you are offering wi-fi as part of a business (e.g. a coffee shop), then unless you keep some form of record of customer IDs, which allow you to match a communication to a customer, then you are on shaky ground. A common business practice is to outsource Wi-fi provision to an ISP, where the customer has to provide their account credentials for that ISP, or otherwise provide some evidence of their identity (e.g. by providing valid credit card details, or less invasively, by sending an SMS containing an activation code to a phone number provided by the customer).

An alternative, and increasingly common is to heavily filter wifi traffic - it's increasingly common to see free wifi locked down like a corporate network with all manner of block lists, and increasingly more so blocked ports (I've come across a few public wifi services where only ports 80 and 443 are available - every other port is blocked - such networks severely disturb smartphones, as it breaks their e-mail, iMessage/facetime, etc. connectivity).

Comment What happens if there is gross negligence? (Score 3, Interesting) 550

Bugs and security vulns are almost unavoidable - but some are due to gross negligence. Gross negligence should always be open to litigation. To follow on from Microsoft's analogy, if a door manufacturer was grossly negligent (let's assume that the door includes the lock and hinges - when this isn't normally teh case), and sold a high security door system, but had accidentally keyed all the doors to a single grand-master key. Then if you were burgled because a burglar happened to find out about this grandmaster key, then potentially you have a claim.

I don't see why it shouldn't be too different in software development. A software vendor needs to bear some responsibilty for good programming practice.

Bad software is everywhere; some is so bad, that it does border on grossly negligent.

As an example, I recently reverse engineered an "electronic patient record" system that was installed at a local hospital. This had a number of interesting design features:
1. Password security was via encryption rather than hashing. The encryption was a home-brew modified Vigenere cipher.
2. The database connection string was stored in the clear in a conf file, stored in the user's home directory. Interesting the database connection used the "sa" user.
3. Presumably for performance reasons, certain database tables (notably "users") would be cached in plaintext to the user's home directory. This way, an SQL join could be avoided, and the joins could be done client side.
4. The software ran an auto-updater that would automatically connect to a specified web site and download and run patches as admin - without any kind of signature verification.
5. All SQL queries were dynamically generated strings - no parameters, prepared statements or stored procedure. Not every user input was properly escaped. Entry of a patient name with an apostrophe in it, would cause very peculiar behavior. In the end, regular memos had to go round to staff telling them under no circumstances to use apostrophes in patient names, and to avoid, wherever possible the use of apostrophes in the plain text entries.

This is by no means all the security problems this software had, never mind the bugs e.g. a race condition when synchronising with a second application which would result in the two components opening different patient's charts.

Amazingly, there weren't any security breaches or significant medical errors as a result of this software - but I can't really conclude that this software production was anything other than grossly negligent.

Comment Re:That Poster... (Score 4, Informative) 439

The lead is likely very effective at reducing recorded exposure - probably cutting it by 75-90%. Most of the radiation in a typical fission product incident is beta radiation, which will be substantially attenuated by 1 mm of lead (the beta particles won't get through, but probably 1-2% of their energy may get through as bremmstrahlung X-rays). Gamma rays, will also be attenuated but only by a few % (high energy direct photons won't be significantly affected, but photons scattered from concrete, etc. will be of much lower energy, so will tend to be heavily attenuated).

There are plenty of radiation suits that offer 0.1 or 0.2 mm lead equivalent protection (they don't usually contain lead for environmental reasons, bismuth is usually used instead). These are quite useful for protection against beta energy, even if they do nothing for gamma. However, the sheer weight of even a 0.2 mm lead suit makes it only barely practical (though I understand the US military have bought a lot of them).

However, lead boots are a sensible precaution - most of the radiation in a Fukushima type incident is in the form of water soluble or suspended particles, which pool on the floor in puddles. Severe radiation injury to the feet from beta emitters is possible - 1mm lead equivalent rubber boots are tolerable to wear, and would offer substantial protection to the feet.

Comment Re:Remember the Kernel Backdoor (Score 3, Interesting) 194

I don't think Gibson found a kernel backdoor.

He did should very loudly about an intentional backdoor in the windows metafile image handler, which would start executing native code when a callback command was included in the script. He made a large number of spurious arguments as to why this was clearly intentional, as the vuln could only be triggered in very exceptional circumstances.

He was completely wrong about almost everything he said. The vuln was trivial to trigger, except when it was the last instruction in the script (which was the only way Gibson was testing). From the fact that he had great difficulty triggering it, requiring multiple parameters to be set to nonsense values, he concluded that this was clearly a deliberate backdoor.

It later came out from a number of MS insiders (incl. Mark Russinovich) that metafiles were a feature of Win 3, and were intended to be fully-trusted OS components (for rapid image drawing, and therefore had privileged access to a variety of internal system calls - notably the ability to set callbacks). The functionality was greatly increased in Win95 and later, with the original x86 hand-written assembly being ported directly, rather than rewritten. In the mists of time, the assumption of full-trust got lost.

Slashdot Top Deals

Everything should be made as simple as possible, but not simpler. -- Albert Einstein

Working...