Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:What did they expect (Score 2) 28

I was speaking to the CIO at a different NHS trust and this topic came up in conversation. He claimed that the hack so far appeared "purely opportunistic, unsophisticated and easily preventable".

Part of the issue is likely that IT is severely under resourced in the NHS. Many NHS organisations are still organised on the assumption of IT as a cost centre, rather than a core business process. The result is that IT doesn't get representation at executive level - so rather than a CTO or CIO explaining matters to the board and having some control over finance, IT may be treated as part of another department such as finance, or even buildings/facilities. Many sites are severely under-resourced in staff numbers, staff skill, as well as hardware resource.

As an example, one major hospital was recently upgrading their campus network earlier this year. Large parts of the campus were still fast ethernet, with gigabit backbone, and this was a source of much frustration due to poor performance for teams video calls, VoIP telephony, medical imaging studies in the multi GB range, as well as increasing use of electronic records. However, they hit a snag; many of the installed PCs, phones, and other appliances were so old that there was some sort of compatability issue with the new Gbit ethernet, and the network roll-out had to be halted until new PCs, etc. could be procured, resulting in a huge delay and massive cost overrun.

At the same time, software tools such as electronic medical records, electronic prescribing and medicines records, medical imaging, etc. are often not treated as IT procurements, and are procured and managed by individual clinical departments. For example, medical imaging software (picture archiving and communication system - PACS) is typically purchased by radiology departments, but it actually requires substantial infrastructure, due to the large quantities of data and need for high data integrity and availability - think multi-PB, tiered, redundant, storage with high availability clustering at both application and storage level, and SAN replication. Because the skills to administer such a system are often not available locally, these systems tend to be procured in a manner akin to SaaS, but with the hardware hosted on-site. This also means that architecture features often end up being specified by the vendor, which can lead to some aspects being deficient. For example, I recently visited one multi-hospital trust, and they had a roughly 1PB storage array for their images. It was "backed up in real-time to a different SAN in a different rack using replication".

NHS is also very good at trying to get away with paying the bare minimum for staff. You basically can't hire someone on a salary which encourages the better applicants. I recently saw an advert for one NHS hospital looking for a software developer to take charge of their in-house electronic medical record. The existing maintainer had left and the system was now live and unsupported. Essentially, they were looking for a lead developer with experience developing safety critical software, understanding the various regulations (e.g. medical device regulations), experience with interfacing via HL7 and various other frameworks. The salary being offered was £26k.

That said management are starting to see the light. I've seen several adverts this year for NHS board level CIO or CDIO (chief digital and information officer) in attempt to get people with detailed understanding of IT into senior positions, and move IT closer towards a core competency which needs to be managed as such. How quickly or how much effect it may have is yet to be seen.

Comment Re:How deep is "deep" for access? (Score 1) 57

This is beyond kernel mode. This is even beyond hypervisor mode. This exploit permits arbitary code to run invisible to the kernel and hypervisors, and that code has full access to system ram, meaning it can invisibly read and change kernel and hypervisor memory. In effect, it is a rootkit running above the hypervisor, so could directly interfere with any VM on the system, or even directly interfere with security infrastructure in the hypervisor itself (recovering security keys, etc.)

This root kit after being injected by e.g. a compromised driver, could potentially become persistent by installation into the system firmware, therefore executing and staying resident even before the normal boot sequence starts.

Comment Re:they can go forever (Score 3, Insightful) 122

That's correct. However, the rate of embrittlement is well known and can be verified by the periodic inspection of test specimens (made from the same batch of steel as the reactor pressure vessel at the same time) which are placed in high flux regions of the reactor core. As a result, these experience a higher dose of radiation and therefore they provide a reference point for future embrittlement of the reactor vessel itself. One thing that has been fairly consistently observed is that as the cumulative dose increases, the rate of embrittlement decreases, and initial estimates of practical lifetime were significantly underestimated. RPVs which had been assumed to have a 30 year life limit have already been extended to 60 years, and relicensing for 80 years is underway at a number of sites.

There are, however, additional things which can be done. It is possible to rearrange the fuel, so that more active fuel is placed at the centre, and less active fuel at the periphery. This greatly reduces neutron leakage from the core into the RPV, and this can greatly extend the operating life.

There are other options to reduce stresses on the RPV. For example, the limiting event is a core quenching and reflooding following a loss of coolant, where very large volumes of cold water are poured onto an overheated core and RPV. It has generally been assumed that this reflooding would be performed with cold water, and therefore the ductile-brittle transition temperature must be at the lowest possible cold water temperature. However, it is possible to reclaim some margin by redesign of the flooding process. For example, it is not necessary that the water be cold, just that it be water. Holding emergency water tanks at 60-80 C for reflooding at a higher temperature can recover substantial margin, by both increasing resistance to brittle transition, but reducing magnitude of the thermal shock.

The other thing which can be done is to heat treat the reactor vessel, which can recover most of the ductility lost due to neutron embrittlement. This has been performed and validated by the Russians. While not validated and licensed in the West, it is expected to work and to be technically feasible, it's just that there has not been any commercial need to develop the technique to a commercial level.

Comment Re:Observation (Score 2) 218

The main difference is that this is a single large, well-conducted, multi-center study by an experienced research group following best research practice and where the research has not (yet) been heavily distorted and politicised. This current research is interesting, but in isolation, insufficient to be used as a treatment recommendation. The fundamental difference is that unlike other potential treatment options, it has not (yet) been widely politicised, hyped and exaggerated and generally being the subject of conspiracy theories.

The same trial (the TOGETHER trial) which has just published their fluvoxamine results, also has an arm which investigated ivermectin and was constructed with similar quality (and found a negative result).

If you look at the situation with ivermectin, the majority of the studies have been published as non-peer reviewed pre-prints, often very small (many with fewer than 100 participants, some uncontrolled). Even among peer-reviewed journal publications, methodological errors and poor practice can be widely seen (e.g. few trials actually registered their trial in advance of performing it - by pre-stating your aims and methods, it means that late method changes or changes to data analysis, such as p-hacking, become apparent. Because of this increase in transparency, it is one of the basic minimum quality indicators which a clinical trial is expected to meet). There has also been an enormous heterogeneity in terms of treatment strategy (prophylaxis, home treatment, hospital treatment), dosage, which makes interpretation of the literature as a whole difficult. On top of that, there is evidence of scientific fraud (evidence of directly fabricated, duplicated or tampered data) in some of the most important "positive" trials, leading to multiple retractions of publications by journals where this has come to light.

Despite the flaws in the literature, which at best is inconclusive, there has been considerable hype on social media and from politicians and other activists. There are even misleading websites (e.g. ivnmeta.com) specifically targeting scientists, presenting flashy "live" meta-analyses online, with compelling tables and figures (but with multiple concealed methodological problems which seriously harm their credibility).

It is wrong to call ivermectin a fraud. There has been legitimate reason to investigate it, and while the negative result of the TOGETHER trial is disappointing for proponents, TOGETHER concentrated on more serious disease, so may have missed a benefit on early treatment of mild disease. The problem is that the evidence base is heterogenous and of generally poor quality, with the better quality trials tending to show negative results. The best assessment is that it is not clear whether there is a benefit or not, but it does not appear to be seriously harmful, and therefore further trials to better investigate certain uses, could be ethical.

Comment Re:Not surprising (Score 2) 85

The basic buttons and mouse functions work.

The extra buttons, resolution settings, and various other config (e.g. button macros, illumination) need an account. Not only that, but even if you do have an account, the software spams you with pop ups, needs regular updates, etc.

The release of the synapse basically ruined their products for me. I didn't have a super fancy mouse, so didn't use the advanced features, so synapse got promptly uninstalled and blocked. However, I won't be buying any more Razer products because the software is so bloated and awful.

Comment Re: Wow such a surprise. . . not (Score 1) 296

One of the big problems with the ivermectin research was that the literature was severely biased by a single early study (Elgazzar et al.) which showed a strong beneficial effect, and also scored highly on quality measures. When using pooled analysis techniques, such as meta-analysis, the results from this one study overpowered the effects of the multiple smaller and lower quality studies performed around the same time. For quite some time, performing a formal meta-analysis of the entire literature showed a beneficial effect of ivermectin, and it has only been recently that enough negative studies have been published to overpower the early positive results.

As time has gone by, larger and better quality studies have started to be published, which have tended to show less effect than earlier studies. This is a common pattern when an ineffective new medical treatment is investigated - there is a tendency for novelty to trump quality in early publications, hence potentially interesting positive results are published, despite poor quality or inadequate sample size. However, as time goes by researchers and journal editors expect quality to be more important, and resources and funding can be obtained to perform these more expensive and resource-intensive studies.

However, the killer blow for ivermectin has been a forensic analysis of the Elgazzar study, which has demonstrated the data to have been fabricated. The article had never been published after peer-review, but had been released on pre-print servers, and even the pre-print servers have now retracted it. Repeating the meta-analyses with exclusion of the Elgazzar study shows convincingly that the ivermectin has not been of benefit.

Comment Re:Open source because it's useless? (Score 1) 99

The schematic of the project is fairly simple. A GPS receiver with PPS output and the output of the rubidium oscillator are connected to an FPGA which is configured as a digital PLL/NCO. The FPGA generates a conditioned PPS signal which is brought out to a PPS out port. This PPS signal needs to be routed to the PPS in port on a PTP server NIC.

The software stack running on the server consists of standard linux PTP and NTP servers. The PTP server is configured so that the PTP hardware on the NIC is operated in PPS disciplined mode and serves as a primary reference time source. The NTP server is configured to use the NIC's PTP hardware as a stratum 1 time source and will slew the system clock to match that.

Having thought about it some more, the FPGA and PCIe card seem rather excessive. One of the convenient things about the microchip MAC compared to older atomic clocks is that its firmware has an integrated DPLL, so that it can be disciplined to an external PPS signal. Just connect a GPS receiver to the MAC's PPS in port, and turn on GPSDO mode.

It may well be that the project was intended to be a generic GPSDO, and that they happened to build the prototype with an atomic clock - one which just so happened to include it's own disciplining facility. The process for getting the time signal into the PC appears to be a standard server NIC with PTP hardware and a standard PTP software stack.

Comment Re:Open source because it's useless? (Score 1) 99

However, you will be using the PPS output to phase lock a digital PLL. The resulting phase error of the PLL is dependent on the control loop time constant and the phase noise of the oscillator being slewed.

There have been some pathological cases where this type of hardware has performed surprisingly badly - for example, some raspberry pi revisions had such a poor local oscillator, far worse than the control loops in ntpd were tuned for, that even when connected to a GPS PPS source they could not achieve precision better than a few milliseconds (I mean milliseconds, not microseconds).

Better quality oscillators permit longer control loop time constants (and so better filter the phase noise in the GPS PPS signal) as well as permitting longer holdover periods (e.g. during a GPS outage).

I confess I don't really understand the choice of PCIe form factor here. The GPS receiver and expensive oscillator could easily have been built into a 1 U rack or desktop enclosure, and this would make connection to the PC/PCs/sever easier - just connect the PPS to a GPIO card or a dedicated PPS input as found on some NICs intended for PTP servers.

Some NICs already include high performance oscillators, specifically to facilitate operation as a PTP server, and provide a PPS driver for the OS - e.g. the Intel XXV710-DA2 card, which features a 20 ppb stability oscillator (good for stratum 3E) and PPS input port. Just connect a GPS receiver and you're good to go. Sure, this card isn't cheap, but neither are PCIe FPGA dev boards. Or if you want stratum 2 timing stability, use the same microchip MAC, and use its GPSDO function. Put it in an external box and connect its USB port if you want to monitor it.

Comment Re:better envelope is more than insulation (Score 5, Informative) 80

The primary purpose was insulation. Under UK legislation and renovation or upgrade to a building must also enhance the energy efficiency of the building to meet modern standards. So, for example, if you build an extension to your 100 year old house - then not only must the extended part meet modern energy efficiency standards, the rest of the house must be upgraded too. So, insulation was a mandatory legal requirement.

Aesthetics are also an important criterion for getting planning permission. Councils these days generally do not permit eyesores - hence it is necessary to convince planning inspectors that the aesthetics are suitable for the area.

However, the insulation was relevant here. One of the targets of the council was to demonstrate "aspirational" energy efficiency. They specifically listed thermal specifications significantly in excess of the legal requirement as one of the core specifications that the contractors had to acheive.

The architect when he got the thermal specifications wasn't sure that they were possible - so contacted all the insulation companies he had used before with and asked their technical support departments for details of fire-resistant insulation products capable of meeting these specifications. The replys came back that the specification was unachievable. The architect went back to the main contractor and the council to inform them that the thermal specification in the contract was unachievable and had to be changed. There then followed an increasingly beligerant exchange of e-mail between the customer and prime contractor, and the architect and insulation suppliers - with the prime contractor insisting that it was possible and the council was insisting that the spec was non-negotiable and would not be changed under any circumstances. After being unable to find suitable fire-resistant materials, the architect begged the prime contractor for advice to break the stalemate. The contractor suggested that he look at alternative materials, and suggested a type of PIR insulation which "they had used all over London, and never had a problem with building control".

Eventually, an order went in for the PIR material. The supplier found out it was for a residential tower, and refused to supply it, stating that it was too unsafe. They substituted it for a fire-resistant version, which had a fire test certificate - however, the certificate was only valid for when the insulation was paired with a cement boards, and not used with any other type of combustible material, and certainly not the polyethylene outer cladding used.

It later turned out that the test which the certificate applied to had been manipulated - The first test of the material at a laboratory failed spectacularly. Subsequently, the manufacturer of the material submitted it for a retest, but had secretly agreed with a technician at the test laboratory to insulate the thermocouples with ceramic.

That said, the insulation waas only a part of the issue - the main issue was the flammability of the outer polyethylene panels. The architect and planning inspectors had initially wanted solid metal cladding - and had agreed on brushed zinc. However the council decided that they had to find cost savings and demanded that the architect/contractor cut the budget by 500k. The expensive solid zinc tiles would have to go. In fact things were worse, the prime contractor had mis-quoted when they signed the contract, and had under-estimated the cost of the project by approx UKP 1 million. The council would not reconsider the budget - the cost reduction was non-negotiable. The prime contractor was already in financial trouble and they could not afford to eat a UKP 1 million loss.

At this point nothing mattered any more. For all the extensive effort and discussions about which shade of brushed zinc with brass accents, the only materials which were being considered were the absolute bottom-of-the-barrel materials which were would meet the budget requirements. Aluminium/Polyethylene composite was the cheapest material available - and nothing else would fit within the budget.

Comment Re:GNSS (Score 1) 29

Anti-spoofing was a core feature of the 1st generation GPS system. The technique was to encrypt the code using a symmetric algorithm.

A spoofed transmission with the wrong key would appear as uncorrelated broadband noise to a receiver with the correct key. At the same time, the anti-spoofing process prevented unauthorised use of the signal, as a receiver without the key would receive nothing but noise. This functionality was only enabled for the "precise" (typically reserved for military and government internal use) signal. The "coarse" signal was transmitted at a much lower chip rate, and the low bit rate left little capacity for meaningful security payloads.

At the time GPS was released, elliptic curves were not in widespread use, and digital signatures therefore meant RSA - and 1024 or 2048 bits would be needed for meaningful security. Considering that the entire GPS navigation data payload is only 1500 bits transmitted at 50 bps, the cryptographic overhead would have been severe, for relatively little benefit.

Comment Re:There is no way this was not botched (Score 2) 173

It's not so much that the study is botched, but this type of retrospective observational study is extremely prone to bias, and the biases can be surprisingly large. This is why the gold-standard for an intervention trial is a randomised controlled trial, preferably double blinded.

For example, the different treatments were chosen on an ad hoc basis - while there may not have been a formal protocol for choosing a treatment, the prescribing doctors may have been going on "gut instinct" - this person is higher risk, better given them the double treatment. Alternatively, other treatment protocols may have changed - better anti-coagulation to prevent blood clots, treatment with non-invasive ventilation and high flow oxygen in preference to invasive ventilation, etc, over the study period and coincidentally treatment protocols changed a bit. Different prescribing policies might have been in effect at different hospitals, where other treatment policies differed (although the paper does state that protocols were the same, but at the same time doesn't explain how different treatments were allocated - for example, no rationale is given as to why AZM should be given alone, and apparently the only reason for not giving HCQ was heart disease showing on EKG).

The second issue is that there were numerous test groups and only one small control group - in fact, the test groups were much larger than the control, with great variation in the group sizes - for example, the HCQ group was 3x the size of the control, and in total the control was only around 20% of the total. This is a problem because any biases which turned up (due to sampling error, selection, etc.) in the control group, will affect the analysis of every test group. This could be significant in this case - both drugs tested are significantly cardiotoxic and use by people with known heart disease is not advisable, and heart disease is a major risk factor for poor covid-19 outcome. On top of that, because there were multiple small groups, the confidence intervals of the results are wide (+/- 5% for the control group - up to +/- 8% for test groups) - so a comparison of 26.4 % +/-5% to 20.1% +/-3% for HCQ+AZM is not as great a result as suggested by the headline numbers alone (26.4 vs 20.1).

A third issue is significant difference between the treatments and severity of disease on admission for each group. Disease severity on admission was significantly higher in the control group than the HCQ only group. Similarly, much fewer of the control group ever got to ICU or on a ventilator - in other words, patients on the HCQ/AZM groups got much more aggressive treatments not received by the control group. Additionally, the HCQ patients were much more likely to have been treated with steroids (a treatment shown to improve outcomes in randomised controlled trials).

Having said that, a statistical regression was done to look for independent factors and HCQ treatment was identified as a risk lowering factor. However, the results of the regression are wildly different to what would be expected, which can be a sign of excessive noise, or an unrecognised confounding effect which was not tested for. For example, this study found that obesity has a protective effect, and that conditions such as heart disease, hypertension, diabetes, asthma, COPD, and cancer had no effect. The researchers also performed an alternative analysis where the analysis was restricted to test/control patients which had a matching "propensity score". This is a reasonable methodological approach - but suffers from the same issue that it can only correct for known confounders, and also assumes that the "propensity score" correctly captures the effects of all the different variables.

While this does appear to be a well carried out retrospective study, Retrospective studies, even when well performed, are known to be unreliable. Their main role is hypothesis generation, although they are often used to guide treatment, this has sometimes been at a cost of promoting the wrong treatment.

There have been good quality prospective randomised controlled trials performed. For example, the RECOVERY trial in the UK - which randomised patients to one of a variety of treatments as soon as possible after hospital admission. The RECOVERY trial is in fact, much larger than this most recent study - and uses a better allocation protocol (larger control group). Hence, while this study is interesting, it needs to be interpreted in the context that it is a weakly powered moderate quality study, giving a contradictory result to multiple higher powered high quality studies.

Comment Re:Some confusion about the plan on the expert's p (Score 1) 241

Accuracy in canyons is limited by location of the satellite in the sky, not the height of its orbit.

QZSS is in high orbit because it is a local system, and in order to provide continuous local coverage, a synchronous orbit is required, and these are high altitude. It's the same with the European EGNOS and US WAAS systems. Those systems piggyback the navigation payload onto commerical comms satellites, so are in geostationary orbit over the equator, and this has the limitation that at high latitudes, the satellite elevation is low. QZSS is a dedicated system, and a synchronous orbit where the satellites are at zenith over Japan was chosen.

The Galileo orbits are at a higher inclination than GPS, and this was done to improve the elevation of the satellites at high latitudes - where GPS has particularly poor performance, because the satellites are too low in the sky and easily obstructed.

An LEO system will necessarily require more satellites than one in medium earth orbit (like GPS/Galileo), but has the advantage of much stronger signals (several orders of magnitude stronger - and signal strength is a limiting factor in satellite navigation performance) and higher redundancy. Additionally, the faster ground speed mitigates the canyon issue, because reflected signals (giving false distance readings) are no-longer quasi-static meaning that they are easily distinguished at the receiver from direct signals.

Full global coverage from LEO requires more than 80 satellites, but it should be possible to design a set of orbits that degrades performance in certain areas (e.g. equatorial regions) in order to improve performance at higher latitudes.

Comment Re:Lacking button for "can't tell" (Score 1) 34

You will have had such a test if you have ever had your eyes tested for anything, especially if being tested for prescription lenses. Whenever you have your eyes tested for lenses, you get a full eye health check, including acuity, intra-ocular pressure, examination/photography of the retina, visual fields/blind spot measurement.

The normal acuity test is just to read letters off a Snellen chart https://en.wikipedia.org/wiki/... - the result is given by the lowest row which you can read accurately.

Comment Re:Lacking button for "can't tell" (Score 2) 34

It measures acuity - which is the medical term for resolution. This is the starting point for measuring eye health. In that it is degraded both due to refractive error and structural problems with the eye. Assessment of refractive error requires optical instruments - which can be as simple as a selection of multiple lenses - or sophisticated automated instruments which measure the shape of the eye and its refractive power using optical and imaging techniques. Typically a proper eye examination would include measurement of acuity before and after correction of any refractive error.

The score is not out of "20". The medical way in which acuity is reported is "distance at which a person with average acuity would have to stand to have the same accuracy". So, for example, 6/6 indicates that your vision at 6 meters is as accurate as the average person's at 6 meters; if you have slightly below average acuity, your acuity might be reported as 6/10 - which means that you had the same accuracy at 6 meters, as an average person at 10 meters. An acuity of 6/60 is a significant impairment (approx 1/10 the resolution of average) and someone with this level of acuity would typically be registered as partially sighted.

As the site is a US site, it is more common to use feet as the unit - in which case, 6/6 would be reported as 20/20, and 6/10 would be reported as 20/33.

If you prefer a more physical interpretation: 6/6 vision is approximately equal to an angular resolution of 1 arcmin.

Slashdot Top Deals

Nobody's gonna believe that computers are intelligent until they start coming in late and lying about it.

Working...