Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Virtualisation dates from the 1960's ! (Score 1) 180

The first large scale availability of virtualisation was with the IBM 370 series, dating from June 30, 1970, but it had been available on some other machines in the 1960's.

So the idea that "newer machines have support for virtualisation" is a bit old.

This point has been made since the first virtualization software on microcomputers were being experimented with. Those who don't know history are doomed to repeat it (or something similar depending how diligent your citation tracking it).

I'm still waiting for someone tell us that IBM discovered perceptional acceptable lossy compression, such as JPEG, MP3, and MPEG, back in the mid-1960s mainframe era to generate image and videos for punchcards distribution.

And Xerox PARC labs had a portable MP3 player prototype with a seamless white case with a steering-wheel styled interface, locked in its vaults of time.

Comment Re:Anyone else concerned? (Score 1) 164

but doctors act a lot more like technicians than scientists or researchers.

Doctors are much more like technicians. You don't want doctors "experimenting" on you unless you really, really need that.

To clarify the doctors or physicians you are referring to medical practitioners in medical parlance. There is two additional medical "communities," which are linked, the medical teaching and research specialties though two these tend to be more intertwined. In many cases they share hospitals, labs, institutions.

Physicians are typically not brought up in a 'science' environment (question assumptions, learning how to research a topic, critical thinking.) Doctors are brought up in 'cram mode'. Dump a lot of into down your throat. You're expected to believe it. They are increasingly taught to 'follow the protocol' which amazingly, is what technicians do.

That is a gross over-generalization. A good physician is trained to be scientifically minded, to take careful observations (utilizing medical testing), question assumptions for faulty assumptions and correlations, and be critical in what they do. They are expected to learn and memorize a large body of knowledge that they will likely need to do their job on a daily basis, and was the first profession AFAIK to have formal continuing education requirements to keep their medical license in many jurisdictions. All bio-chemical scientists follow a protocol so that they have a consistent and reliable testing methodology to reduce mistakes, attempt to be as objective as possible, and to be comparable.

Yes, there are 'physician scientists' but they aren't treating the majority of patients and you don't want them to be ('hey that looks interesting, what happens when I tug on it?').

If you are being treated by a medical researcher, then either there is no known effective or reliable treatment, or there are testing for a new hopefully better treatment. It means you are the test subject, normally not an ideal situation.

This case is interesting as the husband of the patient kicked the docs out of 'technician' mode. And, of course, used a 3D printer.

ALWAYS ask your doc questions about stuff you don't understand.

Interesting, yes, but it bugs me more in that I fear the deniers of vaccine safety, and those who want to consumer-ize their medical experience ("the customer is always right" is a horrible mantra for any legitimate medical practice) will use it as evidence to vindicate their positions. Most of the medical drama was in fact about miscommunication, inconsistent practice, and the need to be your own advocate for medical treatment.

From working with physical scientists, I know that 3 and higher dimensional visualization is still often lacking in being easy to interpret with advanced computer visualization techniques. The results while sometime can be made to look pretty, that has little correlation with how quickly and easily the visualization can be interpreted to extract the relevant information.

Comment Re:Chrony (Score 1) 79

So it's fair to say that Chrony isn't suitable for running on stratum 1 servers, of which there are a few hundred, maybe up to at most a few thousand publically available in the world[1]. For the millions of Linux servers, laptops and desktops that aren't and will never be stratum 1 NTP servers Chrony should be just fine, shouldn't it?

Yes, I think Chrony is fine for most typical unauthenticated leaf-node (client-only) usage, but I still don't recommend it for the thousands of public stratum 2 or higher (see pool.ntp.org, most are stratum 2 or 3) servers, or the thousands of corporate and organizational NTP servers. For usage as a server, with a full-time network connection, I don't know of any compelling reason to use Chrony over NTPD or OpenNTPD.

Personally I can't see any reason to believe Chrony is more secure than either NTPD or OpenNTPD. Being new, or even saying that Chrony is secure, and programming really, really carefully, doesn't make it so.

Comment Re:Mathematics (Score 2) 79

Chrony is a complete working implementation of the NTP protocol.

You mean complete except for broadcast/multicast mode, or authentication based on public-key cryptography. Some basically it's a good client and a unauthenticated / inefficient (network) server.

It also makes some pretty misleading claims; Chrony can usually synchronise the system clock faster and with better time accuracy except it never explains how it can possibly achieve better time accuracy than NTPd.

Chrony does handle a number of client usage scenarios better than NTPD (namely non-permanent network connection, and laptop-like environments) as far as I know, but it does not achieve better accuracy for the usage scenarios NTPD was primarily designed for (e.g. network connected servers).

NTPD gets its knickers in a twist at the slightest excuse and sometimes ends up stepping the time even though it has perfectly good Internet connectivity and a reasonably good internal clock.

Yet chrony can't detect rouge or fix broken time servers. Beyond possibly having better handling for clients of dynamic clock frequencies (i.e. SpeedStep, and various other power saving features that modify one or more of the several frequency oscillators in a computer.). I say possibly because I am not certain of the state of affairs in the current NTPD code base, I know it was lacking when dynamic clock frequencies originally appeared in systems, but I am not sure that it still is naive about that.

Chrony keeps steady time even if Internet access is intermittent. It never gets confused and picks a falseticker pretending to be stratum one instead of a stratum 3 with correct time, unlike NTPD.

While it does appear Chrony has improved greatly from a simple SNTP client for intermittent network connectivity it was when I first heard about it, that is still its forté, and likely the best client for many end-users' cases. Still it is not a robust general purpose replacement of NTPD.

It even has interfaces to GPS clocks or other hardware clocks, so you can run your stratum 1 server on Chrony if you want.

And YouTube is full of people doing stupid, reckless, and/or unwise things too. That's perhaps too harsh, but that's those "features" are quite incomplete.

Having PPS (Pulse Per Second) optional support is a good start, it is not a comprehensive solution to running a quality stratum 1 server. I expect a stratum 1 server to have improved or at least quantified oscillator ("clock") parameters, such as ideally TCXO (Temperature-Compensated crystal Oscillators) or OCXO (Oven-Controlled crystal Oscillator) for the stratum 1 system's time-keeping. For commercial systems I would suggest looking at a professional NTP server network appliance, there are several vendors including Spectracom, Symmetricom, Meinberg, and others.

Comment Re:Not surprising... (Score 3, Informative) 278

Of course, its audio quality compared to a CD is debatable [...]

No, it isn't debatable. Due to physical limitations of cutting a groove in the record surface, and interpreting using a needle during playback, vinyl recordings ("LP" or other form factors such as 7 inch 45's) are physically constrained, preventing the recording of some low-frequency sounds and effects. Such sounds and effects are/were featured in electronic music ("techno", "dance", etc.). This was the reason behind the RIAA equalization curve used to de-emphasize the bass frequencies, it allowed closer spacing of the groove (which lengthen play time, the major justification / selling point of the LP format). There are also pre- and post- echos of loud passages if preceded / followed by a very quiet section. Vinyl is an analog recording using techniques developed in the 1950s, and suffers from numerous limitations of the physical limitations of the medium, with no inherit noise reduction or error correction possible, so the vinyl format has absolutely no objective superiority in accurate sound reproduction.

There is one complicating factor, which is not inherit in the vinyl format itself. Modern ("revival") LPs do excel in that they often use a better quality final mix with a wider dynamic range, whereas final mixes for CDs and digital formats typically are highly or over- compressed (due to the auditory perception of "louder" will intuited as "better", the basis of the "Loudness Wars") before being transferred for commercial duplication.

Some well mastered (retain a full dynamic range between quiet and loud passages) CDs and digital recordings do exist, but sadly too many studios still over-compress the recordings.

There was the comical case of Guitar Hero, where digital recordings shipped with the game were of better (dynamic range) quality than were available as CDs or discrete available for purchase digital format (MP3, AAC, etc.).

Comment Re:Cameras only a partial solution (Score 1) 368

Or....... not carrying guns at all.

This is highly effective in several countries around the world, but it does have one key criteria. The availability of firearms to the criminal and/or general public has to be low initially for this to be an effective policy.

And I believe nearly all countries where regular police / peace officers do not carry a firearm, they do have special units that can be activated in the rare event of (suspected) firearm / deadly weapon usage or widespread violence or mob/mass rioting.

In my youth I was told by a police officer during a tour of police facilities that they were trained to only draw their weapon to fire it or to clean it. To the best of my knowledge, based on my own very limited experience, the majority of officers I have seen still operate under that basic premise. A firearm is a means of lethal force to be used only as a last resort. It is not perfect, but I do believe it has lead to far more lives being saved on both sides than the alternative of officers drawing their weapons sooner as a method of deterrent or preparation.

I support the law enforcement officers goal of making it home alive always, but I also value their efforts in not escalating scenarios, and respecting the lives of others.

Comment Re:Not the holder's money (Score 1) 98

If the University is fining them instead of blocking their access and is failing to prevent the copyright violations that it is benefiting financially from

Some universities already have copyright clearance agreements in place, due to concerns of copyright material being duplicated in libraries, these agreements may allow the university or library to generate income as a means of cost recovery of any expenses from administering the program, and an incentive for enforcement.

Since approximately 1% (or less than $1000 total, divided amongst all the Top of the Pops artists for the past 6 years) of the proceeds would likely end being paid to the artists, songwriters, and/or performers; really what difference does it actually make?

Comment Re:Adblock plus is free (Score 1) 319

Think of it less as a way to avoid ads, more of a way for your favourite sites to stay in business.

Unfortunately I'm not certain how many of the IT / technology website are worth subscribing to. Too many of them are already hollow shills, with writers and "editors" who either lack technical or literary skills if not both. Scarcity of journalism, professionalism, or ethics makes me wonder whether they would just continue to produce more "sponsored content" which is merely advertising being sugar-coated as content, whether new product info, amazingly uncritical glowing reviews, verbatim printing of marketing material. Such that I would be paying to read/ view advertisements. Pretty much why people stopped paying for cable as soon as they could, why pay to get content filled with marketing? Let alone the growth in product placement / endorsement in prime time television.

For websites where the user community or user base is the actual value, then how much money should an active user who already contributes their time and knowledge and/or creativity, be expected to pay on top of their donated time and effort?

And I'm not a cheapskate, I have repeatedly subscribed to print magazine subscriptions for magazines that were 100% reader paid (i.e. no advertisements). A small number succeed, and others devolve or get absorbed into what they were trying to avoid. (e.g. Christopher Schwarz's Woodworking Magazine, Citizen Science Quarterly, etc.) One thing that is certain, is that they do have some of the least bias product reviews.

Comment Re: The Cause (Score 2) 111

BTW, it would be kind of awesome if the computer hardware testing sites incorporated sound tests into their general testing of stuff.

You mean like this:
  Tom's Hardware: Sapphire's Vapor-X R9 290X 8GB - Temperature, Noise And Power.

Actually I continuously get frustrated by "enthusiast" computer sites reviews who seems to being entirely lacking in technical knowledge when it comes to anything beyond quoting the manufacturers press material. Half of them might as well have a companion site reviewing shoes and fashion tends given their display of technical ignorance.

Comment Re:The Cause (Score 1) 111

Naïve question maybe, but couldn't some sort of lacquer be applied on wires to prevent them from physically moving?

For small gauge magnet wire, it often is used on better, but it is not perfect.

Better components tend to cost more, which for commodity priced* products like video cards, saving a few cents can be considered worthwhile.

*) A lower bill of materials cost, can be used to past on some or all of the savings to the consumer, where for price sensitive consumers, the company with the cheapest product can end up selling potentially 30-400% more if you have the cheapest of a seemingly similar product (video card with chip Y9000), thereby increasing net profits even with lower profit margins per unit.

Comment Writing in natural language(s) (Score 1) 223

I agree with the idea to study mathematics, as a useful exercise, that would in many cases benefit programmers by giving them a good mental workout, and hopefully reinforce if not expand their understanding of mathematics, logic, and reasoning.

Beyond that I would argue for the study of writing, in a natural (human-oriented) language of your choice.

Programming as a profession, and as an art, is about the meaningful expression of ideas; in a detailed, unambiguous manner that can be processed by a computer. Programming languages are tiny, simplistic, and restrictive in their ability to express ideas, and the execution of these ideas. Writing in a natural language is much more complex, particularly when you strive to remove undesired ambiguity*. The other issue is that as a professional, programming is not done in isolation. Even if you are an independent contractors, you must be able to communicate effectively with clients and users.

*) Ambiguity can be desirable in humor and poetry.

I think that any programmer can benefit from the abilities to make logically sound, comprehensible arguments in a written document; that these abilities will make them better in their ability to understand, and be understood by users, customers, or colleagues.

The argument has been made in the past by Steven C. McConnell in Code Complete, in The Pragmatic Programmer by Andrew Hunt and David Thomas, Coding Horror by Jeff Atwood, and Joel Spolsky (of Joel on Software) in his Introduction to Best Software Writing I and College Advice. And like tons of other software developers, and their managers; repeatedly.

You see, communication is the only really important aspect of software development that people really have trouble with. The rest are details and small bugs, but for really big screw-ups you need miscommunication (or greed)

Comment S/A vs. C/A & P codes (Score 1) 236

It didn't take long, though, for commercial providers of GPS services to start complaining about the system's "selective availability" which reserved access to the best, most precise signals for the U.S. military.

Actually the most precise signals (Precision (P) code) are still restricted, even though the selective availability (which was basically introducing jitter) was turned off for the Coarse/Acquisition (C/A) code.

Comment Re:To infinity and beyond! (Score 3, Informative) 132

What leaks? [...] I think the memory leak meme has outlived reality...

That just means it's gone gold, as far as Internet memes are concerned. If an Internet "meme" can remain in usage past the natural lifespan or the relevance of its subject, some people mistakenly think that makes it funny.

grumble, grumble Al Gore invented the Internet @(&*@) The Internet is ... a series of tubes *&^^$%^)*#@ 640k[i]B is enough memory for anyone #$@#$@*& BSD is dying !$%#@#)

Comment Re:And the U.S. falls further behind (Score 1) 125

And no source for his (Cliff Mass's) claim of performance. As far as I know the US National Weather Service (NWS) in fact operates multiple clusters, I don't think they have any classic singular "supercomputers," but then again neither does anyone else anymore, since the original Cray supercomputer heydays.

The various models are run on several clusters AFAIK. I believe North American Mesoscale, NAMS and Global Forecast System, GFS may run on the primary operational cluster, but I was under the impression that other models like Rapid Refresh, High Resolution Rapid Refresh (RAP/HRRR) were run on a different cluster. I believe climate models are run on separate ("non-operational forecast") clusters as they don't have the same timeliness constraints. I'm unsure about oceanographic (wave, sea surface temperature) models. See their Environmental Modeling Center

Slashdot Top Deals

Too many people are thinking of security instead of opportunity. They seem more afraid of life than death. -- James F. Byrnes

Working...