Become a fan of Slashdot on Facebook


Forgot your password?

Comment: Re:90 days may be a little short (Score 1) 261

by plcurechax (#48831413) Attached to: Google Releases More Windows Bugs

but in principle I agree with what Google is doing. In effect they are trying to destroy the market for zero day exploits and forcing the companies involved to not site on their hands and hope nobody uses them.. like cybercriminals and the various three letter agencies.

From the article:

In the bug tracker for the impersonation vulnerability, Google said it had queried Microsoft on Wednesday, asking when the flaw would be patched and reminding its rival that the 90 days were about to expire.

"Microsoft informed us that a fix was planned for the January patches but [had] to be pulled due to compatibility issues," the bug tracker stated. "Therefore the fix is now expected in the February patches."

The next Patch Tuesday is scheduled for Feb. 10.

So 90 days is an appropriate time to wait but not 106 days?

Here is what Google use to say (circa 2010) from most of the same people who make up the Project Zero team (Chris Evans, Michel Zalewski, and others) AFAIK.

Rebooting Responsible Disclosure: a focus on protecting end users:

Update September 10, 2010: We'd like to clarify a few of the points above about how we approach the issue of vulnerability disclosure. While we believe vendors have an obligation to be responsive, the 60 day period before public notification about critical bugs is not intended to be a punishment for unresponsive vendors. We understand that not all bugs can be fixed in 60 days, although many can and should be. Rather, we thought of 60 days when considering how large the window of exposure for a critical vulnerability should be permitted to grow before users are best served by hearing enough details to make a decision about implementing possible mitigations, such as disabling a service, restricting access, setting a killbit, or contacting the vendor for more information. In most cases, we don't feel it's in people's best interest to be kept in the dark about critical vulnerabilities affecting their software for any longer period.

Somewhere along the way they appear to have lost their senses, and enshrine 90-days as some written-in-stone deadline that makes no sense, and is counter to their stated objectives.

Announcing Project Zero

... Our objective is to significantly reduce the number of people harmed by targeted attacks. ...We will only report bugs to the software’s vendor—and no third parties. Once the bug report becomes public (typically once a patch is available), you’ll be able to monitor vendor time-to-fix performance, see any discussion about exploitability, and view historical exploits and crash traces.

Comment: Don't miss next week's episode... (Score 1) 119

Where the FBI submit a swore affidavit that Kim DotCom is Dread Pirate Roberts to the New Zealand courts in a bid to further his extradition to US, because surely those sheep-loving Kiwis can't possibly resist the War-on-Drugs(tm) as a legitimate reason to let the MPAA/RIAA go after Kim DotCom for digital piracy[1].

If he wasn't under so much financial pressure (freezing of assets) I'd expect him to make a press release suggesting it himself.

But the conspiracy theorists will posit that John McAfee is the real Dread Pirate Roberts. I mean he was found in Belize of all places. What do you think it was really doing there? Creating his second, pseudonyms fortune, this time without the IRS insisting on payments. Hell, half of software multimillionaires who have been in tufts with the IRS themselves would likely support his venture on the down low.

[1] Okay, infringement of intellectual property doesn't have the same sense of dire urgency does it.

Comment: Re:Very disturbed by tag "writeorexecute" (Score 1) 84

by plcurechax (#48813525) Attached to: OpenBSD's Kernel Gets W^X Treatment On Amd64

Well, you're right from a formal logic perspective. In spoken languages, though, there's often an implicit 'either' attached to the 'or', causing 'or' to essentially mean 'xor'.

Yes, everyone should be expected to go read Principia Mathematica before posting to Slashdot, far better than any captcha in use today.

Comment: Re:Virtualisation dates from the 1960's ! (Score 1) 180

by plcurechax (#48813467) Attached to: The Legacy of CPU Features Since 1980s

The first large scale availability of virtualisation was with the IBM 370 series, dating from June 30, 1970, but it had been available on some other machines in the 1960's.

So the idea that "newer machines have support for virtualisation" is a bit old.

This point has been made since the first virtualization software on microcomputers were being experimented with. Those who don't know history are doomed to repeat it (or something similar depending how diligent your citation tracking it).

I'm still waiting for someone tell us that IBM discovered perceptional acceptable lossy compression, such as JPEG, MP3, and MPEG, back in the mid-1960s mainframe era to generate image and videos for punchcards distribution.

And Xerox PARC labs had a portable MP3 player prototype with a seamless white case with a steering-wheel styled interface, locked in its vaults of time.

Comment: Re:Anyone else concerned? (Score 1) 164

by plcurechax (#48813277) Attached to: Man Saves Wife's Sight By 3D Printing Her Tumor

but doctors act a lot more like technicians than scientists or researchers.

Doctors are much more like technicians. You don't want doctors "experimenting" on you unless you really, really need that.

To clarify the doctors or physicians you are referring to medical practitioners in medical parlance. There is two additional medical "communities," which are linked, the medical teaching and research specialties though two these tend to be more intertwined. In many cases they share hospitals, labs, institutions.

Physicians are typically not brought up in a 'science' environment (question assumptions, learning how to research a topic, critical thinking.) Doctors are brought up in 'cram mode'. Dump a lot of into down your throat. You're expected to believe it. They are increasingly taught to 'follow the protocol' which amazingly, is what technicians do.

That is a gross over-generalization. A good physician is trained to be scientifically minded, to take careful observations (utilizing medical testing), question assumptions for faulty assumptions and correlations, and be critical in what they do. They are expected to learn and memorize a large body of knowledge that they will likely need to do their job on a daily basis, and was the first profession AFAIK to have formal continuing education requirements to keep their medical license in many jurisdictions. All bio-chemical scientists follow a protocol so that they have a consistent and reliable testing methodology to reduce mistakes, attempt to be as objective as possible, and to be comparable.

Yes, there are 'physician scientists' but they aren't treating the majority of patients and you don't want them to be ('hey that looks interesting, what happens when I tug on it?').

If you are being treated by a medical researcher, then either there is no known effective or reliable treatment, or there are testing for a new hopefully better treatment. It means you are the test subject, normally not an ideal situation.

This case is interesting as the husband of the patient kicked the docs out of 'technician' mode. And, of course, used a 3D printer.

ALWAYS ask your doc questions about stuff you don't understand.

Interesting, yes, but it bugs me more in that I fear the deniers of vaccine safety, and those who want to consumer-ize their medical experience ("the customer is always right" is a horrible mantra for any legitimate medical practice) will use it as evidence to vindicate their positions. Most of the medical drama was in fact about miscommunication, inconsistent practice, and the need to be your own advocate for medical treatment.

From working with physical scientists, I know that 3 and higher dimensional visualization is still often lacking in being easy to interpret with advanced computer visualization techniques. The results while sometime can be made to look pretty, that has little correlation with how quickly and easily the visualization can be interpreted to extract the relevant information.

Comment: Re:Chrony (Score 1) 79

by plcurechax (#48790357) Attached to: OpenBSD Releases a Portable Version of OpenNTPD

So it's fair to say that Chrony isn't suitable for running on stratum 1 servers, of which there are a few hundred, maybe up to at most a few thousand publically available in the world[1]. For the millions of Linux servers, laptops and desktops that aren't and will never be stratum 1 NTP servers Chrony should be just fine, shouldn't it?

Yes, I think Chrony is fine for most typical unauthenticated leaf-node (client-only) usage, but I still don't recommend it for the thousands of public stratum 2 or higher (see, most are stratum 2 or 3) servers, or the thousands of corporate and organizational NTP servers. For usage as a server, with a full-time network connection, I don't know of any compelling reason to use Chrony over NTPD or OpenNTPD.

Personally I can't see any reason to believe Chrony is more secure than either NTPD or OpenNTPD. Being new, or even saying that Chrony is secure, and programming really, really carefully, doesn't make it so.

Comment: Re:Mathematics (Score 2) 79

by plcurechax (#48775381) Attached to: OpenBSD Releases a Portable Version of OpenNTPD

Chrony is a complete working implementation of the NTP protocol.

You mean complete except for broadcast/multicast mode, or authentication based on public-key cryptography. Some basically it's a good client and a unauthenticated / inefficient (network) server.

It also makes some pretty misleading claims; Chrony can usually synchronise the system clock faster and with better time accuracy except it never explains how it can possibly achieve better time accuracy than NTPd.

Chrony does handle a number of client usage scenarios better than NTPD (namely non-permanent network connection, and laptop-like environments) as far as I know, but it does not achieve better accuracy for the usage scenarios NTPD was primarily designed for (e.g. network connected servers).

NTPD gets its knickers in a twist at the slightest excuse and sometimes ends up stepping the time even though it has perfectly good Internet connectivity and a reasonably good internal clock.

Yet chrony can't detect rouge or fix broken time servers. Beyond possibly having better handling for clients of dynamic clock frequencies (i.e. SpeedStep, and various other power saving features that modify one or more of the several frequency oscillators in a computer.). I say possibly because I am not certain of the state of affairs in the current NTPD code base, I know it was lacking when dynamic clock frequencies originally appeared in systems, but I am not sure that it still is naive about that.

Chrony keeps steady time even if Internet access is intermittent. It never gets confused and picks a falseticker pretending to be stratum one instead of a stratum 3 with correct time, unlike NTPD.

While it does appear Chrony has improved greatly from a simple SNTP client for intermittent network connectivity it was when I first heard about it, that is still its forté, and likely the best client for many end-users' cases. Still it is not a robust general purpose replacement of NTPD.

It even has interfaces to GPS clocks or other hardware clocks, so you can run your stratum 1 server on Chrony if you want.

And YouTube is full of people doing stupid, reckless, and/or unwise things too. That's perhaps too harsh, but that's those "features" are quite incomplete.

Having PPS (Pulse Per Second) optional support is a good start, it is not a comprehensive solution to running a quality stratum 1 server. I expect a stratum 1 server to have improved or at least quantified oscillator ("clock") parameters, such as ideally TCXO (Temperature-Compensated crystal Oscillators) or OCXO (Oven-Controlled crystal Oscillator) for the stratum 1 system's time-keeping. For commercial systems I would suggest looking at a professional NTP server network appliance, there are several vendors including Spectracom, Symmetricom, Meinberg, and others.

Comment: Re:Not surprising... (Score 3, Informative) 278

by plcurechax (#48720105) Attached to: Vinyl's Revival Is Now a Phenomenon On Both Sides of the Atlantic

Of course, its audio quality compared to a CD is debatable [...]

No, it isn't debatable. Due to physical limitations of cutting a groove in the record surface, and interpreting using a needle during playback, vinyl recordings ("LP" or other form factors such as 7 inch 45's) are physically constrained, preventing the recording of some low-frequency sounds and effects. Such sounds and effects are/were featured in electronic music ("techno", "dance", etc.). This was the reason behind the RIAA equalization curve used to de-emphasize the bass frequencies, it allowed closer spacing of the groove (which lengthen play time, the major justification / selling point of the LP format). There are also pre- and post- echos of loud passages if preceded / followed by a very quiet section. Vinyl is an analog recording using techniques developed in the 1950s, and suffers from numerous limitations of the physical limitations of the medium, with no inherit noise reduction or error correction possible, so the vinyl format has absolutely no objective superiority in accurate sound reproduction.

There is one complicating factor, which is not inherit in the vinyl format itself. Modern ("revival") LPs do excel in that they often use a better quality final mix with a wider dynamic range, whereas final mixes for CDs and digital formats typically are highly or over- compressed (due to the auditory perception of "louder" will intuited as "better", the basis of the "Loudness Wars") before being transferred for commercial duplication.

Some well mastered (retain a full dynamic range between quiet and loud passages) CDs and digital recordings do exist, but sadly too many studios still over-compress the recordings.

There was the comical case of Guitar Hero, where digital recordings shipped with the game were of better (dynamic range) quality than were available as CDs or discrete available for purchase digital format (MP3, AAC, etc.).

Comment: Re:Cameras only a partial solution (Score 1) 368

by plcurechax (#48668513) Attached to: Study: Police Body-Cams Reduce Unacceptable Use of Force

Or....... not carrying guns at all.

This is highly effective in several countries around the world, but it does have one key criteria. The availability of firearms to the criminal and/or general public has to be low initially for this to be an effective policy.

And I believe nearly all countries where regular police / peace officers do not carry a firearm, they do have special units that can be activated in the rare event of (suspected) firearm / deadly weapon usage or widespread violence or mob/mass rioting.

In my youth I was told by a police officer during a tour of police facilities that they were trained to only draw their weapon to fire it or to clean it. To the best of my knowledge, based on my own very limited experience, the majority of officers I have seen still operate under that basic premise. A firearm is a means of lethal force to be used only as a last resort. It is not perfect, but I do believe it has lead to far more lives being saved on both sides than the alternative of officers drawing their weapons sooner as a method of deterrent or preparation.

I support the law enforcement officers goal of making it home alive always, but I also value their efforts in not escalating scenarios, and respecting the lives of others.

Comment: Re:Not the holder's money (Score 1) 98

by plcurechax (#48436207) Attached to: UNSW Has Collected an Estimated $100,000 In Piracy Fines Since 2008

If the University is fining them instead of blocking their access and is failing to prevent the copyright violations that it is benefiting financially from

Some universities already have copyright clearance agreements in place, due to concerns of copyright material being duplicated in libraries, these agreements may allow the university or library to generate income as a means of cost recovery of any expenses from administering the program, and an incentive for enforcement.

Since approximately 1% (or less than $1000 total, divided amongst all the Top of the Pops artists for the past 6 years) of the proceeds would likely end being paid to the artists, songwriters, and/or performers; really what difference does it actually make?

Comment: Re:Adblock plus is free (Score 1) 319

by plcurechax (#48435617) Attached to: Google Launches Service To Replace Web Ads With Subscriptions

Think of it less as a way to avoid ads, more of a way for your favourite sites to stay in business.

Unfortunately I'm not certain how many of the IT / technology website are worth subscribing to. Too many of them are already hollow shills, with writers and "editors" who either lack technical or literary skills if not both. Scarcity of journalism, professionalism, or ethics makes me wonder whether they would just continue to produce more "sponsored content" which is merely advertising being sugar-coated as content, whether new product info, amazingly uncritical glowing reviews, verbatim printing of marketing material. Such that I would be paying to read/ view advertisements. Pretty much why people stopped paying for cable as soon as they could, why pay to get content filled with marketing? Let alone the growth in product placement / endorsement in prime time television.

For websites where the user community or user base is the actual value, then how much money should an active user who already contributes their time and knowledge and/or creativity, be expected to pay on top of their donated time and effort?

And I'm not a cheapskate, I have repeatedly subscribed to print magazine subscriptions for magazines that were 100% reader paid (i.e. no advertisements). A small number succeed, and others devolve or get absorbed into what they were trying to avoid. (e.g. Christopher Schwarz's Woodworking Magazine, Citizen Science Quarterly, etc.) One thing that is certain, is that they do have some of the least bias product reviews.

Comment: Re: The Cause (Score 2) 111

by plcurechax (#48389927) Attached to: An Applied Investigation Into Graphics Card Coil Whine

BTW, it would be kind of awesome if the computer hardware testing sites incorporated sound tests into their general testing of stuff.

You mean like this:
  Tom's Hardware: Sapphire's Vapor-X R9 290X 8GB - Temperature, Noise And Power.

Actually I continuously get frustrated by "enthusiast" computer sites reviews who seems to being entirely lacking in technical knowledge when it comes to anything beyond quoting the manufacturers press material. Half of them might as well have a companion site reviewing shoes and fashion tends given their display of technical ignorance.

Comment: Re:The Cause (Score 1) 111

by plcurechax (#48389895) Attached to: An Applied Investigation Into Graphics Card Coil Whine

Naïve question maybe, but couldn't some sort of lacquer be applied on wires to prevent them from physically moving?

For small gauge magnet wire, it often is used on better, but it is not perfect.

Better components tend to cost more, which for commodity priced* products like video cards, saving a few cents can be considered worthwhile.

*) A lower bill of materials cost, can be used to past on some or all of the savings to the consumer, where for price sensitive consumers, the company with the cheapest product can end up selling potentially 30-400% more if you have the cheapest of a seemingly similar product (video card with chip Y9000), thereby increasing net profits even with lower profit margins per unit.

Comment: Writing in natural language(s) (Score 1) 223

by plcurechax (#48389863) Attached to: Ask Slashdot: Programming Education Resources For a Year Offline?

I agree with the idea to study mathematics, as a useful exercise, that would in many cases benefit programmers by giving them a good mental workout, and hopefully reinforce if not expand their understanding of mathematics, logic, and reasoning.

Beyond that I would argue for the study of writing, in a natural (human-oriented) language of your choice.

Programming as a profession, and as an art, is about the meaningful expression of ideas; in a detailed, unambiguous manner that can be processed by a computer. Programming languages are tiny, simplistic, and restrictive in their ability to express ideas, and the execution of these ideas. Writing in a natural language is much more complex, particularly when you strive to remove undesired ambiguity*. The other issue is that as a professional, programming is not done in isolation. Even if you are an independent contractors, you must be able to communicate effectively with clients and users.

*) Ambiguity can be desirable in humor and poetry.

I think that any programmer can benefit from the abilities to make logically sound, comprehensible arguments in a written document; that these abilities will make them better in their ability to understand, and be understood by users, customers, or colleagues.

The argument has been made in the past by Steven C. McConnell in Code Complete, in The Pragmatic Programmer by Andrew Hunt and David Thomas, Coding Horror by Jeff Atwood, and Joel Spolsky (of Joel on Software) in his Introduction to Best Software Writing I and College Advice. And like tons of other software developers, and their managers; repeatedly.

You see, communication is the only really important aspect of software development that people really have trouble with. The rest are details and small bugs, but for really big screw-ups you need miscommunication (or greed)

Comment: S/A vs. C/A & P codes (Score 1) 236

by plcurechax (#48313615) Attached to: The Plane Crash That Gave Us GPS

It didn't take long, though, for commercial providers of GPS services to start complaining about the system's "selective availability" which reserved access to the best, most precise signals for the U.S. military.

Actually the most precise signals (Precision (P) code) are still restricted, even though the selective availability (which was basically introducing jitter) was turned off for the Coarse/Acquisition (C/A) code.

Mausoleum: The final and funniest folly of the rich. -- Ambrose Bierce