Every laptop has one built in.
Harmony in music is based almost directly on the simplicity of the ratio of the frequencies of notes in a chord.
Octave = 1/2
Fifth = 2/3
Fourth = 3/4
Major Third = 4/5
Minor Third = 5/6
and so on.
Their are certain cultural anomalies; For example our our preference for three notes in a simple chord (first, third and fifth) means that fourths are generally considered slightly more disharmonious that thirds, due to their relationship to the third and the fifth.
Also the intervals in most instruments are fudged slightly to make the work in any key. This practice started with Bach I believe.
The point, of course, is that it is not that surprising that harmony is more universal that human culture. The mathematics that underlies harmony is more universal than human culture.
Author: Jack Freund and Jack Jones
Reviewer: Ben Rothke
Summary: Superb overview to the powerful FAIR risk management methodology
It's hard to go a day without some sort of data about information security and risk. Researches from firms like Gartner are accepted without question; even though they can get their results from untrusted and unvetted sources.
The current panic around Ebola shows how people are ill-informed about risk. While distressing over Ebola, the media is oblivious to true public health threats like obesity, heart disease, drunk driving, diabetes, and the like.
When it comes to information security, is not that much better. With myriad statistics, surveys, data breach reports, cost of data breach: global analyses and the like, there is an overabundance of data, and an under abundance of meaningful data.
In Measuring and Managing Information Risk: A FAIR Approach, authors Jack Freund and Jack Jones have written a magnificent book that will change the way (for the better) you think about and deal with IT risk.
The book details the factor analysis of information risk(FAIR) methodology, which is a proven and credible framework for understanding, measuring, and analyzing information risk of any size or complexity.
An Open Group standard, FAIR is a methodology and a highly effective quantitative analysis tool.
The power of FAIR is immense: it enables the risk practitioner to make well-informed decisions based on meaningful measurements. While that seems obvious, in practicality, it is a challenging endeavor.
FAIR is invaluable in that it helps the risk professional understand the language that the corporate board and senior executives speak. Understanding that and communicating in their language can make it much easier for information security to be perceived as a valued asset, as opposed to using Chicken Little statistics.
FAIR takes the risk professional out of the realm of the dealing with risk via the checklist; which only serves to produce meaningless measurements, into the world of quantitative, defendable results.
For those that are looking for a tool to create pretty executive summary charts with lots of colors, FAIR will sorely disappoint them. For those that are looking for a method to understand how to calculate qualitative risk to support a formal enterprise risk management program, they won't find a better guide than this book.
The book is an incredibly good reference that will force you to look again at how you view risk management.
Jones writes in the preface that the book is not about checklists and formulas, but about critical thinking.
The authors note that information security and operational risk has operated for far too long as an art, with not enough science. This is the gap that FAIR attempts to fill.
The authors write that risk decision making quality boils down to the quality of information decision makers are operating from, and the decision makers themselves. The book does a remarkable job of showing how a person can become a much better decision maker.
A subtle but important point the book makes early on is that many risk professionals confuse risk possibilities with risk probabilities. The FAIR method forces you to focus on probabilities and not to obsess with Ebola like possibilities. Such a quantitative analysis approach is what makes FAIR so beneficial.
The book spends a few chapters on going through FAIR risk ontology and terminology. Inconsistent and poorly defined terminology is one of the most significant challenges the information security and operational risk profession faces. Having a consistent set of logical terms and definitions that make up the FAIR framework significantly improves the quality of risk relations communications within an organization.
The value of having a consistent set of logical terms and definitions is significant. For example, the book notes that many people use the term threat. In the context of risk analysis, it might not be a real threat if there is no resulting loss. In that case, it would be considered a vulnerability event.
The challenge of FAIR is acclimating to its dialect. But once done, it creates an extremely powerful methodology for risk communication and management. And therein lays its power. Setting up a common framework for risk management becomes and invaluable tool to present risk ideas. In addition, it makes the findings much more objective and defendable.
In chapter 5, the authors address the biggest objections to quantitative risk management that it can't be measured or is simply unknowable. They agree that risk can't be measured at the micro level, but it canbe effectively measured to the degree to reduce management's uncertainly about risk.
They also importantly note that risk is a forward-looking statement about what may or come to pass in the future. With that, perfect accuracy is impossible; but effective quantitative risk management is very possible.
The power of FAIR is that is helps add clarity to ambiguous risk situations by giving you the tools to add data points to a situation that is purported to be unknowable.
Chapter 8 is an extremely enlightening chapter in that it provides 11 risk analysis examples. The examples do a great job of reinforcing the key FAIR concepts and methods.
In chapter 10, the authors write that the hardest part of learning FAIR is having to overcome bad habits. For most people, FAIR represents a recalibration of your mental model about what risk is and how it works. The chapter deals with common mistakes and stumbling blocks when performing a FAIR analysis. The 5 high-level categories of mistakes the chapter notes are: checking results, scoping, data, variable confusion and vulnerability analysis.
FAIR is a powerful methodology that can revolutionize risk management. The challenge is that it takes a village to make such a change. Management may be reticent to invest in what is perceived as yet another risk management framework.
But once you start using the language of FAIR and validate your findings, astute management will likely catch on. Over time, FAIR can indeed be a risk management game changer.
The book is flawless in its execution and description of the subject. The only critique is that in that the author's should have been a bit more transparent in the text when (especially in chapter 8) mentioning the FAIR software, in that it is their firm that makes the software.
For those that are willing to put in the time to understanding FAIR, this book it will make their jobs much easier. It will help them earn the trust of senior management, and make them much better risk management professionals in the process.
Reviewed by Ben Rothke"
Using a law designed to catch drug traffickers, racketeers and terrorists by tracking their cash, the government has gone after run-of-the-mill business owners and wage earners without so much as an allegation that they have committed serious crimes. The government can take the money without ever filing a criminal complaint, and the owners are left to prove they are innocent. Many give up and settle the case for a portion of their money. “They’re going after people who are really not criminals,” said David Smith, a former federal prosecutor who is now a forfeiture expert and lawyer in Virginia. “They’re middle-class citizens who have never had any trouble with the law.”
The article describes several specific cases, all of which are beyond egregious and are in fact entirely unconstitutional. The Bill of Rights is very clear about this: The federal government cannot take private property without just compensation."
The analogy actually fails because the original poster didn't RTFA. Or even the Slashdot summary in this case.
The cheesburger only has one contract. (The implicit one in the purchase.) Microsoft requires the purchaser of a PC to agree to a second contract with them AFTER the sale was completed and the goods received from a distinct vendor. (The shop that sold you the computer.)
These private companies are being restricted from their work by a court order. Thats an example of regulation, nothing to do with, "the invisible hand of the free market".
Can you cite any events or references at all to back up that incredibly vague statement?
This could be said of all the whole of Europe, the Near East and North Africa. There were two world wars just in the previous century.
According the Smithsonian The region had existed as 3 separate stable vilayets within the Ottoman empire for nearly 400 years. I'm not sure where you're getting your history from.
I see so many posts here using IQ and intelligence as if they were interchangeable synonyms. They are not.
IQ tests have no basis in science. IQ tests have never been benchmarked against anything except earlier IQ tests.
IQ tests cannot be proven to exclude cultural bias.
IQ tests cannot be said to measure intelligence in any precise way, unless you define intelligence as the ability to do IQ tests.
If you demonstrate that different races perform differently in IQ tests, you haven't proven anything about race and intelligence. You have only proven something about race an IQ tests.
... one in every four women actually will be raped in their life
... The industry needs fewer people like you, and more young girls.
Are you sure it's not just you that needs more young girls bud?
And this is news for nerds how?
As does this Mazda 2 prototype with 0.33 litre rotary engine. http://www.autonews.com/articl...
I wonder, was that sample of people take from a single city/state/country whatever?
Generalising this to a study of, "People" might be more than a little misleading...
The title (of both the slashdot post and the original article) is misleading.
The article cites one Eugene Spatford who observes that, "software makers churn out products riddled with vulnerabilities." That's not the security industry's fault.
He goes on to tell us that law enforcement is inadequately equipped and that criminals protect themselves by bribing government officials. That's not the security industry's fault either.
Of the tools the security industry does use regularly he says that, "We’re using all these tools on a regular basis because the underlying software isn’t trustworthy." Again that's not the security industry at fault.
And the solution?
"... an investment in computer programming education and a major move by software manufacturers to embed software security concepts early into the development process."
Sounds reasonable to me. Also sounds like a task for the software development community generally, NOT just those specialising in security.