Forgot your password?
typodupeerror

Comment: Re:Ideals and reality (Score 1) 438

by FallLine (#33703930) Attached to: You Are Not Mark Zuckerberg, So Stay In School

What you are talking about in terms of running a small business is risky. It is risky because when Best Buy's Geek Squad, or any local large operator sees you as a threat, it won't be long before you are eliminated one way or another. Sometimes it's possible to operate "under the radar" but the fact that anyone would need to do so is clear indication that one already knows that there are risks of being destroyed by larger, predatory businesses.

I beg to differ. No doubt being an entrepreneur is risky and requires a lot of hard work (in most cases), but the overwhelming majority that do fail fail for reasons that have little to do with any kind of direct response from the competition (let alone anything unethical), e.g., insufficient capitalization, poor financial management, il-concieved products/services, bad execution/implementation, etc.

Big companies have a lot more capital and resources to throw at problems, but the reality is that are often slow to respond to anything and, when they do "respond", their response is based on group-think and a fundamental lack of understanding for the marketplaces in which they operate. These big companies remain successful because they have inertia and a lot of capital to acquire younger companies, not because they are effective at killing the competition or finding new areas for growth.

I have seen this time and time again, first hand as an entrepreneur and coming from a family of entrepreneurs, many of whom that have gone head to head against some of the largest companies in this country and come to be the dominate player in their chosen the market or at least carve out a strong niche for themselves, allowing for the many stakeholders to profit handsomely (including employees), without any of the sorts of shady behavior you imply.

I have known at least a few instances where the founders have actually cheered when their primary competition was acquired by multi-billion dollar corporation since they have a strong conviction that these big companies, even though being known to try to play a little dirty at times and have more capital to spend on things like marketing, actually lack the discipline to be as an effective competitor as the relatively smaller prior-organization.

To re-iterate, most companies fail due the reasons of their own making. To the extent competition is an issue at all, it's more that their product or service fails to provide a sufficiently compelling reason from customers to switch from the existing products/services or have something enough different to attract wholly new customers.

Finally, I'll point out that it's not a zero-sum game. Many markets have a lot of potential room for growth. Sometimes new competition is the best thing that can happen for all (or at least most) market plaeyrs because brings in some ideas/vigour and creates vital competition where before little existed -- which spurs all the companies to invest in R&D -- which leads to growth for all as the products become substantially more attractive for all.

Comment: Re:The problem is politics (Score 1) 892

by FallLine (#32380180) Attached to: The "Scientific Impotence" Excuse

One huge problem with these sorts of debates is that people seem to think that actionable policy descends automatically from scientific conclusions as if it is simply a matter of logic. This is simply not the case because personal values, risk preference, economic principles, and more play a necessary and important role in decision making.

For instance, although the science is fairly certain that the earth has warmed over the last 150 years and that CO2 acts as a greenhouse gas in and of itself, it does not automatically follow that we should abandon anything that produces CO2 or other GHG. Even if we were absolutely certain that X tons of additional CO2 today result in actual damages Y tomorrow (which we are nowhere close to), rational people can still choose other courses of action because they believe that the costs of said action are less than the proposed solutions today. Unfortunately many scientists have unnecessarily politicized this area by conflating their scientific findings with their personal preferences on policy. They are NOT the same thing.

Both science and policy outcomes are likely to be better when society appreciates the difference and proceeds with their debate accordingly.

Comment: Re:I don't quite agree (Score 1) 291

by FallLine (#32106306) Attached to: Microsoft Office 2010, Dissected

To be very clear about this: I have argued from the start that most small to medium sized businesses would be better off outsourcing typically generic services like email because outsource companies enjoy systematic advantages (scale, specialization, expertise, etc). I was also specifically challenging your apparently sweeping arguments that they should be categorically rejected " simply because of liability and privacy" or because they are "not tightly bound by laws regarding retention and usage". You may have intended this to refer only to Google Mail/Apps specifically, but I think it was fair to interpret them as a broader attack on outsourcing IT services generally. Regardless, that is the point I took issue with.

however, all of this is academic and is entirely dependent upon process, policy,hardware, and software. It's like stating that it's less risky to let someone else drive than it is to drive your own car. In some cases yes, in my case, I would argue no, it all depends on the details - and again, I wasn't making generalizations, I was stating my opinion about why I don't use it and the opinion of those in similar positions to myself whose opinion I am aware of.

I agree there can be unique circumstances that make outsourcing a poor choice, but I think you overstate your case. A typical outsourced email operation enjoys substantial structural advantages that an in-house operation rarely has given more standard requirements. A better analogy would be asking whether you're better off flying in a single-engine plane with a private weekend pilot or flying commercial airlines in the US on a wide-body jet. OK, that's perhaps an overstatement with respect to comparative risks, but nevertheless....

But why would a 50 person company need 24-7 support for office applications and e-mail?

Some medical devices business, for instance, require this kind of support (my last company) since they are supporting patients in a clinical capacity nationwide 24-7. Likewise, some of my current clients require support at 10PM or later since they're sending multi-million dollar proposals at the last minute (a lot of money at stake). In any event, my point is that a 24-7 operation has an easier time doing maintenance (since they can schedule it after hours easily) and are more likely to be able to attack a problem as soon as it is detected (which may well be earlier too since their operations are often more professional/proactive). Perhaps you can have your admins stay till 5AM to fix a problem or do a routine upgrade, but they're probably going to make more mistakes because they're tired and will be of little use the following morning when/if something blows the following morning.

I'm sorry but we're an ISV and we have some pretty sharp people here, but none of our 'non techies' would easily be able to fill the role of a squashed Google Apps admin or manage mail difficulties, or be able to convey those issues intelligently to whatever support mechanism Google has in place.

I bet they can handle most of the routine stuff (e.g., add/drop/change accounts) long enough to comfortably locate a replacement (which, btw, probably is NOT a full-time IT person)-- certainly far better than they are to be able to re-build a RAID array or deal with a complex AD replication issue.

What downtime? Are you envisioning some scenario where there's a private company with an understaffed, overworked IT group and someone burns down a file server someplace right before a big sales presentation? While certainly what I would term 'uncommon' it is a very possible scenario; however, it is simple to argue that a small company could at least remedy the situation themselves whereas Google losing your mail (as has happened), or Google Apps not being reachable (happened several times that I'm aware of for long periods of time) is something you can do absolutely nothing about. That's ignoring the obvious problems with "Hey Bob, where's that spreadsheet you had ready for the board meeting today?" "Uh, well, I can't show it from my laptop because when I went to synch it this morning, we were having ISP difficulties, so I couldn't get it off the cloud..." In all instances I can think of, the worst case scenarios are all better on the 'no Google Apps' side of the river.

(Again, I do NOT think Google Docs is a replacement for MS Office generally speaking). Where I think you go wrong is in confusing the availability of Google Mail, practically the lowest possible cost option for an actual company, with the significantly higher cost hosted Exchange options using more stable software and established technology. I would challenge you to try to offer a better emails/contacts/calendar experience for all of your 50 users for a mere $2500 a year (including server acquisition, IT overhead, data center, bandwidth, licensing, etc) which is what Google is charging. I bet you spend at least twice this much in IT manpower alone to deal with backup/maintenance/issues/extra time necessary to handle it as-is (without more exotic/expensive systems to ensure higher levels of availability)

Why? Adding 50 people to your staff is much more a problem relating to hardware, imaging, OS infrastructure than it is to e-mail, file server space, or office productivity software. Adding 50 people to your staff and giving them an approach to office collaboration they've likely never used, seen, or heard of, would be a much bigger problem. Again, I can imagine scenarios where what you're saying is absolutely correct, but I certainly don't think it applies in the general case and certainly not in my case(s)

(Again, not talking about Google Docs!) Consider the costs to step-up server, licensing, IT staff, data center size, bandwidth, etc as you grow or add services. If you buy a server hoping it will last for 2 years as rapidly growing company, you will probably be forced to buy a lot more capacity than you really need if you think you will reach, say, 100 users in the next year and/or as mailbox sizes expand (if you allow this). What about your data center when your company needs to move to a new office space? It's kind of hard to hire 0.2 sysadmins. Ok, maybe you can find a decent part-timer where you are but you lose something in the process (imho). These are just a few examples of stepping issues.

I don't understand why using Google Apps or Google Mail would keep a company, who needed data centers to accomplish their goals, from needing data centers anyhow? Are you saying that these data centers exist simply to support something like Microsoft Office and Exchange?

Actually for this $60M (250+ FTE and rapidly growing) client of mine I've kept them out of a needing to invest in a data center entirely with judicious use of outsource services and I am running their entire IT operation with around 5-10 hours a week from my company (a bit pricey on an hourly basis, but they get a lot more value too). I don't need to be on-site to deal with any critical servers. With the servers that I do directly manage at Rackspace, I can be confident that if a hard-drive or power supply fails, it will be replaced in less than an hour--even if it happens at 2AM and I'm under some bus tire :-). This frees me up to focus on the tasks that really offer value and means that I can bring on a relatively junior full-time IT guy to manage the day-to-day issues when I go off-premises in a few months, after I've delivered a major piece of software, and that it will be more than enough even with increased demand. Likewise, I know when this client outgrows their current suite in 10-12 months I will be able to manage the move painlessly and with minimal investment of time.

I think it would be very simple to argue the exact opposite. Google is quite obviously a giant, virtually irresistible target to hackers. Given the hacking of their e-mail systems most recently, I would think this would actually be an argument against using Google mail for things you want to be private. The more and more people who use Google Apps, the more likely Google Apps will be penetrated (I know, that's obvious.)

My point here was that if you are a tenant in some kind of shared facility and you're a relatively obscure 50 person company in a relatively unsexy area, your data is not going to be of much interest to most would-be hackers or snooping employees. Google itself is obviously a huge name target for hackers (on the other hand, Exchange as an application is at least as large of a target, with highly exposed code (albeit in machine code), and it's difficult for Microsoft/Customers to test and deploy patches across the many different configurations and millions of installations.

Having also been acquired by large companies (both Microsoft and then at another company by SIEMENS) and 'integrated' (love that term which should be worded 'ugh, you change now, do things same way we do'),

Agreed :-) The acquirer of my last company sent in a "SWAT" team from IBM, directed by their usual middling mid-level yes-man people, (they treat them like dirt) and some other consultants to do the remaining integration that I resisted after I gave notice (they didn't understand wtf they were talking about) and it was epic fail by all accounts. Sigh. They're still trying to shoehorn their ERP system, several years later, at the cost of several million in customization (and that's just the estimate!) and it's going to massive clusterf*ck if they pull the trigger. I am half tempted to just deliver a clean functional (fully buzzword compliant) re-write and offer to sell it to them for half the cost (a mere $2M), but they'd never go for it if it doesn't come from SAP or one of their other established business partners. Then again, I think I'd probably commit seppuku if I had to deal with their C-level operating company people again (or, worse, wanna-bes) for any extended period of time.

Comment: Re:I don't quite agree (Score 1) 291

by FallLine (#32102394) Attached to: Microsoft Office 2010, Dissected

I am simply making a general case for the pros and cons of outsourcing email and related services (albeit a bit more pro in the general sense). I am not trying to argue for Google Apps specifically and, in fact, would generally not recommend the product today to most businesses unless their needs were very minimal and/or cost was the overriding concern.

I don't believe that I'm giving short shrift to any "more likely risks."

I only meant this insofar as your actual argument on /. goes, not that you are making an inappropriate decision for your environment.

You may consider the failure rate of industry standard backup hardware to be riskier than storing all of your contracts, designs, patent documents, and corporate e-mail on a publicly traded U.S. company's servers, but I do not.

In my experience and in the experience of many others, there is a high failure rate when actual restores are required -- particularly in smaller organizations that do not systematically test their backups on a regular basis -- and it has less to do with hardware failure than with misconfiguration, small backup windows, occasional media failure, poorly understand expectations/training (e.g., time, granularity, etc), improper storage of tapes, etc. Furthermore, the shortcomings of doing it in-house extend beyond just the ability to do a successful timely restores. In the organizations I have managed, for instance, a few hours of downtime can easily cost them a million dollars in lost sales, not to mention potential hassles/penalty from regulatory agencies, customer relations issues, etc.

I do not see what baring the outsource company's publicly traded status has to do with this debate. Regardless, it is difficult for me to imagine a situation in the various reputable service providers today wherein senior management would make a decision to systematically rifle through their clients files--particularly not if they are contractually obligated not to do things like this and will likely lose customers (at the very least) if they are ever caught. However, I am willing to acknowledge it as a potential risk (however remote). I might be a bit more sympathetic to this argument if said company is a potential competitor of yours or might be likely to want to acquire you at some point. The more realistic concern, imho, is whether or not they have good security procedures in place and whether they actually execute on them to prevent their own employees from misdeeds or some outside hacker (likewise for in-house operations!).

Furthermore, if you really have this much truly sensitive material online that could threaten the very survival of your organization if leaked/stolen (i.e., you have a competitors that could actually substantially steal your IP and get away with it), I would argue that putting it all online in a readily accessible manner is probably a mistake no matter where you host it and email is generally not treated this way. If you're not behind secure firewall/VPN, with mandatory two-factor authentication, w/ minimal granular access levels, auditing, segregation of duties, etc in place.... the concern of company management stealing secrets seems a comparatively remote risk.

How quickly can we respond to a problem after hours? What does this have to do with Google Apps? Or is this now about all 3rd party services?

My point here is that most good service providers make strong commitments to high availability and have the resources in place to actually deliver on it well. Most 50 person companies simply cannot afford to staff even one highly qualified sysadmin 24-7, never mind all the other resources necessary to deliver. Without this kind of staffing, it's harder to do necessary maintenance and respond quickly to problems as they happen, i.e., before they impact primary business hours.

What happens if the admin gets hit by a bus (I use this analogy at the office a lot as well ;)) ? The same thing that happens if the person administering our usage of Google Apps gets hit by a bus.

A company can easily share credentials with a number of reasonably intelligent people to allow them to fill this role quite easily because it's doesn't require much technical know-how or knowledge of the environment. The situation gets very different when you get an even moderately complex in-house IT environment because it requires a lot more skill to operate reliably not to mention specific knowledge of the environment as it is configured.

If you need me to make a generalization I would be happy to state that I find it difficult (but not impossible) to imagine scenarios where private companies should use Google Apps.

I would argue pretty much exactly the opposite (albeit not specific to Google Apps).

A private company likely has:

1) fewer resources to survive lost sales or opportunity b/c of downtime.
2) less scale to make high availability cost effective
3) far more cash flow sensitive (survival)
4) larger stepping problems with rapid growth/movement/strategic reposition
5) can generally use extra cash flow far more productively elsewhere in the organization, i.e., better to hire an engineer or good salesman than buying IT hardware or hiring an additional IT employee.
6) more likely to need to re-locate and thus re-invest in their data centers (been in that situation several times)
7) probably a bit less likely to be a target of some hacker.

Having managed IT very much in-house from 10 to 800+ employees (very IT intensive), from private to publicly traded, to being subsequently acquired and having to integrate into a large (Fortune 50) company, and having consulted for a variety of SMBs (especially start-ups) outside of this... I can say with confidence that most would be better advised to outsource their email and other similar services unless they have very clear reasons for not doing so or already have a large investment in IT for whatever reason that cannot be more efficiently utilized elsewhere.

Comment: Re:I don't quite agree (Score 1) 291

by FallLine (#32098712) Attached to: Microsoft Office 2010, Dissected

I am not claiming that all organizations that reject it do not have valid reasons. There are valid reasons for companies not to use it today -- even for some smaller companies. However, many IT organizations make these sorts of decision in an ignorant and/or self-serving fashion.

I have known many people in IT that think they will promote their own careers by maximizing their budget, head count under them, getting experience with latest/sexiest technology, etc without really considering the company's needs. I have known other that simply lack the ability to think critically about the issues (e.g., use an arbitrary requirement to rule out alternatives without considering the cost v benefit). For instance, they look at the per mailbox cost but fail to really take into consideration just how much overhead managing said technology in-house incurs or the cost of the risk they take by using shortcuts. Others are simply very conservative by nature (in the sense of personal career risk) and until outsourcing becomes the norm in their line of business or management essentially insists, they will not go with the program.

I do not know your organization. However, I wonder whether your preoccupation with trust in your IT organization is informed and rational. Yes, it is possible that the outsource company might do something in more of a top-down fashion that is not aligned with your company's interests. However, those businesses that actually profit substantially directly from their users (e.g., not gmail) have a strong incentive not to do something that might cause their client to lose confidence in them. A company that abuses their clients will ultimately hurt themselves.

In the mean time, I think you are giving short shrift to other more likely risks that correlate strongly with how these services are actually provisioned and managed. In other words, data loss, extended downtime, poor security, malfeasance of IT/operations folk, etc. An outsource company that serves one thousand times as many users with similar needs is far more likely to be able to do that job better and more cost effectively because they specialize in it and have the scale to do is cost effectively (though, as I said, the business is only came into maturity in the past few years and I believe it will take more time to be able to accommodate a wider range of requirements).

Consider:

How often do you test your backups?
How quickly can you respond to problems after hours?
What happens if you or key admin(s) gets hit by the proverbial bus?
How much expertise do your admins really have with your mail servers?
How many people have admin level access?
How many vendors or non-IT people have access to your data center(s) for various reasons?
Do you have offsite DR -- how good is it really?
What controls do you have against internal malfeasance?
How good are your data centers really?

My only point here is that you can't just look at "trust" in an outsource company in isolation of the other risks that hinge on decisions you make in this context. Your environment might be tighter and more cost effective given the stuff you really need but I haven't seen you articulate why this might be the case. Just some food for thought.

Comment: I don't quite agree (Score 3, Informative) 291

by FallLine (#32097100) Attached to: Microsoft Office 2010, Dissected

As a former CIO, I disagree with your diagnosis of the issues. Many companies, both large and small, outsource services to companies with access to all manner of sensitive materials (e.g., documentation destruction, electronic reading rooms, business continuity services, AR, etc). The difference is how those services are implemented and the trust in the organizations, not so much the laws that specifically regulate their offerings or even the ability to sue them.

In my opinion, the problem with Google Apps is that they:

1) don't make many important explicit commitments (e.g., availability, security, retention policies, restoration times, etc)
2) provide very little visibility into their implementation
3) their low cost service model provides little room for day-to-day customer service (e.g., mailbox restore) and the confidence to know that you can rapidly escalate a problem should one arise (not to mention offline backup)

I say this because this implies the issue is not inherent to outsourcing email in principle. The outsource service model is the future for generally commoditized services like email. There are several offerings today that I believe are generally superior to in-house for most SMBs that want Exchange functionality and need good availability. I have recommended Rackspace's Hosted Exchange to a $60M (revenues) client of mine and a few others. I am generally quite pleased with it, though there are a few shortcomings that will prevent others from adopting it today (especially larger organizations).

The biggest issues with the various Hosted Exchange offerings (those I'm familiar with at least):

#1: Authentication cannot be readily shared with other services, i.e., the employees need to juggle yet one more set of credentials.
#2: Limited ability to use 3rd party software (e.g., VM, Fax, two-factor authentication systems, etc) unless it exclusively uses exposed interfaces (RPC/HTTP, IMAP, etc).
#3: Won't scale well with large companies (with multiple subsidiaries/operating companies) that need/want to use more advanced AD features.

That said, these companies will figure most of this stuff out gradually until all but the most conservative big companies concede that they are better off outsourcing it, i.e., that an outside company has the scale and expertise to do a better job at less cost and in a more capital friendly way. When real customization is required then in-house makes sense, but the reality is that many of these issues are fairly widely felt and can be addressed with more generalized solutions.

Image

PhD Candidate Talks About the Physics of Space Battles 361

Posted by samzenpus
from the load-photon-torpedoes dept.
darthvader100 writes "Gizmodo has run an article with some predictions on what future space battles will be like. The author brings up several theories on propulsion (and orbits), weapons (explosives, kinetic and laser), and design. Sounds like the ideal shape for spaceships will be spherical, like the one in the Hitchhiker's Guide movie."

Comment: Re:Extraordinary claims... (Score 1) 822

by FallLine (#30255816) Attached to: Engaging With Climate Skeptics

If I can do anything for the cause of science, it's to repeat this: Scientists get famous by ripping the shit out of other scientists' work. The famous scientists you've heard of got famous by demolishing the work of others. As scientists, we know that. And we're always looking for some schmuck to use as a stepping stone. I know if I do bad science, I'll be a stepping stone. I know if I find bad science, I can use IT as a stepping stone. That keeps most scientists pretty damn honest.

These CRU emails pretty much prove that you put way too much stock in this. There were several emails in which these so-called scientists expressed major reservations about the quality of research being conducted by Mann, Briffa, and other researchers and yet none of this made its way into the literature and these same researchers continued to support each others work in public.

What you fail to acknowledge is:

1) The conclusions are not binary. Relative warming can be exaggerated in a climate reconstruction, while another substantially different conclusion will generally not be acknowledged as falsifying it.

2) The incestuous nature of the research community combined with the fact that they are rely on fundamentally similar and weak evidence means that it is not in their interest to try to directly discredit their peers research.

3) The vast majority of this research is NOT reproducible because complete data, code, and methods are generally NOT shared even within the so-called community. This makes it extremely difficult to prove bad methods were used or otherwise falsify the conclusions. When material is shared it is often incomplete and only shared with friendly peers which creates an incentive not to criticize for fear of getting cut off in the future and a feeling of debt.

4) There are probably no silver bullets that can easily overturn these models or reconstructions empirically. Unless someone has something like this that undeniably falsifies all prior recent work in the field all of their incentives tell them to go with the party line because they are unlikely to get published and will have scorn heaped upon them.

Comment: Are you kidding me? (Score 5, Insightful) 822

by FallLine (#30250496) Attached to: Engaging With Climate Skeptics

What possible motivation would the climate scientists have to do so? What do they gain from over hyping the possible scenarios? To promote renewable energy? Again, what do they gain from this?

Here are just a few reasons:

1) Further their own careers. Big (positive) claims about AGW are important if you want to get published in the high impact journals.

2) To get grant Money to stay publish and stay employed.

3) Face time with the media

4) Genuine-belief in AGW--even if not well supported by the actual evidence.

5) Insider politics -- why criticize a peer's research that largely agrees with your own? The incentives are reversed.

6) Other environmental motives, e.g., "even if AGW is wrong, reducing pollution, sprawl, cars, oil dependency, etc is good" (I have heard this argument a lot)

7) (Mistaken) belief in the precautionary principle, i.e., AGW is a risk and refusal to see it in cost vs benefit terms.

Comment: Re:Scientists are not Politicians (Score 1) 822

by FallLine (#30250292) Attached to: Engaging With Climate Skeptics

Scientists have always been egotistical, with their own pet theories and human idiosyncrasies. The saving grace of science has never been the scientists, but the method in which science is conducted. Peer review, vigorous debate, and cat-fights. What we believe scientists should be and what scientists are are two very different things. The problem here is the outside influences. You and me.

I think you have managed to miss the point and (sort of) contradict yourself. We will never remove the human element -- inside OR outside -- these negative influences have always existed in science and will never be banished.

The method is what matters most in ultimately arriving at the best possible science. Although we can never systematically remove all bias, bad faith, bad incentives, or what have you, we can sure as hell demand integrity and openness in science. The more politicized and the more important science is the more important it is that science stick to its core principle of openness. All of this empirical research MUST be fully replicable. In other words, all data, methods, and code should be reasonably obtainable by any qualified individual. Whether research is replicable by qualified individual is truly an objective fact that can be agreed upon by reasonable people on any side of an issue.

There is no excuse in this day and age with the internet and cheap storage why nearly all of this cannot be archived as a condition for publication. We certainly should not be making multi-trillion dollar decisions based on science without this critical step. Without replicability it's extremely difficult to falsify research and very easy to publish false or bad research when the prominent journals are all managed by like-minded individuals who have little incentive to find flaws in your research and a lot of incentive to keep you friendly.

What the CRU emails have done is demonstrate, amongst other things, a conspiracy to prevent reasonable requests (for essential information necessary for replication) for its own sake despite: the long tradition of this in science, the policy of several journals (albeit poorly enforced), the charter of these government agencies, and FOI requests. This behavior fundamentally abhorrent to good science.

These CRU emails have given some people a view into the naked politics (against science), bias, attempts to manipulate publication, suppressing legitimate internal criticism (for political reasons), etc... Taken together, they make compelling case for the corruptibility of science and the need for outside scrutiny. You do not need to be a climate scientist to point out fundamental statistical errors in so-called climate reconstructions or to observe that data is being misrepresented on graphs ("hide the decline")... The knowledge that outsiders can actually check research for errors at any point would do a lot to keep research honest -- before and after it is published. We may never be able to ensure that good skeptical research receives a fair chance to be published, but by preventing bad research from entering or remaining in the literature science can correct itself far more quickly.

Comment: Re:Nonsense (Score 1) 746

by FallLine (#30206970) Attached to: New Research Forecasts Global 6C Increase By End of Century

Yes, clearly this is something that is better handled by lawyers, business people and their assorted paid off goons.

I said nothing of the sort. I merely suggest that the academic process is wholly inadequate given the cost of the measures that are being proposed today and the alleged risks. We should not be making multi-trillion dollar decisions based on the output of a handful of academics following the usual insular peer review process which was never designed to do this sort of thing.

We can fund and construct a far more open and rigorous process if the science is truly "settled".

Want more reliable proxy records? Gather a team of experts to determine the best proxies to examine, clearly state the assumptions and reasons for choices, then fund this research directly so that it can be used freely by all. This data should be as complete as possible, i.e., do not do what Briffa did with his Yamal paper by just including 13 trees to represent the last 100 years while using many many more in prior years.

Want to produce reconstructions that will convince a lot more people? Surely these scientists can collectively decide on the best most defensible way to produce a limited number of reconstructions, then they can document their exact methods and explain their rationale for outside review and commentary. Demand thorough independent analysis by one or more 3rd parties, don't just wave your hand at it and say it roughly agrees with my beliefs about AGW and looks OK on paper. Allow people with expertise in relevant fields to produce relevant criticisms, e.g., statisticians, dendros, etc. Do not allow artificial hurdles to be constructed like those which exist in academic journals (e.g., commentary can only be made X days after publication, can only be Y words long, etc).

Want to produce climate model that will attempt to predict future temperatures? Share the f'n code. Document it well. Explicitly state your assumptions and defend them. Allow people to see how it fairs against the instrumental record in detail, both past and future. It's not as if the only meaningful output is what happens with average temperature. These models make all kinds of assumptions about various phenomenon, particularly with respect to positive and negative feedback effects (esp. cloud cover)... well the outputs can be documented to so how well they hold up. If CO2 output deviates substantially from projections, fine, then run it with the actuals and see how it changes...

All of this data can be archived in one place for all to see at the click of a mouse. The Whitehouse spent $18,000,000 to re-design recovery.gov. Certainly we can spend money to have one definitive source for OPEN research on the settled science of AGW. I would gladly see my tax dollars be used for the goverment to set aside, say, 10 billion dollars to assemble a panel of leading scientists from a variety of areas of expertise to manage this process, direct research funds, etc. If the science is truly settled, it shouldn't take very long to make a compelling case for AGW and dispel mainstream concerns of bias and manipulation.

Comment: Re:Nonsense (Score 1) 746

by FallLine (#30195610) Attached to: New Research Forecasts Global 6C Increase By End of Century

That is BS. If you bothered to read the refutations, the divergences are themselves a subject of many publications, and this has been out in the open forever.

My point is that this study and several others have deliberately used this 'trick' in their presentations and that this behavior is dishonest and anti-scientific. There is no excuse for it, period. Your excuse in the prior post that the actual temperature record reflects an increase is misleading and amounts to sophistry.

I do agree that access to the raw data could be better, and even that some of the statistical methods etc have been applied poorly (or even incorrectly). You might even find, somewhere in the stack of tens of thousands of climate science publications, some that misrepresent the data, perhaps even deliberately. Not all scientists are as expert as they should be in statistics, and scientists are human and have human frailties (although that doesn't excuse anything). But this does not appear to be one of those cases. You are reading far too much into one email, and you clearly are not aware of the context.

There are only a handful of published reconstructions that show 'hockey stick' shapes stretching over ~2K years and most of these rely on a small set of similar raw data, similar statistical methods, and are authored by a small circle of scientists at even fewer institutions. Many of these studies have already been largely discredited. Furthermore, expecting that every study should be immediately shot down if it is bad is unrealistic given the biases of the climate science community and their complete lack of openness on critical issues.

You need to have a great deal of faith in their particular peer review process to believe this issue is anywhere near settled given how reluctant they have been to allow any sort of dissent by barring access to data, methods, black-balling people from journals, etc. Many reasonable people look at the facts and believe that there is a need for skepticism.

Given:

1) Emails like this (clear conspiracy to mislead, hide embarrassing facts, hide & delete data, etc).

2) The small size and incentives within the climate science community

3) Numerous fundamental documented errors in many of these studies (bad statistics, poor quality data collection, etc)

I simply have little faith in the quality of the peer review going on.

If the climate science community wants to convince more people of the need to act, to spend many many trillions of dollars in the near term, then they MUST be far more open and honest at the very least. There is no excuse in this day and age not to share data and methods freely given the fact that almost all of these people work for publicly funded institutions and are collectively asking society to turn itself upside down to fix the alleged problem.

If all of the science of global climate change depended on a single set of proxy data, then you would have a point. But it doesn't, and you don't.

I never claimed that the entire science of global climate change relies on a single set of proxy data. Even if the science is correct and the model forecasts are 100% accurate, this behavior is still bad and inexcusable, i.e., my point still stands. That said, even though I acknowledge that the earth has warmed over the last century and that CO2 does act as a greenhouse gas, there is: a great deal of unsettled science in exactly how unprecedented today's temperatures are; exactly how sensitive the climate is to CO2; the behavior and extent of feedback effects; what will happen in the future given certain CO2 levels; not to mention the 2nd, 3rd, 4th, and further order impacts of warming (if it happens as predicted).

This issue is too politicized and too important to leave to the typical academic process.

Comment: Nonsense (Score 3, Insightful) 746

by FallLine (#30194232) Attached to: New Research Forecasts Global 6C Increase By End of Century

Try reading that again: "adding in the real temps [...] to hide the decline."

So, it is some kind of proxy for measuring the historical temperatures (in this case, tree rings), and this proxy data, for some completely different reason (pollution affecting the tree growth, for example??), shows a decline in the last couple of decades.

The real temperatures (ie, the ones that are actaully measured, like with a thermometer) show an increase, so use the real measurements for the final 20 years of the data.

There would be more of a problem if this wasn't disclosed somewhere. But even then, it is an argument about how the proxy data is presented. The real temperature data doesn't show a decline.

Virtually everyone admits that temperatures have increased substantially over the last ~100 years. The entire point of these reconstructions is to demonstrate that this rise is unprecedented over the past ~2K years and follows a certain pattern. If the same methods on the same species of tree in the same area in the same study not only fail to accurately replicate the thermometer record over the last several decades but also actually diverge substantially, this calls into question the entire pursuit.

In other words, if your methodology suggests that it couldn't have been warmer from 0 BC to 1900 because tree rings were not statistically larger, but the rings actually fail to increase as predicted in recent history when we know it has warmed, then this strongly indicates that we also cannot rely on warmer past temperatures to be accurately reflected in increased tree ring size either. Of course you can speculate that pollution may be playing a role, but it is still just speculation and there are better documented conclusions one could draw from this, e.g., that tree rings do not correlate linearly with temperature, that changes in moisture content, sunshine, CO2, etc play an equally large role, etc.

Good non-politicized science should: pick a methodology; show how it correlates with the actual thermometer record; then document it clearly for better or worse over the entire course, i.e., actually show the divergence (and make the data and methods available for all for review). These so-called "scientists" actually went to the other extreme by trying to hide the divergence and present a view that was not supported by their actual research. Many of these same scientists have gone further still by refusing reasonable requests for the raw data and further information on their methods.

This is politicized "science" at its very worst.

Comment: Re:Fundamentally ignorant of the business (Score 1) 159

by FallLine (#29889257) Attached to: Should a New Technology Change the Patent System?

Assuming this is true, which has to be assumed to give any credence because you give no backing material or references here whatsoever.

I have multiple and varied connections to the pharmaceutical, biotech, and medical devices industries (inside & outside, academic, research, financial, executive, etc), so I have a feel for the numbers. However, you can find confirmation of this in academic literature if you look. DiMasi asserts in a 2003 paper that the fully capitalized cost of clinical trials is 70% of R&D (and this number has certainly grown according to all trends), see: "The price of innovation: new estimates of drug development costs".

then I think such government regulation should be funded.

You are not presenting any concrete proposal nor any support for your vague notions, but it's a bad one in general. Government generally does a poor job at allocating capital (see: central planning) and has a serious problem with efficiency (see: motivating its employees, agency costs, interest group politics, etc). Furthermore, if you mean that industry would still retain a role, such an arrangement could create a massive agency situation whereby industry could submit multiple compounds for clinical studies and not have to pay the costs thereby making the system far more expensive for a society as a whole. There is also real expertise needed to design the trials and manage them which, again, government tends not to do well with pursuits like this.

In example I think "No Child Left Behind" was a dismal failure because it tested for poor school environments but did nothing to fund those requirements, or to catch schools up to standards.

I know it is deeply unpopular with the teachers unions, but this does not prove anything. Virtually every test score has risen or stayed the same. Do you have any objective facts to prove that on whole it was a net loss? Furthermore, with respect to the unfunded mandate stuff, that is a weak argument. It is not mandating an act--it's demanding baseline performance (and generally weakly at that). Given the dismal performance of US public school education vs most of the developed world and it's comparatively high cost, it's difficult to argue persuasively that waste could not be removed from the system by refocusing it on a more effective curriculum and reducing the vast amounts of administrative overhead. If anything it has failed to the extent that it allowed the states to control the tests and generally weaken the standards.

what's the alternative -- beta testing?

Fewer, smaller, and shorter clinical trials (that recognize that RCTs have huge limitations) combined with increased post-marketing (approval) surveillance. Namely, the cost per patient is so high and the availability of volunteers is so limited (this is a HUGE problem now) that it is impossible to conduct these trials with sufficiently large numbers of patients (sample size) to detect relatively rare and/or delayed on-set problems. With current phase III trials (average 3-4K patients) adverse drug reactions (ADRs) can only be detected that occur more frequently than 1:1,000. To have a 95% chance of detecting ADRs that occur in 1 or 2 in 10,000, you would need to enroll 600,000 patients. This is a number that is completely impractical yet would still expose half those patients to risk and the other half would just be getting placebos (a problem in and itself). Furthermore (besides raw statistics), these patients do not tend to be represent real world conditions very well (far higher compliance, generally healthier, fewer co-morbidities, increased observation, etc).

In other words, there is no substitute for actual real world use. In the meantime the FDA, which has some understanding of this problem, has every incentive to delay and restrict products from market and almost none to allow market entry (it's CYA). This problem has gotten increasingly worse over the past several decades and shows no sign of changing.

Nobody should do any such thing. I actually don't think software is dependent at all on such monopoly grants. Software patents, which are essentially patents on Boolean Math, are inherently counter-productive, and arguably a limitation on free speech rights. I think currently proprietary software isn't even all that dependent on copyright. Proprietary software is dependent on Trade Secret, since they only release binaries and hide the source. Even when they release the source it tends to be under NDA style contract provisions. Open Source also has better business models than Copyright entails.

The software industry is almost entirely dependent on various forms of IP and related forms of IP support. Copyright is most critical, but so to is trademark, patents, trade secret law, and various other protections (even government support for contracts can be a form of this). There are almost no software sales (not to mention development) in countries that have weak support for IP, even in relatively industrialized ones like China and India that are starting to move in this direction. The vast majority of paying customers just care about being able to use the actual object code as it exists, not to understand how it works or be able to make their own improvements. It is trivial to copy software and break any protection schemes (see: the internet, china, etc). IP-free regimes support very little ability for software companies to recoup their R&D and is no guarantee of open source code either (not to mention that most open source business models would be equally dead).

Furthermore, you are entirely mistaken, software patents are typically not on math or even basic algorithms as you imagine them (I actually am a named inventor on at least one). You might argue that all software is ultimately built on math, but that is normally not contained in the claims of most patents, and is equivalent to claiming that a patent on, say, an engine design is equivalent to patenting atoms (or that copyrighting a book is equivalent to copyrighting individual words). In other words, it's a non-nonsensical argument. That is not to say there are no problems here, but that your argument is very bad.

Why would I pose the idea of Democratic government taking over an industry if I thought it was inherently bad? You're making that claim, not me. And you have the burden of proof here -- the need for big pharma has never been proven, in that it has never been proven that similar or better treatments wouldn't arise without giant drug companies and monopolistic patents.

Actually you have the burden of proof, since you are arguing against the status quo (not to mention against common sense, decades of basic economic research, empirical comparisons with various socialist regimes, etc). Furthermore, you need to present an actual proposal, not just half-baked ideas in order to even begin to offer an argument. In addition, I have presented many arguments against your vague ideas and demonstrated flaws in your thinking.

I do not accept your arguments, and the burden of proof is with those that say patents are needed, or that the same or better results cannot be achieved through less exclusionary means. That is what has not been proven.

Again, the burden is on you since you are arguing against the status quo. However, if you an arguing that shorter patent lengths would work better, then you need to address the many problems I have presented at the start of this thread (e.g., the actual effective patent life today (post-marketing approval), the R&D cost, risk, costs of capital, S&M, etc).

Comment: Re:Fundamentally ignorant of the business (Score 1) 159

by FallLine (#29879867) Attached to: Should a New Technology Change the Patent System?

I thank you for enumerating all the ways that big pharma would fail without government protection. Any business that depends on so much government protection should simply be made part of the government, and thus subject to Democratic decision making instead of private profiteering. Short of that, they should be allowed to fail just like any similarly poor business model would on the free market.

Funny you should say that. The overwhelming majority of the R&D costs are imposed by government demanding very expensive and lengthy randomly controlled trials (RCTs) combined with the FDA's excessive conservatism to an extent that is generally not socially optimal. These increased costs and delay affect not just the cost of the approved drugs, but the labeling of them, i.e., the manner in which companies can market them, and the ability to sell some good medications at all (some good drugs are bared from the market due to the limitations of RCTs).

In other words, if you are arguing this from a libertarian perspective you should at least recognize the governments role on the other side of the equation. You should also extend this same argument to software and most other technology oriented business since they too depend on government granted monoplies.

If you are not arguing that government action is inherently bad, then you should at least present a coherent argument for why I am wrong.

If you accept my arguments, then you really need to explain why strong patent protection is problematic since those drugs would not be developed without it and the "problem" would be entirely academic.

Biology is the only science in which multiplication means the same thing as division.

Working...