The suit, filed this week in federal court in Washington, D.C., also names Roger Vinson, the judge who signed the Verizon order, as a defendant, along with Attorney General Eric Holder and NSA Director Keith Alexander. The plaintiffs say that the NSA’s surveillance program violates the Constitution and unfairly and unnecessarily infringes on citizens’ privacy. The classified order directs Verizon to hand over all of the so-called metadata for calls on its network to the NSA. The metadata includes the originating and terminating phone numbers along with details of the call, but not the contents of the call.
“The order, issued and signed by Judge Roger Vinson, violates the U.S. Constitution and also federal laws, including, but not limited to, the outrageous breach of privacy, freedom of speech, freedom of association, and the due process rights of American citizens.”"
So let me get this right, Deflation is bad and inflation is good?
If I have a currency that is worth less tomorrow than it is today then it is a good thing?
Sorry, I like the idea of a deflationary currency. It encourages saving, thrift, and money management where as inflationary currency encourages spending, and waist.
The whole idea that inflationary currency is good for the economy is based on the idea that it artificially forces spending as no one wants to hold onto it. Believe it or not that is a new idea, for thousands of years the world ran on deflationary currency. It was good then and it is still good now.
You can buy gold with it, then sell the gold and pay your taxes on the profit.
So you can pay taxes with and on bitcoin, you just have to buy and sell the right items.
Give the people who need to work 10x more than everyone else gets and I doubt they'll complain.
The world is changing, letting a handful of people control 90% of the wealth is a bad idea.
So, If I give 10x the amount to those that work, compared to those that do not work, then the 99% of the population will bitch that the top 1% is making too much and control too much of the wealth.
As is obvious from your above two statements.
You are not to far off.
This planet just happens to be several light-years off the main trade routes.
Fewer than 1 in 20 stars have a planet in the habitable zone.
Fewer than 1 in 200 stars have a planet that supports life.
1 in 20,000 have evolved any intelligent life.
So there are a lot of places out there that are off the beaten path and not visited often. Most intelligent species are not noticed for many years after they become space faring and start to explore. This is just a fact of space being so big and there being so many places where there is no life.
About the only people that make it this far off the beaten path and come across this little planet are the ones that are hiding from something or the ones that get lost.
After reading the comments, it appears that most dont know how to estimate.
If you are giving a time estimate in days/weeks/months/years Stop! That is wrong!
Always give the estimate in hours of work. The reason multiplying by pi works is that to management 1 day == 8 hours and 8*3 == 24 hours or 1 day.
I have found that when developers, Admins, and other technical people are asked to give the estimate in hours, you normally get a good estimate. Asking for the number of hours causes them to stop and think about the answer on a more granular level.
To top it off you can play with management when using hours. A work week is not 40 hours, it is 37.5 when you deduct federally mandated breaks and lunch. A year for one employee is 1,950 hours. If the employee has holidays and 2 weeks vacation it drops it down to 1837.5 hours. You can even go in and say things like the first hour of the day and the last hour of the day are non-productive hours due to time to spin up/read email/spin down/ etc and you can deduct another 10 hours a week for that making a work week 27.5 hours of working time or 1317.5 hours a year. Then start deducting weekly meetings, project meetings, etc. I bet when all is done you are looking at less than 10 hours a week of actual time coding. (Therefore your estimate of 3 days was right because that is 18 hours and it has taken 2 weeks to get the 18 hours in
Managers and MBA's think that (maximising hours) == (maximising output), knowing nothing about how productivity tails off when hours worked in a week exceed ~ 40 or so.
I fixed it for you. The above is true of 99.9% of the companies I have worked for.
First, I did not say it was useless, I said "I dont think the average couch potato will ever get it"
Assuming you are an average couch potato, I would suggest that you did not break even. You simply did not understand the math.
Doing some rough math I come up with the following.
Checking my local electric I am paying $0.06 per kWh and if I feed electric back to the grid they pay $0.03 per kWh
Checking current prices and using the optimum output from the system a $28,000 system will produce 16,755 kWh a year. (The average American uses 11,280 kWh a year)
If you take the maximum output of the system multiplied by 10 years (The life expectancy of the system) then divided it by the cost to come up with a cost per kWh of $0.16
Then subtracted the kWh above the average American usage as a net gain of 5,475 kWh multiplied by $0.03 per kWh as the electric company pays you for that, then multiplied it by 10 years. (Total $1,642) to be deducted from the overall cost of the system.
This brings the electric cost to $0.15 per kWh on the system (Remember it cost me $0.06 per kWh from the grid)
The sales brochures will often extend the life of the system out to 15 years in order to reduce the TCO and show you making a small net gain.
All of the above is based on the system working at peek efficiency. The truth is you will average 8,000 to 9,000 kWh a year from the system not the 16,755 kWh as the real world never gives peek efficiency. Add to that the chance of the system lasting 10 years with out damage and costly repairs is slim, one hail storm (We have them here ever 5 years or so) will total a system are require replacement of the panels.
Now, the reason I did not say it was useless is because there are a lot of uses for Solar. I have researched it because the cost of solar is well worth it, if and only if you use it in a manner that get the best bang for the buck so to speak. A small hunting cabin in the woods is a great example. It is not used as a daily place to live, you can design the cabin to be extremely electric efficient, and because you are not there all the time the solar can take days/weeks/months to charge the batters while you are not there so that when you are, you have electric on demand. Add to that the cost being far less than paying to have electric lines run miles out to your cabin and you have major net gains using it in that instance.
You may want to give up. I dont think the average couch potato will ever get it, they all think solar is the way of the future. Most seems to believe we will someday put a simple put a 2m^2 panel on your roof and get all your power needs.
You know what, I take that back. Most could not tell you how big two square meters is. They are expecting one of them blue panel thingies will some day power the house.
I hate to say it but there is not a good one. I have been in the IT field for 20+ years and I personally hate AD, however there is no real alternative. I have watched the open source solutions for years and they tend to be way to complex for a Jr admin and no where near as easy to get going as AD.
A note to open source developers: Come up with a replacement that can work in AD's place for Windows, Linux, Mac, etc. and is just as easy to set up. With that you could get a foothold in the Directory market place. There is no real competition for AD and it is needed badly.
Innovation does not come from a company it comes from competition!
The issue here is that Microsoft has killed the competition, No longer does innovation flow through competition.
Back before windows 95 we had Windows 3.1 and Dos. Dos was produced by Microsoft (MSDOS), IBM (IBM DOS), and Digital Research (DrDOS).
As one would come up with an innovative feature and gain some market share, the others would follow and add a new feature of their own. Each to try to regain the lost share and expand their market. When Microsoft combined Windows and DOS to create Windows 95 they killed the other dos manufactures. Thus creating there market dominance. From that point on they continued to flounder with few major innovations and more and more redesigned of the GUI or adding features that no one wanted or used.
The money they have along with the "really smart developers and engineers" do not matter, they have no real competition. Linux is the closest thing they have had to competition in years and it has never really grabbed enough market share on the desk top to spur the innovation and product life cycles that Microsoft would need to keep going. Dont get me wrong, Linux is stellar and I run it everywhere I can but without the pressure there is no market force to force the innovation.
On the server side, you can see Linux forcing innovation with Microsoft's announcement that admins should learn command line as Windows server GUI will be going away, as well as many of the server advancements that Linux has and Microsoft is implementing.
Do I think Microsoft desktops will survive, no. I see a slow erosion to obscurity. What replaces them may be Linux, Mac, or something completely new designed to use the new technologies that are emerging. I do however see Microsoft continuing for many years, struggling with the desktop and pushing more and more to servers and the cloud.
Link to Original Source