How can a list of "the seven major tech hubs" not include Seattle, which is home to some of the biggest tech companies in the world, but include cities like Atlanta? That is a strangely biased list so I wonder what the criteria was for "tech hub".
The vast majority of businesses have no significant branding to speak of, publish no significant media, and do no significant R&D. To put some perspective on it, there are more companies in the US than there are individual engineers and very few companies produce anything where copyright is central to the business.
Most businesses are built almost entirely on individual customer relationships. Restaurants, contractors, specialty manufacturers, agriculture, etc. In all of these areas there are a few outliers that develop a substantial brand that they trademark but more often than not the "brand" is the individuals that work there or run the business so trademarks simply are not that important to their success.
I think this is surprising to people only because most of the largest and most visible American companies do have substantial investment in IP, so it is an availability bias. It overlooks the myriad smaller companies that have little investment in IP. It would probably be fair to say that IP is important to a significant percentage of the American economy but only because it tends to be concentrated in many of the largest and most successful companies. It is not evenly distributed across all companies.
Your understanding is incorrect. There is nothing particularly special about anti-ship missiles and there are no anti-ship missiles that cannot be intercepted by the myriad active defense systems that currently protect the US Navy. The cost is also asymmetric; intercepting an anti-ship missile is much cheaper than the anti-ship missile, so it is difficult to make it up in volume.
There is nothing that flies through the air that cannot be hit by modern active intercept technologies. The trend away from heavy armor is due in part to this technical development; you don't need armor if it is inexpensive to ensure no one can hit you.
Obamacare slightly reduces the cost of insurance for older people (like me) but then materially increases the cost for young males and in other ways in practice. Ever look at the demographics of a tech startup beyond a founder? At my startup, we pay for good insurance for our employees and while maybe my individual insurance is slightly cheaper, that is apparently buried in the noise floor of the increasing costs for the total employee pool. And the small difference in individual cost for older individuals does not materially alter the risk calculus for the individual in terms of whether they'll start a tech company.
It would be nice to see a little honesty that the law as written will be terrible for a lot of people. Including, empirically, tech startups. The percentage increases per employee are not small at all going forward and I know a lot of tech startups that are trying figure out if and how they can bury those new costs. I'm sure there are many policies that would reduce the direct costs for startups but this wasn't it, and predictably so. Perhaps media spin artists can contrive politically palatable scenarios where it reduces some startup's cost slightly while out here in the real world there has been a substantial increase in the cost of providing health insurance at tech startups.
Consequently, the idea that this reality will fuel a tech startup boom is some pretty strained reasoning. It may have some benefits but this won't be one of them. Obamacare might have helped some people but tech startups do not seem to be among them.
"All software is, by definition, math. And all math, by definition, is not patentable."
The problem with this argument is that the same reasoning that defines software as mathematics also defines *all* patentable subject matter as mathematics. If you can describe it, it is literally a finite algorithm. If a software expression of an algorithm can be excluded on the basis that it is an "algorithm" then the argument can be applied to all subject matter. (see: algorithmic information theory)
What is not patentable are mathematical concepts, not specific processes that implement those concepts. You cannot patent the idea of "sorting" but you can patent a sorting algorithm. This is an important distinction: there are an unbounded number of sorting algorithms that can be invented that express the mathematical concept of sorting so inventing one particular expression does not preclude anyone else from inventing their own expression.
This does not speak to the "on a computer" type patents (which are silliness) but it is the reason that computer algorithm patents are generally accepted in most countries (yes, even Europe). A consistent policy that banned computer algorithm patents would ban most other types of patents as well.
Once upon a time, people generated most of their value with their muscles. When machines replaced muscles, people could still generate value with their brains because machines could not replace brains. So the original Luddite scenario never materialized.
Now that machines are starting to replace brains, a growing portion of the population has a rapidly dwindling ability to generate significant economic value relative to the machines. As time passes, machines can effectively replace both the muscles and brains of more of the population.
This is also why forcing people to work fewer hours will not help. The problem is not the number of jobs available; it is the number of people who can generate more positive value in that position relative to a machine. Eventually we will all be in the position of no longer being able to be a productive member of a modern economy; everyone believes their contribution to be indispensable until the technology catches up and it isn't.
Actually, it is a relatively cheap computation on modern computing hardware. The specification for many modern tactical intercept systems is that the complete decision cycle has an upper bound of 20-50 milliseconds. You can do an amazing amount of computation on sensor data in that amount of time.
Remember, sophisticated multi-target tracking and engagement systems were built in the 1970s and 1980s with much less processing power than your cell phone has today.
And in fact, if you look at the chipsets used in state-of-the-art terminal guidance packages for hypersonic kinetic intercept of agile targets, they are embedded systems chips that would have been obsolete as desktop CPUs even a decade ago. Think MIPS R3000 or R4000 class CPUs and a modest DSP.
Basically, CPUs can drive computation at a much higher rate than material physics allows targets to change their behavior. We passed the threshold where computation is the bottleneck decades ago.
The SIG716 is not an "assault rifle" and you won't be "mowing" anything down with it. It is a conventional semi-automatic rifle that can be legally owned just about everywhere. Also, it is in a large caliber that makes it better suited for hunting than for rapid fire.
If the guy had been shipped a functionally equivalent hunting rifle with a classic wood stock there would not be as many ninnies getting the vapors over it. Unless Amazon has never made a shipping error before, this is a non-story.
I need to find myself some of these Power BBQ Chips mentioned in the summary. Fast and tangy without the downside of Cheetoh fingers.
Polymer and composite firearm construction was pioneered by Heckler & Koch in the 1970s and they have continuously produced lightweight composite firearms since that time. Glock is popular because it offered good price performance, not because it was particularly innovative. The construction, action, etc were copied from old firearm designs.
It is possible to build non-metallic firearms but the manufacturing would be exotic and extremely expensive.
Perhaps those drinking 3 cups a day are more likely to be in jobs where they are virtually chained to a desk, so they rarely see the sun and thus less skin cancer.
Prior studies on animal models have produced similar reductions in skin cancer associated with caffeine; the result in the article is not surprising. For example, here is a skin cancer study done with caffeine and mice:
No mice were chained to a desk for this study. I recall other studies done based on topical application of caffeine (rather than ingestion) with good results but I am too lazy to google them.
There is no such thing as a "software patent". There are a few entirely unrelated classes of patent that are lazily bundled together under the rubric of "software patent". A business method patent or "...on the Internet" or "...on a computer" patent are very different beasts from fundamental machinery or algorithm patents. From a patent policy and theory standpoint, these different cases are essentially unrelated.
The lack of precision is part of the reason there is not a cohesive movement to reform patents in this area. A reasonable solution for one class may not be a reasonable solution for another yet many reform proponents are throwing all into the same bucket. To people with legitimate patent interests who are amenable to reform, the scorched earth of deeming everything remotely associated with computing as "non-patentable", whether people realize it or not, seems unreasonable and gets in the way of getting everyone to agree on reasonable reforms to the individual patent classes. Discussions of patent reform policy require more nuance than many people are willing to give it.
My electricity costs $0.0476 (I have the bill in front of me) and I live in a major urban area. So a lot less than $0.08.
Not to take away from the basic argument but the assumed cost of electricity is considerably higher than actual for parts of the country.
There is definitely movement away from Java and toward C/C++ for some types of software. Applications bottlenecked by memory performance, like databases and high-performance codes, will often be faster than a language like Java by integer factors. When people assert that Java is about as fast as C/C++ they are talking about code like tight, CPU-bound loops. However, Java is wasteful of memory and CPU cache lines in a way that C/C++ is not under normal circumstances which has a significant adverse impact on the performance of some codes.
On recent processors, memory performance is a bigger bottleneck than CPU performance for performance-sensitive codes. The throughput of CPUs has grown faster than our ability to keep those CPUs fed from memory. In the supercomputing world this started to become evident years ago; memory benchmarks like STREAM became more closely correlated with real-world performance than CPU benchmarks like LINPACK for a great many algorithms. The resurgence of C/C++ is partly driven by this reality since it makes memory optimization relatively straightforward and you can receive large gains relative to Java for modest effort.
A smaller but also important driver away from Java is the GC. The increasing focus on "real-time" and predictable latency for applications like analytics and database engines is complicated when Java's garbage collector is inserted in the middle. This is a chronic point of pain for some applications.
I developed Java for years but my latest project (a real-time analytical database engine) is being written in C++ for the above reasons, among others. Writing high-performance applications of this type is actually pretty painful in Java because you end up doing unnatural things in the language to even approach the efficiency of conventional C++. Anecdotally, many of our C++ developers were doing Java until recently so the statistic does not surprise me.
It's the US after all with the largest stockpile of nuclear weapons...
I see this stated often but it is factually incorrect. Russia has had the largest stockpile of nuclear weapons for decades, often by a large margin.