The Digital Trends article claims that unmitigated Spectre is not a problem for consumer hardware. Yet we have demonstrations of Spectre attacks in JavaScript. So unless you run all your browsers with NoScript, I believe this is still an issue.
The article doesn't mention ZombieLoad, so we can assume that you can keep your keys secret or you can use hyperthreading, but not both.
Steady or declining CO2 emissions is only good news if we're in a steady state situation.
We're not.
Simply keeping anthropogenic CO2 and CH4 emissions steady is woefully insufficient. The "well below 2 degrees warming" goal of the Paris agreements is itself based on an assumption that we will be able to actively remove CO2 from the atmosphere, requiring a technology we have yet to make feasible at scale.
We cannot afford to burn our currently known reserves of fossil fuels. We have to decarbonise our energy production as quickly as is humanly possible. That countries such as Australia are still granting fossil fuel exploration permits is, frankly, insane.
Excitingly, elevated levels of CO2 — such as those found in poorly ventilated rooms, or as atmospheric CO2 levels continue to climb, in our environment at large — also diminish our cognitive ability. See, e.g. this paper among others.
If we don't stop burning coal and hydrocarbons soon, we'll be too stupid to ameliorate the consequences,
Reading this was almost painful: Cook, like Trump, appears to be aiming at primary schoolers when it comes to complexity of expression.
Sentences have on average fewer than nine words each and the majority are extremely simple in structure — the text has a Flesch–Kincaid grade level of only 4.5. He uses the word 'thing' instead of a more specific word or phrase on five different occasions.
As with Trump, this lack of nuance and basic level of language seems at odds with what we would expect from the role. Is this really appropriate from a CEO of Apple?
My only disappointment with the paper was that it read more as a survey than something offering new conclusions or new methodologies.
The claim that 'ice is just ice' is both tautological and missing the point. Glaciers obviously play a role in societies that cannot be captured purely by a description of their iciness. It shouldn't be surprising that analyses of the impact of (for example) climate change on glacier retreat that take into account only a certain subset of their role in a social, human context will give a distorted picture. Such selective views can indeed lead to policies that exacerbate existing power differentials.
Words such as discourse, colonialism and marginalization, the use of which is mocked without further argument by Soave, do have specialised meanings in critical studies and sociology. One might mount arguments about the relevance or quality of scholarship, but to criticise without any appreciation of the academic context is lazy and contributes nothing but noise to the discussion.
If I want to understand how ice melts, I will use the language and methods of physics. If I want to understand what it means to a community when the ice melts, then I will want to use a different set of tools.
The central claim appears to be that humans individually are bad at formulating plans to respond to distant crises, and consequently we are failing to tackle climate change in a meaningful way as a population. But the article is all over the place.
The author states that we should frame the problem with "an ends-justify-the-means approach", based on a quote from a study that states "[...] whereas harm originating from impersonal moral violations, like those produced by climate impacts, prompts consequentialist moral reasoning." On the contrary, the quoted statement indicates that by virtue of it being impersonal, we employ consequentialist approaches.
Inasmuch as this holds among our population, the conclusion isn't that we are bad at dealing with these sorts of crisis, but rather some of us — in particular, I imagine, the members of our oligarchies — are incapable of or disinclined to engage in moral reasoning. In short, they are broadly psychopaths or evil.
Oh, and "[...] the slowly unfurling nuclear crises that may or may not eventually wipe out whole metropolises and military bases" — the what now?
Consider if the media did do this — detail the (on average) 90 people killed in the US on the roads in one day. And then did it again the next day. And the next. And the next.
Perhaps then the population would demand a proportionate response. Or at least would place the current risk from terrorism in context.
Once that's done, we could move on to cancer.
The erosion of middle-button paste functionality is a continual frustration.
There are cultural differences between the Windows and Macintosh personal computing worlds, and that of X11 on Unix workstations. While always allowing customisability, we should hold on to the good ideas of the past, rather than dismiss them as being unfamiliar to the personal computer user.
What irks me especially is that the same forces that are driving us towards a Windows-like experience on the Linux desktop are also removing the ability to easily customise our environment, if only to retain the functionality that is being deprecated or dismissed. (I'm looking at you, GNOME.)
The Logitech Anywhere Mouse MX is a wireless mouse, so may not suit. It does nonetheless have a middle button distinct from the scroll wheel, and is not a weird 'ergonomic' shape.
I use it with the laptop, but at work I'm on the sadly long-discontinued Logitech Marble Mouse, with middle-button emulation. (I see that there is now the Trackman Marble, so perhaps I will still have somewhere to go if my venerable trackball ever dies!)
It's a shame it has such a reputation for being boring, and it is a shame that it seems to be rarely taught in an engaging way.
Statistics is the first artificial intelligence. It formalises what we know when we 'know'. It is fundamental.
It's also fairly hard to do right. But many worthwhile things are hard.
Short of bugs in the compiler's optimizer — and we all know there have been many — the idea that "if the entire code absolutely must stay fully intact, it shouldn't be optimized" is already dangerous.
A compiler conforming to its documentation or standard isn't going to change semantics that have been guaranteed by that document. Those guarantees though are all you have: even without explicit optimization options, a compiler has a lot of freedom in how it implements those semantics. Relying on a naïve translation from a line of code to a particular, non-guaranteed assembly representation is a very brittle practice.
Graduate-level CS encompasses a lot of ground!
Knuth is of course a valuable addition to the book-shelf — as others have pointed out, it's a superb source for chasing up information, details and citations for algorithms and data structures one needs to justify or investigate, if nothing else.
Okasaki's Purely Functional Data Structures has also already been mentioned, and I'd add my endorsement!
I would recommend two other texts to add to a collection:
I would recommend a book on convex optimisation and probabilistic graphical models, but frankly I don't know of a single text on either topic that I could whole-heartedly recommend. Any suggestions?
"Experience has proved that some people indeed know everything." -- Russell Baker