Comment Re:disingenuous (Score 1) 365
The insurance industry already prices at the human rate of accidents for premiums. That's what the professional actuaries do. One would expect that they'd make it far cheaper to "auto drive" a car instead.
The insurance industry already prices at the human rate of accidents for premiums. That's what the professional actuaries do. One would expect that they'd make it far cheaper to "auto drive" a car instead.
You do, of course, then open up the concept of what is evil. What we consider highly unethical these days and quite unthinkable to do was a matter of course a few hundred years ago.
What needs to happen is an understanding of the person in the context of their day, and a solid appraisal of the good things they did that weren't standard back then, but perhaps have been adopted more widely today.
History is a delicate and nuanced beast, though today people expect to be able to understand it as an entirely arbitrarily univariate entity, which it definitely isn't. Alas, much shouting loudly and attacking anyone that disagrees is what goes for discussion on many topics today. Which goes back to your points very valid points that much of today's discourse tends to be infantilised.
Very glad you don't run the company.
The bulk of the market these days doesn't much care about discrete graphics cards anymore. General office machines just don't need that kind of power, especially when the entire use case of the office boxes is spreadsheets, word processing and general applications.
Building a cut down GPU that can handle basic (for a modern system) acceleration of all the requirements into the base CPU and having the cost of that be fractionally higher cuts down on the overall cost of the machine being shipped by bulk vendors (dell, HP etc.). And the power consumption is lower too, so a double saving.
Now, my own use case for my home workstation doesn't fit this use case, so I have a discrete GPU. And I have a CPU that doesn't have integrated GPU (well, on my workstation; my laptop DOES have an integrated GPU, and it works just fine for the use cases I have for my laptop).
Employees have known for a long time that their traffic is passing through a gateway that allows or denies their connections (the occasional "This site has been blocked" message conveys that very well). This is different to surveillance in most cases as nobody really has time to trawl through the bulk of the logs where things just work.
What the "man in the middle" warnings do is cause lots of users to suddenly get scared and start calling the service desks, which get overloaded and put a lot of diagnostic work on the back end teams (of which there are not infinite numbers). Rather than condition people to the approach of "Oh, look, we have a serious scary notification that we've been compromised, but we'll continue because it's just a browser sending a spurious notification", they did the eminently sensible thing of "we'll just not bother utilising a tool that causes us endless work that we don't need to do if we use one of the competing products that does the job just as well, with only a fraction of the hassle".
In an enterprise environment, that's the only sane approach really.
No, it means they've standardised on something else and don't have time to support everything, especially when it doesn't adhere to standards that their supported product does.
That's just a pure scale resourcing issue.
What you're advocating is that a team spends time to go and work out any and all issues with a product that they don't support, when there are other products that they do support that work just fine.
If all the unsupported apps were taken into account and the core groups were tasked with making sure that everything worked (core supported or not), then nothing of worth would ever get done.
This is just another "It's too much time and effort to sort out something we don't have to because we use something else that does the same job perfectly well". That's standardisation for you.
That the area seems to be working fine, seems to me to be that they already have Chrome (or some other browser they support) configured just how they want it.
That subset (from an outsider's view) is the proven legal citizens. What subset are you talking about?
Every government restricts to a subset of the population of a country. There are various legal statuses in many places that are associated with serious criminal activity that remove voting privilege.
Anyhow, you successfully diverted making much of a response by whataboutism, which is a logical fallacy, which is essentially debating in bad faith.
I missed a whole year of schooling (medical issue). I chose to stay in my year and do the syllabus off my own back by lots of reading when I was healthy enough to do so, and a copy of what the syllabus was meant to teach me.
I stayed in the same groupings I'd previously had (good at most stuff, terrible at Geography and History simply because I have a terrible memory for dates and location names; could still give precise sequences and find root causes, but that's not what makes those studies at that level).
I'm all for giving extra tuition for those that need and want it. I'm also for streaming as it keeps a generalised level of achievement (and these days, it's simpler than ever to extend your own learning online). The perfect model would be one teacher, one student, or very few students, as that way you get the best out of each small group. However, resources aren't there to do that, so you get big classes, where nobody get the most out of their education.
The reason I hate the one size fits all, is because it's been done before and it's terrible. My primary school had that policy. I read Lord of the Rings when I was 5, and at school was forced to read the children's beginner books. I couldn't run through them at my own pace, I had to choose a book for the hour or so (which I read in about 2 minutes) then sit there bored. No advancement beyond the pace that the slower members of the class could achieve.
Same with maths. I picked that up early, but was never allowed to progress at any pace because it wasn't the pace the class as an entity moved at. Now, it wasn't as easy to practice and learn math back in the '70s as it is now, and certainly not as easy as it was to practice reading. So this ended up with my math ability to be stunted and held back, to the point that it damaged my scholarship chances when I went to secondary schools.
I was bored out of my mind by things I'd progressed well beyond, and _wanted_ to be challenged, but the school and teachers simply wouldn't let me.
It took far more effort than I'd have otherwise needed to expend to catch up with people who _had_ been allowed to be challenged (and I still think I'd be better than I am if I'd learned earlier, rather than having to fight to stay with the groups that'd been challenged to excel earlier).
That's direct experience. Now, I work in a field that does use math a lot (it's pretty foundational) and to a reasonably high level. All the ones that struggled and were the bulk of the class do not work in fields that need anything other than basic arithmetic. People advance to their strengths as they grow.
The counterpoint to this is that I love music and thoroughly enjoy playing. I'm a competent musician; however, no matter how much I practice or push, I'll _never_ be as good as the people who have a real talent for it. Same with a horde of other fields. I may grasp the basics and essentials, and enough to get by, but that's where my end point over the course of education should point to and be projected to cover. Aiming to improve the overall average is essentially saying "We aim for everyone to be mediocre", which to me is insanity. Challenge people. Get them to be as good as they can be where they can be. If they're challenged by the basics, that's not necessarily bad; they'll just be likely to only grasp the basics for quite a long time. If they're a prodigy, they should be taken as far as teaching can take them. That way, they may have a better chance of actually improving the overall system in their time, allowing everyone to eventually benefit from their excellence, rather than have them mired in the mud, and nothing progress unless it's at the rate that the average person could progress it (that actually leads to retreat from knowledge, as once things are at the pace of the average, what it takes the exceptional to understand today will be forever beyond their reach, and once that falls, exceptional is reduced and so on).
Nobody does. They simply put people into those categories based on whether they can do math or not.
Interestingly, that's an out of date viewpoint. The Guardian _used_ to be center left (and I used to read it along with a few other rags across the political and geographic spectrum).
However, I've increasingly found them subject to huge amounts of misinformation (a tendancy to roll out 'Experts ' on their political cause of the day, but when you chase credentials, some of their real headline items haven't been with experts in the actual field, they've been from lawyers associated with it, and on an opinion, rather than a researched expert opinion, yet they are published as "speaking for the organisation as an expert").
They're pretty much leading the charge on the far left into the mainstream these days.
An example of their reporting is from recent riots and protests in the UK that landed quite a few police in hospital (some seriously injured) after "protestors/rioters" threw large items at them from a height.
The Conservatives (center right, effectively where the Democrat Moderates sit most of the time), the Lib Dems (mainstream Democrat) and Labour (center to left Democrat) all condemned the violence.
Almost every new outlet condemned the violence. The Guardian reported on the poor protesters and how important it was that they could do whatever they wanted if they called something a protest, and any restriction was inhumane.
With you about the American right being pretty out of kilter though. As is the American Left. For some reason, the American culture seems to love asserting that there must be one way, and resistance to compromise is a virtue, leading to increasingly polarised stances, and the eradication of middle ground.
I quite like my middle ground, and I happily irritate people who try and dislodge me from it.
Yep, that's the reason I didn't move to the USA when I had the chance.
I've seen some very worrying developments in US based thought that directly correlates with Ash'arism a thousand years ago, when Arabic countries were at the heart of scientific (or at least rational) progress in the world.
Essentially, Ash'arism was the rising of an anti-rationalist group which subverted and finally took precedence over the rational subjects, so that eventually all 'science' was subject to political/religious approval from this sect, shutting down anything it didn't like to hear.
This is exactly analogous to the current Cancel Culture that's prevalent. The Religious Right would like to do that, but never really gained the mass traction, unlike the media supported far left ideology that's currently become worryingly mainstream; effectively, the USA now has a firmly entrenched Ash'arite equivalence that really would like to unseat the rational. The endpoint should it continue to thrive is well documented historically (just have a peer at the state of Middle Eastern countries still ruled by anti-rationalist leaders, and the living standards of their citizens).
An interesting read (that covers this, but I found to be a very enjoyable and quite educational text is: https://www.thenewatlantis.com...
Old programmers never die, they just hit account block limit.