Comment Re:Demo Effect (Score 2) 67
Apparently reality distortion field beats demo effect.
Apparently reality distortion field beats demo effect.
Ish.
I would not trust C++ for safety-critical work as MISRA can only limit features, it can't add support for contracts.
There have been other dialects of C++ - Aspect-Oriented C++ and Feature-Oriented C++ being the two that I monitored closely. You can't really do either by using subsetting, regardless of mechanism.
IMHO, it might be easier to reverse the problem. Instead of having specific subsets for specific tasks, where you drill down to the subset you want, have specific subsets for specific mechanisms where you build up to the feature set you need.
the poster didn't say anything about federal funding. the republican party exists and operates at the state level too, you know
Oh, absolutely. These days, I spend so much time checking the output from computers, it would normally have been quicker to do searches by hand. This is... not useful.
I can fully understand that.
The knowledge is free.
The skilled professionals to persuade the pupil whose civil rights include refusing to learn to absorb it are not.
You can lock a kid in a library but you can't make her think. When ignorance is virtue we have lost.
Large houses in hot climates don't have enough roof space to accommodate the number of panels you would need to displace grid power. Using obsolete panels that generated less power when they were new is only going to make the problem worse.
Leaning into unreliable power sources when demand is increasing is just going to make any supply issues worse.
Someone posted what they pay for electricity in California and what their buy back rate is and their rates are insane.
Basing a person's credit score on credit usage rather than payment history is part of the system's bullshit.
Sort of, but lenders aren't looking for people who have good financial sense.
Lenders are looking for people who will borrow a lot of money, then continue paying on that loan. Which is not the same thing as having financial sense.
Which is pants on head retarded.
Both from a moral level, keeping people in debt to ensure they're indentured to someone and from a business perspective as it's a imperial shitton (about 1.8 Metric fuckloads) of risk. In case you don't remember the last time the banks over-leveraged on risky customers, it was only 2008, there was a massive collapse we're still feeling the effects of.
I'm glad my country doesn't have anything of the sort. Experian are trying to "sell" their credit score system but no-one is buying. When you apply for credit here (I.E. home or car loan) your existing credit obligations including potential ones (like a credit card or phone contract) are listed as risks and detract from the amount you can borrow.
Compare the market caps though, Nvidia and Intel are not on the same order of magnitude.
I don't think there is much a of frenemy relationship really to speak of.
My guess is this about two things:
1) Nvidia ensuring they have or could get some access to an x86 license if AMD is somehow able to both make some kind of great leap in MIMD compute space and at the same time is able to deliver some kind of integration advantage with integration in traditional compute in memory architecture with EPYC parts.
2) Being sure they have access to some kind of FAB capacity in the event the excrement hits the fan around TSMC, and with a "partner" to whom they could dictate terms.
I think everyone is over thinking this. NVidia need to do something with that money they're drowning in and there's only so much that can be syphoned off to tax havens before everyone starts asking questions. If they kept it, they'd need to pay tax on it and we can't have that.
AMD has been outselling Intel in the DC for what, a year now or more?
No, AMD still trails Intel in the data center, both in terms of revenue and unit sales. AMD's server share was stagnating at 20% for many years despite an uptick in reputation and positive press. It's only been recently this year that AMD hit around 40% in server market share, and that's based on revenue. In terms of units, AMD's market share is lower at around 32%.
Of course, upward momentum is still with AMD, so it wouldn't be surprising to see AMD claim a majority of server market share in the near future.
What's surprised me in the processor market is AMD making inroads in the laptop space that Intel owned for decades, even when AMD were dominating on the desktop and had very strong server offerings (the Athlon 64/Opteron days, for those who's memories stretch back that far). 15% of laptops are now AMD.
So? 99.9% of laptop users will never upgrade their RAM anyways. This is a budget laptop. It doesn't need to be upgraded. It's made to do the basics, like a Chromebook. For someone on a tech website, you're certainly completely out of touch with technology needs.
Upgrading budget laptops is how you save.
I have a cheap Asus gaming laptop for travel. It cost me £550 and most of that was for the RTX3050. I was able to pop the back off, add in another 8 GB of RAM, doubling the original and replace the 512 GB SSD with a faster 1TB for less than £100, the next model up that had the same amount of RAM was over £200 more expensive (but it did have a 3060 to be fair).
A lot of budget laptops are deliberately underspec'd to force people afraid to upgrade to buying a far more expensive laptop.
All of this is a moot point anyway, Apple have been hostile to upgrades and repairs for decades, they already spend extra to solder in the hard drives and ram modules. Also anyone who didn't predict that they'd be killing off the Mac line for phone based ones? Anyone still denying that? Mac OS being killed off is next.
It's "so complicated" because with almost any other laptop manufacturer you can add RAM after the fact, and not have to upgrade everything else in the system that you don't need upgraded just to get more RAM.
The people who upgrade ram in their laptops are probably like 1% of the total market. Most are bought by corps or individuals who treat them as an appliance.
That's still 3 million people of the ext. 300 million laptops that get sold each year.
And I'm willing to bet a lot of people are like me who buy a lower spec'd model where I can upgrade the RAM and SSD on the cheap saving a hundred quid off the more expensive one.
However that's besides the point, Apple have been openly hostile to people who want to upgrade (and consumers in general) for years. Plenty of other manufacturers to choose from though.
"Smart" devices were ALWAYS a vector for advertising. First it was a vector for data collection to better target ads, now it's just blatant advertising all the time. I've shied away from smart devices whenever possible, and if I'm forced to buy a smart device to replace a defective device I don't enable the network. "But you're missing out on features, man!" Yeah, I'm missing out on yet another advertising and propaganda channel. Boo hoo.
Smart devices were *NEVER* about adding features for the end user. It was always, and always will be, about advertising. Don't fall for it. Don't support it.
As a side effect, they're going to become a huge vector for malware.
A fridge will last for a decade or more, they'll stop doing software updates after 7 years maximum, maybe even as low as 2. In fact some may never receive a software update.
You had mail. Paul read it, so ask him what it said.