Comment Re:6,000 people, in Hong Kong ? (Score 1) 47
It was inside Kowloon walled city. The bomb went off. Nobody noticed a difference.
It was inside Kowloon walled city. The bomb went off. Nobody noticed a difference.
As long as the old USB-C power bricks can be used
It's more likely that USB-D power bricks will be compatible with old -C devices. Just support the -C power handshake. Which will make for a lot of orphaned -C bricks, destined for the dump.
Or the APOE4 variant mice realized that they were giving their lives for a poorly designed experiment. They just figured, "Screw it" and stood out where they figured an owl would end it all.
We don't need another civil war MAGAs[FTFY]
King George III? The civil war?* Actually, characterizing our revolution as a "civil war" isn't entirely incorrect. But then we went off and had another one about a hundred years later. Over some woke GOP agenda. Perhaps it's time for another. To keep the Left Coast from falling under foreign influence.
*Almost as funny as as Belushi's remark about the Germans bombing Pearl Harbor. You probably had the same history professor.
True. So true.
I guess this is why they say the left can't meme.
The ghost of Enron.
Facilitator agents will now sit in on Teams meetings, creating agendas,
... switching the company to Linux.
Basing a person's credit score on credit usage rather than payment history is part of the system's bullshit.
Sort of, but lenders aren't looking for people who have good financial sense.
Lenders are looking for people who will borrow a lot of money, then continue paying on that loan. Which is not the same thing as having financial sense.
Which is pants on head retarded.
Both from a moral level, keeping people in debt to ensure they're indentured to someone and from a business perspective as it's a imperial shitton (about 1.8 Metric fuckloads) of risk. In case you don't remember the last time the banks over-leveraged on risky customers, it was only 2008, there was a massive collapse we're still feeling the effects of.
I'm glad my country doesn't have anything of the sort. Experian are trying to "sell" their credit score system but no-one is buying. When you apply for credit here (I.E. home or car loan) your existing credit obligations including potential ones (like a credit card or phone contract) are listed as risks and detract from the amount you can borrow.
Compare the market caps though, Nvidia and Intel are not on the same order of magnitude.
I don't think there is much a of frenemy relationship really to speak of.
My guess is this about two things:
1) Nvidia ensuring they have or could get some access to an x86 license if AMD is somehow able to both make some kind of great leap in MIMD compute space and at the same time is able to deliver some kind of integration advantage with integration in traditional compute in memory architecture with EPYC parts.
2) Being sure they have access to some kind of FAB capacity in the event the excrement hits the fan around TSMC, and with a "partner" to whom they could dictate terms.
I think everyone is over thinking this. NVidia need to do something with that money they're drowning in and there's only so much that can be syphoned off to tax havens before everyone starts asking questions. If they kept it, they'd need to pay tax on it and we can't have that.
AMD has been outselling Intel in the DC for what, a year now or more?
No, AMD still trails Intel in the data center, both in terms of revenue and unit sales. AMD's server share was stagnating at 20% for many years despite an uptick in reputation and positive press. It's only been recently this year that AMD hit around 40% in server market share, and that's based on revenue. In terms of units, AMD's market share is lower at around 32%.
Of course, upward momentum is still with AMD, so it wouldn't be surprising to see AMD claim a majority of server market share in the near future.
What's surprised me in the processor market is AMD making inroads in the laptop space that Intel owned for decades, even when AMD were dominating on the desktop and had very strong server offerings (the Athlon 64/Opteron days, for those who's memories stretch back that far). 15% of laptops are now AMD.
So? 99.9% of laptop users will never upgrade their RAM anyways. This is a budget laptop. It doesn't need to be upgraded. It's made to do the basics, like a Chromebook. For someone on a tech website, you're certainly completely out of touch with technology needs.
Upgrading budget laptops is how you save.
I have a cheap Asus gaming laptop for travel. It cost me £550 and most of that was for the RTX3050. I was able to pop the back off, add in another 8 GB of RAM, doubling the original and replace the 512 GB SSD with a faster 1TB for less than £100, the next model up that had the same amount of RAM was over £200 more expensive (but it did have a 3060 to be fair).
A lot of budget laptops are deliberately underspec'd to force people afraid to upgrade to buying a far more expensive laptop.
All of this is a moot point anyway, Apple have been hostile to upgrades and repairs for decades, they already spend extra to solder in the hard drives and ram modules. Also anyone who didn't predict that they'd be killing off the Mac line for phone based ones? Anyone still denying that? Mac OS being killed off is next.
It's "so complicated" because with almost any other laptop manufacturer you can add RAM after the fact, and not have to upgrade everything else in the system that you don't need upgraded just to get more RAM.
The people who upgrade ram in their laptops are probably like 1% of the total market. Most are bought by corps or individuals who treat them as an appliance.
That's still 3 million people of the ext. 300 million laptops that get sold each year.
And I'm willing to bet a lot of people are like me who buy a lower spec'd model where I can upgrade the RAM and SSD on the cheap saving a hundred quid off the more expensive one.
However that's besides the point, Apple have been openly hostile to people who want to upgrade (and consumers in general) for years. Plenty of other manufacturers to choose from though.
"Smart" devices were ALWAYS a vector for advertising. First it was a vector for data collection to better target ads, now it's just blatant advertising all the time. I've shied away from smart devices whenever possible, and if I'm forced to buy a smart device to replace a defective device I don't enable the network. "But you're missing out on features, man!" Yeah, I'm missing out on yet another advertising and propaganda channel. Boo hoo.
Smart devices were *NEVER* about adding features for the end user. It was always, and always will be, about advertising. Don't fall for it. Don't support it.
As a side effect, they're going to become a huge vector for malware.
A fridge will last for a decade or more, they'll stop doing software updates after 7 years maximum, maybe even as low as 2. In fact some may never receive a software update.
APL is a write-only language. I can write programs in APL, but I can't read any of them. -- Roy Keir