Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:This is an abysmal sales story for AMD (Score 3, Interesting) 21

They had to pay several billion $ many years later, which was like 1% of the money they made while AMD had Athlon/Athlon 64 and the equivalent Opterons when Intel had utter crap (Pentium 4).
Dell was an instrumental part of the illegal activity - they were paid by Intel. And apart from not promoting AMD themselves, from personal experience it was more than that. I remember I was evaluating servers for the new computer lab in my university, back in 2004-2005. The Opteron 64 powered HP Proliant servers were offered to us at somewhere like $2k per dual system (academic price) and they were about 60% faster in 32bit linux (on our own software) and were like 2.5x faster when we compiled our computational biology etc stuff on a 64 bit linux. It was a clear win. Suddenly, Dell with their Xeons (I think Prescott?) comes back and gives a ridiculous price of $800/dual CPU server! Surely under cost (if you discount Intel's "rebates" of course). The heads of our departed were ecstatic, they said we can now afford twice the servers! I warned them that we'll need more than twice the power and the cooling and end up with mostly similar actual performance! They, of course, did not listen and I graduated. It took them 3 years to manage to get the power and A/C requirements to install the full cluster. So they had a "new" cluster of P4-Xeons in 2008...

Comment Not meaningful world records... (Score 4, Insightful) 40

If you take a modern heavily parallel benchmark it is completely meaningless to say you got an "8 core record". The whole point is that we can scale much better to more cores than frequency. The only meaningful limited record would be for power - cinebench score/Watt.

In the meantime, in the real-world, the fastest servers you can easily get (e.g. large cloud providers) are EPYC Genoa, which are cheaper and much faster than Intel's Sapphire Rapids (you'll have to wait quite a while for Raptor Lake). In fact, I could recently write about my experience with Google's new Genoa VMs: they were in a private preview this summer and I found out I could break many of the OpenBenchmarking.org records using the 90 core (180 thread) single or double processor VMs. Yes, I know these are server CPUs, but Xeon/EPYC is what is more relevant to "world record" benchmarking.

Obviously I welcome competition - We were on Xeons for the Haswell/Broadwell/Skylake generations - AMD only caught up with Rome and we switched everything to Milan when it became widely available. We'll switch back to Intel when they catch up and surpass AMD, but it doesn't seem likely to be very soon (I wouldn't be surprised if ARM solutions caught up faster than Intel - Graviton3 is fast) and liquid nitrogen benchmarks claiming world records by limiting to a few high frequency cores seem like little more than PR. Despite it being cool otherwise in both the literal and metaphorical sense ;)

Comment Re:Forgive me if I seem skeptical (Score 1) 98

I was like that until high school - although I used it mostly for entertainment. I had this story going on and I would continue it every night - would start thinking about it lying in bed and I would transition to lucid dreaming. It was the usual heroic stuff, starring myself of course. Not sure how I lost the ability...

Comment Re:Forgive me if I seem skeptical (Score 1) 98

If ibuprofen doesn't work for your migraines, try sumatriptan or rizatriptan.

Other than that, you are right, this ultrasonic lucid dreaming sounds like complete BS. I was a lucid dreamer as a teen, I can no longer do it and I do miss the ability, but, no, I am pretty sure this is not the way for me to start having it again.

Comment Re:OK Boomer... (Score 2, Interesting) 234

GTFO, a 'generation of boomers' did NOT go to college for free! ... A LOT OF THEM SERVED IN FRIGGIN WORLD WAR 2.

Their PARENTS might have served in "friggin WW2", but boomers themselves (i.e. people born AFTER the end of WW2 - 50s till 60s) did not have the time travelling technology required for that. They were toddlers during the Korean war as well, boomers could only have served in Vietnam. Less than 10% of boomers served the military according to a quick search, so that's a small minority you are talking about.

Comment I love Excel (Score 1) 83

I love Excel, it's Microsoft's best software every by far (and yes, I've tried the alternatives, they are not up to par), but that auto-formatting specifically for detecting dates is ridiculous. It is so permissive, it "works" with slashes, hyphens etc, even if you only have two numbers with a hyphen (like how I'd write a range as title on a column) it will figure out a way it can possibly be interpreted as a date (if your number is 13 it could be a month, otherwise it could be a year, no worries any 2 digit number will do - we'll obviously add 1900 or 2000 to make it "right"). And if you dare use parts of month names you don't even need any delimiter, because you OBVIOUSLY are writing dates in a run-on style like that...

I know you can turn it off, but you shouldn't have to. Also perhaps it should work restrictively, there are times when it is unambiguous that you are writing dates. I end up using ="" a lot of the time. And it would even be fine it if was broken like it is, but was a separate action like Word's auto-formatting. Now, don't get me wrong, Word's auto-formatting is crazy and the main reason I avoid it, but at least the moment they happen a "ctrl+Z" will revert them.

Aaand, that's my rant!

Comment Re: USB-C is great, but (Score 1) 191

120W charging is the next big feature, with the usb-c cable they can now do it. I got a Xiaomi 12T Pro last year and it completely changed the way I use my charger and removed any stress. I usually put it on charge during the 10-15 mins of my shower and that's good for the full day. Even if I only remember to charge "last minute" going somewhere, I just give it 5 mins and it will last half a day. I've never used my extra battery bank since I got this phone, so I no longer carry one. And it's not like it's putting the battery in danger, inside or has 2x2500mAh batteries and charges each at 60W. Oh and if I go in any sort of trip I only take my phone charger - it works for both my work MacBook Pro and my personal ThinkPad.

Comment Re:nVidia at it again... (Score 1) 9

I did say they are not the only ones. But they were sort of pioneers :) You are talking about 2001, but as I said Nvidia were at it since the 90s, when reviewers had not yet wised up to actually check image quality along with performance (apart from the obvious 16bit vs 24bit and shading). Nvidia had either the Riva 128 or Tnt smashing the frame rate records and ATI (I think with the Rage Pro at the time) could not match it, according to all reviews I read in magazines... except a review if I remember correctly on the French Joystick magazine (I was trying to keep up with my French). They had noticed that the menus on a couple of games (and possibly some text in Final Reality) were not readable. Other magazines assume it was a glitch probably, but they figured out text was unreadable in games where text (e.g. menus) were actually textures. They then compared screenshots closely and you could see the textures on Nvidia were highly compressed, and while the quality overall did not look bad compared to the 16bit 3DFX, the other 24/32bit cards were all using high quality textures. Internet in its infancy back then and even graphic card reviews were a new "art" so it did not become a big thing, but the RIVA cards were faster because they were heavily compressing the textures so they did not have to move the same amount of data. Nvidia has also been caught cheating on 3Dmark on numerous occasions, so, yeah, ATI was in good company :D
But I do think Intel was probably the worst, no idea how much money they threw left and right, but they managed for a few years (until the Core came out) to keep the tale that the Pentium 4 was on par or better than the competition. Their compiler cheating was also quite despicable.

Comment nVidia at it again... (Score 2) 9

Nvidia declined to discuss the cost of its chip. On Friday Nvidia said it planned to soon roll out a software upgrade that would double the performance from its showing in the MLPerf benchmark.

Ugh, how do they think that's a good announcement? If a software release will double the performance of your hardware in a benchmark, you are "optimizing" for just that benchmark. And I quote the term here, because nVidia have been cheating on benchmarks since the 90s (they are obviously not the only ones, Intel has possibly done it more) and we know it...

Slashdot Top Deals

Money will say more in one moment than the most eloquent lover can in years.

Working...