And according to Tom's, that exactly what the last BIOS update was all about. Taking better into account the tacho feedback.
Yeah, and that's the main exchange where I trade mine.
A few other thing:
- much smaller scale, but cryptsy has almost the whole zoo.
- lots of trader are playing around and speculating with minor coin.
- That means there's a lot of exchange traffic between major and minor alt coins.
- That means it's easier to exchange whatever you mined with whatever is more usefull to do transaction for you.
- LTC is starting to grow big enough to get some indenpendance from BTC.
Will gamers see that much of an improvement? The PS4 and Xbone being x86 hardware is nice as the excuses on why a port to PCs can't happen, but both consoles are pretty sad when compared to current mid-range PCs let alone a high-end rig. The Xbone one is struggling to hit 1080p while the PS4 is hitting it, but at 30fps. This is matching or lower than the performance of a current mid-range PC and the performance gap will only widen.
...all this done with intergrated GPU. That's the key point. These performance are pulled using just an embed chip, that pulls minimal amount of power.
Now scale this thing up, move one generation next ( to GCN 2.x ) and the discrete card that will be getting next year from AMD are going to be quite interesting.
I'm talking about all the above being GCN 1.x chips.
The GCN 2.x chips (which was initially what was going to be inside HD 8000, before they delayed everything) will be here early next year.
Only because the user isn't technically part of the PC.
Also the user isn't under warranty.
(that might explain because most of them are crap)
Stupid planned obsolescence. We should complain to the manufacturer.
or whether review cards come from the initial run, probably the one where AMD's people are mostly tightly and nervously observing the process, rather than the potentially more variable just-another-day-slapping-chips-on-cards production that follows.
I would indeed agree with your first post and this part. To me, a big conspiracy to manipulate results is far less likely than simply slopiness of a mass-produced good, where speed of production counts, in order to quickly meet the demand.
To quote a variant of Hanlon's Razor (often attributed without sources to Napoleon Bonaparte):
"Never ascribe to malice what can adequately be explained by incompetence."
Merely variation is only inconvenient, and may well mean that the usual 3rd-party overkill heatsinks actually help significantly.
Yup very probably. Specially with modern card that try to go as fast as they can, while still within target thermal envelope.
Personally, I think AMD should have waited until it had legitimate improvements to offer before releasing anything, rather than trying to fool people with the R9 series.
The problem, is that AMD got too busy doing legitimate improvement under contract for the coming generation of consoles (Xbox One, Play Station 4, and some of the upcoming Steam Box, all run the same combo of AMD CPUs and GPUs).
With that work, there was going to be some delay for their PC's Radeon HD 8xxx serie.
So it was either:
- have absolutely nothing to sell.
- do some small upgrade on the older board (R2 270/280 are simply older board slightly upgraded) and older chips (R2 290(X) are GCN1.1 chips, slightly newer than the GCN1.0) and have something to sell until the real new gen comes, while still taking advantage of the time to add some improvements.
Note that this was pretty much announced that way, and well known for people in the field.
now look at the bright side of the things:
even if they are coming with some delay, that means that next year, you're going to finally see the newer GCN2.0 chip based card, that have taken advangtage of all the R&D done for consoles to improve the performance and quality for radeons.
A little bit later, but a little bit more R&D money poured on the steps leading to them (specially the latest step)
due to the fact that the software/driver adjusts the frequency independently instead of being a static clock speed (something they should have disclaimed to the reviewers).
It's well known that these cards operate at a fixed temperature and push the clock and voltage as high as they can within these thermal limits.
It's so well known among professionnal, that some like Tom's are giving advice about BIOS replacement (newer have better and more consistent throttling or fan accross all the varied parts) or thermal paste replacement (to improve cooling and thus performance).
but if it turns out that the junior thermal past application technicians get less attentive once the first production batch is finished and the people who've been babying the project leave, that wouldn't be a total surprise.
Now add in the mix that some parts, like fans, might be sourced from several different manufacturer (and according to source, BIOS wasn't until latest update so good at operating them), add also that there might be variation in quality between the different batches of thermal paste (which got very probably sourced from several productors) and the output variation is clearly expected.
But also fixable (newest BIOS to compensate fans with better throttling, manually replace thermal paste. Now cards works as good as benchmarking sample or even better)
If you bothered to RTFA (I know!), you'd see that they indeed checked this out. They flashed the BIOS of their sample card onto their worst performing retail card. There was a small difference, but far from enough to make up for the gap between that card and the sample unit they received from AMD
Not that BIOS. As other have pointed in the thread, the variation in performance is more or less linked to the variance of thermal management.
Not all PWM Fan behave the same. There's a *newer* BIOS version (not as in "use the one that came with the sample" but as in "download the latest version that was made available on the manufacturer website, betwen when you bought it and now").
This version of BIOS is better at computing what signal it should send to the fan to have better cooling.
And once the cooling is improvent, the card will automatically scale up its speed.
Also, there can be difference in thermal grease, etc.
At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.
Or, instead of cherry-picking, maybe there's some build quality between the first engineering sample sent by AMD, and the mass-produced card by NONAME asian manufacturer ? (Or even mass-produced cards by very popular brands that have to fulfill lots of orders ?)
Difference in quality of the fans (NONAME will pick whatever is currently the cheapest, and even with popular big-names, there's going to be some variance, depending on where the current batch was sourced).
Difference in quality of thermal conduction of the interface. Difference of quality of thermal grease (NONAME will pick the cheapest, bigname might have variation in batches, specially if they source batch from several manufacturer to keep up with the pace). Difference in quality of work (NONAME might even do a sloppy job in applying the thermal medium to the radiator).
You end up with exactly the same chip produced by AMD, but vastly different thermal condition, all this with a firmware and a driver which isn't yet best at fan throttling, and you end-up with some measurable difference in output.
Pick up Nvidia cards, and you're going to see exactly the same effect.
Either card that vary in their performance (or that have big variation in temperature, depending on how the current firmware throttles the card)
You're joking but that's more or less what's happening.
There are lots of alt-coins in addition of bitcoin.
Some are easier to mine on GPU than other.
You need to pick up a coin that plays well with it.
And another dab coming (underhanded) at Apple in TFA:
"will also be tailored to work well with emerging product designs and will scale for future USB bus performance", said the group
Unlike Apple's Lightning port, current and future USB connection have decent bandwidth, and thus can drive a HDMI-out USB-chip to display HD (when they're not directly speaking 'MHL' instead of USB, directly to the display over a dump cable).
Unlike Lightning to 30-pin adaptors which are basically tiny protocol droids translating between the two.
And the Lightning port actually isn't a connector, just a direct internal bus. Whatever the iPhone speaks to the outside world is handled by a small chip in the cable or dongle.
In theory, the "pros" given by Apple are:
- evolutive. If a new form of connection arrivers (say for exemple USB3.2 over these new reversible connector), no need of any change in the iPhone itself, just get a new different dongle with an USB3.2 controller inside and a new-gen connector.
In practice, there are tons of "cons" :
- Non standard. (of course, that's apple)
- Mandatory chip (controlled and licensed by Apple) in every single thing that you connect. There can't be a dumb cable, you need a controller.
- Awful bandwith. Lightning's bandwith sucks, it can't be used to drive a display. Current HD-video out dongles have been found to be sorts of "AirPlay": video stream is (destructively) compressed using the accelerated hardware inside the iPhone, the compressed stream is output through the lightning port, the HD-Video out dongle contrain a full blown ARM SoC with integrated graphics which decompresses the video stream and outputs it on its own HDMI connector.
(Cue-in problem with generational compression, over expensive dongle hardware, etc.)
Man, the future of FOREX is going to make the Linux DE holy wars look like minor doctrinal differences...
I think the parallel with Linux is valid on a lot of point.
Not only have recent history seen an explosion of variants:
(There are many alt-coins just as there are many linux distributions).
But on the long term, probably is will resolve itself in the same way:
A couple of widespread mainstream variants (like Debian, Redhat, Ubuntu, openSUSE) (same in the crypto-coin world: Bitcoin and Litecoin are apparently here to stay, and happy at their position)
A few others for more specialist uses (like Gentoo, Knoppix, SystemRescueCD) (probably in the crypto world some *actually anonymous* coin will emerge).
And then a whole bunch of entries that nobody has ever heard of and are almost not used.
But there's a small difference:
- Low popularity linux distro, end up usually abandonned
- Whereas, low use coins end up being the playground for troll-traders.
This aspect of trust is no different to existing banks, and it is the reason why banks started to be regulated and insured.
with one small difference:
- With Bank, your only choice is trying to find one that you can trust.
- With crpyto-currencies, you can also run your own node and in a way be your own bank.
In that case, you can trust yourself that you're note going to scam yourself.
And you also know more or less what security measures you're using
Bitcoin's concept of wallet simply adds the users as a possible element in the equation.