Become a fan of Slashdot on Facebook


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Market (Score 2) 474

That's not true; GPUs basically always use the latest process technology available, just like CPUs. Recently, there have been some degenerate cases where a new process is (at least initially) slower and more expensive than the previous one; but in general, they always move to the latest and greatest process, once that process is capable of making a better product.

As for die size, the big GPUs are way bigger than CPUs. A 22-core Xeon Broadwell E5 from 2016 is 7.2 billion transistors, and 456 mm^2. The NVIDIA GP100 chip (also 2016) is 15 billion transistors, and 600 mm^2. The AMD Ryzen (2017) info I can find says it's (probably up to) 4.8 billion transistors.

I have no idea what you mean by "tolerances". Maybe you mean "process variation", which is a natural part of any semiconductor manufacturing - and is controlled by the fab (TSMC, GlobalFoundries, Samsung, Intel), not the chip designers (Apple, NVIDIA, AMD, ummm Intel again). The design houses ship off the chip they want - and the fab produces it, with some chips a little hotter/faster than others. Over time, they can tighten up the process so it has less variation and higher yields, but nobody is "running wild" with anything.

It's complicated too, because the node names are really just marketing hype. Just as "Kaby Lake" is a name that Intel gave to a collection of optimizations put in a single chip, or "Pascal" is a name that NVIDIA gave, or "Ryzen" is a name that AMD gave – 14 nm is a name that some fab gives to their latest collection of optimizations. There's no one measurement that corresponds with the marketing name any more, like there was until the early 2000s. [citation] The upshot of this is that Intel's 14 nm isn't the same as TSMC's 14 nm or GloFo's 14 nm, so you can't necessarily compare them. Intel does generally have an advantage in this space, however. That said, everybody pretty much uses the latest, greatest process technology available to them from the fab they have chosen. And it is often the case that a GPU is one of the first things manufactured in a new process at a fab, so they aren't benefitting from anybody prior - especially not at a different fab, because the fabs don't share their secrets, or even the same set of features (as noted previously).

Also, with a brand new process, yields can be very low, so a given company may choose to reduce their risk by making their first chip on a new process either a die shrink of a previous chip, a minor revision to an existing architecture (Intel's "tick"), or a small low-performance chip. Once the kinks have been ironed out on one of those "easy" options, they can shift the bigger, higher-performance chips to the new process. But in some cases, if they started out on the big chips, the yield would be 0% - or if not 0%, the cost of an individual chip would be so high that no consumer would ever pay for it.

And while I will grant you that GPUs have *less* cache, they do still have some caches and other memories. A GP100, for example, has 14 MB of register files, 4 MB of L2 cache, 3.5 MB of shared memory, and 1.3 MB of L1 cache. That's still well shy of the 22-core Xeon I mentioned earlier, which can have up to 55 MB of LLC, but it's a pretty good amount all the same.

The real reason that GPUs have always outpaced CPUs is because they are inherently parallel. In addition to all the architectural optimizations that are made every year, they also add more cores every year; while most of us are still using something in the vicinity of quad-core CPUs, just like we were 5 years ago. Also, the parallelism of GPUs means that they have more freedom for architectural changes to yield throughput enhancements. A CPU is largely targeted at single-thread performance, so most of the optimizations they make will enhance that. A GPU architect can make similar optimizations to enhance a single thread's performance, but they can also make changes that only help parallel computation.

So GPUs are arguably more advanced than CPUs, or at the very least on par with them - and they will continue to outpace CPU development for the foreseeable future as well.

Comment Re:Low Hanging Fruit (Score 1) 2

I'd also venture that it's occasionally possible to make a huge improvement somewhere, but it takes many years of development to get there. By the time your 8-year research project has given you a 16x speedup and is ready to be released to the world, all the incremental changes everybody else has developed have caught up and are just as fast.

Comment Re:What's the big problem? (Score 1) 675

As a Canadian that recently moved the US, the system here is utterly ridiculous and broken.

No argument here.

I never know when I should swipe vs insert the chip, I have never been asked for a pin, sometimes I have to sign and sometimes I don't (there doesn't seem to be a clear limit), and there's no tap-to-pay.

You can always ask. The monkey behind the counter probably knows. And most terminals I've seen have some sort of indication - for example, a light by both the swipe slot and the dip slot (or whatever you call them). If only the swipe light is lit, you swipe. If both lights are lit, you dip (assuming you have a chip card). Or the POS will say either "Please swipe card" or "Please swipe or insert card".

If you want to use a PIN in the US, you can always get a debit card. But otherwise, you will never be asked for a PIN on a credit card.

The no-signature rules vary by credit card issuer, merchant type, merchant implementation, and purchase amount. For certain types of purchases, certain types of merchants, or merchants who have not implemented it, you will always be asked for a signature. Otherwise, there is generally a $25 or $50 limit, with the higher one usually applied to grocery stores and big box retailers.

Tap-to-pay support is still quite limited, that's true. But one of the nice things about the Chip and Signature transition is that all these merchants are having to buy new POS terminals, most of which include tap-to-pay. It just will take a while for them to update their software and turn the feature on - plus some of them are still hoping that if they don't let you use Apple Pay, et. al., that they can force you into their competitor, CurrentC. (I don't actually know the current status of it, but it seems... questionable at best. But all the same, the stores that supported it are at least a year or two behind everybody else in implementing tap-to-pay – and everybody else is a year or two behind the wide release of phones that support it.)

Comment Re:Strong enough for a man, made for a woman (Score 1) 858

Sex in the city still rated around 6 among men, so they did like it, just not as much as women. Yet still the headline was that men sabotaged the votes.

It occurred to me that there is a relevant quote in the article regarding this:

Distilling any work into a single number strips out a substantial amount of meaning. ... To understand the whole picture, you need to dive into the data.

Except I would word it thus:
Distilling any work into a single headline strips out a substantial amount of meaning. ... To understand the whole picture, you need to dive into the text.

Yeah, the headline is a bit inflammatory. Welcome to 2016. The point is really that, whether intentionally (as the word 'sabotage' would imply) or not, men drag down ratings of certain shows that are more liked by women.

Comment Re:Strong enough for a man, made for a woman (Score 1) 858

If you don't like those shows why not just communicate that to your partner? If she cares about you she won't force you to do things you hate doing, and will find something mutually enjoyable for "quality time". If she doesn't care and insists on torturing you, maybe it's time to reconsider your relationship.

My wife doesn't insist that I watch any particular show with her, but I usually join her anyway. And sometimes, we really agree on liking a show, which is great. But if I'm watching a show she doesn't like, she's sometimes quite vocal about it - often understandably so. If you're not interested in a TV show but all you can hear is the sound of skulls getting bashed together, I can understand how that would be disturbing, annoying, and/or distracting. On the other hand, if she's watching a show I don't like, I'm rarely so offended by it that I raise a stink. (There's no "Dear, there is too much empathy on the TV, can you turn it off?" or "I can't stand the predictable jokes and laugh track!" – on the other hand, if she thinks there's too much violence or sex or whatever, I'm likely to hear about it.) When I want to watch my TV, I wait until she's asleep or away from home.

The net effect is that I end up watching more of what she wants than she does of what I want. Most of the time, I watch her shows anyway, even if I'm not a fan (i.e. I'd give them middling ratings); she does not do the same on my shows. So, if the average man is like me, I could understand how the data would skew that way. Although I'm pretty sure the average man isn't like me, and there are some vindictive pricks out there too who just like crapping on other people's nice things. So I can *still* understand how the data would skew that way.

I think the conclusions at the end of the article are right on point:

Distilling any work into a single number strips out a substantial amount of meaning. ... To understand the whole picture, you need to dive into the data.

Comment Re:Brakes? Tires? (Score 1) 555

Coming up in next month's newsletter:
Eating bacon can extend your life by several years!
In a controlled study, one group of people ate only bacon for all meals, and lived several years longer than the control group! The control group consumed the exact same diet, but without the bacon. While the control group mostly died within a month or two, the bacon-eaters survived several years before succumbing to normal diseases like heart attack, arteriosclerosis, and uncontrollable joy from eating bacon.

Comment Re:Brakes? Tires? (Score 5, Insightful) 555

Articles like this are almost as popular with news sites as "chocolate/beer/wine/cheese/bacon cures cancer!". From what I can tell, the publication was written by a summer intern who is about a junior in college, by reviewing other publications and making some guesses from the data contained therein. It's a good thought piece, i.e. "Hey guys, there's a lot of stuff that we haven't really done much to improve yet, maybe we should look into that." The publication doesn't make an argument that "electric cars are evil." It doesn't even have any real data of its own. And well over half of the particulate matter that they attribute is just stuff that was lying on the ground and the cars kicked up into the air; and because they claim that an EV is 24% heavier, it will kick up 24% more PM in its wake, which is probably not true. I'd be willing to bet that even if EVs average 24% heavier, they are probably not also 24% larger and 24% less aerodynamic; and the size and shape of the vehicle matter at least as much as the weight in creating a wake, if not more.

On top of that, I don't know that reduction of particulate matter has ever been a huge concern for the EV market. Generally, the concerns are more along the lines of reducing CO2 (/CO/NOx/HCHO/NMOG/NMHC) emissions, oil consumption, monetary support to unfriendly OPEC nations, required maintenance, or fuel costs; or increasing support of new technology, renewable energy, etc. But, PM is certainly a health concern, so maybe the article's best use is just to point out that, as long as we're making a lot of other changes in our transportation system, maybe we should consider how we can change it to reduce PM emissions as well.

TL;DR: Science reporting fails again.

Comment Re:And the problem is? (Score 1) 268

The problem is that a lot of companies (and, at their behest, some of the regulators too) are going for a slow takeover of driving by computers. Today they can do a little bit of driving mostly on the highway. Next year, they'll handle some city driving too. The year after that, they'll handle areas without good lane markings, the next year get a little better still, etc. But they still need a person there, because what if the car encounters a woman in an electric wheelchair chasing a duck around an intersection with a broom and doesn't know what to do?

This is one reason why Google's approach is better - build the car to handle everything, even things it has never seen before. Otherwise, you end up with a human who hasn't been paying attention for the past 15 minutes and is suddenly expected to come up to speed (or get his girlfriend's pubic hair out of his face) and take over driving in the next second in order to avoid an accident.

Comment Re:Misleading (Score 4, Insightful) 382

Attributing long lines to TSA pre-check is false; attributing long lines to mismanagement would be more accurate.

Yes, this times 1000. And, FWIW, the article isn't slanted this way, only the summary is. The article is much more straightforward, although they don't explicitly call out mismanagement.

Honestly, I think we'd be better off just getting used to the fact that sometimes bad people will get on planes, and security doesn't need to keep the casualty rate to zero; just discouraging most of the bad guys is good enough. We don't require that cars protect you from every possible way you could die in an accident - we just require them to be pretty good at protecting you most of the time. That's what I'd rather have the TSA's replacement tasked with.

Comment Re:Forget PreCheck if you fly international (Score 1) 382

If I traveled internationally more than once every 2-ish years, I'd consider it. But in my case, I had this choice:
1. Get fingerprinted at a nearby (5 miles) H&R Block office for PreCheck tomorrow.
2. Get fingerprinted and interviewed at the nearest major international airport (40 miles) two weeks from now.

I chose option #1. It was even completed fast enough that I was registered to get PreCheck for my next flight later the same month. Option #2 would have likely taken more of my time than I'd spend just waiting in customs lines over the next 5 years. But yes, for many people, Global Entry might be a better option.

Comment Re:Arleady problematic now (Score 1) 602

My car (about 6 years old now) already has those two features. Newer models certainly do a better job than mine, but I have a few comments...

For the Lane Departure Warning, my car won't even turn it on unless you're over 25 mph. It's really meant for highway driving, not city driving. They're applying this treatment only to roads with a speed limit of 30 mph or less.

For Collision Avoidance (my car calls it a pre-collision system, because it won't actually avoid an accident, it will only reduce the severity); it does need to know whether a stopped object is in your lane or not, but it's not using the camera and lane departure warning system to figure that out. The LDW can be very unreliable (a sudden shadow like an overpass will confuse it, as will a break in lane lines from a merge or exit). Instead, it uses the steering angle sensor to figure out what direction you're going. I also think it might be a little more prejudicial against stopped objects; it assumes that if it's already stopped, you probably saw it well in advance, so only when you're indubitably going to smash right into it will it brake. This is how they avoid braking every time you pass a parked car at the start of a curve. On the other hand, if there's a fast-moving object in front of you and it suddenly starts a rapid deceleration, then it's a safer bet that it's on the road with you and not just a random object on the side of the road - and thus it will brake for you.

Comment Re:More nation-wrecking idiocy (Score 1) 602

I would expect that sort of thing to be focused in certain areas where there are higher accident risks due to inappropriate driver behavior. In general, all the things you listed are done to encourage drivers to drive at a safe speed.

In my area of California, there are a fair number of places that are getting "road diets" where they remove a travel lane and re-stripe. A common change is to take a 4-lane road, reduce it to a 2-lane road, and then add a bike lane in each direction and a center left turn lane. I'm pretty sure I've seen parking both added and removed in various reconfigurations. And most of these 4-lane roads were overbuilt in the first place during a suburban boom; they really weren't meant (or needed) for commuter traffic. There's usually a better road nearby that the traffic engineers are trying to encourage people to use. When these overbuilt roads are available, what tends to happen is some hotshot in a hurry decides to drive 50 mph through a residential area because he can save 10 seconds. Instead, these changes encourage people to stay on the main road at 40 mph.

In general, the total capacity of the road isn't reduced by a road diet; the center left turn lane makes sure that nobody has to wait for left turns, and the bike lane gets bikes and right turns out of the right lane. The end result is that even though there's only one traffic lane, it is more free-flowing than a lane would be in the 4-lane configuration. This is true until you get over 20,000 vehicles on the road in a day; after that point, you do indeed need the extra lanes.

Adding more protected bike lanes can also get more people to bike (and thus fewer people slowing your car down), although that's not something that will happen in any measurable amount by adding one bike lane - that's something you get when you make the whole city bikeable.

Slashdot Top Deals

Memory fault -- Oh dammit, I forget!