Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Cell (Score 1) 338

I'm speaking more to characteristics of the end result, rather than lineage. At the broad level, both the original Atom and the PPU in the Xbox360 and Cell are dual-issue, in-order processors that use thread-level-parallelism (hyperthreading) to keep the CPUs execution units fed. Neither the PPU nor Atom(including the original Pentium) supported instruction re-ordering, register renaming, or speculative execution -- the only instruction level parallelism any of these designs can extract is that two adjacent instructions can be issued if there's no dependency between them. These design choices are very different than the fast single-core (what I call a 'fat' core) that are typical of mainstream, high-performance CPUs of the day, through today, like the Pentium 4, Athlon, Power4, or Intel's current i-series.

However, one thing I had forgotten about the PPU in the Xbox360 is that it wasn't just that the SIMD Altivec units were given a larger register file and some tweaked instructions -- There's actually two full, independent SIMDs per core, one for each thread. The one in the Cell only had one SIMD and its register file wasn't extended, AFAIK.

Comment Re:clock speed is not the right comparison (Score 2) 338

Unified memory isn't strictly bad, so long as there's sufficient bandwidth and sharing the resource isn't overly contentious. The PS4 has around 176GB/s bandwidth to its GDDR5. Xbox One has 68GB/s to its DDR3, but also has 32MB ESRAM that's effectively a software-controlled, level-4 cache shared between the CPU and GPU--it provides another 120GB/s bandwidth so Xbox One actually has a bit more bandwidth to go around than PS4 if the software is good about using the ESRAM. There's less apparent bandwidth to go around than a PC with a higher-end discrete GPU, but consoles with unified memory make the difference back by not having to copy things around so much -- a simple example of a PC inefficiency here is that a CPU resource on the PC typically has to be copied somewhere before the GPU can touch it, even on shared-memory systems with integrated GPUs; On a console, passing over some minor details, you just pass a pointer. Note that Mantle, D3D12, Mantle, and OpenGL Next are all bringing some of these console-style efficiencies to the PC. Neither platform is bandwidth starved, unless you were to go out of your way to avoid the Xbox's ESRAM.

The trade-off of GDDR5 vs DDR3 is that GDDR5 has higher bandwidth, but it also has higher latency. For graphics and for accessing large swaths of data linearly, GDDR amortizes latency effectively, but the latency becomes apparent for more random access patterns, complex data structures, and indirection (As in general application code, and in game control logic). I believe this is the reason that Microsoft went with DDR3 as main memory, because it makes it easier for non-games developers to bring apps to XBox One.

Comment Re:clock speed is not the right comparison (Score 2) 338

Clockspeed really isn't a good comparison. The PPC chips in the last-gen consoles were the PowerPC equivilent of Intel's first-generation Atom processors. They were more narrowly-issued than Jaguar, and didn't have fancy branch predictors or instruction re-ordering; like Atom, they used hyperthreading to hide when one thread stalled, but that only goes so far. Jaguar in the current gen has a nice branch predictor, modest instruction re-ordering, can issue more instructions per cycle, and also supports AVX. And there's 8 of them instead of just 3 -- all in all, current gen has around 4x the CPU power as last-gen, even accounting for clock-speeds.

All that said, when I was guessing at what this generation's specs would be about 5 years ago (I work in games, so it wasn't mearly a 'what if' game), I was right about GPU size and performance, right about the size of main memory, right about unified memory architectures, and right about price all within a relatively small margin. The CPU I was split on -- I figured we'd see 8-16 "thin" cores, or 4-6 "fat" cores, or maybe a heterogenous system with 2 fat cores and 4-8 thin cores; we got 8 "thin" cores running slower than the 2.4Ghz I'd have guessed at, but my money was on 4 fat cores precisely because fat cores do much better with AI, general logic, and other branchy or latency-sensitive code. The choice of CPU this generation surprised me.

Comment Re:Cell (Score 4, Insightful) 338

Cell -- at least the part of Cell that provided the computational grunt -- was not a CPU. Cell was a single-core, dual-threaded PowerPC core -- the exact same PowerPC core as the three in the Xbox360, save the extended vector instruction set and registers that Microsoft had added to their implementation. That core was basically the Intel Atom of PowerPC architectures. The better part of Cell was the DSP units, which you can basically think of as SIMD (Like SSE) units that are able to run autonomously from a small pool of local memory. The PowerPC's job is to load the Cell units with work (data and code) and coordinate the results -- they work for the CPU, and aren't capable of much on their own.

In the PS4, you don't have cell, but instead you have 4 GPU cores dedicated for compute tasks (they don't have the back end necessary to perform rendering, although they can participate in compute tasks that aid the rendering pipeline.) Like Cell, these cores work for the CPU, have generally the same programming model (load them up with code and data and set them to running), and also have the exact same theoretical throughput as Cell had.

Variety and competition are great, but Cell was nothing special in the sense that what was good and unique about it has been subsumed by GPU compute -- it was ahead of its time, but it hasn't aged nearly as well. Game consoles are a commodity business though, its hard to justify custom or obscure hardware unless its the only way to get the architecture you want, but then you have to teach everyone to use it effectively.

Comment Re:ha! Inuit diet. Hazda diet. (Score 1) 281

That's not fully informed -- In general, the average life expectancy of these people is dragged down by unusually-high infant and child mortality, and to a lesser extent unnatural early mortality in adults due to lifestyle hazards and both issues exacerbated by limited or no access to modern medicine. When these people survive those additional hazards, life expectancy is similar to those who live a more "Western" lifestyle.

But life expectancy is not really the point here -- health and quality of life is far more apropos -- In that measure, these people "outlive" the average Westerner in spades, regardless of how old they are when they kick the bucket.

Comment Re:Why do we need Auto? (Score 2, Informative) 193

Auto has other benefits -- firstly, in nearly any case where you would use it, you use it not to avoid stating a type altogether, but to avoid repeating type information that is already stated explicitly or is immediately apparent from the initializing expression (e.g. int x = (int)some_float;) Secondly, in the case of generic code, auto makes it easier to just adapt types to different combinations of template type parameters when that's exactly what you want to do, the alternative has been to maintain these relationships yourself, through what is usually a non-trivial arithmetic of types (including their constness and reference types) and sometimes involving multiple levels of intermediately defined typedefs. Thirdly, some types in C++ actually cannot be stated explicitly (for example, the type of a closure generated from a lambda expression) and auto is the only option if you wish to store or reference one.

In all cases, the value is strongly typed. The type of a value is set in stone the moment the program is compiled, although multi-types are supported through various library implementations (e.g. Boost::variant, I think is the name of one).

Comment Re: You're doing it wrong. (Score 4, Interesting) 199

Having a well-thought-out, consistent, orthogonal, and to-the-extent-possible obvioius UI can go a long ways toward the user experience, bring relevent information nearer to the user, and even make the documentation easier to write -- but even having achieved that ideal, UI/UX cannot and will not substitute for documentation.

At the end of the day, your users have a business goal, and you've sold them on the idea that your software package will help them achieve it better and more easily than other solutions. You sell solutions and solution components, but you also sell 'better' and 'more-easily'. Documentation is necessary, no amount of UI will take you from splash screen to solution whilst navigating a large set of outcomes and a series of interdependent choices.

DO provide UI reference, but scenario-driven documentation is your users' greatest need.
DO automate common, simple tasks to the extent possible.
DO make doing the right thing easy, and wrong or dangerous things hard.
DO bring the most relevant information into the app in abbreviated form (apply the 90/10 rule)
DO link the UI to relevant documentation.
DON'T get hung up on covering every possible scenario (again, 90/10 rule)
DON'T believe that a perfect UI avoids the need for documentation.
DON'T try to bring all the documentation into the UI.
DON'T rely on your own intuition about what's common or difficult for users, ask them or collect the data.

Comment Shuttle DS437! (Score 5, Informative) 183

Finally, an Ask Slashdot I can answer with personal experience and some authority!

Do yourself a favor and order a Shuttle DS437, I bought one myself and cannot think of a better little box for playing with embedded systems. Here's why:
  • Its small -- about the size of a 5.25" disk drive.
  • Its low-power -- not as low as you'd like -- but less than 20watts under load for the system. Its passively cooled.
  • It takes a 12v barrel-plug from a standard 65watt laptop power adapter (included) -- easy to replace anywhere in the world. Also good if the impetus for your low-power requirement is an exotic wish, like being able to run the system from battery or solar.
  • Its relatively inexpensive -- about $200 from Amazon.com, and qualifies for Prime shipping. You'll need to add storage and RAM, but maybet have some DDR3 so-dimms and a spare 2.5" drive kicking around from an old laptop.
  • Its got two DB9 Serial ports, right on the front. Handy!
  • Its a modern system: 64bit, dual-core, Ivy Bridge, SSE 4.2, supports up to 16GB ram.
  • Connectivity: VGA/HDMI, USB 3.0, USB 2.0, dual gigabit NICs, Wireless N WiFi
  • Storage options: you've got one mSata slot and one 2.5" sata drive. I've got a 128GB SSD in the mSata slot, and a 500GB magnetic drive installed
  • It took Ubuntu 14.04 without any significant fuss. Most things worked out of the box. I'm not a linux super expert, but got the rest working within an hour or so.

It's "only" 1.8Ghz, but we're talking Ivy Bridge here, not some wimpy Atom or ARM core. Plus, in my experience you really want x86 for your host machine. Not every compiler or tool you might want to use is going to be supported on, say, a lower-powered ARM system.

I considered a lot of exotic ARM boards as my development host, including BeagleBone, Jetson-K1, and a handful of others. I think the D437 leads by a wide margin, but for what its worth I considered the Jetson-K1 board a distant runner-up.

Comment Re:Cryptographically signed elections? (Score 3, Insightful) 266

And therein lies the rub. Is non-anonymity really better, especially where despots reign? Does it matter whether despots are continually re-elected through fraud or through fear of repercussions if the result is the same?

I'm not one to roll over to this sort of fraud myself, but I have little faith that identity wouldn't simply shift the solution to the 'problem' of the people's will in a different, and likely violent, direction.

Comment Re:Proud? (Score 1) 1233

I don't think that its simply the geographical size of the government's influence that is the problem, if anything, the problem is actually one of spanning cultural geographies (which happen to concentrate in physical geographies).

But I think there's another way that doesn't necessitate full-on-disillusion. Its not just that government interests are spread so widely, its that they have *so many* (sometimes localized) interests spread far and wide. The government is like four bakers stretching a pizza dough overhead at arms length from each other, then insisting that the pizza *needs* 50lbs of toppings to make everyone happy. One could give up and have the four bakers each make their own pizza for a different demographic. Or, one might simply not put 50lbs of shit on one big-ass pizza.

A federal government with a much lighter footprint than what we have today could serve the *actual* common interest of the 50 states well. That was the original idea, after all, and it did reasonably well for approaching 200 years. Then the feds started seizing more power from the states, and the states did nothing about it because they're dependent on big-daddy government's pocket-book. I agree wholeheartedly that a return of power to the states, with a corresponding shrink in the size of the federal government is needed. I think if that happens, it actually would solve the problems we see today, obviating the need to dissolve the federation, and leave the federation of states (Remember people, that's what "Federal Government" means: A federation of the states!) to perform the functions it was meant to.

Comment Net Neutrality: Its about content, not capacity. (Score 1) 555

For me, the key thrust of net nuetrality is more about the network provider not being able to block or degrade the level of service based on the content being transfered and upon the providers preferences. For me, net neutrality doesn't really come into it with regards to the the amount of traffic I'm moving through the pipe I paid for -- that seems to be the domain of the license attached to the package plan I signed up for.

In my mind, it would be evil for Google to tell me I can't serve up or consume certain kind of (legal) content or to degrade my service while I am, but its not evil for them to not want me serving up 75TB/mo on my residential-class fiber connection that costs me 39.99/mo. Granted, if they sell me a package that's billed as "unlimited" then that's on them and they can stick it, but if they offer a limited, cheaper service for the masses, and a more-expensive, less-restrictive plan for those that want to pay for it, then its reasonable for them to want to get paid for it.

Offer unlimited downstream bandwidth, and a reasonable, loosely-enforced upstream cap that won't raise a flag for normal usage. When a user consistently goes over, call them up and find out what's what, then just raise their cap because they actually are just doing a lot of something reasonable, or bump their cap for a fee if they're doing something that needs to be done under a different plan. Problem solved.

Comment This is your congress (Score 4, Insightful) 650

Wake up and smell the roses -- Just a day after your congress failed to amend a bill with an article to de-fund the domestic spy net exposed by Edward Snowden, they made sure to unanimously amend another bill with a different article to sanction an entire country--site unseen-- for harboring him from prosecution for what is essentially whistle-blowing. They are employing historic pressure already, having called on allies to ground a diplomat's plane he was rumored to be aboard.

Anyone who doubts the authenticity of Snowden's information, or the level of access he had in his position, need only look at the effort being expended by this government to reel him in to cast all doubt aside.

I would at least applaud them for being internally-consistent, if it weren't for the fact that they're only consistent against the ideals this country is supposed to hold dear.

Comment An answer for every question. (Score 1, Insightful) 273

If Mother Theresa or[sic] Ghandi had access to 3D printing what would they print?

That's easy!

Mother Theresa would 3D print destitite people suffering from horrible diseases, so that she could lock them away in 'hospice' where they will be denied medical care, pain management, and be denied visitors -- even their 3D printed family.

Ghandi would print naked, pre-pubescent girls to sleep with, so that he can 'prove his piety'.

Come on Slashdot, what's with the softball questions?

Slashdot Top Deals

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...