Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Give me one giant touchscreen (Score 1) 60

Honestly, the laptop concept I want to see come to be is one giant, flexible screen with a hinge that will lay perfectly flat and fold around 360 degrees, probably something segmented like the surfacebook. Fold it all the way over to get a more traditional tablet form factor, tent mode to present or screen-share, lay it flat on a table to write on or share in a game, or flatten it out and prop it up on a built-in kickstand for something like a 24" display.

For my uses, touchscreen typing is fine for casual tasks like email, chat, commenting in forums -- even for occasional 'real' work (coding, etc) in cramped spaces like a train or during a layover. I wouldn't mind carrying a separate keyboard for real work, and laptop keys are getting progressively worse for the most-part anyways. Sign me the hell up if I get an actual productive screen size to boot.

IIRC Lenovo has said they have something like this in mind for the 3rd gen yogabooks. They can gladly have my money if they nail the concept.

Comment And what if our simulation is a child's plaything? (Score 1) 951

This kind of argument has been around for awhile, and its almost religious in that you can't really ever prove whether its true or not from the inside -- which I suppose is comforting, since it also doesn't matter whether its true so long as we can't know. If we did break through, talk about existential crisis!

What's an even more interesting conclusion to me, is that if you assume an advanced technological society has pulled this all off for a first time, it would follow that eventually this capability would become commonplace. You'd assume at first blush that our simulation *must* be running is some grand institute of science, but you'd be foolish to assume so. If you accept the argument on premise, its much more likely that our simulation is running in a mass-produced plaything, or is their version of a college freshman's D+ work.

You can't even assume that they think our simulation is anything special. There's no guarantee of a body that holds the ethical duty of keeping our simulation running, no five-nines, or even proper backups. We'd never be able to observe from the inside anyways, but it also follows that, eventually, the simulation will run so fast and be such a commodity that this original society doesn't think anything more of creating and destroying simulations than we think of turning our televisions on and off.

Or what if our simulation is a virus -- they may curse us and actively seek to wipe our simulation out.

And whose to say that our creators are the origin. Maybe our existence is just an experiment into their own. Its turtles all the way down until its not.

Comment Re:An idea. (Score 1) 106

Just cost prohibitive. For downloadable titles, the cost to pump bits to the end user is pennies per install, even for 50GB+ titles. Producing a BD-ROM, packaging, and all the physical manufacturing costs are more, but still on the order of maybe a dollar or so -- in fact, the marketting spend (which is what retailers rely most heavily on when deciding whether/how much of their valuable shelf-space to allocate for a title) surely dwarfs the cost of producing the tangible good.

Technologically, sure, we could ship games on the same type of flash technology used in SSDs. Heck, we could even put the SSD controller inside the machines, and make it configurable such that each game could choose the number of flash chips (channels, which is part of how SSDs attain the speeds they do), and capacities to meet their needs. But even if your performance/capacity needs are small (say, small games that don't stream content in) you're talking several dollars just for flash, maybe another dollar for circuitboad/shell/packaging, and because you're now physical-only, lots of marketting spend to gain a small amount of time on a small amount of shelf-space -- oh, and you've also just raised the pricing floor beneath which you can't profitably sell.

Selling new games at retail is already a thin-margin endeavor (a retail outlet makes perhaps a couple bucks per unit) -- that's why Gamestop pushes you to buy used copies of even the newest games: their profit margin on those is much bigger, even at 5-10 dollars savings for the consumer. Big retailers like Walmart only really sell them as a sort of attraction, in the same way that gas stations sell milk -- when you need milk, if you can get it at a convenience store, you might also make some other higher-margin purchases. In both cases, that's why the milk and the games are always in the back of the store.

There's just no room to change the production cost of the tangible good from pennies to dollars. Would the market adapt to a higher price? sure, mostly. But from the perspective of the publishers and studios, all they'll have succeeded at doing is convincing their consumers to pay more for something they see non of the profit from, and now you have less money to spend on the next game, which they would see some profit from.

Comment Something to see behind the curtain? (Score 5, Insightful) 302

I don't know the details of what Canada is doing, but often when it comes to IP enforcement changes around the world, the United States IP lobbies are somewhere in the shadows. Doesn't take much arm-twisting, just a 'gentle' reminder that future trade negotiations will look unfavorably on those who don't uphold IP similarly to the US.

Comment Re:Or a simple solution. (Score 1) 95

Its not nearly that simple -- LXC and the windows container technology put applications into their own private namespace -- they can't even see other applications or any resources the underlying OS hasn't given the container access to. This isn't just about isolating software dependencies, it allows you to do things like running two apps on the same host OS that might have mutually-exclusive dependencies, or say, to run one version of the app on a newer version of the same library, where in the past you might not have been able to have two versions of that library side-by-side.

Containers are like virtual machines, except rather than each application needing its own individual VM (and the resource usage that brings along with it), it shares an OS host without giving up the isolation benenfits, and without introducing redundant resource use due to running many Operating Systems. This is important because one of the ways admins might reduce resource usage is to combine applications inside a single traditional VM, and then you start loosing all those isolation benefits and the thing becomes more brittle and difficult to manage. Containers mean that you can isolate individual apps and their specific environment without much overhead at all, which encourages this good practice -- of course you're free, actually, to install multiple apps inside a single container if you choose (I've seen people set up entire desktop environments and remote into them), but for running a web-service, smaller and more isolated is better.

Comment What do you mean "Let them"? (Score 2) 734

What do you mean "Let them"? -- This is a choice that will affect *them* primarily for the duration of their adult lives. Military draft, taxation, future residency plans, or plans for college/post-secondary -- adult things.

My suggestion is that your role here is to compile a list of facts, pros, and cons to US citizenship, and prepare to have a frank and earnest conversation with your kids about it when they become 16 or so, and tell them that you'll help them begin the process when they're 17 if they choose to go through with it. Your role here is facilitator, not dictator.

Comment Model: Radeon r9280x or 7970 (Score 2) 110

As for a particular model, if double-precision performance is important, go with a 7970 or 280x on theAMD side (or 7990 if you need dual-gpu in one slot). They did double-precision at 1/4th their single-precision rate, which is the best you're going to find at consumer-grade pricing -- even more-modern or more powerful cards have backed off on double-precision, so something like a 290x has almost 50% more shader ALUs than a 280x, and will perform better at single-precision workloads, but only does double-precision at a rate of 1/8th, so its actually slower in purely double-precision workloads. All of nVidia's consumer cards are in the ballpark of 1/8th to 1/16th rate too, except the GTX Titan Black, which did 1/3rd rate, but at $1500 is nearly Quadro pricing anyways.

If money is no object an AMD firepro 9100 is the workstation version of the 290x, and does double-precision at 1/2 single precision rate, and is the current best-of-both worlds, and will probably remain so for the remainder of the year, but its a 3-grand price tag or so.

Comment Re:Cell (Score 1) 338

I'm speaking more to characteristics of the end result, rather than lineage. At the broad level, both the original Atom and the PPU in the Xbox360 and Cell are dual-issue, in-order processors that use thread-level-parallelism (hyperthreading) to keep the CPUs execution units fed. Neither the PPU nor Atom(including the original Pentium) supported instruction re-ordering, register renaming, or speculative execution -- the only instruction level parallelism any of these designs can extract is that two adjacent instructions can be issued if there's no dependency between them. These design choices are very different than the fast single-core (what I call a 'fat' core) that are typical of mainstream, high-performance CPUs of the day, through today, like the Pentium 4, Athlon, Power4, or Intel's current i-series.

However, one thing I had forgotten about the PPU in the Xbox360 is that it wasn't just that the SIMD Altivec units were given a larger register file and some tweaked instructions -- There's actually two full, independent SIMDs per core, one for each thread. The one in the Cell only had one SIMD and its register file wasn't extended, AFAIK.

Comment Re:clock speed is not the right comparison (Score 2) 338

Unified memory isn't strictly bad, so long as there's sufficient bandwidth and sharing the resource isn't overly contentious. The PS4 has around 176GB/s bandwidth to its GDDR5. Xbox One has 68GB/s to its DDR3, but also has 32MB ESRAM that's effectively a software-controlled, level-4 cache shared between the CPU and GPU--it provides another 120GB/s bandwidth so Xbox One actually has a bit more bandwidth to go around than PS4 if the software is good about using the ESRAM. There's less apparent bandwidth to go around than a PC with a higher-end discrete GPU, but consoles with unified memory make the difference back by not having to copy things around so much -- a simple example of a PC inefficiency here is that a CPU resource on the PC typically has to be copied somewhere before the GPU can touch it, even on shared-memory systems with integrated GPUs; On a console, passing over some minor details, you just pass a pointer. Note that Mantle, D3D12, Mantle, and OpenGL Next are all bringing some of these console-style efficiencies to the PC. Neither platform is bandwidth starved, unless you were to go out of your way to avoid the Xbox's ESRAM.

The trade-off of GDDR5 vs DDR3 is that GDDR5 has higher bandwidth, but it also has higher latency. For graphics and for accessing large swaths of data linearly, GDDR amortizes latency effectively, but the latency becomes apparent for more random access patterns, complex data structures, and indirection (As in general application code, and in game control logic). I believe this is the reason that Microsoft went with DDR3 as main memory, because it makes it easier for non-games developers to bring apps to XBox One.

Comment Re:clock speed is not the right comparison (Score 2) 338

Clockspeed really isn't a good comparison. The PPC chips in the last-gen consoles were the PowerPC equivilent of Intel's first-generation Atom processors. They were more narrowly-issued than Jaguar, and didn't have fancy branch predictors or instruction re-ordering; like Atom, they used hyperthreading to hide when one thread stalled, but that only goes so far. Jaguar in the current gen has a nice branch predictor, modest instruction re-ordering, can issue more instructions per cycle, and also supports AVX. And there's 8 of them instead of just 3 -- all in all, current gen has around 4x the CPU power as last-gen, even accounting for clock-speeds.

All that said, when I was guessing at what this generation's specs would be about 5 years ago (I work in games, so it wasn't mearly a 'what if' game), I was right about GPU size and performance, right about the size of main memory, right about unified memory architectures, and right about price all within a relatively small margin. The CPU I was split on -- I figured we'd see 8-16 "thin" cores, or 4-6 "fat" cores, or maybe a heterogenous system with 2 fat cores and 4-8 thin cores; we got 8 "thin" cores running slower than the 2.4Ghz I'd have guessed at, but my money was on 4 fat cores precisely because fat cores do much better with AI, general logic, and other branchy or latency-sensitive code. The choice of CPU this generation surprised me.

Comment Re:Cell (Score 4, Insightful) 338

Cell -- at least the part of Cell that provided the computational grunt -- was not a CPU. Cell was a single-core, dual-threaded PowerPC core -- the exact same PowerPC core as the three in the Xbox360, save the extended vector instruction set and registers that Microsoft had added to their implementation. That core was basically the Intel Atom of PowerPC architectures. The better part of Cell was the DSP units, which you can basically think of as SIMD (Like SSE) units that are able to run autonomously from a small pool of local memory. The PowerPC's job is to load the Cell units with work (data and code) and coordinate the results -- they work for the CPU, and aren't capable of much on their own.

In the PS4, you don't have cell, but instead you have 4 GPU cores dedicated for compute tasks (they don't have the back end necessary to perform rendering, although they can participate in compute tasks that aid the rendering pipeline.) Like Cell, these cores work for the CPU, have generally the same programming model (load them up with code and data and set them to running), and also have the exact same theoretical throughput as Cell had.

Variety and competition are great, but Cell was nothing special in the sense that what was good and unique about it has been subsumed by GPU compute -- it was ahead of its time, but it hasn't aged nearly as well. Game consoles are a commodity business though, its hard to justify custom or obscure hardware unless its the only way to get the architecture you want, but then you have to teach everyone to use it effectively.

Comment Re:ha! Inuit diet. Hazda diet. (Score 1) 281

That's not fully informed -- In general, the average life expectancy of these people is dragged down by unusually-high infant and child mortality, and to a lesser extent unnatural early mortality in adults due to lifestyle hazards and both issues exacerbated by limited or no access to modern medicine. When these people survive those additional hazards, life expectancy is similar to those who live a more "Western" lifestyle.

But life expectancy is not really the point here -- health and quality of life is far more apropos -- In that measure, these people "outlive" the average Westerner in spades, regardless of how old they are when they kick the bucket.

Comment Re:Why do we need Auto? (Score 2, Informative) 193

Auto has other benefits -- firstly, in nearly any case where you would use it, you use it not to avoid stating a type altogether, but to avoid repeating type information that is already stated explicitly or is immediately apparent from the initializing expression (e.g. int x = (int)some_float;) Secondly, in the case of generic code, auto makes it easier to just adapt types to different combinations of template type parameters when that's exactly what you want to do, the alternative has been to maintain these relationships yourself, through what is usually a non-trivial arithmetic of types (including their constness and reference types) and sometimes involving multiple levels of intermediately defined typedefs. Thirdly, some types in C++ actually cannot be stated explicitly (for example, the type of a closure generated from a lambda expression) and auto is the only option if you wish to store or reference one.

In all cases, the value is strongly typed. The type of a value is set in stone the moment the program is compiled, although multi-types are supported through various library implementations (e.g. Boost::variant, I think is the name of one).

Comment Re: You're doing it wrong. (Score 4, Interesting) 199

Having a well-thought-out, consistent, orthogonal, and to-the-extent-possible obvioius UI can go a long ways toward the user experience, bring relevent information nearer to the user, and even make the documentation easier to write -- but even having achieved that ideal, UI/UX cannot and will not substitute for documentation.

At the end of the day, your users have a business goal, and you've sold them on the idea that your software package will help them achieve it better and more easily than other solutions. You sell solutions and solution components, but you also sell 'better' and 'more-easily'. Documentation is necessary, no amount of UI will take you from splash screen to solution whilst navigating a large set of outcomes and a series of interdependent choices.

DO provide UI reference, but scenario-driven documentation is your users' greatest need.
DO automate common, simple tasks to the extent possible.
DO make doing the right thing easy, and wrong or dangerous things hard.
DO bring the most relevant information into the app in abbreviated form (apply the 90/10 rule)
DO link the UI to relevant documentation.
DON'T get hung up on covering every possible scenario (again, 90/10 rule)
DON'T believe that a perfect UI avoids the need for documentation.
DON'T try to bring all the documentation into the UI.
DON'T rely on your own intuition about what's common or difficult for users, ask them or collect the data.

Slashdot Top Deals

"The only way I can lose this election is if I'm caught in bed with a dead girl or a live boy." -- Louisiana governor Edwin Edwards

Working...