Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:Fighting Game Players Going to Be Bummed (Score 1) 202

I'm not a fighting gamer but I am sensitive to input lag, and my 50" Panasonic plasma was the answer to my prayers. Not only is the lag low, but there are real blacks and no ghosting. I might have to get another one before they shut down production. The downsides are high power consumption (it noticeably heats up my room) and the fact that they only come in large sizes, mostly greater than 50".

Comment Re:The paper gives examples (Score 1) 470

It's no so much a violation as assuming behaviour is defined when it is not.

Yes, that's a better way to say it.

The passing of parameters and return values falls under the purview of calling conventions and ABIs. These are discussed in the compiler manual (yes, yours has one), but usually ignored on PCs. (Or really anything with an OS.) In embedded programming, that stuff is much more useful, since you're more often interfacing C and assembly functions. It's also helpful for debugging, since the debugger is often confused by optimization. Put a breakpoint on the branch instruction and the parameters will be in the registers or on the stack.

Someone posted a link to Deep C elsewhere in the comments, which goes over some of these details.

Comment The paper gives examples (Score 4, Informative) 470

The article doesn't summarize this very well, but the paper (second link) provides a couple examples. First up:

char *buf = ...;
char *buf_end = ...;
unsigned int len = ...;
if (buf + len >= buf_end)
  return; /* len too large */

if (buf + len < buf)
  return; /* overflow, buf+len wrapped around */ /* write to buf[0..len-1] */

To understand unstable code, consider the pointer overflow check buf + len < buf shown [above], where buf is a pointer and len is a positive integer. The programmer's intention is to catch the case when len is so large that buf + len wraps around and bypasses the first check ... We have found similar checks in a number of systems, including the Chromium browser, the Linux kernel, and the Python interpreter.

While this check appears to work on a flat address space, it fails on a segmented architecture. Therefore, the C standard states that an overflowed pointer is undefined, which allows gcc to simply assume that no pointer overflow ever occurs on any architecture. Under this assumption, buf + len must be larger than buf, and thus the "overflow" check always evaluates to false. Consequently, gcc removes the check, paving the way for an attack to the system.

They then give another example, this time from the Linux kernel:

struct tun_struct *tun = ...;
struct sock *sk = tun->sk;
if (!tun)
  return POLLERR; /* write to address based on tun */

In addition to introducing new vulnerabilities, unstable code can amplify existing weakness in the system. [The above] shows a mild defect in the Linux kernel, where the programmer incorrectly placed the dereference tun->sk before the null pointer check !tun. Normally, the kernel forbids access to page zero; a null tun pointing to page zero causes a kernel oops at tun->sk and terminates the current process. Even if page zero is made accessible (e.g. via mmap or some other exploits), the check !tun would catch a null tun and prevent any further exploits. In either case, an adversary should not be able to go beyond the null pointer check.

Unfortunately, unstable code can turn this simple bug into an exploitable vulnerability. For example, when gcc first sees the dereference tun->sk, it concludes that the pointer tun must be non-null, because the C standard states that dereferencing a null pointer is undefined. Since tun is non-null, gcc further determines that the null pointer check is unnecessary and eliminates the check, making a privilege escalation exploit possible that would not otherwise be.

The basic issue here is that optimizers are making aggressive inferences from the code based on the assumption of standards-compliance. Programmers, meanwhile, are writing code that sometimes violates the C standard, particularly in corner cases. Many of these seem to be attempts at machine-specific optimization, such as this "clever" trick from Postgres for checking whether an integer is the most negative number possible:

int64_t arg1 = ...;
if (arg1 != 0 && ((-arg1 < 0) == (arg1 < 0)))
  ereport(ERROR, ...);

The remainder of the paper goes into the gory Comp Sci details and discusses their model for detecting unstable code, which they implemented in LLVM. Of particular interest is the table on page 9, which lists the number of unstable code fragments found in a variety of software packages, including exciting ones like Kerberos.

Comment Re:i wonder.. (Score 1) 530

You (as are most people with a poor grasp of these physics because of some shitty analogy someone used to explain it to them) are making the common mistake that the person with the flashlight in hand would think the light is traveling at 1c away from him (total of 1.5c) but they wouldn't see any such thing.

No, the GP is correct. Both observers see the light moving at 1c relative to them, even though they're moving relative to each other.

Comment Re:This should be a good thing (Score 1) 754

I hardly think it's a bad thing that people expect you to take part in the effort to take part in the spoils. Imagine you had a farmer who did all the work of plowing the field, sowing seeds, clearing weeds, fertilizing, harvesting, grinding to flour and baking the bread then along comes this guy and says I'm hungry, feed me.

I didn't say (or mean) that expecting people to contribute something is bad, especially not today. But who contributes and what counts as a real contribution are fairly arbitrary. Children in industrialized societies are usually not expected to contribute to their household income today, even though it was the norm for millennia. Likewise, opinions vary wildly over whether being a stay-at-home parent counts as a "job", and often depend on unrelated things like the race and marital status of the mother. Even normal types of wage labor have been/are thought to be unsuitable for various groups of people. (Women can't be doctors! etc.)

In a scarcity-driven society, your example makes sense:

Imagine you had a farmer who did all the work of plowing the field, sowing seeds, clearing weeds, fertilizing, harvesting, grinding to flour and baking the bread then along comes this guy and says I'm hungry, feed me. Maybe you're a good farmer that work long and hard and actually have more bread than you need, but I'd still tell you to pick up a shovel and help out if you want any. Then it's all tractors and machinery so the guy says he doesn't know how to use one, well you still say then learn and help out. Then it's semi-autonomous agriculture drones so the guy says he doesn't know how to maintain one, well you still say then learn and help out. Or do any other work you have.

But the direction it goes in is one where less and less work is required. It takes a lot of people to manually run an entire farm. What if it just doesn't take as many to maintain the drones? Do we shrink the population until there's ~1 person per available job? Or do we do something else, like rotate the work? Imagine only having to work for, say, five years before retirement. Or maybe some people would volunteer to maintain the drones because they find it fulfilling, or maybe drone maintenance comes with extra prestige or privileges beyond a basic standard of living. Or we could wildly expand our definition of what counts as work, so that people writing fanfiction or playing in garage bands are seen as contributing something worthwhile to society. You get the idea.

Comment This should be a good thing (Score 4, Insightful) 754

Reducing the amount of work it takes to keep the human race alive, fed, and housed should be unambiguously good. The reason it's not is because we've structured our society around the idea that all adults must be employed in full-time jobs (or be married to someone who is) to qualify for a decent life. We have this idea, particularly in America, that (employment) work is a virtue in and of itself. Unemployed people are shamed and villainized.

If we all lived on isolated family farms, it would be obvious that reducing the total workload is better for everyone -- less work = more free time. But instead, we live in a complex, interconnected industrial society. It's going to take a lot of large cultural changes before we can handle the idea that some people might not work at all, or only work a few hours a week. For perspective, we still don't have a consensus on whether something as difficult and time-consuming as being a stay-at-home mom counts as a job.

Comment Re:Use Slashcode. FFS. (Score 1) 281

What kind of idiot could make a statement like that with a straight face?

"The origin of climate change is mistakenly up for grabs"???

I can't help but notice that you left off the first part of the quoted sentence. You know -- the part about the *other* major subject of right-wing science denial.

Sure, there are rude comments and insane comments, and that's what moderation is for. But what I'm betting is really the problem here are the intelligent and reasoned critics who raise points the editors can't address without losing their ivory tower air of authority, (at best), or at worst, just looking ignorant and stupid.

Funny how the evolutionary biologists seem to have the exact same problem.

(Not everyone is practiced in publicly debating bogus talking points.)

Comment Re:I think they plan to compete on the premium end (Score 1) 348

I agree with your overall point, but...

"Does Nintendo really think they can compete with Atari, Magnavox, Intellivision, and Coleco with their upcoming 'NES'? Can they really elbow their way into this crowded market full of entrenched and experienced companies?"

It wasn't a crowded market. The North American video game industry had collapsed when the NES came out.

"Does Sony really think they can compete with Sega, Nintendo, NEC and Neo*Geo with their upcoming 'Playstation'? Can they really elbow their way into this crowded market full of entrenched and experienced companies?"

Sony entered the market at a time when Nintendo was both unpopular with developers (due to restrictive policies) and behind on hardware (due to sticking with cartridges). Sega was trying to come back from a series of hardware blunders when Sony undercut them by $100. The Neo Geo was never a serious competitor, and the TurboGrafx-16 only sold well in Japan. Sony is also a hardware company, and spent quite a lot of money ($1 billion?) to design custom hardware for the PlayStation.

"Does Microsoft really think they can compete with Sony, Nintendo, Sega, 3D0 and Atari with their upcoming 'Xbox'? Can they really elbow their way into this crowded market full of entrenched and experienced companies?"

3DO was from the previous decade. Not sure where you're getting Atari from. Microsoft basically threw monopoly money at the XBox, even buying Bungie so they could have a decent launch title. There were also lots of existing developers used to making games for a Microsoft platform (DirectX). Even then, the sales were pitiful next to the PS2, especially in Japan (which mattered more back then). The GameCube did about the same, and was the start of Nintendo's journey towards low-cost party game consoles. The XBox 360 was where Microsoft really got going.

Yeah, I think history says it can be done.

Under the right circumstances, with the right company, yes. And I think Valve has a serious shot at this. But it's not trivial. Half the companies you listed were utter failures in the market.

On the other hand, the market is different today. From the NES to the PS2, there was always a clear winner in sales and third-party support in each generation. But the latest generation was closer to a tie between the PS3 and X360 (please don't start talking about the Wii). We're now in a world where cross-platform games are the norm and two consoles can co-exist on equal footing. Hopefully there's enough room for Valve to push the industry in a different direction from where Sony and Microsoft want to take us.

Comment Re:This actually looks really unusable (Score 1) 317

It looks like the main buttons are on the top and back. I see the usual two sets of triggers, but there are also two big buttons going down the middle. I guess you'd press them by squeezing your middle and ring fingers, which I don't think has been done before. It's a neat idea; trying to put four fingers on the triggers always makes me feel like I'm going to drop the controller.

Comment Re:Today's Slashvertisement brought to you by... (Score 1) 317

Second... no, it isn't interesting new technology. It's technology that's been around for the past two decades at least, wrapped up in a slightly different package.

To be fair, that is what basically all new products are.

For gamers, a new type of video game controller is a big deal. Compare a DualShock 4 gamepad (2013) to an SNES gamepad (1990). They're still remarkably similar. The basic concept of two analog sticks, a D-pad, start/select, four face buttons, and some shoulder buttons has been the standard for well over a decade. The exceptions are some niche attempts at motion control that haven't worked so well for actually gaming. Using trackpads to replace the analog sticks on a gamepad is a new idea. It sounds pretty clever, assuming they can optimize the design.

I personally don't find vaporware advertisements interesting -- when they have an actual product, that I can hold, or buy, or at least get a fucking diagram to build a prototype of it, then it's interesting. Because in my world, interesting is defined as "shit I can use", not "shit someone in marketing dreamed up."

Part of the announcement is a request for beta testers. Beta hardware will be shipped in the next few months with plans for release early next year. That's not vaporware. As someone who's eligible for the beta, I appreciated the heads-up.

Comment Re:It was a casual game (Score 1) 374

Well, I didn't say it was *justified* sneering. But I wouldn't call it 100% artificial either. Big-name casual games are usually not the most sophisticated of their genre. (Fans of tower defense games complain bitterly about the simplicity of Plants vs. Zombies, for example.) There are legitimate criticisms of any work of art.

Myst players weren't all part of the gaming community. Many (most?) never played other games, except for stuff like Solitaire. And Myst sold 3-6 times as many copies as the most popular hardcore games of that year (e.g. Doom, Secret of Mana, Mega Man X). It might have been niche in an absolute sense, but it didn't seem that way at the time.

Comment Re:This isn't the history I remember. (Score 1) 374

I do not perceive that there are any more than there were in the late 70s/early 80s (remember Dallas, Dynasty, etc?).

Well, I was born in '82, so no. I turned 18 the year Survivor launched, so my view is probably colored somewhat. I fled to anime for a few years, and when I came back to American TV it looked very different.

Slashdot Top Deals

Adding features does not necessarily increase functionality -- it just makes the manuals thicker.

Working...