Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:From time to time? (Score 1) 163

We put a man on the moon, and I still get tangled in my phone cord! Gosh! (What does that have to do with the issue?)

People go to jail for traffic misdemeanors and worse. There's little to no punishment for losing a highly radioactive source. These things have to be reported and monitored, and the subsequent search for the object consumes tons of taxpayers' money. If the government represents the people's collective wealth, and the people don't want their money wasted, then shouldn't the people say hey, penalize these guys for wasting our money?

Comment Re:Monster cable (Score 1) 282

In other words, playing 480p Wii content over HDMI will still look nicer.

A digital signal on a digital panel is going to be measurably sharper. Nicer is subjective, though. Cheaper 7-bit-per-channel panels don't look nicer to me, hence I still have a CRT since large digital panels with truer color reproduction are still rather expensive. A 24" with 99+% sRGB is still $400 or more.

I don't want my Wii content up-scaled so much as presented to my TV with larger square pixels. No idea if Wii U will output 1080p all the time such that 480p Wii content is scaled to 960p with black bars or if it switches display modes to 480p and leaves the rest up to the TV.

That would be really cool. I'd be happy with that.

HDMI 1.3 has more than the NTSC color range in its gamut. Not sure if deep color will be used by Wii U, but certainly wouldn't be for existing Wii games. There's also issues with saturation levels and overall brightness in the NTSC spec (enforced to prevent phosphor bleed) that affect component but don't affect HDMI.

HDMI can pass signals like that, but whether you get it depends on the whole chain: what the video adapter is given to output, whether the video adapter can output it, whether any intermediate devices like receivers will pass it through, and whether your display can display it. Games are probably (reasonable assumption alert) going to be coded with one code path, rather than multiple GFX engine code paths and texture sets to support larger color gamuts. It gets really expensive to have someone take the time and do color-correction for a full texture set. Indie games might make it happen first, and I'll be happy to see it. Usually you wind up with Y'CbCr color (equivalent to YPbPr which is what component video uses), and HDTV's color space specs are in fact based on modern phosphor characteristics. They are truly having a lot of trouble providing affordable digital panels that can even come close to the color reproduction of phosphor-based screens, so staying under that limit is still necessary to avoid color blowout issues on cheap displays. Personally I wish xvYCC was more prevalent, but Blu-ray doesn't even allow it. I know the PS3 can do it and lots of higher-end video cards can do it too but I've never seen it demonstrated.

Comment Re:Monster cable (Score 1) 282

For the Wii U, there is.

I was talking about the Wii, not the Wii U.

What you're saying, probably, is that you can't tell the difference between VGA (roughly equivalent to component, if not better) or DVI/HDMI on a digital flat panel. That's fine for you, but not for me. Eye fatigue from trying to refocus the slight blur of the VGA signal is gone with a digital connection for me. And again, black levels are an issue with analog sources.

Hang on, that's not what I said at all, and I resent the assumption. Why assume things like that? My eyes are plenty good enough to see the difference when the system is capable of reproducing it. My CRT is not due to various factors explained below. What I'm saying is that upscaling a non-native resolution to the panel's native resolution still isn't as sharp as it ought to be on a flat panel. I think we could agree on that.

Analog signals are never as clear as a digital signal, obviously, since you can only approximate a square wave (hard edge) and hardware capable of better approximations cost more money than a consumer's going to pay. When an analog signal's loss is equal to or less than the loss inherent to the analog CRT, you'll never notice anything related to the analog signal quality because the CRT's loss is greater - which is probably the case in my system. Hooking up a digital panel to a digital signal is the ideal situation because it has that digital nature, discrete pixels that don't bleed into each other. (Inter-pixel light bleed is a different story but mostly negligible for this discussion since CRTs have the same problem.) Hooking up a digital panel with poor analog sampling, IE a consumer grade TV/monitor, to an analog signal will immediately look blurry compared to what you'd get with a decent CRT. It will still look blurry until you spend thousands of dollars on the scaling equipment, which nobody in their right mind is going to do unless they are doing professional video archiving or restoration or feed-switching work. They have room to grow with video ADCs in consumer/prosumer level equipment but nobody's going to do it because of the poor cost/benefit ratio. So... you're always going to see it...

Black levels are defined by the specs, so it's up to you to calibrate the output device using the input source. Blu-ray, DVD, broadcast DTV (see Rec.709) etc have a digital range of 16-235 per channel to accommodate filter overshoots and such, so you never get true black or true white in the first place. Computer video devices, digital and analog, have no such restrictions. Calibrate your screen to a computer, and you'll have black level complaints when you hook up a blu-ray player or cable box. Calibrate it to the other system, and you'll have the opposing complaint.

Well, hopefully someone finds some of this information useful.

Comment Re:Monster cable (Score 1) 282

Is there even an official HDMI cable for the Wii? I haven't seen one. That being said, I only observed 'dot crawl' with composite and to a lesser extent with S-Video. Component cables seem to eliminate the issue entirely on my 1080i CRT TV and it looks decent to me. Upscaling is still pretty bad on flat panels, but they don't have much of a reason to make that a strong selling point, do they? Maybe the problem needs to be addressed from both ends instead of exclusively blaming one or the other.

Comment Re:DRM worked out then.. (Score 5, Informative) 464

I just wanted to play Raving Rabbids. Yeah, imagine my embarrassment when I had to tell my girlfriend's family, who gave not-financially-well-off me the gift card for Christmas, that I'd bought a game I couldn't play and basically their money was wasted. Telling her was bad enough. I couldn't even return the game since it was already open. Ubisoft wouldn't help, the store wouldn't help. So they don't get any more of my money and I'm happy to tell the story.

Comment Re:DRM worked out then.. (Score 5, Informative) 464

Yup, their DRM makes their games unplayable on my computer. Standard Windows PC with the only optical drive being a DVD burner. You know, one of the standard choices available on most PCs. Their customer support people got angry that I kept pressing the issue and told me to read the box more carefully next time I buy a game... Guess what, I will do that: I will skip anything that says Ubisoft on the box. It didn't say anything about not working if a burner was present.

Comment Re:Yes (Score 3, Insightful) 1086

Also, you'll never be able to verify that your algorithm is working by manually processing sample inputs. That's a tremendously useful ability to have. See the following thought process:

>> "See if I give it A, it should give B, but instead it gives C"

>> "Let me try it by hand"

>> "My algorithm is wrong" or "My implementation of the algorithm is wrong" or "I'm using the wrong algorithm to solve this problem" (knowing the difference saves you notable amounts of time)

>> "I now have an understanding of the actual problem and can solve it"

Comment Re:Yes I do, thanks for asking (Score 1) 660

Having larger than average hands myself, I definitely find that the larger phones fit better in my hands. I'm with you on that one.

I haven't really seen anything powerful in a small-person-friendly factor. The size-to-power ratio seems to decrease at an unfair rate. That, I think, is the general gripe reflected by the post. Maybe people want to have modern apps / games on a smaller screen, but there's a limit to what they can use because they get underpowered devices? Maybe they want to run multiple apps at once, or not be running out of memory all the time. I dunno, it seems fair to want a more powerful small-screen phone.

By the way... Does anybody remember how big mobile phones were in the 1980s and early 1990s? :)

Slashdot Top Deals

If you have a procedure with 10 parameters, you probably missed some.

Working...