Comment Re:Just refuse the new gear (Score 1) 224
DMZ is not as clean as true bridge mode. And in many cases, there are protocols that just don't work. And then the modem/router is processing the packets and you're more bound by its (crappy) CPU.
DMZ is not as clean as true bridge mode. And in many cases, there are protocols that just don't work. And then the modem/router is processing the packets and you're more bound by its (crappy) CPU.
You beat me to it. Thankfully, I don't have one. But if you do and you're stuck on AT&T, there is a good workaround (finally). It lets you have root via telnet and you can get true bridge mode.
http://www.ron-berman.com/2011...
Artificially inflating the price of Internet and then giving a bundling discount is the same as forcing you to pay for the cable package. Don't look at the $0 on the column on the right. It's a dummy number.
It was broadband that killed that. The software (those distros, anyway) was free anyway unless you wanted support. As soon as it was easier and cheaper to download at home, that's what people did.
So they've succeeded - and matched Microsoft's incompetence. Isn't that what it takes to go mainstream?
Distinction - 2014 is the year of the Linux Post-Desktop Era...
But even doing the maintenance and eating up the savings is fine if it doesn't come out taking longer overall. You spent the same amount of time working, but didn't have to do anything dull or repetitive.
I didn't forget. He tripled productivity! I was going to post a reference myself, but I decided to crowdsource it. Does that count as automation?
Welcome to Slashdot.
But running software on a mainframe requires you to have "power...in an organization." Anyone can automate tasks from their own workstation without any authority needed.
But who automates the automaters?
Way to miss the point. They are essentially two frames being displayed at once. The signal resolution was not what was being discussed.
The above is for anything that wasn't shot 24p, of course.
Yes - the TV's all have de-interlacers. A still frame on 1080i is 1920x540 upscaled to 1920x1080 and another 1920x540 upscaled to 1920x1080. It's not necessarily as sharp, depending on the content, because you're still interpolating to get beyond 540p.
I realize you meant slow-moving, and not a still frame, but the point sort of still stands. I don't know if temporal resolution gains really even out with the eye because of how the LCD changes from one from to the other. Some newer TV's are blanking the screen with black between frames (at 600Hz) trying to trick the eye into seeing it as smooth as phosphor.
If the fields are upscaled (height doubled)
If you think that's all a good de-interlacer does, you're really misinformed. The in-between lines are interpolated pretty well- and they do not leave any visible oscillation. If you're talking about sending interlaced video to a TV from a computer using a progressive display resolution, you're doing it wrong. Either you send the display an interlaced video mode or you de-interlace it before sending it out.
Well, yes, that's precisely what CRTs did. They skipped the width of one scanline (a little more, or less, if not properly calibrated) between each scanline while rendering one field, then rendered the following field in between. This is still the only *proper* way to de-interlace interlaced video, anything else is just compensating for not knowing if the content its display is (properly) interlaced or not.
Again, it's not the only proper way. Modern de-interlacing algorithms aren't just in media player software - they're on any LCD TV. And they certainly go beyond simply weaving fields. De-interlaced video looks better than interlaced display on even high-resolution phosphor. The television doesn't compensate for not knowing - the television is told by the signal it's getting whether the content is interlaced or not.
The next person to mention spaghetti stacks to me is going to have his head knocked off. -- Bill Conrad