Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:nah it's a dead cat bounce (Score 1) 269

>They are compared to smartphones + 120GB SDCards.

Yeah, we get it. The iPod was a hard drive device. Flash is more expensive. Having said that, it appears that with todays prices a smartphone and 128GB card will set you back way less than the iPod did.

>A typical smartphone with it's cellular, 3G and WiFi turned on will drain the battery in about a day and a bit assuming that the device is completely in standby which it won't be while playing music.

[citation needed]. I haven't noticed any gains by turning off the the antenna compared to just saving power by not using it so heavily.

>Of course it can. Those people aren't the ones who still own 160GB iPods on which they carry their entire music collection.

Just because they exist as a minority doesn't mean it's a good idea for apple to invest a product line in them.

>What an carry around another device? This is moving the goalposts a bit isn't it?

Not at all. It was *you* who was extolling the virtues of having a separate device in the form of an iPod. The way I see it cellphone+iPod is no worse than smartphone+powerbank.

>I don't think I've ever seen an iPhone with 40 hours of standby time let alone time doing something useful like playing music.

I gave a source for that info, so if you have any source to the contrary you can present it. As I mentioned earlier I felt the battery life of the iPod to be inflated too, as it referred only to sequential playback without using the screen or menus whatsoever, and that's not how most people use their iPods.

>And if you start disabling things like Facebook, email, and all the other applications that do useful stuff on the phone, why not just buy an iPod, it'll be cheaper.

As we have jest established, this is certainly not the case.

>I'm not saying that playing music takes a lot of power, it doesn't, but playing music prevents the CPU going to sleep, and THAT uses a lot of power.

This explanation certainly sounds sensible, but you haven't produced any evidence why my claim about smartphones playing audio for as long as iPods is wrong.

>But hey you said it yourself your radio uses the most power, but at the top you told me I can't turn it off and said something about a strawman.

The radio uses a lot of power if you browse the web and watch streaming videos all the time. If you use it like an iPod and only play music offline, then the GSM standby and occaisonal SMS won't exactly be major energy hogs. No "gimping" involved, just make an effort to cut back on power-heavy usage.

>This is all semantics really because in this connected world I doubt people find themselves without a phone charger for more than a few hours at a time anyway. Even when going camping we have solar panels to charge various devices.

Finding the charger and getting the phone out can be a pain though. But like I say, the powerbank has been a real saviour for me cause then I can go off somewhere and have enough charge for many days of GPS, music, videos and whatever else.

>But the point is if you want to play music then the iPod is a hell of a device to beat, and certainly can't be fully displaced by a smartphone for many scenarios.

Personally I found that the iPod's lack of support for various audio formats, lack of bluetooth audio, lacking equalizer and iTunes' terrible tagging system to be quite the dealbreaker for music.

Comment Re:Too small to be of any benefit. (Score 1) 179

Any particular examples? Personally I find all fonts look perfectly fine and as-intended on my screen, and anti-aliasing enhances the appearance. But then again I do usually scale text and keep a distance of about 3 feet. Most Windows users keep the teeny-weeny default fonts and just sit closer to the screen.

Comment Re:Too small to be of any benefit. (Score 1) 179

By the way, the way I tested this was to open a paint program and create a 3x3 black square on a white background with one pixel missing, like this:

xxx
xx
xxx

At about 3 feet for my 24'' 1080p I wasn't able to tell which pixel was missing after randomly rotating it.

Maybe you want to try this yourself and report back. I'd be interested in your results.

Comment Re:Too small to be of any benefit. (Score 1) 179

>Take fonts, you can 'smooth' them, to me that makes them look blurry in an ugly way.

That's true for low resolutions, but my point was that at high resolutions there's a domain where you can still make out jaggies on a high-contrast edge, but can't tell any tell the difference between higher resolutions if you use a filter.

>The font I am typing with right now is constructed of lines only 1 pixel wide, leaving negligible room for font style.

Pixel-level design has to accommodate for the natural and psychological blurring in the design anyway. Plus, modern fonts are vector-based and designed to scale arbitrarily. If anything the smoothing filters are even *more* important when you get to very small font sizes.

>Text would clearly benefit from 4k as opposed to 1080p.

It would benefit in that you could get up really close to the screen and admire the smoother edges, yes. But for the scenarios we were talking about, sitting at fixed distances and using the smoothing filters every modern OS relies on, no, it wouldn't.
In practical terms it's like this: A pixel-wide black line on a 4K monitor is clearly more detailed than one on a 2K monitor, but once you stand back far enough you wouldn't be able to tell the difference between the black line on the 4K monitor and a gray line on a 2K monitor.

The chart might be wrong for some people and I can't argue with that. But we need to be specific about what it's referring to. They're not saying "we guarantee you won't ever see aliasing on this setup" but rather "on this setup you can't tell the difference between between this and a higher resolution, provided that the pixels are smoothed naturally."
For the record I can spot aliasing about twice as far away as the chart references, but can' make out any detail whatsoever on pixel-level objects right about where the chart says I would have full benefit.

Comment Re:nah it's a dead cat bounce (Score 1) 269

So what you are saying is that we should buy expensive smartphones

iPods weren't cheap.

disable all cellular connections, remove updates from a device connected to the internet, effectively dim the screen as dark as possible, and then stick in the largest SD card we can find and then we can match the performance of a cheap iPod for a cool $700? Sign me up!

Nice straw man. You don't need to do any of that. The point is that if you don't install tons of background apps and don't watch youtube all day, you should be able to listen to music for a very long time. Many Smartphone users actively avoid updates because of the problems they sometimes cause.

In all seriousness though if there is a suggestion for matching a device's functionality which involves so heavily gimping a smartphone then it really is a great scenario where having a separate device would be ideal

The point I was making is that if 90% of people's needs can be met with one device, the remaining niche probably won't justify holding on to an entirely separate product line. And it's not as if there's anything the iPod does that modern gadgets absolutely can't.

not to mention that I don't like a scenario which potentially leaves me stuck because either my phone doesn't work because I've been using it for non-stop music listening.

I would really recommend an external power bank. I've completely stopped worrying about battery issues since I got one. They can hold an incredible amount of charge so you can run even the most demanding tasks all day. You can charge your "phone functionality" as well as your "music functionality". Not to mention practically everything else (like my camera battery) as they're universal. They weigh hardly any more than an iPod and are much much cheaper.

I'll also happily challenge the fact that a smartphone can match the iPod's battery performance.

Fine. I haven't got an iPhone, so I can't test it myself. I was able to find this which claims 40 hours of music playback. The last generation of iPod apparently could play for 36 hours. But I think it should be pointed out that the most popular models most people remember had far less than that.

Most of the battery saving features involve switching the CPU into an ultra low power state, something that can't be done while playing music.

My battery monitor app suggests that by far the most power is consumed by the screen and radio adapters. This seemed to apply to my old iPod nano too, which was rated as lasting for 12 hours, which it would do so long as you didn't touch anything. But change the song now and then or browse through your collection for a few minutes and you'll be lucky to get half that.

Comment Re:What's state of the art in UI scaling? (Score 1) 179

I find it has much to be desired, though to be fare that is also due to the state of Windows applications. Metro apps probably work well, but Metro isn't how most people use their Windows PCs (File Explorer still isn't Metro-compatible). So what most people are left with is badly-scaling apps in the desktop environment. The two fixes are to scale fonts, which makes the menus look shit, or to scale the apps, which makes them look blurry. And there is no way to selectively scale them, and the settings are global. The only solution is to enable it and then selectively disable the scaling for apps that scale well, by manually changing the properties on the .exe file which you have to grab out of the depths of the file system.

Comment Re:Too small to be of any benefit. (Score 1) 179

High-contrast objects are more discernible. Jaggies are an extreme case as they often have a very high contrast at the smallest of subdivisions, and draw a lot of attention because they're right on the boarder of two regions.

This issue only really affects certain types of computer generated images. It could be blurred out with a filter and *still* stay under the resolution people can normally detect. "Naturally" recorded images will have a certain blur to begin with.
I suppose it's an issue of diminishing returns, where you'd have to invest an awful lot more resources to "fix" a relatively minor issue for CGI sources, that could be dealt with more cost-effectively by simply using filters.

Comment Re:"Content" is an obnoxious red herring.. (Score 1) 179

Modern TVs enhance the framerate with interpolation, but that's a recent thing and not present on the low-end. The other stuff are basic colour calibration settings, which many computer monitors support too. Often they'll have more scaling options, but not as good as doing it on the GPU.
In fact one of the biggest complaints about TVs is that their default settings change the image to look good in the store (enhanced sharpness and contrast filters, excessive saturation), so that videophiles usually have to turn off a lot of the processing to get their video to look how it should.

Comment Re:nah it's a dead cat bounce (Score 1) 269

Smartphone updates are a pain, but you can turn off updates if you need rock-solid reliability. If you use simple battery-saving methods an iPhone or Android will play music for longer than any iPod ever did. And as for AV output, modern devices are more capable than ever, but composite video has gone out of fashion (though they were available for a while).

I guess my point is that apart from the physical buttons most of the functionality is easy to replicate. And the remaining niches like composite video and storage capacity can be addressed with more capable devices.
There sure are practical things about using old technology. Like how fast a DOS machine would boot, the contrast of a CRT, replacing the batteries on a GameBoy or simply pressing "record" on a VCR. But getting to worked up about these things is Nostalgia if you ask me.

Comment Re:Why? (Score 1) 120

The predictive algorithms for the scene would be deterministic, so you may as well perform them on the local machine. I still don't see how you propose this would work, so maybe I should ask this question specifically: What would be the criteria upon which the local client would select one of the pre-constructed scenes?

Comment Re:Why? (Score 1) 120

But this is specifically about streamed games. Locally rendered games (as with WebGL) can render the scene immediately in response to the player's input, so they don't have this problem. The lag is still there, but you only have to deal with the physics side of it, rather than dealing with the physics *and* the audio and video.

The scenario you describe doesn't make any sense because the idea is that you choose the frame (or scene) based on the inputs on the local machine. So if you're rendering it anyway then you may as well just have the local client respond itself to the inputs and do away with the overhead.

Comment Re:Microsoft did something like this once before (Score 1) 120

TV is. Music generally is. Why are games so special?

TV and Music is a non-responsive, non-interactive recording.

It's not like it is a physics issue, just cheap-ass network operators not laying lines from this CENTURY, hell, technically even last, some seriously still use aluminium.

From a physics perspective the age and material of the lines are irrelevant. Signals travel at nearly the same speed. The latency is due to the physical limitations of the hardware and routing infrastructure. This is mitigated by placing the server closer to the client, which obviously costs the game provider much more.

Or just ads at loading screens, again, as long as it was instant (which it would be since it is based their servers)?

Actually, loading screens would be one thing that cloud computing might make economical to seriously reduce.

Slashdot Top Deals

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...