Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment What's up: Sciuridae! (Score 4, Insightful) 222

They aren't doing this to improve the user experience with the software. They're doing it to address the perception that "new and shiny" is what people want -- not functionality per se. They're aiming at the user experience of getting something new.

You know that marketing slogan, "sell by showing what problem you solve"? The "problem" that marketers have identified is the public's disinterest in things not new and not shiny -- and lately, not thin.

In my view, incompatibility is a sign of poor vision, poor support, and a lack of respect for those people who have come to you for what you offer. Speaking as a developer, if I come up with new functionality that is incompatible with the old, I add the new functionality without breaking the old. There are almost always many ways that can be done. I never did find a worthy excuse not to do it, either.

It isn't Google, or Apple, or whatever vendor that needs to learn a lesson. It's the public. I don't think it can be taught to them, either.

Squirrel!

Comment Re:As long as you don't mind spoilers for 11+ mont (Score 1) 148

I know, but charging me X for the box set just after it finished wouldn't cost them anything compared to charging me X nearly a year later. In fact, it would benefit them a little in terms of cash flow and probably very slightly due to inflation. And obviously it would benefit them compared to me being fed up with the spoilers and consequently not bothering to buy the next season on disc at all. I enjoy the show, but I enjoy plenty of other shows too, and I could just as easily spend similar money on 20+ episodes of one of them instead of 10 episodes of GoT next time I'm on Amazon.

Comment Re:As long as you don't mind spoilers for 11+ mont (Score 1) 148

So what you're saying is that there *are* legal ways for you to get the show earlier and avoid being spoiled?

Reportedly, but as far as I know I don't have any way to use any of them without spending many times the cost of the box set just on one kind of equipment or another and then another significant multiple of the box set cost on the subscription/streaming/whatever for the show itself. So as long as I don't mind a 1000-2000% mark-up, sure, I can probably avoid being spoiled (unless you count the other inferior aspects I mentioned as spoiling the show in another sense, of course).

Comment Re:As long as you don't mind spoilers for 11+ mont (Score 1) 148

They just launched HBO Now, so your complaint is moot.

I know, but as I'm in the UK, my complaint remains perfectly valid.

Here your legal options are basically limited to either getting Sky or relying on one of the very limited number of on-line options. All of these require dedicated equipment and/or work out absurdly expensive if GoT is the only exclusive show on the service that you're interested in watching. As I understand it, you're also still likely to get interrupted by ad breaks and logos/banners spammed all over the screen -- an insultingly inferior experience to just playing a disc and enjoying the show, and you're paying a premium for the "privilege".

Personally, all I'd need to avoid the disappointment is a simple and reasonably priced pay-per-view option to watch in sync with everyone else. With no real effort at all they could at least release the box set of discs as soon as the season has finished like every other show ever. In practice that would probably still avoid the worst of the spoilers, because usually people are pretty good about not assuming everyone saw this show live. The biggest spoilers I had for season 4, which I just finished watching, were all the trailers and promos for season 5, which obviously only start happening nearly a year after season 4 finished its first run.

Comment Re:Not my type of show either.... (Score 2) 148

There's an old saying, if it's fantasy the women are dressed in fur bikinis. If it's science fiction, they are wearing metallic bikinis.

Funnily enough, I think this is one of the things that gives GoT its edge over a lot of on-screen sword and sorcery fantasy. You get women wearing realistic clothes, like expensive formal outfits at court or actually useful armour for combat. You get women wearing effectively no clothes at all. However, you rarely get much in between, and in particular you don't get women going into situations with random skimpy clothing for no apparent reason beyond the ratings. Also, while there has been (with some justification) criticism of the gratuitous nudity on the show, the same basic all-or-nothing-but-plausible divide has been true of the male characters as well.

Comment Re:Good guy HBO (Score 1) 148

Because once watched I probably won't ever watch it again at least not for another decade.

I can see this for some shows, but GoT is one I almost always watch at least twice. It's such a huge cast that if I don't review key parts of the last season before the next one starts, I forget minor details, like who got married and brought 17 new characters from their family into House Evilempire, or who arrived/left locations X and Y, or who died in a spectacular betrayal by their formerly loyal henchman/sibling/dog.

Comment Re:Keeping spoilers close to the chest??? (Score 2) 148

anyone can know what's coming by RTFB.

Until next year. Given that GRRM's shown no interest in accelerating his writing and it must have been 5+ years since the last book, it's likely that the TV show will overtake the paper version within the next season or so. Reportedly, their general strategy is that since the TV show only follows the general storyline rather than being 1:1 with the books in recent seasons anyway, they will get an advance outline of the future of the story and work from that instead.

Comment As long as you don't mind spoilers for 11+ months (Score 4, Informative) 148

I still think that HBO has met me half way in providing their content in a reasonable, fair manner.

I've bought legal copies of the previous seasons on Blu-Ray, lacking better options for seeing them. HBO's insistence on not releasing each season on disc until just before the next one (with the inevitable resulting spoilers in between) really annoys me.

When I've paid full price -- and it's an expensive price for a show with only 10 episodes per season -- for something that from my point of view was only just released, I don't appreciate seeing trailers and promos for the new season that show the person in supposedly mortal jeopardy at the end of the episode I just watched is going to make it/not make it/turn into an angel and fly away. This has been happening even in between old shows I'm rewatching on second-rate freeview TV channels for more than a month (advertising the new GoT season coming up on an expensive premium channel not conveniently available where I am). They even had two principal characters on the front cover of TV magazines at the store last week.

I'm generally anti-piracy, but this is a show that depends on the big plot twists and no-one-is-safe surprises, and I'm far more likely to give up and just rip it on-line as so many others do because of the spoilers than for any other reason. Or just give up watching at all, because why bother when the story has already been ruined anyway?

Comment Perhaps I was too hasty there (Score 1) 263

Yes, I'm a developer as well. Let me re-phrase that, as I was going off an assumption that for all I know is no longer true, now that I look directly at it:

I have no use for graphics solutions that consume memory bandwidth that would otherwise be available to CPU core(s.)

Having said that, as memory bandwidth, as far as I was aware, remains nowhere near the bandwidth required to reach "always there when the CPU needs it", and integrated solutions always share memory with the CPU, particularly when data is being passed between CPU and GPU... it just strikes me that integrated probably -- not certainly -- remains a reliable proxy for "makes things slower."

It's also a given that the more monitors the thing is driving, the more memory bandwidth it will need. If that memory is on the same bus as the rest of the memory in the machine, again, adding monitors reduces memory bandwidth available to the CPU, and remember that the monitor has first priority -- system designs can't have the monitor going blank because the CPU wants memory. Doing both -- running graphics intensive tasks on multiple monitors... that's quite demanding. Hence, my preference for non-integrated graphics. When the graphics subsystem has its own memory, CPU performance has, at least in my experience, been considerably higher in general.

I have six monitors on one desktop setup, and two on the other. My lady has two as well. There are times for me when at least two monitors are very busy continuously and simultaneously for long periods of time (hours) at the same time that there is a heavy CPU load (where at least one core constantly at 100% and others variously hitting hard at times as well.)

Now that solid state drives are around, my machine spends a lot more time computing and a lot less waiting on disk I/O, too.

Anyone who definitively knows modern integrated chipset performance, by all means, stick an oar in.

Comment Don't care (Score 2, Interesting) 263

If the new model has a larger screen, 5K would definitely be insufficient.

I'm a photographer and and constant user/developer of image manipulation software. I edit every shot. I don't need 5k in a monitor; if I need a full-image overview, I can have that, zero perceptible time. If I need to look at pixels, same thing. Or anywhere in between. I do *not* need to be squinting at a monitor in order to resolve detail. I value my vision too highly. And at these resolutions, you don' t squint, you can't see it. And I have extremely high visual acuity.

Higher (and higher) resolution makes sense in data acquisition. Once you have it, you can do damned near anything with it. Even if you exceed the MTF of the lens, you get the advantage that while the edges are smoother, they now start in a more accurate place, geometrically speaking. It can be thought of as like old TV luma; the bandwidth is limited, so the rate of change has a proportionally limited slew rate, but the phosphor on an old B&W monitor is continuous, and you can start a waveform anywhere (horizontally) with luma, to any accuracy within the timing of the display, which can be pretty darned high. So things tend to look very, very good as opposed to what you might expect from naively considering nothing but the bandwidth. It's not like a modern color display, where the phosphor/pixel groups serve to sub-sample the signal no matter how you feed it in. But that advantage goes away when the subtleties exceed your eye's ability to perceive them. Or you have to strain/hurt yourself to do it.

So anyway... any single one or combination of these three things would motivate me to buy more new Apple hardware. Nothing else:

o A Mac pro that is self-contained -- installable, replaceable drives, lots of memory, replicable display cards. The "trashcan" Mac pro is an obscenity. All it did was send me to EBay to buy used prior model Mac Pros. The trashcan isn't so much a wrong turn as it is a faceplant.

o A Mac mid-tower that can have hard drives installed+replaced and at least 16gb of RAM. 32gb would be better. Doesn't have to be that fast. Real gfx. I know, mythical, not probable. Still want it, though. Actually, I want several. :/

o A multicore Mac mini with a real graphics card, 8gb or better ram, network, USB, HDMI and audio ports.

I have uses for all those. Failing that, and in fact that's my expectation, more fail -- I'm done with them. And I have no use whatever for "integrated" graphics.

What's annoying is that just about when they finally managed to a get a stable OS with most of the features I like and want (and the ability to get around the stupid features like "App Nap"), they totally borked the hardware side. I just can't win with Apple. Sigh.

Comment NTMP (Never Too Many Pixels) (Score 1) 263

sheeeeeit. These are NOTHING compared to the 16k displays that'll be out in the spring. I hear that's when they're going to add the mandatory "oil cooling hotness" to the Mac Pro, too. Of course, if you wait till fall, those 32k displays are on the way!

[Looks sadly at N(ever)T(wice)S(ame)C(color) security monitor...]

As Cheech and Chong might have put it, "Even gets AM!" Well, ok, old school TV that isn't broadcast any longer. But you know what I meant.

Or not. I'm old.

GET OFF MY NURSING HOME'S LAWN!

Slashdot Top Deals

If all else fails, lower your standards.

Working...