Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:That's nice (Score 1) 142

I wouldn't hold MS as some sort of an example of support as none of the last 4 are more than 6 years.

  • Vista: RTM 8 November 2006, EOL Mainstream 10 April 2012
  • 7: RTM 22 October 2009, EOL Mainstream 13 January 2015
  • 8: RTM 1 August 2012, EOL Mainstream 12 January 2016
  • 8.1: 27 August 2013, EOL Mainstream 9 January 2018

Yes you can pay for extended support that goes past these dates but that's a different animal altogether.

Comment Re:Not the real thing? (Score 1) 365

even purer when man-made?

Diamonds have been made synthetically for a long time; however, they have been used industrially for cutting and drilling (ie diamond tipped blades). The crystals size were small and not structurally perfect as they would need to be for a gem. However when larger and purer diamonds can be made, they are often too perfect as natural diamonds have minor flaws. I see this development like the pearl market. For a long time natural pearls were considered "better" but they are often irregularly shaped. "Cultured" pearls are near perfect spheres which makes them more aesthetically pleasing. These days natural pearls are still more valuable and rarer but less frequently sold due to demand.

Comment Re:I wonder when we'll see A-series MacBooks? (Score 1) 136

Of course they could. My point is that they would end up needing to modify the cores themselves significantly to ramp up the core count, and that those sorts of changes would, I suspect, be too significant to pay off when you're talking about a laptop.

Well ramping the core count is one way of boosting performance; however, I don't expect an Ax MacBook to be a powerhouse. Again, I expect Apple would replace the MacBook Air with it if they do it.

And no, the reason they design their own chips is that they can blow the doors off of what the other companies achieve in terms of power consumption by hand-optimizing the heck out of the core designs. On cell phones, that makes a big difference, and in quantities of tens of millions, the R&D cost per unit is small. On laptops, that makes a much smaller difference (because the batteries are huge by comparison) and the R&D cost per unit is relatively large.

But Apple is not doing this from scratch. They have a design already. Whether it is enough to power a laptop is a different question.

And yet every time I open up my Xcode project at work, Xcode sits there with a single CPU code pegged at 100% for a couple of minutes just to load the project, and several more minutes before it stops SPODing long enough to be usable, and basically the CPU is pegged at 100% for about an hour before indexing finishes. Real-world code doesn't always parallelize easily.

I'm not understanding your argument. First you are saying it's hard to do multicore CPUs and get it right. But you're also saying multicore usage in the real world does not work well. Not all problems can be benefit from multicore. Some do.

To replace existing machines, the GPU would need to be at least as fast as what they're shipping now...

Why? If you are looking at the high end like for gaming laptops that assertion would be true. If Apple were to design an Ax laptop, it will probably replace the MacBook Air not the Pro which does not use the most powerful CPU much less GPU. Taking a look at the most current MacBook Air (2015) it used Intel HD 6000 graphics. Compared to the lowest Nvidia mobile GPU (920M) at the time, it loses. While the current generation Intel Iris beats that older chip, how much horsepower will you need for laptops?

Comment Re:Nope, not bashing (Score 1) 224

Or that Trump cares more about trying to control his image in social media than Clinton does. It doesn't surprise me that Trump's priorities are about social media than the campaign. For example I can imagine Trump devoting lots of time to make that the domains were owned while Clinton was more concerned about staffing offices in battleground states.

Whatever your opinion of Clinton, it is without a doubt that she's spent a great deal of her life and career under public criticism. From being the wife of a governor to her own career, public scrutiny comes with the job. At some point, trying to control the scrutiny it is pointless.

Comment Re:I wonder when we'll see A-series MacBooks? (Score 1) 136

Yeah, but adding cores isn't free. The more cores you add, the more challenging it is to keep their caches in sync. Two cores are relatively easy. Four cores are considerably harder. Six or more cores to match the multicore performance of modern Intel chips are harder still. Obviously it can be done (because it has been done many times), but the point is that cranking up the core count is a non-trivial piece of engineering.

Which is a problem for all multi-core CPUs and not just Apple. I would argue though optimizing two different set of cores (2+2) might be harder than 4 of the same core. My point still is that Apple has optimized the number of cores for each device sometimes removing a core. Apple could design a quad core Ax laptop CPU if it wanted.

The reason it makes sense for Apple to build their own chips for cell phones is because they turn around and build ten million of each model. It makes a lot less sense to spread higher R&D expenses (for a much more complex chip) across a tenth as many devices (or less).

Not really. They design their own chips because their requirements for each device can be optimized as opposed to accepting whatever design Samsung or even Qualcomm had made to work with a vast array of different devices and customers. There's no reason they could not extend that to laptops. For example they tweak each Ax chip for different purposes whether it is iPad or iPhone like adding in more GPU cores for higher resolutions.

And that's still ignoring the elephant in the room, which is that the per-core speed would need to be half again higher than it currently is, and that's before you consider that initially 100% of the software you run would be emulated. If they could double the per-core speed, it might be practical, but I would be surprised if Apple could reliably clock an A10-derived CPU at 4+ GHz. I mean, you never know, but it doesn't seem very realistic.

Why would software be emulated and why would the clock speed be half? Apple has done a lot of work in the software side of OS with Grand Central Dispatch to ensure that OS and programming language knows how to use and optimize multiple cores.

Although you can generally scale GPUs to massive levels of parallelism much more easily than CPUs, the question becomes whether you're really saving power or expense when you scale up an underpowered GPU by throwing more cores at the problem. And discrete GPUs have been a constant source of heat-induced hardware failures. A scaled-up PowerVR GPU would probably have the same issues unless they specifically designed it to disperse heat evenly. My gut says that a scaled-up design would end up being a ground-up redesign rather than a bunch of existing cores on a single die. I could be wrong, though.

Well how Apple has used multi-core GPUs in the past, it isn't using them to run parallel processes of the same computation. They split them so that different GPUs take different parts of the screen. It's the same principle with NVidia's SLI and Radeon's Crossfire in a way. Also if you are using an Ax laptop, I would guess that gaming and bitcoin farming would not be the primary reason to get one. It would be more of a replacement for the MacBook Air as opposed to the MacBook Pro.

As for heat issues using discrete GPUs, remember again Apple might not be using the laptop for high end gaming. They can pick a low end GPU with just enough power to handle the display.

Comment Re:I wonder when we'll see A-series MacBooks? (Score 1) 136

The biggest problem is that the A10 is still a two-core chip.

It is 2+2 because it was designed for phones. It does not have to 2+2 if defined for a laptop and could be quad-core. Remember Apple ships variants of their Ax processors all the time. The AppleTV used a single core A5 which was only ever made for the AppleTV. Still I don't see it being as powerful as an Intel chip.

The second biggest problem is that the GPU in the A10 is designed to drive a 1920×1080 screen. Even the iPad Pro's GPU is designed to drive only a 2732 x 2048 screen. That's less than half the pixels on an iMac screen, and the iMac's GPU has to routinely drive up to two 3840x2160 UHD screens on top of that. So basically, the GPU performance would probably need to go up by an order of magnitude to replace what's there now, or else they would have to use an external GPU.

The 6 core GPUs drive a 1920x1080 display. Again for a laptop, there would be more room to squeeze in more cores to handle a bigger display. Also there is no reason that Apple has to use a PowerVR GPU. They could rely on discrete GPUs like they currently do in their high end laptops.

Comment Re: Text (Score 1) 136

These are pros. Not airs for consumers

[sarcasm]And pros never buy during the Christmas season. And consumers never buy the MacBook Pro.[/sarcasm]

Yes the newer GPUs use Samsung 14 nm processes compared to the 28 nm from previous generation. Big boast in performance.

[Citation Needed]. My information says that the Radeon Pro 455 which is Radeon Arctic Islands architecture and is manufactured on TSMC 16nm FinFET process not a 28nm process.

These are supposedly for professionals. Not capable of any real work dealing with VMware fusion, Adobe premiere, compiling code, or anything else a professional would use.

Let's look at this argument. You are saying that pros would benefit greatly enough from using Kaby Lake over Skylake. The fastest mobile Kaby Lake is the Core i7 7500u(cpu score: 5381) vs a Core i7 6700T (cpu score:8971) which Apple used instead the MacBook Pro. The mobile CPU Apple used clobbers the best Kaby Lake mobile CPU. Do you want to guess why? Because Intel hasn't released a Quadcore mobile Kaby Lake CPU yet. Intel isn't expected to release the Quadcore versions until December which means Apple might have them ready for consumers in April at the earliest.

At best pros will have to wait six to nine months for the next MacBook Pro. Or Apple could release a MacBook Pro with the processors it has now.

Comment Re: Text (Score 2) 136

First of all the components are not "out of date". We're not talking about Core 2 Duo CPUs and Radeon 6000 GPUs. The components simply are not leading edge and the newest generation. Second, it's not like Apple and MS targeting new products for the holiday season is a bug surprise to anyone including Intel, Radeon, and Nvidia. According to you they should delay release of a product that works fine and forgo the busiest retail season for the vast number of consumers just so that a few geeks can boast they have the latest and greatest processors. Frankly most consumers won't know the difference between the processors to give enough of a damn.

Slashdot Top Deals

Sendmail may be safely run set-user-id to root. -- Eric Allman, "Sendmail Installation Guide"

Working...