Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Fsck x86 (Score 3, Informative) 230

x86 is hardly any less proprietary than PowerPC or SPARC. You've got Intel and AMD at the helm. VIA walked the plank ages ago.
Apple ditched PowerPC because Apple's market share was so fucking low that the only company compiling for PowerPC was Adobe. The decision to drop PowerPC had to do with market share and cost, not the architecture itself.

Yes? No? I think this is a misunderstanding of the motivations behind Apple's PowerPC switch. (Source: I wrote PowerPC Mac apps at the time and was in the room at WWDC when Apple announced the switch.)

The PowerPC market was a bit wider than that. Microsoft had Office on PowerPC, Adobe had their suite, and there was a smattering of other apps.

At the time, the future of PowerPC had looked pretty bright. Microsoft's Xbox, Sony's PS3, and Nintendo's Wii were all switching to PowerPC. Within a span of several months, the community was looking at a majority of gaming hardware being PowerPC based. PowerPC was going to be in very high demand, which would mean great things for the Mac PowerPC platform. Far from "the only company compiling for PowerPC was Adobe", Microsoft was buying Power Mac G5 boxes for their dev kits and they were porting Windows to the PowerPC for the Xbox. And in the end, Microsoft, Sony, and Nintendo combined shipped several hundred million units based on the PowerPC (With Nintendo still shipping the Wii U with PowerPC today.)

So why did Apple leave the PowerPC?

At the time, laptop sales were on the rise, but Apple's laptop CPUs were not designed by IBM, they were designed by Motorola. IBM's PowerPC G5 was suitable for the Xbox 360 and desktop machines, but it ran far too hot to go into laptops. This left Motorola with their G4 CPU. And let me tell you, Motorola probably had very smart people working for them, but their execution was incompetent. The G4 had a 133 mhz system bus (which was slow even for the time), and ran very hot (but still cooler than the G5), and worst of all, was much slower than Intel's Pentium M.

Meanwhile the Pentium M was doing very well. It was faster than the G4, more power efficient than the G4, and it actually had a modern chipset and bus. Switching to the Pentium M was a no brainer.

There was speculation that Apple was trying to get IBM to make a mobile G5, but they were never able to get the power consumption down. When Microsoft and Sony entered onto the scene, IBM's interest shifted to getting the PowerPC into larger form factors, and Apple just didn't ship enough units in laptops to balance out the R&D demand that Microsoft and Sony created.

Motorola in the meantime with the G4 just kept sucking. There was a new architecture that was basically a modern architecture for the G4 that did eventually end up shipping, but by then Apple was just done with PowerPC.

Intel provided a stability the AIM (Apple, IBM, Motorola) alliance just didn't provide, with a quality chip. PowerPC did end up scaling, but there simply wasn't the same demand for PowerPC machines at the time to make it scale well enough.

So were people not actually writing code for PowerPC? No, lot's of people were. I'd actually guess that after Apple left PowerPC, the number of PowerPC developers continued to rise. And with the Xbox 360, Sony PS3, and the Nintendo Wii/Wii U continuing to get new games, there are still a lot of PowerPC developers out there.

Comment Re:Somebody post a SWIFT example PLEASE! (Score 3, Insightful) 636

Ok, you guys are too slow, I RTFA and downloaded the iBook. So far, I am very much liking the SYNTAX, especially OBJECTS and FUNCTIONS, they even brought the LET keyword in from BASIC. SWIFT will make programming Apple products much easier for the C loving syntax crowd, from what I can see. Ahhh... what a breath of fresh air. Code snippet below of creating an object and exercising it. I feel bad for those that suffered through Objective-C.

To be honest, while this snippet is a few lines shorter, it's arguably more complicated than the corresponding Obj-C. It drops returning self in the init, and drops a few lines that would have had to go in to the class definition, but you gain a few unsightly keywords like "override", having to add the keyword "func" to every function, and you gain some more syntactical mess like "->".

It's not horrible, but I'm not sure this sample is more readable than Obj-C. As others have noted, Swift has the habit of taking the important parts of a function (like what it's named and what it returns, or what name a class is and what it subclasses) and shoving them off to entirely different sides of the function declaration.

Comment Re:None of the baggage of C? (Score 2) 636

From what I can tell (I just got out of WWDC and am reading through the docs) it can be bridged to, but not directly called. You can directly call Obj-C methods through the bridge, but not C methods. You'd have to bridge to the Obj-C methods which then call C methods.

I don't know what happens when that Obj-C method calls malloc and returns some memory for leak-tastic behavior. I still haven't read if or how Swift handles raw memory buffers.

Comment Re:Blown out of proportion (Score 2) 406

I think this whole situation has been blown out of proportion.
How will this code, that allows loading a 3rd party DRM plugin, be conceptually different than the bit of code that allows loading other closed source plugins (Flash, Silverlight, etc)?

It doesn't.

I raised this point months ago when this whole DRM thing started and no one had a good explanation. I think the best explanation was "Yes, but it's encouraging it more." Not that I understand how an arbitrary plugin architecture encourages DRM any less. Cause that's what we have today.

Comment Re:No Threat To Thunderbolt (Score 5, Informative) 355

What PCIe cards are you plugging in again? Graphics cards? You still have yet to demonstrate that it is not a novelty. I have never seen a CAD setup like that. Nor have I heard of a gaming rig that uses a laptop CPU but has an external graphics box. Maybe you're right and it will be all the rage in CAD houses.

What devices are these? Still graphics cards?

http://www.red.com/store/produ...
http://www.blackmagicdesign.co...
http://www.nvidia.com/object/q...
http://eshop.macsales.com/item...
http://www.amazon.com/Apple-Du...

I could go on but really the answer is "Every single PCI-E card that exists." Or "Every single PCI-E card that is important to professional users that just because you don't know about doesn't mean it doesn't exist."

Comment Re:No Threat To Thunderbolt (Score 4, Interesting) 355

Is there a real use case for connecting a PCI-E card to a system via an external port? The link you showed was basically an enthusiast/hobbyist novelty. If I actually need that sort of graphics power (gamers or CAD), I'm probably using a gaming rig or a workstation, which both have PCI-E slots in the case. I can't imagine what other sort of PCI-E cards I'd be carrying around with my laptop.

The point isn't to make PCI-E cards portable. It's to make it so you only need one machine. Why buy a desktop when you can simply plug the PCI-E cards straight into your laptop? You COULD buy a desktop with a bunch of PCI-E slots, but you don't need to now. Why buy a redundant CPU with a redundant motherboard just to drive a few PCI-E cards?

And if you're a pro with a desktop, and you run out of PCI-E slots, do you simply buy a whole new machine? Thunderbolt can drive six PCI-E devices per bus (http://www.macworld.com/article/2146360/lab-tested-the-mac-pro-daisy-chain-challenge.html). Most desktops don't have six PCI-E slots total.

A lot of pros are adopting Thunderbolt because it allows them to use the devices that used to require a desktop quickly and easily with a laptop, and they can reduce their machine count by one. Thunderbolt doesn't need to displace USB because it has a niche that USB effectively can't replace.

Comment No Threat To Thunderbolt (Score 5, Informative) 355

Thunderbolt isn't going to replace USB in all cases, but Thunderbolt isn't about the speed. It's about the protocol. Thunderbolt is basically PCI-E over a wire. Can you connect a GTX 780 Ti (http://techreport.com/news/26426/thunderbolt-box-mates-macbook-pro-with-geforce-gtx-780-ti) with USB 3.1? No? Not really a replacement then. Same goes for any other device that has traditionally been a PCI-E card. Or, you know, you can get an adaptor (http://www.sonnettech.com/product/echoexpressiii.html) and directly connect a PCI-E card.

Speed wise Thunderbolt is evolving too. At this rate there isn't much of a chance of USB 3.1 catching Thunderbolt. As the OP mentioned, Thunderbolt is still ahead of USB 3.1 and 40 Gbps Thunderbolt is coming soon (http://www.extremetech.com/computing/181099-next-gen-thunderbolt-details-40gbps-pcie-3-0-hdmi-2-0-and-100w-power-delivery-for-single-cable-pcs). But again, even is USB catches Thunderbolt, or both become fast enough, the protocols and designs of the connections makes them entirely unsuitable for each other's uses (you wouldn't connect a mouse and keyboard to your PCI-E bus directly via Thunderbolt.)

Comment Re:Linux/WIndows, or Mac too? (Score 1) 158

The article seems to mention Windows/Linux (or Linux/Window). What about OpenGL/GLES drivers on other platforms, such as Mac OS X, Android, iOS, ?

OS X and iOS well, the drivers I believe work, but can be slow. The reason is, well, Apple pretty much wrote the drivers for AMD, nVidia, Intel and Imagination Technologies. There probably was a lot of cooperation with the respective companies, but Apple pretty much wrote it themselves as the others do not have the time, money or resources to write drivers for Apple.

Apple is not writing the drivers for AMD and nVidia. I'm not sure about Intel. At one time Apple wrote the Nvidia drivers (over a decade ago), but they never wrote the AMD drivers. AMD and nVidia definitely have internal teams writing their drivers these days.

Apple is responsible for the OpenGL stack and driver ABI, which is where they work closely with the GPU vendors. But they're taking drops of the drivers and pre-bundling them with the OS. It can make submitting bugs a problem because Apple are the ones supplying the drivers, so you file bugs with them, but they're just forwarding the bugs on to teams at Nvidia or AMD.

There is a lot of finger pointing over the slowness issues. Sometime's it's clearly Nvidia or AMD. Sometimes it's clearly Apple. Because Apple controls the OpenGL ABI and public interface, OpenGL version update issues are definitely Apple's problem, which could be the result of some performance issues.

A lot of the issues are just around Apple's history. Apple was big into games and gaming performance back around 2000. There usually was an OpenGL game tech demo every conference. A big driver of this was Apple's support of Bungie. When Microsoft bought Bungie I think Jobs held a bit of a grudge against the gaming community. Apple tried to counter Microsoft's offer but came in too late. Ever since then, Apple's interest in games has gone away.

So a lot of the slowness issues are commonly thought to be Apple optimizing their drivers towards pro applications like Final Cut, and not spending much time optimizing for games.

Comment Re:Do you have the time? (Score 1) 309

Egad, what terrible advice. Yes freshman year is the lightest workload if you came from a good HS but it can be hard for people that come from crappy school systems.

But there is something more important and that's having fun. Collage is the last real time in your life you can goof off and have a good time without severe repercussions. Studies need to be important and good grades a must but with the lighter work load freshman year you should be having fun. That means making friends, dating and having a good time. Once you graduate are looking at almost 50 years of continuous 40+ hour workweeks with 2 weeks of time off a year.

Enjoy collage, its your last chance to act like a kid.

Well, I didn't mention dating or having fun, but that's not bad advice either. :)

Seriously OP, this is going to be one of the best dating pools you will have in your whole life.

If the OP is looking for things to do with his/her time, I was kind of assuming the whole social thing had been considered and rejected, but if your school is being paid for and you've got the time, it is one of the best time's in your life to live a little.

Comment Do you have the time? (Score 4, Insightful) 309

Keep in mind: Freshman year you're going to have the most free time out of any other year. By senior year your workload is going to be double or tripled.

With that in mind: I'd focus on your studies. If you have spare time, focus on getting other classes out of the way so you won't have to take them later. Or take other classes that could develop your degree and help you learn things you didn't know before. Take a network security class, or a graphics class. Something outside your wheelhouse.

If you're already at 18 credits and finding yourself bored: Work on your own outside project, contribute to open source project, etc. Whatever you do, do not commit yourself to a regular job with expected hours.

For reference: I worked while I was getting my degree (had to, I paid my own way) and it delayed my graduation about a year to a year and a half. So I'd only recommend doing it if you need the money.

Slashdot Top Deals

The most difficult thing in the world is to know how to do a thing and to watch someone else doing it wrong, without commenting. -- T.H. White

Working...