Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Even Apple is abandoning Objective-C (Score 1) 407

http://ask.slashdot.org/story/14/11/28/2148254/ask-slashdot-objective-c-vs-swift-for-a-new-ios-developer

http://ask.slashdot.org/story/14/09/26/2247211/ask-slashdot-swift-or-objective-c-as-new-ios-developers-1st-language

From http://www.raywenderlich.com/74138/swift-language-faq:

So you can still use Objective-C. However, Apple seems to be encouraging you to use Swift for any new development, while not expecting you to go back and re-write all of your Objective-C code.

From www.archer-soft.com/en/blog/technology-swift-vs-acobjective-c-pros-and-cons:

Swift was created in order to replace Objective-C, however, Swift is capable of working alongside Objective-C while using Cocoa and Cocoa Touch frameworks.

None of which are first party sources. Please cite one.

Here, I'll help you out. Here is a quote from one of your own sources:
"To quote Apple, “Objective-C is not going away, both Swift and Objective-C are first class citizens for doing Cocoa and Cocoa Touch development.”"

You're purposely digging around Apple's own statements that Swift and Objective-C are first class citizens.

Wait, first class citizens? That sounds a lot like what I said in an earlier reply. Hmmmmmm.

Comment Re:Even Apple is abandoning Objective-C (Score 3) 407

Of course Apple hasn't said Objective-C is a dead end. There would be a revolt and a mass fleeing from the platform if they did that.

...from their own developers? Again, out of the whole community, they have the largest Obj-C source base. If they abandon Obj-C within the next 10 years, they won't be able to ship anything. And it's going to take at least a decade to rewrite everything, if that was even their goal. During which time they ship no features. Apple can't abandon Obj-C because they need to use Obj-C. If they abandon Obj-C, they abandon Mac OS X and iOS. And they will be done in the market. And given that new API is still written in Obj-C, that's a process they haven't even started yet. In April they're shipping a brand new hardware platform that still runs on Obj-C.

For a past data point, Microsoft said with Vista they were going to rewrite Windows in C#. How did that go? Replacing an entire language is simply not realistic. If you're an engineer, you should know that.

Furthermore, once you're talking about the pain and suffering in moving everything to Swift, maintaining Obj-C looks like a far easier and more desirable costume. And that's what Apple is doing.

But they sure as hell aren't encouraging people to stay on Objective-C instead of moving to Swift.

I've talked with engineers on the Swift team who've said that's not the intention (and are wondering why the public thinks that). But please. Do go on.

I've read several articles and summaries (including on Slashdot) that have made it clear Apple wants people to use Swift.

"I've read articles from other people who think they know what they are talking about, and they wrote something that they think is right. Look at me! I'm such an expert!"

Oh, wait. Never mind. Apple person. You don't deal with the same reality as the rest of the world.

I... deal with the realities on the Apple platform?

Comment Re:Even Apple is abandoning Objective-C (Score 4, Interesting) 407

Apple has made it clear their development future lies in Swift, not Objective-C.

That means you're choosing between a popular, well supported language and a dead end.

The choice should be obvious.

They've done no such thing. The biggest writer and maintainer of Obj-C code is Apple. They're sitting on a huge source base they'll continue developing on. Please link me to where Apple has said Swift is replacing Obj-C. Because they haven't. And they've said the opposite many times. Everything I've read/heard is that Obj-C will continue to be a first class language on iOS and Mac (with Swift and Obj-C both being considered first class languages.) You can have more than one language on a single platform. Shocking, I know.

Not to mention, for such a dead end, Apple's still writing a lot of new Obj-C. The iWatch OS (what runs on the watch itself) is Obj-C. Apple has not shipped a single API on Mac or iOS written in Swift. Not one. So it makes zero sense that Apple would consider Obj-C a dead language, and yet continue to write source they'll have to maintain for years in it. And if you think Apple is going to rewrite the millions of lines of Obj-C in Mac OS X and iOS in Swift, you really don't understand software engineering very well.

Another problem is that Swift is missing basic language features. Obj-C can link to C++ code. Swift? Nope. That alone means Swift can't replace Obj-C code. Everyone has C++ code they need to link to. Apple has C++ they need to link to in their own APIs. So does Adobe. Microsoft. And they'll probably fix it in the future. But you can't even approach suggesting Swift is going to replace Obj-C with a straight face until that is fixed.

Now look, I'm not trying to argue against Swift here. It's a valuable language to use and learn. This isn't a desperate "Obj-C forever!" post. But if you think Obj-C is going anywhere in the next decade or two... It can't. Apple will continue upgrading it, and continue supporting it, or else they're going to end up putting themselves in a corner where they can't even maintain their own software. That's not opinion, that's realism. It's knowing when a tool is right for a problem. And we're nowhere near Swift even being able to entirely replace Obj-C in usage.

Heck, the last Xcode beta even shipped with some upgrades to Obj-C. So I don't even need to argue that point. It's not a question of if Apple will keep advancing Obj-C. They are.

Comment Re: The future is not UHD (Score 4, Interesting) 332

Any TV you can buy today can do 60 fps over HDMI. The frame rate push has been done for years, the content just never showed up:

It's also arguable if that's the future. Everyone seems pretty happy with the current refresh rates of film, and 60 fps Hobbit wasn't well received.

Comment Re:Too Late? (Score 1) 165

Isn't Microsoft announcing a new web browser intended to replace Internet Explorer today? Maybe it'll be open source. Maybe it'll even be based on Webkit.

I sure hope not. We need competing browser engines to keep things honest. The competition between them is the only way we ever get standards compliance.

Spoken by someone who wasn't around for the web browser wars of the 90s...

Multiple browsers led to less compliance, not more. Both Netscape and IE were in a rush to add their own non standard HTML elements to "outdate" the other. ActiveX didn't come along at a time that IE owned the market. ActiveX came along at a time when IE was in fierce competition with Netscape, and needed to BREAK the standard to push Netscape out of the market.

Having lived through that, I've never understood the logic of "we need multiple browsers to maintain standards." That's never actually happened in practice. It's like free market philosophy gone amok. Even today, we still see that a bit with either draft or pre-draft things getting added to web browsers outside of standards. Stuff like NaCL is not part of any web spec, and is entirely proprietary to Google, but hey, even with all the competition that's supposed to stop that, it still exists. Because competition promotes people creating their own proprietary stuff to beat the other browsers with.

Comment Re:So not Python, but VB? (Score 1) 648

I also disagree about C being "incredibly complex for a beginner". I found C to be very easy to grasp and very good at exposing what the computer is actually doing under the hood. I would agree that programming C well is complex (and also time-consuming), but that is because it is simple, not because it is complex.

As someone who went down this path and is now a professional software developer many years later... C sucks as an intro language.

Yes, it's very important for the work I do now. It's clean, simple, and easy to understand. It's also totally useless to a beginner.

I used BASIC (and HyperCard) way back when I was a kid because I could actually do things in it. Want to code a 3D game? I could do that. How about something that could play movies? Yep, easy. A basic text editor? I could do that too. With C? Lot's of hello world and "add these numbers together." C's minimalistic nature may be a strength for actual practice, but as a kid who wants to actually create things quickly, it sucks. I'd argue early education is about getting kids interested in coding, with less of an emphasis on actual practice they might use in a job 10 years down the road first. Just get them in the door first.

If C was my only intro to programming I would have lost my mind. Try to get more kids interested in development with only a command line interface and see how that goes. (Yes, there are libraries like SDL, but that takes you fair outside the realm of novice student development.)

Comment Re:How parallel does a Word Processor need to be? (Score 2) 449

Or a spreadsheet? (Sure, a small fraction of people will have monster multi-tab sheets, but they're idiots.)
Email programs?
Chat?
Web browsers get a big win from multi-processing, but not parallel algorithms.

Linus is right: most of what we do has limited need for massive parallelization, and the work that does benefit from parallelization has been parallelized.

This is kind of silly. Rendering, indexing and searching get pretty easy boosts from parallelization. That applies to all three cases you've listed above. Web browsers especially love tiled parallel rendering (very rarely these days does your web browser output get rendered into one giant buffer), and that can apply to spreadsheets to.

A better question is how much parallelization we need for the average user. While the software algorithms should nicely scale to any reasonable processor/thread count, on the hardware side you do have to ask how many cores we really need, especially in since a lot of users are happy right now. But targeting these sorts of operations as a single thread is also the entirely wrong approach. It's not power efficient for mobile users, and it drastically limits the gains your code will see on new hardware, while competing source bases pass you up.

Comment Re:Considering how few boys graduate at ALL (Score 1) 355

To be fair, teacher pay sucks. We all know it. There isn't a debate there.

So you don't exactly make a solid point by saying "Hey look! Women dominate in all the crappy low paying jobs! How are they oppressed?"

Do women dominate in teaching because they choose to go into teaching, or because society is corralling them into teaching because it's pretty much on the bottom end of the career ladder?

I'll tell you, any time I look at teaching (which I'd be interested in), I go "hell no" when I see the pay scales, and go back to my normal engineering job.

If you think discrimination is not a thing, perhaps you're deluded enough to think that women, as a large group, are all collectively putting themselves into low paying, low recognition work.

For bonus points, break out gender ratios in education teaching by pay scale. You'll find as the level of academia and pay increases, the ratio of women also declines. Gee, that's funny.

Comment Re:all this info for what? (Score 1) 278

3: Other country's laws. People don't realize it in the US that Thailand's lese majeste laws apply here? Well, they do, and an American can get shipped over there for breaking them, due to extradition treaties. Same with Turkey and the Kingdom of Saudi Arabia. In theory, someone handing out events for their pagan festival or church bulletins can be shipped over there to be executed, due to violating Islamic sharia laws. Privacy is important, since it isn't just domestic LEOs, but LEOs of foreign countries who can press charges and have US citizens answer for them. Right now, it tends not to be enforced, but the laws are on the books, and the pastor who was televised burning a Koran might find himself in Riyadh facing an imam and a crowd with rocks and a can of gasoline.

Errrr, no, that's totally wrong. Where did you learn this stuff?

If you commit an illegal activity in Thailand, and then enter the United States, there is a chance that the US could return you to Thailand. If you do something that is illegal in Thailand but not illegal in the United States in the United Staes, then it does not matter at all. Only US law applies to acts committed in the US.

I don't know where you learned your understanding of extradition laws, but this is so far out in right field. Maybe you should lay off the internet conspiracy crack for a while.

Seriously, learn a few things about extradition. It only applies to crimes committed in the country trying to get their hands on the person.

Comment Re:Hasn't this been known? (Score 4, Insightful) 163

Well, now I'm reading specs on USB 3.0 controllers. Ugh. There's a lot on mapping a bus address to a memory address for DMA, but nothing addressing the security implications of doing so, or what devices are allowed to do, just broad hints like the buffer has to exist in a DMA-able part of memory without saying if that's a security implication or a hardware implication.

It would be nice to see a follow up article on if/how USB 3.0 protects against these things, because I'm not a kernel USB developer sort of guy, so while I know DMA is there, I'm not feeling like I'd be able to dissect these implementation specs.

Comment Re:Hasn't this been known? (Score 1) 163

same thing as a pci-e / pci / cardbus / express card with a boot ROM or flash. They load pre boot at least on non mac systems you can go to bios and trun off option roms / set it to EFI only mode.

Apple exposes a bunch of pre boot options for the firmware on the command line, but I'm not sure if you can disable pre-boot EFI drivers from there.

Comment Re:Hasn't this been known? (Score 4, Interesting) 163

I'm pretty sure in the case of USB 3 that DMA is a function of the host controller. A device by itself cannot inject into arbitrary memory. This thunderbolt "vulnerability" is the equivalent of the windows autorun on insertion function that was disabled years ago. Only this functions above the level of the current user (aka much worse).

I'm looking up DMA for USB3. Although there are some ways to secure DMA (like a white list of addresses/sizes that are safe to write to), all of the advertised functionality of USB3, such as the sustained data rates, would be very hard to achieve if you didn't have direct access to memory. That's why Firewire ruled for live streaming of data for so long: DMA made it's rates reliable, whereas USB's dependence on the controller and CPU for memory transfers made the throughput more flakey.

Comment Re:uh - by design? (Score 3, Informative) 163

Thunderbolt is more like USB to the user - it's a thing you use to connect untrusted devices to your system. You wouldn't expect that plugging in a USB thumbdrive would magically own your system (well, maybe you should, because it's happened in the past, but I think it's fair to say that it shouldn't). You'd think that plugging in a random Thunderbolt device would be designed to be safe. Apparently not: apparently Thunderbolt is unsafe by design.

USB 3.0 has this exact same feature (DMA), so yes, yes you should expect a USB thumb drive to be able to do this.

Slashdot Top Deals

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...