Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI

Meet Siri's Little Brother, Trapit 183

waderoush writes "Virtually overnight, Siri, the personal assistant technology in Apple's new iPhone 4S, has brought state-of-the-art AI to the consumer mainstream. Well, it turns out there's more where that came from. Trapit, a second spinoff of SRI International's groundbreaking CALO project (Cognitive Assistant that Learns and Organizes), is preparing for a public beta launch this fall. The Web-based news aggregator lets users set up persistent 'traps' or filters on specific topics. Over time, the traps learn to include more articles that match users' interests and exclude those that don't. Philosophically, it's the exact opposite of social-curation news apps like Flipboard or Pulse, since it uses adaptive learning and sense-making technologies to learn what users like, not what their friends like. 'Just as Siri is revolutionizing the human-computer interaction on the mobile device, Trapit will revolutionize Web search as we know it today,' the company asserts."

Comment Re:Not Reasons Unknown! (Score 2) 155

Native apps (including those running UIWebView) already use native ARM machine instructions as they wish (you can set compiler to to compile into pure native ARM instructions or write ARM assembly code if you want; Apple only controls which system APIs developers can access, which they can do from JS->machine compiled code equally well). So that "explanation" doesn't make much sense. It is more likely that they merely rushed the iOS upgrade out before their programmers had finished the porting to UIWebView.

Comment Re:No surprises here (Score 4, Interesting) 273

It's trivial to block this -- just add a batch file nofb.bat that replaces your host file with the one that has facebook redirected to 127.0.0.1. If you use fb and wish to actually go there, you can have another bat file, gofb.bat which changes host file back to the one with facebook entry commented out (the bat file may call a little executable that flushes local DNS cache on your machine by resolving the affected domain name). In general case, if you wish to do this selectively for n tracking sites, with n>1, you will need one bat file that blocks all of them and one for each site that has just one site site unblocked, hence you need n+1 bat files. Also, going to any of the tracking sites to use their services will also cost you an extra click for in and out.

Note that google, digg and many others are doing the same kind of tracking, whether you subscribe to their site or not. You get ID on their servers attached to your cookies, tracking your visits anywhere where their bug is placed. That way they can sell to some site A which you are visiting now the fact that you have also visited sites B, C, D, ... earlier (when and how many times each, what kind of content you used there, etc). Of course, if the tracking servers know who you are, they can also sell that info to sites A, B, C..., at a higher price.

Comment Re:Dinosour language (Score 2, Interesting) 351

Dr. Brad Cox -- and he had one main goal in mind: Be a strict superset of C

That's as worthy a goal as inventing car that is strict superset of the horse carriage or a plane which is strict superset of a hot air baloon. Strict C is not suitable for OOP due to lack of overloading and creation/management of name spaces.

This is by design. It allows dynamic messaging. You can even, for example, send a message to nil and everything is fine.

You should't have to execute hundreds of CPU instructions to make a function call that checks for the whether function pointer (or its parent ptr) is null. Three CPU instructions can easily handle it (e.g. OR EAX,EAX; JZ skip; CALL EAX). In fact this dubious "feature" has probably cost me more debugging time than any other "feature" -- an uninitialized/released object quietly returns 0, that breaks something else many steps later, which needs then to be backtracked to the source problem. I would rather the code crashed right there at call, so I can find it on the first crash, rather then reconstructing it from some subtle malfunction much later on.

Again, only something you need in a statically linked object-inheritance style language like C++.

There is absolutely no gain (other than saving efforts of the compiler writer at the expense of programmers & end users CPU/battery) in hashing method names and searching them in a hash table for each invocation of class method/property access compared to storing the target function address into an array of pointers (which can be fixed-up/reloacted if needed by loader) and calling them via function pointers retrieved in a single instruction via compiler generated index. The array of pointers, with an extern/export allowing access to it to the app, has full dynamical felxibility equal to anything provided by hashed method names, while using hundreds times fewer runtime CPU instructions (with more complex compiler code instead). Single step once through assembly code of a method call or property access, and recall that all it is doing is one or two instruction worth of actual task.

Again, by design. Named arguments makes Objective-C one of the best languages for code readability. You don't have to wonder what the arguments are!

I don't find [obj string:string count:count] any more readable than obj->string(count) or *obj(string,count). It's needless clutter that saved the compiler writer trouble of implementing name mangling & overloading by shoving that part of compiler job to the brains of future programmers. Compiler designer saved himself few weeks at the expense of few hundred million weeks of work for programmers. Great deal.

I agree that the Cocoa library objects / methods are verbose, but this is a GOOD thing.

It's a good thing if you are manager who hasn't programmed since college and wish to peek at what some code is doing, without learning the language. Cocoa names are like having manual for the class rewritten over and over in each statement.

But if you are trying to follow the pattern specific to the task of your code, the vast volume of the Cocoa names smothers it, making even the most trivial algorithm look like rocket science. Dragging with each name its whole ancestry is exactly opposite from the objective of abstraction, which is the key tool in conquering the complexities of programming. All aspects that are not strictly specific to narrow task that some function/method is doing, should be out of sight, just like the one of most valuable abstraction tools in computer languages, functions, hide all the variation of the caller's context and purposes from the implementation of the function. The function knows only the aspect of the world defined by its parameters & return values and need not worry about whether, say rectangle it is operating on is screen rectangle or room floor...

With its lack of overloading, named args, poor name space partitioning,... the Obj-C is completely contrary to the objectives of abstraction and Cocoa merely wraps around that fundamental defect, amplifying it as it layers its functionality around it. Further, the people who are attracted to such anti-abstraction language are naturaly anti-abstraction (concrete, pro busywork) types, hence that type of midset will form self-reinforcing anti-abstraction loop with the system they are evolving, making it even worse.

Comment Re:Dinosour language (Score 1) 351

You clearly don't know what you are talking about, which is probably why you were shut down on the Apple forum.

I wasn't shut down on the forum. Just one thread was frozen by some big honcho after they couldn't defend Obj-C in a fair debate.

Using things like KVO/Bindings and distributed objects REQUIRES that the functions be addressed by name, not by address.

On iPhone/iPad there are no distributed objects or execution. Even with distributed execution systems why would one use malloc-ed ascii string objects for simple scalar parameters (like integers, small enums, even single bits) to a service. I have written quite a bit of networking code (mostly in CFNetwork & BSD sockets whenever possible), with complex distributed state machines and execution, and all protocols and state machines are pure binary, without needless back and forth conversions to/from ascii, let alone malloc-ing memory block for an ascii string to pass a true/false parameter to a service. If something needs to be distribute the processing, any de/serializing should be wrapped and hidden from the API user. The ascii string objects/name-value dictionaries obsession for simple scalar values is pervasive in Cocoa APIs, almost like some religious rite. Unless that's all you grew up with and don't know any better, it is extremely annoying since it is completely needless.

Comment Re:Dinosour language (Score 1) 351

And i must say i also like categories and delegates, which allow surprisingly well organized code.

Categories are fine as far they go (in alleviating needless COBOL like block rigidity of class interface/implementation which get in the way for larger classes; without that rigidity they wouldn't buy you much). Unfortunately, without category being able to have its own variables (ideally with its own name space), you still end up bunching all the variables you need in a category back into the header of the main class interface, imported by others (making it more difficult to fully isolate autonomous parts of the category functionality).

As to delegates, the fixed names of the delegate callbacks make it difficult to have multiple handlers for different callback services of the same type in the same class, e.g. returns from alerts or action sheets for several popups that class may use.

Comment Re:Dinosour language (Score 1) 351

I have tried multiple times to get into Objective-C and Cocoa. I just can't do it and Objective-C is why.

After about a week upon installing SDK and playing with their samples, I spent couple weeks writing macros and wrappers (which grew over time to 5-6 thousand macros+functions) for much of Obj-C/Cocoa that I needed, so the worst stuff is out of sight in much of my code. I also ported my string library (the Cocoa NSString is about the worst string library you will ever encounter), linked lists, hash tables, small blocks sub-allocator... and avoid NSStrings, NSDictionary, NSArray... for anything other than Cocoa APIs requiring them. While the resulting code still isn't as clean and elegant as one can get with C++ or Java, or even Javascript with good libraries, it was managable enough to complete several complex apps (full remote desktop control, remote file access/manager, remote cams viewer, and some fun ones, like golf range finder and break reader, which worked quite well despite the flimsy iPhone accelerometer).

Comment Re:Dinosour language (Score 2, Interesting) 351

There is nothing there that a table of functions pointers along with compiler generated indices for methods/properties, linker/loader address bindings and on rare occasions OS APIs that provide access to these function pointer tables (Windows has such capabilities), cannot accomplish 300-500 times faster (yep, I had single stepped through their method calls & getters/setters, KVO/KVC processing,... and watched tasks that any decent compiler turns into one or two instructions explode into many hundreds of instructions through their runtime interpreter every time you invoke the "feature").

I have debated Apple guys on SDK forum (until some big guy, evangelist, got annoyed with the heresy and put a stop to the discussion), asking them to show what exactly can't be done hundreds of times faster (with properly integrated compiler/linker/loader) than their runtime hash tables are doing, and I couldn't get one example that could stand after closer analysis of the tasks. Basically, the guy who created Obj-C was toying with OOP concepts (even he doesn't use it any more), he didn't get the key point of OOP (tools for building layers of abstraction, via flexible name spaces and overloading), and somehow he impressed Steve Jobs, who knew even less about compilers and programming, in his Next phase, and everyone is now stuck with half baked language and hyper-verbose wasteful Cocoa, which leave good chunk of the compiler/linker/loader job to the programmers and end users' CPUs.

Comment Re:Dinosour language (Score 2, Interesting) 351

Have fun re-implementing something like KVC/KVO in C++.

Now that you mention it, KVC/KVO is another one of ridiculous and wasteful (on end user's CPU & programmer's time) "features" of the Cocoa API. Passing simple numeric arguments of time critical functions (such as animation control) as ascii string objects (not just ascii strings, but malloced strings, which need to be parsed & converted into binary integer/float then free-ed) is utter idiocy. If you wish to get file properties, they return you malloced ascii dictionary of ascii name-value pairs, for size, time date,... (all in ascii pairs that need to be parsed back to binary values that your code needs). It's beyond stupid. Similarly, passing 4 byte IP addresses to their CFNetwork APIs (their wrapper around BSD sockets) as malloced objects is just a mind boggling wastefulness. And all that wastefulness on a mobile device, burning away the batteries.

As with everything Apple, where sparks of genius are smothered by cult-like rigidity and idiocy, they are going to squander again the gigantic edge they had with the iPhone (which is a great device), just as they did it with Apple II after IBM PC took over, then later after pioneering PC GUI with Mac, they lost the market to Windows which was ten years behind, despite the latter being fueled by the famous Microsoft's creativity and excellence. They are now repeating their old pattern with iPhone (and, among others, their religious devotion to the clunky, wasteful Obj-C/Cocoa). In couple years Android and Mobile Windows, or maybe something better, will leave Apple with their usual 5% niche of the market.

Comment Dinosour language (Score 5, Informative) 351

After about two years programming Obj-C/Cocoa for iPhone apps, I can't believe that this ancient experiment in OOP by an amateur compiler writer is still around. Even though it is nominally a compiled language, all the calls to methods as well as accesses to class properties are interpreted -- the name of the method & its args (args have names) is looked up in a hash table by runtime interpreter to find the address, then to push args and call it, every time you invoke it or access a property. The Obj-C creator basically didn't know how to code linker-loader address binding and so he just left that part for the runtime to decode on millions of end users CPUs from there on. He also didn't know about name mangling, and left that part of his job for the future programmers to do manually (method names and args are explicitly named, so you end up with arg named calling methods like [obj method:arg1 count:count]). For adding properties to a class you have enter the same info in triplicate (variable delcaratiom, property declaratiom, getter/setter declaration), so there is lots of cut & paste, doing by hand the job that compiler should have been doing. The syntax is very clunky, inelegant, uneconomical on programmer's time e.g. requiring lot's of jumping back and forth to match/complete nested square brackets, again simplifying compiler writer's job at the expense of busy work for countless programmers from there on.

In addition to performance & narrow technical issues, the worst fundamental flaw of Obj-C is that the creator didn't understand the value of name space partitioning in OOP (the key tool for building layers of abstraction), so much of that's left largely to programmers, which in Cocoa (API, like win32) resulted in mind-numbing hyper-verbosity, with each class and method names dragging huge repetitive prefixes, with each name spelling out explicitly its whole ancestry back to the stone age. While the Xcode editor is doing heoric efforts in trying to guess what you meant and offer auto-completion of the names as you type, that is the lesser half of the problem (you still end up doing lots of cut & paste of the Cocoa names). The main drawback is when trying to read or modify the code later -- even the simplest algorithm looks complex, any pattern specific to the task at hand is drowned in the mind-numbing sea of repetitive Cocoa verbiage.

In short, horrible language & API framework. Only someone who grew up with this and never knew anything better could love it. Of course, like everything Apple, buried under the idiotic Coca+Obj-C layer, there are gems of genius, especially the extremely well thought out functionality and high performance graphics & animation engines.

Comment Re:Furry overlords notwithstanding... (Score 0, Flamebait) 215

The genes are more like global variables, while the auto variables and the actual code is in the "junk" DNA. Our scientific priesthoods (in any field) behave exactly like the ancient Egyptian priesthoods who, after figuring out a bit of a pattern behind calendar seasons, star constellations and Nile cycles, declared themselves all-knowing about everything there is, on earth and in heavens, before and after death. Among others, they declared brain as unimportant organ used only for cooling.

The worst offenders nowdays are public health "scientists" who pick out from vast patient databases (using statistical software which is a mysterious magic wand for most of them) some correlations on self-selected subjects, which could mean anything, then hand-pick one possible explanation as the "real" one (coincidentally, it is always the one benefiting financially the most their sponsors, pharmaceutical industry), declare pompously "science has spoken, debate is over" and then use it to drive policies, regulations and medical expenditures. Their "science" invariably ends up in more control over your life, worse health for you and your family (e.g. recent rapid rise of autism, asthma, allergies, obesity, diabetes, dialysis...) and more money going from your pocket to theirs and of their sponsors.

Slashdot Top Deals

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...