Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×

Comment: Re:Knowing when not to (Score 1) 345 345

...and a good metric is if the feature simplifies the code you are writing or if it makes it more complex.

The more orderly a system is, the smoother and more functional it will behave with fewer unintended consequences. The easier it will be to maintain, and the less time it will take to understand. Simplicity truly is the essence of all that is good in a computer.

What is easily overlooked is that order requires effort to maintain. At zero effort, complexity is going to increase, and looking at any system (or code) it becomes obvious where effort is lacking just by its complexity.

Another key intuition that is easily overlooked is that this principle holds true at every level, from user interface to technical performance optimizations to language features. At any given moment if we find what we are doing is complicated, we must question if it can be done simpler. Windows 8 failed at the interface level to simplify the experience. HTML and CSS failed at the language level for most of their early versions with browser incompatibilities and quirks.

A programmer's main job beyond implementation is to simplify the implementation. And anyone not actively maintaining order is inevitably going to be part of the problem.

Comment: actually a good idea, sort of (Score 1) 1067 1067

System-wide default error handling would actually be useful. It is completely up to the programmer to make sure nothing unexpected happens, but that is true even without the feature. And it isn't as if there is no default already... it just isn't customizable on a system-wide or better yet, scope basis (in most languages).

This isn't about invalidating any logical tenants. It is still up to the code to know how to handle errors. For there to be easier methods of instructing default behavior cannot be a bad thing.

Comment: overriding user intentions (Score 5, Insightful) 424 424

This is a perfect example of the flawed interface design philosophy many tech giants fall prey to, and it boils down to "we know what you want better than you do".

To their credit, companies like Google and Microsoft and Facebook put their best minds behind these problems and come up with technically ingenious solutions. That's part of the problem. It must be correct and it must be better, because we worked so hard on it using proven methods. But people who know what they want find these products difficult to use, difficult to control, and even vaguely insulting.

The Facebook news feed is a triumph in machine learning, as is/was Microsoft's ribbon interface in UI, and Google's search in contextualized search... They're based on solid research, mass user polling, hard big data, and ambitious technical goals of competent engineers. Yet, they can't get it right because they continue to look at the problem and ignoring the people, often condescendingly so.

It takes understanding for users to have clear intentions. As others have said, if the user doesn't know anything about what they are searching for, Google does a good job of educating their guesses. And to their credit, these companies are successfully serving the inept majority. But anyone who continues to use their products inevitably will have clearer intentions, because with use, we naturally get smarter. That is why the more we use these tools, the more we have reasons to hate them. The more we find things we wish to do with these tools, the more we find they are less accommodating.

The technical solution is rather simple. Interfaces are intention driven, and if they're not driven by the intentions of the user, they are driven by the intentions of the developers. Hence, each feature can be tested for the intentions they serve, and those that serve the user must be added and made more prominent. An existing example in facebook is the "don't show me posts from ___" feature. But other's that don't exist would be listing entries in strict chronological order, or listing entries unfiltered. They could be simple checkboxes and implementation would be simple (boring almost).

The technical solution is far easier than what really needs to happen, and that is a change in attitude and philosophy of the people building these products. They need to be more embracing and less insistent on user behavior. They need to stop thinking they know better. They need to stop judging their own solutions by their technical prowess. People who know what they want need to be able to choose, and for the most part, intentions are simple. Simple intentions garner simple select-able features. If this is too boring, maybe they need to stop using users as guinea pigs, quit their insanely high paying job, and go back to academia where they could do some really interesting work.

Comment: true value of science is in controlling the future (Score 1) 364 364

Science is the axiomatization of falsifiable statements as either true or false through reference to real events and experiments. Experiments are not necessary to generate falsifiable theories, and there are plenty of theories that are impossible or nearly impossible to test. Testing for the Higgs boson took 50 years and billions of dollars. The experiments don't necessarily come first.

Our past is an ample source of real evidence. What happened was real, and is just as useful as any event intentionally produced in a lab. But the true value of science is in controlling and predicting future events and consequences. And for this, experimental validation is not asking for much. If we can't reproduce it in a lab, it will never make it to an iPhone. Hence irreproducible events are worthless to the pragmatic scientist and to every engineer. In most cases they're either not what they seemed or beyond our current scope anyway. We'll either get to a point where we understand it later, or we'll find out the scientist was lying -- the latter has turned out to be quite common.

All this theorizing and hypothesizing is simply part of the initial process. What has no consequence will be unobservable and untestable and unusable anyway. But imagination is already a consequence which at minimum has great entertainment value. The next step -- and quite an important step -- is seeing if any of it will make it out of our imagination.

Comment: People are on but not on. (Score 4, Interesting) 394 394

The youth are not embracing facebook. Facebook is a brand, and it is hated by too many taste makers. Facebook doesn't taste good. Any employer that likes facebook is already behind the curve, pun intended.

Most people on facebook are not on facebook. They have inactive profiles. They may check to peep those who are active, but beyond that, there is very little utility or upside to those who quit caring. And this is always a simple function of time; everyone quits caring eventually. Facebook will continue to insist these peepers are "active" but no, this bluff was tried by Google+ and it won't fool anyone. Those looking for a job might clean up their profile just in case, but this doesn't mean they're on or using facebook.

Facebook will become the next myspace. That's why Facebook, being run by people who know this well, is buying what could be to facebook what facebook was to myspace. That's why Instagram and WhatsApp needed to be purchased.

Facebook is moving beyond a platform. Social media to them is now about real estate. You can move off from facebook to instragram like one would from Santa Monica to Venice. But your landlord is still facebook.

Here is one concrete example of why Instagram is amazing and Facebook sucks. When a brand posts something on Instagram, there is no "promoting" their post, there is no "mining impressions", and there is no "paying for likes". There is no machine learning optimized feed. Instagram pushes a photo to everyone instantly, and the response is also unencumbered and immediate. And it has no ads. Unlike facebook, Instagram does not stand in between you and your followers. All their efforts into the quality of what facebook should be doing on facebook, yet the answer was to not be there at all. The presence of the "host" is not welcome in any social setting, not online or offline. We don't need the waiter or waitress to feed us at the restaurant while reading ads. That's facebook.

Seriously, facebook sucks. It's future is dead. Even just for the reason that my mom has twice as many friends than I do and all her peers love it. She just turned 70.

Comment: Not a choice (Score 3, Interesting) 397 397

If we're choosing, we're already losing sight of what it truly means to be intelligent, capable, and human.

Art is expression, science is technology, and philosophy is intuition. The tools of an artist are made with technology. A scientists imagination is powered by expression. And without art or science, what would a philosopher spend their days thinking about?

These divides are mostly for the convenience of being able to hire a specialist and for splitting students into classrooms. At the end of the day, there are no downsides in being proficient in all three.

Here's one way to look at it. Hypothetically, given three candidates, if you need a philosopher, pick whoever scores highest in philosophy. But given all scores are equal, whoever has the highest combined score will be the better philosopher or scientist or artist. None of these takes away from any other, and more often than not, it's where they overlap that is the most interesting, relevant, and progressive.

Expression, technology, and intuition can be applied to anything, not just to one anther. Take an iPhone. It's built with technology, it's a piece of art, and it was made with a philosophy. Take Barack Obama. He is a master at expressing himself, his political decisions are guided by his intuitions, and technology was key in winning his elections. Take Michael Jordan. His style was all his own, he had awesome sneakers, and his intuitions helped him win his championships -- from when to shoot, when to pass, when to quit, and when to come back.

If you look at anyone who got far in life, it rarely matters where they start, but by the time they get anywhere, you'll see traces of all three.

Comment: Dumb first. (Score 1) 294 294

When Elon says that the risk of 'something seriously dangerous happening' as a result of machines with artificial intelligence, he is not referring to sentience. He is referring to dumb AIs not working as intended. Maybe an auto-piloted car running over a baby or an AI trading program accidentally crashing the market... One of which already happened.

And even with regards to the singularity or whatever, we know the thing is going to be dumb first. We were all dumb. Kids are cruel and irrational and love to play. If AI were anything like us, it'll be childish first before it surpasses its parents. No one seems to go there.

But the real threat is us wishing for daemons, not by accident, but on purpose. The open letter warns 'our AI systems must do what we want them to do.' As long as there are military interests, AI will be made into weapons first. Enlightened robotic butlers aren't going to kill us. Robotic soldiers will do it better, and do it first, and they will be obeying our orders.

Comment: don't censor. mock. (Score 3, Interesting) 216 216

The Japanese twitter meme contest was a far better counter attack than or censorship or war.

Terror is a feeling and humor is the antidote. Just as the Scary Movie franchise ruined classic horror, once it's mocked and funny, those giggling are no longer scared. They are empowered and immune to that pattern of fear. The Daily Show is also founded on this, as is/was Charlie Hebdo. France agrees with Charlie, but still fails to understand the guiding principles.

ref: http://www.nbcnews.com/storyli...

Comment: The Old Apple (Score 3, Interesting) 86 86

Then Nintendo had a lot in common with Now Apple. Games were simple enough that indie developers could make hit titles that Nintendo would then publish and distribute as cartridges, which is basically what the App eco system is, minus the hardware. But Nintendo was Apple, not an App developer... and to stoop to that level is seppuku harakiri suicide.

What if Nintendo made an official NES emulator app and publish every NES game ever made... add a gamepad accessory built to legacy standards, and the NES graveyard just became a NES-fan's utopia. Do this for the SNES, Gameboy, N64... whatever an iPhone can handle. Keywords: Every game ever, identical, fingertips. This wouldn't be just another App or just another game. It would be Nintendo via my phone! Can't wait!

Comment: Re:There might not be Proper English (Score 1) 667 667

Exactly this. Proper is in the context, and there is proper English for every occasion.

Kids using broken grammar and butchered words is proper for their audience. But when they need to speak to impress their teachers, their parents, their employers, their investors, their readers, their students... they will speak proper English.

The issue is whether one can communicate their philosophy, their science, their intentions at the highest level. This is the skill that is lacking in public education in China and Japan and even in the US at lower levels. Learning how to express sophisticated thoughts proficiently requires a higher education even for native speakers, and the Ivy league schools does set the international standard for proper, intellectual, universal English.

Comment: We're getting better, not worse. (Score 1) 320 320

It takes a better scientist to correct a scientist. For all these mistakes to come to light is a sign that we are getting smarter, that research is becoming more open, and that science is accelerating. A lot of it is thanks to the internet and the speed at which information can travel. Catching our mistakes is progress. Any scientist knows this.

> that puts the very basis of our reliance on scientific research results at risk

Utter nonsense. Science is about applying our findings and building new technology. Results that cannot be reproduces are completely useless. The faster we weed them out the better.

Our reliance on scientific research is permanent. Our reliance on useless research is what needs to go.

Comment: Not if we hate it. (Score 1) 209 209

The prediction here is made by extending the present, but the future is never that predictable. Look at snapchat and google allowing deletion of entries. The demand for ephemeral data is growing, and this directly contradicts the premises. What this doesn't take into account is the people NOT wanting this who will invent ways to serve those who don't want it either... and when that market surpasses the Timeline reseller market, this prophecy will not be fulfilled.

Timeline is a technology that is already here and it already has a market. In the future, we will be able to own our timeline, and we would not want others to own our timeline. The government will try, but we will fight them, like always. And once there are better alternatives, we will get off facebook and google and all the timeline reselling monopolies....

Timelines aren't just for people though... Phones, toasters, forks... anything could have a timeline, and this is where non-right-violating timeline technology has a huge upside. But I'd be wary of any company banking on the timelines of people, especially those that disregard basic user rights and user voices, such as facebook and google.

You can't go home again, unless you set $HOME.

Working...