Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re: isn't x86 RISC by now? (Score 2) 161

You make a lot of good points. But any time you create a processor with cache miss prediction as well as branch prediction as well as execution parallelization, the instruction set has almost no effect at all.

You're suggesting that the instruction set interpretation is a separate unit to to out of order execution code. In reality, the first stage of any highly optimized modern CPU able to minimize pipeline misses must actually "recompile the code" (an oversimplification) in hardware before the actual execution unit and any intelligent designer would simplify the operations into much wider RISC style operations in the pipeline(s).

A purely in-order processor would certainly suffer from using a variable length instruction set.

What many people fail to realize is that ARM is very much a CISC ISA as well, but with a wider fixed sized word. So instead of increment ing instruction lengths by single byte units, ARM has two byte wide units in Thumb mode and four byte wife increments in ARM mode. Either way, it's nothing like MIPS (which is a dinosaur) which is a pure RISC architecture.

With the exception of a rare set of ULIW DSP cores (TI), it doesn't really matter. The major performance booster DSPs have is that code for a DSP are generally very small programs with extremely long pre-optimized pipelines which are not intended to be run in a task switching environment. Those processors don't need translation because they're completely pre-optimized. They also don't tend to have memory fragmentation and almost never have MMUs.

General purpose CPUs run code all ass-backwards because the code is unpredictable. The ISA will almost always benefit most from having as many possible version of instructions as possible for performance per watt.

The trick will be how to detect when specific units aren't needed and power them down when possible. Compilers and ISA will have no impact on this... Unless instructions are added to power on and off units explicitly based on the code entering the pipeline.

Comment It's nothing more than laser mics (Score 1) 142

All they did was remove the laser to make the ease dropper unobservable. The tech is useless unless the camera has an insanely fine pitch resolution or the speech is so loud it causes large vibrations.

Otherwise, they're just sampling motion at 2000-6000fps. It seems ridiculously processor intensive for something which could be better achieved using a high performance light meter.

Comment Tools just get worse. (Score 3, Funny) 240

15 years I coded with Qt professionally and came to love and adore it's quality and simplicity. I even wrote and operating system which was a microkernel and everything else was based on Qt Core.

I wanted to make a simple project the other day and Qt failed me. It seems that code goes in without unit tests these days and shipments are made without any quality control.

I used Qt because I'm not the typical loser Linux developer who wants to spend 50% of their time just fixing their tools or reporting bugs to package maintainers.

Comment Lame... Electric (Score 1) 122

Electric is SOooooo 2012!

I was looking at a Tesla or another electric and the Toyota said they'll ship Hydrogen Fuel Cell next year. So, I will drive my car another year.

I have a Prius now and am driving into the ground. Tesla was the next intelligent step... While waiting for fuel cell. But now fuel cell seems to be here, so why would I buy that?

Comment Complexity (Score 4, Informative) 213

ACM is a great resource and I regularly borrow journals from friends.

My issues are simple.

1) I'm self educated. ACM discriminates against people like me. It doesn't matter that I have 20 years experience in protocol and codec design or that I've designed algorithms which they have published articles analyzing.

2) price. ACM is too expensive for individuals and programmers who are actual scientists and actual engineers as opposed to Python coding web site developers have a hard enough time getting bosses to pay for RAM upgrades. Things like "club memberships" are generally out of the question.

3) Too many journals to choose from and each one costs more. Professional programmers probably want 3-5 different journals. I haven't checked in a while, but I would want the journals on compilers, machine vision, signal processing and probably AI (if those are all categories) but I wouldn't want to pay for all 3. A downloadable printable version of the actual journals or at least an ebook would be welcome. Last I checked, they only offered article by article.

Finally, I never see ACM articles linked from Google. You'd imagine searches for things like "reduction of inter block artifacts in discrete wavelet transforms" should nail 5 ACM articles on the first page. Instead, I see mailing lists.

Comment Re:Disengenous (Score 5, Informative) 306

While I feel your argument was probably not thought through well enough, I believe there is merit to it.<br><br>Here in Norway, we tend to suffer a great deal as consumers because of the publisher/distributor relationship. The pricing model of books is highly predatory and the book rights for Norwegian translations also allow the local publisher to own the rights to the original language within the country. This drives prices on the original language and the translation through the roof since the cost of translating is so high that unless it's a #1 best seller, all the profit has to be made on a few hundred... possibly thousand copies. What is worse is that Norway has a higher English literacy level than either the U.S. or the U.K. We don't need these translations. They are translating them for no apparent reason... and worse... as the availability of English books through Amazon or others increases, the Norwegian translation market shrinks and the quality of the translations shrink too.<br><br>Another major issue which I have is... I am willing to pay large amounts for Print-on-Demand if I need a paper book. In fact, I try to avoid purchasing books which were mass printed only to look good enough on display cases to attract sales.... then when the book cools down, they'll throw them away and recycle them. This practice is so fantastically stupid that I can't even imagine that the people who want to make this continue can even tie their shoe laces. I don't feel any personal need to help the printing business by printing documents which just don't need to be printed. Books should never be printed like that anymore. We have eBooks. I don't actually know anyone who prefers paper anymore... including wrinkle monsters.<br><br>I don't care what the eBooks cost, but here's a simple rule.... I under no circumstance am willing to pay for the printing of a book in my eBooks. Meaning if I assume the printing cost of one book to be $1 and that the idiot publisher is probably printing three copies of the book for each one he sells... so let's be fair (toss him a cookie) and say to cover his costs, he needs to pay $2.50 for the cost of printing. Then the eBook should never cost more than $2.50 less than what the printed book would cost on the shelf of a brick and mortar store which will discount the book immediately. So if the MSRP is $20, a store would discount that book 10-25% which is why we have MSRP (feels great to save that 25% right?), so $15... now, subtract $2.50 to cover printing costs... that's $12.50.<br><br>I'm willing to pay $12.50 for the eBook which is MSRP of $20.<br><br>You know what? I'm willing to pay $20 for the paper copy if it's printed on demand instead of just killing the planet for fun. Of course, I'm not going to demand that paper copy unless I need it for reference.

Comment Good move :) (Score 1) 383

First of all, Nokia was only worth the patents. Their tech was lame and Microsoft can do it better themselves. Also, Nokia focuses too much on making too many phones. A good team focusing on one damn good phone at a time is far superior and Nokia sucks at that.

Microsoft couldn't possibly just dump Nokia dead weight. If they did, it would be a disaster politically. So finding 5000 more workers in that monsterous company shouldn't be hard and there is certainly a pile of dead weight.

Well done!

Comment Wow!!! This is most famous at least (Score 1) 285

How about Matthias Ettric? He wrote the original KDE, was the core developer of Qt for years?

Lars Knoll, more or less single handedly wrote the Konquerer Web Browser which later became WebKit and then move to Trolltech where he became a core tech developer on Qt.

Warwick (can't remember his last name) at Trolltech who more or less wrote all the cool stuff for Trolltech like the Qt Embedded windowing system as a replacement for Qt.

Karl Anders &#195;&#732;ygard who optimized the AMP code base making PC MP3 playback possible as well as being the core or WinAmp. He also wrote major portions of the Opera Web Browser like the original full EcmaScript implementation, he also architected the layout engine of the browser which supported reflow making it one of the fastest web browsers (often fastest) for a decade

Lars Thomas Hansen who wrote the optimized EcmaScript engine for Opera. Implemented some of the earliest byte code compilers in browsers. Damn near reinvented garbage collection. Wrote his own Scheme compiler. Later worked at Adobe boosting performance and features of ActionScript like crazy. Now works at mozilla foundation making yummy stuff there.

Ugh... Can't remember his name was it Christian-Jacq, codeveloper and maintainer of VLC.

Wim Taymens, the core developer of GStreamer... He wrote most of the cool stuff in there. Excellent sense of humor and generally good person as well as amazing programmer.

That annoying/obnoxious guy from x264 who basically has been the project maintainer for years. Personally I want to choke him, but he's among the best programmers I know.

Kieran Kunaya (sp.) developer of the Open Broadcast Project. Basically made it possible for TV stations and broadcasters all over the world to use Open Source.

Ole Andre Vadnes (sp.) wrote endless numbers of tools for reverse engineering core components of programs to be able to use libraries from defunc programs companies needed to run.

Lars Petter (something or other) spent years counting clock cycles for Tandberg developing faster H.264 codecs.

Mark Russonovich (sp.) who single handedly reverse engineered and reimplement the NTFS file system and documented, wrote books about it and more or less is responsible for many of us not losing all our stuff.

The guy who wrote IDA Pro, the most advance disassembler/compiler ever.

I have had the pleasure of meeting and often working together with many of these guys for weeks or years. While I don't discount the names on the list... I would say that it's terribly naive. I can probably name another 100 developers that should be in the top 10

Comment Cloud only applications are a disaster (Score 3, Insightful) 409

1) Cloud office suites store documents.... in the cloud
2) Cloud office suites make you 100% dependent on their apps. Sure... Google uses "open formats" but as they add features and other companies add features, they lose formatting compatibility.
3) Here kid, the first one is free. Using free cloud software is great while it's free. Where's the guarantee that it will always be free? When it's not free, how much will it cost? Will I actually be able to move?
4) Are you seriously asking me to trust Microsoft, Google or Apple more than the other? This just is laughable. They're all a bunch of crooks. The only difference is, at least for now, Microsoft has governments around the world already treating them like crooks, so they at least have to try to be honest. Apple makes absolutely no pretenses of being an honest player and Google... they scare the shit out of me.

In the end, the best solution is a cloud player which has a clear means of licensing their software and running it within your organization without them being involved. So far as I know, Google doesn't even try for this. Microsoft does have a product, but it's not easy to get.

So for now, I'll use desktop and mobile apps and cloud storage. Thank you very much.

P.S. - It's scary how I am not nearly as worried about government spying, I simply accept it as part of life. But Google really scares the shit out of me.

Comment Re:Having used both (Score 1) 314

QNX is one of the more enjoyable embedded OSes?

It's just a unix style microkernel operating system. What they're buying is the Qt platform which is far more interesting than QNX... they'd have been equally well off with a Linux kernel and Qt on top.

There is no size footprint or CPU footprint benefits to be had here... the point is, if you're not using Windows, a Qt based platform with proper driver support is the way to go.

But... it will be in a Ford product which means they'll have to make it work like shit to belong there.

Comment The blind leading the blind (Score 2) 177

While the article justifiably blows a whistle on what could be an abuse or power, the premise of the article is BS at best. It suggests that the tech could be used to maliciously snoop on people without their knowledge. The spec says nothing of the sort. It allows a user to make use of a proxy. In the case of a TLS only HTTP 2.0, this is needed. Without it, people like myself would have to setup VPNs for management of infrastructure. I can instead make a web based authenticated proxy server which would permit me to manage servers and networks in a secure VPN environment where end to end access is not possible.

Additional benefits of the tech will be to create outgoing load balanced for traffic which add additional security.

How about protecting users privacy by using this tech. If HTTPv2 is any good for security, deep packet inspection will not be possible and as a result all endpoint security would have to exist at the endpoint. Porn filters for kids? Anti-virus for corporations? Popup blockers?

How about letting the user make use of technology like antivirus on their own local machine to improve their experience? How many people on slashdot use popup blockers which work as proxies on the same machine.

This tech adds to their security end-to-end instead. After all, it allows a user to explicitly define a man-in-the-middle to explicitly trust applications and appliances in the middle to improve their experience.

What about technology like Opera mini which cuts phone bills drastically or improves performance by reducing page size in the middle.

Could the tech be used maliciously? To a limited extent... Yes. But it is far more secure than not having such a standard and still using these features. By standardizing a means to explicitly define trusted proxy servers, it mitigates the threat of having to use untrusted ones.

Where does it become a problem? It'll be an issue when you buy a phone/device from a vendor who has pre-installed a trusted proxy on your behalf. It can also be an issue if the company you work for pushes out a trusted proxy via group policy that now is able to decrypt more than what it should.

I haven't read the spec entirely, but I would hope that banks and enterprises will be able to flag traffic as "do not proxy" explicitly so that endpoints will know to not trust proxies with that information.

Oh... And as for tracking as the writer suggests... While we can't snoop the content, tools like WCCP, NetFlow, NBAR (all Cisco flavors) as well as transparent firewalls and more can already log all URLs and usage patterns without needing to decrypt.

So... May I be so kind as to simply say "This person is full of shit" and move on from there?

Comment Why does the U.S. even teach second languages? (Score 1) 426

Honestly, I went to a high school with over 1000 other students. I believe about 800 took spanish and the remaining 200 or so too French, Italian or German. The average student studied their second language for 3 years, some as many as 5. Of those students, less than 10 percent can communicate on a vacation to another country where those languages are spoken. Probably less than 2% became fluent. And yet, most of them graduated with good grades in those languages.

Second languages should be optional and should be a major boost on college applications. But to be fair, it's a waste of millions and maybe billions of dollars to educate in a topic which less people can perform well in than they do in mathematics.

Comment Thank you! (Score 2) 449

I've been looking for someone to finally make this point.

Let's also consider that the attack Magnus used on Bill was a class speed chess method. He sacrificed his front row, took a small gamble that Bill would play regular chess and be protective of his front row. As a result, Magnus came out fast and hard with his knights and queen. I have seen this precise game played (move for move) many time growing up by the old jewish men in the park in Brooklyn. In fact, I'm almost sure I played it against other people several times.

Speed chess is rarely about skill or beauty. It's about patterns. The difference between speed chess and an opening moves book is that the speed chess games build a start to finish tree of possibilities in 10 moves or less.

Want to talk about impressive? Bill was smart enough (just before moving his bishop in a "protective" measure) to recognize that he'd lost already. Watch the video and you'll see. My wife was laughing because I was telling her the game and how it would be played from the second Bill too Magnus's pawn.

Slashdot Top Deals

There are two ways to write error-free programs; only the third one works.

Working...