Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment: Re:Xilinx (Score 1) 60

by Svartalf (#49815133) Attached to: Intel To Buy Altera For $16.7 Billion
Biggest problem THERE would be that they'd have to open up the X86 kimono a bit more than they'd really want to do that with NIOS.  I won't be surprised in one way (your meaning of the situation) if they do it and surprised all the same- because they're giving stuff out that can be more readily reverse engineered through the tools, etc. that people would get as a result of that decision.

Comment: Re:Conflict of interest (Score 1) 60

by Svartalf (#49815101) Attached to: Intel To Buy Altera For $16.7 Billion
Not a conflict of interest.  Just that a competitor just bought your supplier.  Big difference.  It's a problem that you need to find a new supplier.  The drawbacks with FPGAs is that there's nothing other than your sole supplier is just that.  You can't readily or easily swap out the FPGAs like you can SoC's in the ARM or MIPS space- or like RAM or eMMC's.  There's a bit of "standard" and "open" involved with things there.  I consider it necessary evil to be using them because they're not as open or "standardized" as the other stuff- but the moment someone wises up, even though it'll be a race to the bottom like the other plays, they will be the "king" there.

Comment: Re:So, what's the plan? (Score 2) 60

by Svartalf (#49815057) Attached to: Intel To Buy Altera For $16.7 Billion
They're big and slow compared to an ASIC, yes.  But the thing is, they're not big and slow overall- they're reconfigurable and you can dynamically change the logic (Witness Altera's OpenCL offering on the higher-end stuff they offer...  You don't offer that unless you're competitive with GPUs...) on the fly.  They have a place and it's not always custom logic.  It's adaptable custom logic- which ASICs **CANT** do.  CPUs are slow and plodding in many of the tasks you're talking about in that space- and GPUs are cumbersome and painful to use compared to them for that use.

Comment: There are reasons for 12-TET (Score 1) 106

by Sycraft-fu (#49799983) Attached to: Android M To Embrace USB Type-C and MIDI

It is a good balance between getting a good 3rd, 4th and 5th and not getting too complex. You have to go to 29-TET before you get a better perfect 5th and 41-TET to get a better major 3rd. Gets a little complex musically to represent and deal with all that, not to mention design instruments that can play it back.

So remember that ultimately music is all math, and as such some things do end up being "better" than others musically. I'm not saying we shouldn't have the capabilities to use other scales, I mean computers are more or less unbounded in their capabilities and samplers can microtune to any required setup, but 12-TET has a reason for its prevalence.

Comment: It is also still what computer music uses (Score 1) 106

by Sycraft-fu (#49794281) Attached to: Android M To Embrace USB Type-C and MIDI

If you compose music on a computer, you almost certainly still do it with MIDI. All the new highly advanced synths and samplers still use MIDI as their input for data. Everything from big dollar stuff like Native Instruments Komplete down to freeware. MIDI goes in, sound comes out.

In some ways it surprises me since you'd think they would get around to improving it (there are some things MIDI leaves to be desired) but on the other hand it does work well for nearly everything and there's something to be said about keeping things standard. You can literally take one of those old General MIDI songs and feed the data in to a modern sampler. I do just that all the time to remake old game soundtracks because I enjoy it.

Comment: All the time (Score 3, Insightful) 742

by Sycraft-fu (#49767733) Attached to: Greece Is Running Out of Money, Cannot Make June IMF Repayment

The US always pays its debts when they are due. I think perhaps the problem is you don't understand how US debt works, and why it is a bit special:

So the most important thing to understand is the US doesn't go and beg people to give it money, rather it auctions debt. People come and purchase the debt. You can do it yourself on their Treasury Direct site. The US sells debt instruments to interested buyers. They are bid on, and whoever bids the lowest interest rate wins. The upshot is the US sets the terms of the debt instruments sold. They have a variety, some are as short as 4 weeks, some as long as 30 years. When you buy something, the terms of repayment are stated up front: What it'll pay, and when. There is no provision to cash out early, and you don't get to dictate any terms, you just choose what note you want to buy (if they are available).

This is how public debt works in a lot of countries, but it isn't how things go when you are getting loans from the IMF.

The other important thing is that all US debt is denominated in US dollars. A US debt instrument specifies how many dollars it'll pay out and that number is NOT inflation adjusted, except in a few very special cases. Well the US government also controls the US mint, which makes US dollars. So the US government can literally print money, and inflate its way in to payments. There are negatives to that, of course, but it is perfectly doable. The US controls its fiscal and monetary policy regarding its debt. Since all its debts are in US dollars, and since US dollars are the world's reserve currency, the US cannot face a crisis where it can't pay, unless such a crisis is internally generated (via the debt limit).

Not the case with Greek debt, it is in Euros and Greece doesn't control the Euro.

Finally, there's the fact that the US has great credit. Doesn't matter if you disagree that it should, fact is it does. Investors are willing to loan the US money for extremely low interest rates because they see it as a very safe investment. 4 week T-Bills have been going for between 0%-0.015%. 30-year bonds have been going for 2.5%-3.75%. Investors bid the interest rates very low because they desire it as a safe investment.

Comment: Incorrect (Score 5, Interesting) 174

It is easier with something simpler, not something smaller. When you start doing extreme optimization for size, as in this case, you are going to do it at the expense of many things, checks being one of them. If you want to have good security, particularly for something that can be hit with completely arbitrary and hostile input like something on the network, you want to do good data checking and sanitization. Well guess what? That takes code, takes memory, takes cycles. You start stripping everything down to basics, stuff like that may go away.

What's more, with really tiny code sizes, particularly for complex items like an OS, what you are often doing is using assembly, or at best C, which means that you'd better be really careful, but there is a lot of room to fuck up. You mess up one pointer and you can have a major vulnerability. Now you go and use a managed language or the like and the size goes up drastically... but of course that management framework can deal with a lot of issues.

Comment: Well, perhaps you should look at features (Score 1) 174

And also other tradeoffs. It is fashionable for some geeks to cry about the amount of disk space that stuff takes, but it always seems devoid of context and consideration, as though you could have the exact same performance/setup in a tiny amount of space if only programmers "tried harder" or something. However you do some research, and it turns out to all be tradeoffs, and often times the tradeoff to use more system resources is a good one. Never mind just capabilities/features, but there can be reasons to have abstractions, managed environments, and so on.

Comment: That's why they didn't do it (Score 1, Funny) 244

by Sycraft-fu (#49728713) Attached to: Why Apple Ditched Its Plan To Build a Television

Because they couldn't overcharge. I'm sure they researched the industry and discovered that it is highly price competitive and that just putting an aluminium frame on it would justify a doubling or tripling in price. So they weren't interested. Apple only likes markets where they can overcharge to a massive degree. They don't want to just make money, they want to make stupid amounts of money.

Comment: A two factor device (Score 4, Informative) 88

by Sycraft-fu (#49727625) Attached to: Yubikey Neo Teardown and Durability Review

I know, only because where I work is using them. Idea is it is a general two factor token. Can be programmed by the end user or their org. Also in theory a lot of companies could all use their platform and you have one two factor device for everything but in reality you use it for whatever your company does and nothing else.

Once programmed it acts like a HID class keyboard. You push the button, it spits out a string of characters, that being the two factor code for your account at the time.

Comment: Oh come on (Score 2, Insightful) 66

I had never seen a black rectangle with rounded edges before the iPhone! ... ...well unless you count the TV I had as a child. And the TV I have now. And probably half the electronics in my house.

The whole "trade dress" concept seems a bit silly to me in the first place but ti is beyond stupid when they can claim something as simple as their rounded rectangular design as being "trade dress".

Comment: Re:Anecdotal evidence (Score 2, Informative) 241

by Sycraft-fu (#49710117) Attached to: How Windows 10 Performs On a 12-inch MacBook

Nothing rigorous that I've found. I've seen some things like a Mac user posting on a forum asking why Cubase was hitting harder on OS-X than Windows along with screenshots of the overall load meters that it has, but little in the way of details on methodology.

While I haven't done extensive looking, I haven't come across anything and it is something I'm interested in.

Sadly, there seems to be little interest in testing. People who own PCs can't really test it, outside of building a hackintosh, and Mac users are not very interested in testing particularly since many of them have a real need to believe their money was well spend and do not wish to do something which might challenge that idea.

If someone gave me the hardware and software I'd love to try it, but I own only a PC, and the DAW I use (Sonar) is Windows only.

The only thing I can point to with some newer data is a Sonar benchmark, conducted by their lead programmer, showing improvements in Windows 8 vs Windows 7. They found basically an across the board improvement, with no code recompile http://blog.cakewalk.com/windo... . Now that says nothing of cross platform (as I noted, Sonar is Windows only anyhow) but does indicate that MS continues to improve Windows' performance with regards to intensive time critical tasks like audio.

Comment: Re:Anecdotal evidence (Score 5, Insightful) 241

by Sycraft-fu (#49709489) Attached to: How Windows 10 Performs On a 12-inch MacBook

True, though there is some precedence. OS-X does not seem to be particularly zippy in the few cross platform app benchmarks that are to be found. A good example is DAW bench's test on Cubase, Protools, and Kontakt: http://dawbench.com/win7-v-osx.... What you see is that Cubase has a much more efficient engine than ProTools (no surprise) and that on Windows either one gets a lot more polyphony than the Mac. At any given buffer size (lower buffers are harder to deal with) Windows did better.

Pretty good test too since you are dealing with tools that have long been cross platform. Kontakt has been cross platform for its entire life, Pro Tools was Mac only until version 5 (1998ish), since when it has been cross platform, and Cubase has been cross platform since back in the DOS and Atari ST days. All the software has long development histories on both platforms, yet Windows gives superior results.

None of this means OS-X is unusable or anything, but it doesn't appear to have the performance Windows does, when pushed.

Comment: Re:Really? (Score 1) 368

Isn't cheating allowed? Write the original OS in C, then compile, and then work on the resultant assembly code, first on optimizing, and then on adding features?

Writing C code and playing with the generated assembly might be OK for a day or two's work in learning assembly on a particular architecture. However for any half serious project its probably a terrible way to get started.

Assembly programming is not about trying to out code generate a compiler. Its about thinking of implementations that are not restricted by the semantics of a particular high level language, of being able to leverage knowledge that can not be given to the compiler, of thinking in terms of the actual hardware architecture.

Take an astronaut to launch.

Working...