Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Submission + - Ask Slashdot: Does USB Audio Cards With HDMI Multichannel PCM Output Exists?

mr3038 writes: Use case: I have an high resolution LCD monitor connected via HDMI to my computer. In addition, I have an AV receiver capable of receiving multichannel PCM audio into HDMI input but said HDMI input only supports resolutions up to 1920x1080p so I don't want to route my display through the receiver. I would like to add a new HDMI output port to my computer with the intent to never output any video from that port, only multichannel PCM audio.

One would imagine that an USB device that pretends to be a multichannel USB audio card to the host and pretends to be HDMI source device to the AV receiver would be pretty cheap to build. Technically it would be an USB audio card to HDMI converter that does media conversion but does not change bitrate, bit depth or anything else.

Does such a product exists at all?

Comment Re:HDR? (Score 1) 287

It's possible to do HDR correctly, too. See e.g. http://www.flickr.com/photos/cmdrcord/4973996377/sizes/o/in/pool-89888984@N00/. You cannot take a shot like this without HDR + tone mapping because the amount of light in the wall with direct sunlight is way too high compared to shadows under collapsed roof. I consider HDR similar to digital sharpening algorithms: it's possible that using the technique improves the image quality but more often than not, beginners use it way too much.

Comment Re:The problem is not the storing of SSN! (Score 1) 505

It just came to my mind, that even some banks are stupid enough to use identification number as authentication. In this particular case, the attacker was able to withdraw money from an account by only knowing the account number (the account identifier). If this happened to me, I'd sue my bank for giving out my money without authenticating my identity. It should be really simple:

  • 1. account identifier (account number) identifies the account,
  • 2. the bank authenticates the idenfication of the person doing the withdrawal,
  • 3. the bank checks that the authenticated person is authorized for the given account.
  • 4. if step 3 is successful, withdraw the money from account

Any bank doing only

  • 1. identifier identifies the account,
  • 2. withdraw the money from account

deserve to be sued their assess off.

Comment The problem is not the storing of SSN! (Score 1) 505

It should not matter if businesses store your SSN. Would you object to storing your name, email address, phone number, postal address or any other publicly available number or information? The SSN should not be any different.

I'm afraid that the real problem is that businesses (and possibly government officials) are using SSN as authentication token instead of identification token.

We have exactly the same problem here in Finland with our SOTU/HETU/what-ever-it's-called-today identifier string. It was originally designed to be identifier for every citizen but the latest law (Henkilötietolaki, 1999) says that this identifier should not be public... Or it can still be used for identifying persons for statistical reasons, for selling services for credit, renting, insurances and other miscellaneus stuff. However, it cannot be used as the person idenfier "only because it were the easiest way to identify a person" (direct translation from the actual law)! How fucked up is that? A personal identification number that shall not be used as personal identification number? To my knowledge this originates from using this identifier for authentication (surely you are the only person that can remember the last 4 symbols in your identification number?)... After reading this discussion, it seems clear that the problem is the same in the USA. What I cannot understand is why they decided to codify this brain-damage as a law instead of simply saying that you cannot authenticate with identifier.

How can we get businesses and government to regognize the difference between identification and authentication? SSN or any other non-secret is not an authentication token and MUST NOT require any protection to keep it from public. One simple method would be to pass a new law that practically says that "SSN number cannot be used as an authentication". As a result, anybody using the SSN for authentication would have no authentication at all, according to law. Hopefully that would be clear enough even for dumber businesses.

Comment Re:What did open source software ever do for anybo (Score 1) 640

Mozilla can't implement h.264.

Why not? It's easily licensable, and Mozilla has a pretty decent income.

Because even though Mozilla has some money, it cannot license H.264 with GPL compatible terms. They need a license that allows end users to modify and redistribute modified versions of Mozilla products (e.g. Firefox). The modified version could be a GPL licensed H.264 codec which has absolutely no browser code remaining. The patent owner, MPEG LA, is not happy with such licensing terms because if they license H.264 to Mozilla with such terms, every free software project has a license. Or if they grant such license, Mozilla is not rich enough for it...

Why are software patents stupid? Because you say so? Do you think there should be a difference between software and non-software patents? Why?

I'm not parent poster that claimed such but here're my ideas about this:

  • Software as an engineering field advances much more rapidly than say medical engineering or biology. Using similar expiration terms for all fields is insane. The 20 year monopoly granted for a new drug may make some sense if research takes 5 years and obtaining national license for using the said drug takes 10 years of field testing. A new video codec may require 2 years of research, manufacturing it takes zero years and it will be deprecated in 2 years after a better codec comes available. And still that codec gets the same 20 year monopoly as the new drugs. Does not make any sense to me. Notice that such deprecated but patented video codec prevents further research using any of the patented methods as a part of the new video codec (or any non-related software).
  • Software patents make no sense because the patent does not disclose the invention. Look at any software patent that you can find. Does it disclose enough information to implement the invention (the piece of software that is being patented)? I haven't seen such software patent. In every case the patent has been obfuscated enough to be not helpful for programming. In fact, in many cases it's practically impossible to even regognize the patented invention even if you had an infridging implementation done by yourself. See my older comment about software patents.

I'd be happy with software patents given following further restrictions:

  • The patent MUST include reference source code (NOT in pseudo-language)
  • Software patents always expire in maximum of 5 years
  • If the patent owner does not distribute (or sell) a software containing the invention, the patent automatically expires in one year (prevents patent trolls).

Notice that originally US patent system required implementation of said invention to be presented to patent officer. This requirement was then dropped because of heavy costs (for the officers or inventors, I don't know). With software, the cost of copying the invention to the patent officer is less than filing the patent so there is really no reason not to require reference implementation.

Comment Re:gnome changes too often (Score 2, Informative) 455

No. Fundamentally, what is a web browser? It's a program that sends out tcp/ip packets, waits for the response, and displays stuff on screen. While there have been many new features added to windows over the years, there isn't anything fundamental that has changed that would impede a web browser from running on an older version of the win32 api.

Basically true, but the devil is in the details. Latest Firefox version does stuff such as display downloaded fonts on web pages without installing said fonts in the system (requires a new API), scan downloaded files for viruses (has 2 APIs, win2000 requires the old one, newer Windows versions require the newer API), allows theming the browser (could use native uxtheme library API if supported only winxp or newer), native UNICODE support is better with newer versions, too.

For combination of wget and cat the OS version does not change much, for OS supported rendering and integration features, the OS version is very important. The linux version of Firefox already requires pretty recent glibc and cairo libraries.

Comment Re:Surely it goes both ways? (Score 1) 335

What would stop Sun from merging any interesting development made on any of these forks back into their version?

The fact that Sun bought the MySQL for acquiring rights to the source. That allows them to sell MySQL with licenses other than GPLv2. If they merge code from any open source fork (they're all GPLv2 because that's the only choice MySQL license allows for a fork), then Sun would be forced to distribute under GPLv2 only. Clearly this is not what they want because they paid $1,000,000,000 for the source. If they wanted GPLv2, they had it for free (as in beer!) already.

Comment Re:Dunno (Score 2, Insightful) 421

ext3 is also delaying writes. The bug is that ext4 is not delaying renames to happen after writes. Instead renames happen immediately, and guess what, they spin your hard drive up, then you get to wait 60 second until real data starts to be written. Oh and if you lose power or crash during these 60 seconds, you loose all data - new and old. Oh and you common desktops programs do that cycle several times a minute.

Excuse my language, but why the fuck are those "common desktop programs" writing and renaming files several times a minute? I understand that files are written if I change any settings but this is something different. Perhaps there should be some special filesystem that is designed to freeze the whole system for 1 second for every write() any application does. Such filesystem could be used for application testing. That way it would be immediately obvious if any program is writing too much stuff without a good reason.

The EXT4 is doing exactly the right thing because it's never actually writing any of those files to the disk. Because those files are constantly replaced with new versions, there's no point trying to save any unless the application ask so. To do that, the application should call fsync(). Otherwise, the FS has no obligation to write anything in any given order to the disk until the FS is unmounted. A high performance FS with enough cache will not write anything to disk until fsync() unless the CPU and disk have nothing else to do (and even then, only because it probably improves the performance of possibly following fsync() or unmount in the future).

Comment The problem is/was in the EXT3 in the first place! (Score 2, Informative) 421

The POSIX specifies that closing a file does not force it to permanent storage. To get that, you MUST call fsync() .

So the required code to write a new file safely is:

  1. fd = fopen(...)
  2. fwrite(..., fd)
  3. fsync(fd)
  4. fclose(fd)

The is no performance problem because fsync(fd) syncs only the requested file. However, that's in theory... use EXT3 and you'll quickly learn that fsync() is only able to sync the whole filesystem - it doesn't matter which file you ask it to sync, it will always sync the whole filesystem! Obviously that is going to be really slow.

Because of this, way too many software developers have dropped the fsync() call to make the software usable (that is, not too slow) with EXT3. The correct fix is to change all the broken software and in the process that will make EXT3 unusable because of slow performance. After that EXT3 will be fixed or it will be abandoned. An alternative choice is to use fdatasync() instead of fsync() if the features of fdatasync() are enough. If I've understood correctly, EXT3 is able to do fdatasync() with acceptable performance.

If any piece of software is writing to disk without using either fsync() or fdatasync() it's basically telling the system: the file I'm writing is not important, try to store it if you don't have better things to do.

Comment Hopefully displays (Score 1) 596

I want megapixel war on displays (desktop monitor, EVF, mobile phone, etc). 12 Mpixel CCD sensor does not matter over 10 Mpixel CCD sensor in a small device but a 2 Mpixel display is much better than a 0.2Mpixel displays than we currently have in viewfinder or mobile phone. Even 1920x1200 computer display is barely 2 Mpixel display (or perhaps it's "6 Mpixels" if you count each subpixel).
Graphics

Silverlight On the Way To Linux 475

Afforess writes "For the past two years Microsoft and Novell have been working on the 'Moonlight' project. It is a runtime library for websites that run Silverlight. It should allow PCs running Linux to view sites that use Siverlight. Betanews reports 'In the next stage of what has turned out to be a more successful project than even its creators envisioned, the public beta of Moonlight — a runtime library for Linux supporting sites that expect Silverlight — is expected within days.' Moonlight 2.0 is already in the works."

Comment Re:SO much of it is wrong (Score 1) 452

A good designer and a bad coder creates better output then a poor designer and a good coder.

A good coder will ignore poor design done by a poor designer. There does not exists a good coder that would not be at least an average designer, too. [Granted, if you have a good design done and no coder, you are closer to ready (in process, not necassarely in calendar time) than with no design and a good coder. Yes, poor design is same as no design.]

Open Design and Open Specification are far more important then Open Source.

I agree in theory. However, in practice, most of the time the only specification exact enough is the implementation. See hardware drivers for example: there's always some specifics that are not documented anywhere else but in the the driver source. Granted, it would be better if all those specifics would be defined in the hardware documentation but more often than not, the documentation is not detailed enough to cover all the cases. One could argue whether that is because the design is poor or because the design is not detailed enough, though.

All programming is, after all, just documenting the desired behavior in approriate detail for a simple calculator to be able to comprehend. Some software designers dream about the future where software can be dragged-and-dropped together from simple pieces. If you are trying to describe some process in great detail, would you rather use some language (like English, C++, Java) or would you like to drag images around with a mouse? I'd prefer using some language suitable for the problem. Do you believe that in the future coders do are not required?

I do both software design and coding.

Software

Ext3cow Versioning File System Released For 2.6 241

Zachary Peterson writes "Ext3cow, an open-source versioning file system based on ext3, has been released for the 2.6 Linux kernel. Ext3cow allows users to view their file system as it appeared at any point in time through a natural, time-shifting interface. This is can be very useful for revision control, intrusion detection, preventing data loss, and meeting the requirements of data retention legislation. See the link for kernel patches and details."

Slashdot Top Deals

One man's constant is another man's variable. -- A.J. Perlis

Working...