Comment Re:What I Want To Know... (Score 1) 658
OS X no longer ships Samba as of 10.7 (due to the switch upstream to GPLv3). So yes, if by broken you mean 'was replaced with an entirely new project', it will remain so.
OS X no longer ships Samba as of 10.7 (due to the switch upstream to GPLv3). So yes, if by broken you mean 'was replaced with an entirely new project', it will remain so.
but Apple already granted them legitimacy by signing an agreement with Lodsys.
I believe all of the large companies had rights to the patent as part of the Intellectual Ventures portfolio, and part of the deal of the sale of the patent to Lodsys included that companies who had license to the portfolio retained the right to use the patent.
I believe apple has a head-start on the technology, not an exclusive. Intel's stock parts do not include thunderbolt, stock video cards do not support thunderbolt and I believe most PCs have HDMI or at best Displayport, not Mini-DP.
Until PC vendors want thunderbolt enough to get it integrated into third party video cards, or until Intel ships Ivy Bridge which is supposed to include Thunderbolt for its integrated video, Apple will be the only vendor producing computers with Thunderbolt.
Even then, it looks like Sony is interested enough in the technology to try to create a proprietary version of Thunderbolt which uses additional connects on a USB port, and most likely does not support the DisplayPort video channel.
Firewire 800 is slower, what about firewire 3200?
Damn it, uid envy
Except those do not have access to the google marketplace.
People seem to think that this was done for piracy, or done by extraordinarily clever hackers through a lot of time and pain.
Thats all bunk. The whole reason people hack these master keys is to sell a butt-load of t-shirts.
I know in particular that the NIO system on windows used to use evented sockets rather than IOCP, which among other things meant that IO events were shoved into the windows message pump, and only 32/64 sockets could be handled at one time.
It’s not just that the bars are not a standard unit of measurement, but they are not necessarily even measuring the right thing.
Just measuring the strength of the signal does not take into account the amount of noise in the signal, which is arguably more important. The signal strength also only measures one side of the equation, being the cell tower to the phone. Finally, a raw measure of signal strength ignores that different antenna designs and phones may have different lower bounds where they can still make and maintain a call.
I just accept bars for what they are - an indicator of how likely it is that you will be able to make and maintain a call. Trying to compare ‘bar’ measurements between phones may not even work between models by the same vendor.
Apple’s big problems with the iPhone 4 is that they had their scale really messed up, and that became really obvious because the new antenna design shows a greater degree of worst-case attenuation than their previous models. As a result, it was possible to go from five bars to one bar when you attenuate the antenna in an area with only moderate signal strength - because it was reporting moderate signals with five bars
Ironically, the new scale for the bars is way more useful for people because it has grown considerably wider - but Apple will probably have a drop in customer satisfaction because people will assume fewer displayed bars indicates the phone is lousy at picking up the signal.
It blows my mind that they didn't come up with a simple non-conductive coating for the exposed antenna to reduce this problem in the first place.
The very thing that makes the steel ‘stainless’ makes it hard to coat. But the natural oxide is a insulator, so in a sense it is self-coating the longer you expose it to the air.
Why did they feel the need to inflate the bar readings in the first place? So the iPhone 4 could gain a reputation for having better signal in more places? That sounds a little squirrely to me.
From the QA, Jobs said that they wrote the software (including the signal strength display algorithm) themselves, and it has been there since the first iPhone. Some people claimed that the 3G had a patch to increase signal strength that actually was this particular bar algorithm, something Jobs refuted
How are Mac users with Mercury Extreme SSD's or the Mushkin ect units doing?
Based on my research, the sandforce-based disks have a minimum reserve space of 7% (so say, 16GB of a 256GB disk). When you perform a write, it merges the old data and your changes to create a new block on the SSD, and then maps that into the computer’s view of the disk - the computer only sees new data, but the address now points to a different part of the physical disk. The ‘old’ block is cleaned by the drive whenever the drive becomes idle again, and added to the reserve pool, most probably to the end to promote more even wear.
If you did a large burst of writes (say, ~16GB of data on the 240GB-rated drives) you should see performance plummet, as the drive is kept busy and hasn’t been able to blank out the dirty pages in the reserve space. I imagine this is the purpose of higher performance, 200GB-rated drives; they just reserve a larger amount of space to deal with these sorts of usage bursts.
If you have TRIM support, it should be possible for the deletion of files to cause the reserve space to grow, as the TRIM command is issued by the filesystem layer to indicate to the drive that there is no longer relevant information on that section of disk.
TRIM is really important for drives without reserved space, as once you fill up the disk every write will first require a section of the flash to be wiped clean.
Making and publishing a developer/project-specific branch is what distributed version control systems were designed for. If the constraint was that individual branches weren't being captured, the system was being used wrong
All great discoveries are made by mistake. -- Young