Forgot your password?
typodupeerror

Comment: The same public key can map to many private keys (Score 2) 44

by Simon Brooke (#47962839) Attached to: Researchers Propose a Revocable Identity-Based Encryption Scheme

Private key and public key are factors in a two factor mathematical relationship.

So there can potentially be many (possibly infinitely many, I haven't tried to prove this) valid private keys for any given public key.

So I can see that, given the public key john@doe.com, I can see that there could be potentially many private keys. I see how you could brute force selecting a private key that matched your public key, and I can see that, depending how the brute-forcing is done, it would not be determinate that an attacker also trying to brute force a private key from the same public key would not come up with the same private key.

What I can't see is how, if you have a message which unlocks with the public key, how you can tell whether it was locked with the 'authentic' private key or with an attackers' inauthentic private key.

Anyone?

Comment: Re: only manual lenses? (Score 1) 50

by dgatwood (#47962725) Attached to: Video Released, Crowdfunding Underway For Axiom Open Source Cinema Camera

BTW, 720p is just shy of a megapixel, not a third of one. You're probably thinking of the old 720x480 format used for widescreen standard def content. :-) Not that a megapixel is all that amazing, either, mind you.

Let me correct myself further. 720p is just shy of a million full-color pixels. On a Foveon sensor, depending on how you count megapixels, that might be the same number. On a Bayer-filtered sensor, it's more in the neighborhood of 3 MP, because each color channel has about a third the spatial resolution of the sensor as a whole.

Comment: Re: only manual lenses? (Score 1) 50

by dgatwood (#47962675) Attached to: Video Released, Crowdfunding Underway For Axiom Open Source Cinema Camera

It will indeed be interesting to see if that happens. If I was to bet on it I would say that it's not going to happen in cinematography. Indeed moving focus through the scene is one of the tool that a cinematographer uses to achieve the desired artistic effect. It is hard to imagine that a computer algorithm would be able to predict how fast or slowly we want to bring objects in/out of focus and how much smoothness we want in these transitions.

That's an interesting question, and you're right that for that particular effect, you're probably better off doing it manually—preferably with a long-throw manual lens and a reasonably long stick attached. But that's likely to be an occasional thing, with either static focusing or traditional subject-following focusing used for probably 99% of your shots; if you're using focus to move from one subject to another for 99% of your shots, the viewers are likely to get nauseated rather quickly. :-D

Can autofocus beat the precision of a measuring tape?

Depends on how narrow the depth of field is. At large f-stops (e.g. the Zeiss 50 f/0.7 lens that Kubrick used), it can be done by hand, sure, but if you blow that up to where you can see pixels at 4K resolution, you're almost certainly going to notice the softness compared with what modern electronics could achieve, particularly if the subject is close to the lens and he or she decides to move a fraction of an inch. Mind you, that's a rather extreme case. :-) At more sane stops, it's not quite that bad. It's still a lot of work, though—work that's largely unnecessary with a decent, modern, subject-tracking AF mechanism (even without eye tracking to set the starting point). It's not that focus pulling can't be good enough, so much as that the extra work to make it good enough is significant, and it makes little sense to bother with that when a simple circuit can do at least as good a job (if not better) without all that effort. :-)

Lastly focus depth even though sometimes shallow isn't nil in most circumstances so small focusing errors might not have an adverse negative effect on the result.

That's true. With that said, the higher the resolution, the more visible that small effect becomes. At some point, you start to swear because the soft focus limits your effective resolution, and all those extra pixels are just taking up more space on disk without any real benefit. I'm not quite sure where that magic point is for manually focused movies—you'd have to ask somebody who regularly does film scanning and media ingestion for their take on it. Obviously it would depend on the f-stop, the distance to the subject, the film format, the focal length, and the skills of the person doing it. :-)

Chances are you've got the experience here while I certainly don't. However my impression so far was that what you're saying is true but not to such an extreme extent. People do use old lenses including those which are much older than a decade on modern still cameras and the results they're getting certainly don't look like they were shot with resolution of about 1/3 mega pixel provided by 720p video. I think the truth must be somewhere in between and the old glass must still be a valuable tool. After all if that glass was indeed that bad why wouldn't the prices not be nil today? Some of these lenses still command amounts of money which one on a budget would think twice before spending.

I'm probably being a bit on the cynical side; I'm sure there are some older lenses that are usable at 4K. The point I was trying to make was that the newer lenses are breathtakingly better at high resolutions—maybe not at 4K, but long before you get to 8K. And their handling of bad lighting conditions (lens flare, for example) is just amazing compared with the older lenses. Unless, of course, you're into that whole lens flare thing. (Yes, I'm talking to you, Mr. Abrams.)

BTW, 720p is just shy of a megapixel, not a third of one. You're probably thinking of the old 720x480 format used for widescreen standard def content. :-) Not that a megapixel is all that amazing, either, mind you.

Comment: Re: Alright smart guy (Score 1) 378

by dgatwood (#47962209) Attached to: Ask Slashdot: Is iOS 8 a Pig?

I can't imagine why Texas Instruments' lack of support would be relevant in any way unless the phone vendor seriously lacked foresight. Most hardware manufacturers won't ship a closed binary blob that they don't build themselves. They may not be able to make the sources available, and it may be guarded under piles of NDAs so tall that the falling tower of paper would kill anyone who tried to leak it, but that doesn't mean they don't have the sources. I can't imagine that even Samsung would put themselves in such a vulnerable position.

Then again, if Samsung really doesn't care much about long-term support, maybe they would.

*shrugs*

Comment: Re:Alright smart guy (Score 2) 378

by dgatwood (#47962169) Attached to: Ask Slashdot: Is iOS 8 a Pig?

Backups make no difference. GP is correct. A backup of an iOS device includes only user data and apps, not the OS itself, because it is always more reliable to install the OS from a known-good source, and you wouldn't want those bits getting overwritten by corrupted versions from a backup.

And as I understand it, iTunes won't sign the firmware for your device unless Apple says it should. And Apple stops letting it do so shortly after the next OS comes out. Therefore, short of a jailbreak and some sort of forced downgrade from within iOS itself, it is not possible to reinstall a non-current version of iOS even if you've kept the old IPSW file (except on older devices where no newer version is available).

Comment: Re:Alright smart guy (Score 1) 378

by dgatwood (#47962117) Attached to: Ask Slashdot: Is iOS 8 a Pig?

Go take a look at which devices support "mavericks" and come back and post your findings.
HINT: Many Mac Pro users were unhappy with the "line" at "2008 and newer"

The list for Mavericks was identical to the list for Mountain Lion. They were mad a year earlier than that.

The way Apple determines support tends to be based upon hardware functionality. Mountain Lion dropped the 32-bit kernel, which means only systems with a 64-bit EFI could run it. Of course, for as long as I can remember, folks have been hacking OS X to run on older, unsupported machines—usually by hacking up the installer and replacing the missing drivers and platform experts, IIRC.

In the case of Mountain Lion and later, such a hack would also require writing a custom bootloader, because the 32-bit EFI can't load a 64-bit bootloader, and the Apple-provided 32-bit bootloader can't load a 64-bit kernel. It seems likely that the non-EFI kernels and bootloader used in the Hackintosh community would "just work" in that regard, but I've never tried it.

Either way, I'm pretty sure the discussion was about phone hardware, rather than computers.

Comment: Re: only manual lenses? (Score 1) 50

by dgatwood (#47956915) Attached to: Video Released, Crowdfunding Underway For Axiom Open Source Cinema Camera

...the connection between the manual focusing ring and the lens part is electronic rather than mechanical...

Just because a lens has electronic focus doesn't mean that it doesn't have mechanical manual focus. At least on the Canon side of things, focus-by-wire lenses are rare. Most of the focus-by-wire lenses are old, discontinued models like the 50mm f/1.0. The only current focus-by-wire lenses I'm aware of are their STM lenses (mostly low end) and the 85mm f/1.2L II. The rest of their L line is mechanical, including the 50mm f/1.2 L (popular for movie work), the 135 L II, their various zooms, etc.

The big advantage that fully manual lenses have over autofocus lenses when it comes to manual focusing is that most manual lenses have a longer throw. This makes it easier to get a more precise focus when focusing manually. They don't do that on autofocus lenses because it would make focusing slower.

With that said, I think the industry's obsession with manual focus is badly misplaced. When you're dealing with 4K video, you want the focus to be right, not just close, and autofocus is a lot more precise than any human can possibly be, even with static subjects, with the best long-throw lenses, and with a separate person doing nothing but handling the focusing. The only thing holding back autofocus for video use was the slowness of contrast-based autofocus (and its tendency to seek). With the advent of on-die phase-detect autofocus capabilities, that limitation is rapidly disappearing. Add a bit of eye tracking into the mix, and I think you'll find that within the next ten years, nobody in their right minds will still be focusing manually, particularly when they're shooting 4K.

it is often better if the aperture can be set in a step-less fashion

AFAIK, that's fairly rare even in fully mechanical lenses unless they've been modified. Perhaps dedicated cinema lenses are different in that regard. I'm not sure. But even some of my old screw-mount lenses from back in the black-and-white TV days had mechanical stops, so I'm guessing stopless lenses aren't exactly common.

So I can conclude that while having a powered mount is very much desirable on Axiom cameras (and so it will come just a bit later) it is also true that the old lenses are in fact more suitable to the task of shooting movies and so the decision to deliver a fully manual Nikon-F mount first is justified

The problem with old lenses is that they're designed for a world where cameras had relatively poor spatial resolution, and for much less reflective sensor material (film). I enjoy playing with old lenses on a 6D, and they create an interesting artistic feel, but they don't even approach the level of flare resistance, sharpness, etc. that you'd want for a digital 4K cinema camera. So if you're limiting yourself to mostly old lenses, you might as well limit yourself to 720p as well, because you'll be lucky to out-resolve that with most lenses designed more than about a decade or so back.

And if you have the money for modern, full-manual cinema lenses, chances are you aren't in the market for anything less than a highly polished, turnkey camera system.

So I really think that they need to at least lay the groundwork (in hardware) by making the plastic plates in front of the sensor removable and by including USB and DC connectors near the back side of that plate so that the system will be readily extensible in the future. That small change shouldn't require a huge amount of effort, and it will future-proof the design in a way that nothing else will.

Or, if USB isn't feasible, a high-speed serial port capable of at least 230 kbps would probably be good enough.

Just my $0.02.

Comment: Re:"compared to consumer grade cameras" (Score 1) 50

by dgatwood (#47956721) Attached to: Video Released, Crowdfunding Underway For Axiom Open Source Cinema Camera

I think you missed my point. If they don't provide an electrical interface near the front of the hardware as part of the core design, there's no way that users can develop any electronic mount hardware, because there's no way to communicate with said mount hardware... or at least none that doesn't involve a box fastened to the back of the camera with a wire wrapped all the way around the camera to the front.

That said, so long as they provide a multipin connector with full-voltage DC and USB pins on the interior of the body, just inside the mount, that's good enough to make it possible to add electronic mount hardware by replacing the mount with a redesigned mount. That's the minimum that the core developers must do. If they don't, first-generation hardware users will be stuck with that limitation forever, and folks will try to work around the lack of that hardware with disgusting workarounds, which future hardware users will then get stuck with... probably forever. :-)

Comment: Re:"compared to consumer grade cameras" (Score 2) 50

by dgatwood (#47955691) Attached to: Video Released, Crowdfunding Underway For Axiom Open Source Cinema Camera

The biggest problem I see with this is that the lens mount system appears to be purely manual. This seriously limits the lenses you can use, because these days, 99% of lenses don't have mechanical aperture control. They really need to have some sort of adaptable lens electronics in this thing, so that people can design adapters that actually support modern lenses, similar to the Metabones adapters for NEX. The absolute minimum requirement for such things is a set of electronic contacts inside the lens mount that are controllable through software.

I think if I were designing a camera system to be extensible, I'd make the lens contacts speak USB 2.0, with appropriate short-circuit protection for when the lens is being attached to the mount. That way, the adapters could be very basic USB controllers that speak a particular lens protocol, rather than having to convert one arbitrary lens protocol to another (potentially incompatible) protocol.

There is one caveat to using USB, though. You'd need to also provide a 7.2VDC pin on the lens mount. Many camera lens systems require that much voltage to drive the focus motors, and it would suck to have to boost the voltage from a 5VDC USB supply in an adapter, particularly given that you probably already have the higher-voltage DC supply floating around inside the camera.

Comment: Re:Edge routers are expensive (Score 1) 85

by dgatwood (#47948583) Attached to: Why Is It Taking So Long To Secure Internet Routing?

First, I'm not talking about adding any additional gear. There's no reason that what I'm talking about can't be handled entirely in the DSLAM or head end or whatever and in the existing CPE hardware that talks to it.

Second, I wasn't really talking about changing the CPE for business customers with fiber connections anyway. They're not (usually) the ones who are constantly on the phone with tech support saying "The Internet is down" when really, they just accidentally unplugged something. I'm talking about providing smarter, preconfigured cable modems and DSL modems for home use.

Comment: Re:Edge routers are expensive (Score 1) 85

by dgatwood (#47924071) Attached to: Why Is It Taking So Long To Secure Internet Routing?

I keep thinking that if an ISP really wanted to cut costs, they could proactively monitor their network for problems:

  • Provide the CPE preconfigured, at no additional cost to the customer. (Build the hardware cost into the price of service.)
  • Ensure that the CPE keeps a persistent capacitor-backed log across reboots. If the reboot was caused by anything other than the customer yanking the cord out of the wall or a power outage, send that failure info upstream. Upon multiple failures in less than a few weeks, assume that the customer's CPE is failing, and call the customer with a robocall to tell them that you're mailing them new CPE to improve the quality of their service.
  • Detect frequent disconnects and reconnects, monitor the line for high error rates, etc. and when you see this happening, treat it the same way you treat a CPE failure.
  • If the new hardware behaves the same way, silently schedule a truck roll to fix the lines.

If done correctly (and if clearly advertised by the ISP so that users would know that they didn't need to call to report any outages), it would eliminate the need for all customer service except for billing, and a decent online billing system could significantly reduce the need for that as well.

Comment: Re:Article shows fundamental lack of understanding (Score 2) 182

by dgatwood (#47921615) Attached to: Why Apple Should Open-Source Swift -- But Won't

They won't see people switching to Swift uniformly. There are trillions of lines of code written in Objective-C, and programmers already know it and are comfortable with it. There are no tools for migrating code from Objective-C to Swift, much less the hodgepodge of mixed C, Objective-C, and sometimes C++ that quite frequently occurs in real-world apps, so for the foreseeable future, you'd end up just adding Swift to your existing apps, which means you now have three or four languages mixed in one app instead of two or three, and now one of them looks completely different than the others. I just don't see very many developers seriously considering adopting Swift without a robust translator tool in place.

I do, however, expect to see Swift become the language of choice for new programmers who are coming from scripting languages like Python and Ruby, because it is more like what they're used to. In the long term, they'll outnumber the Objective-C developers, but the big, expensive apps will still mostly be written in Objective-C, simply because most of them will be new versions of apps that already exist.

BTW, Apple never really treated Java like a first-class citizen; it was always a half-hearted bolt-on language. My gut says that they added Java support under the belief that more developers knew Java than Objective-C, so it would attract developers to the platform faster. In practice, however, almost nobody ever really adopted it, so it withered on the vine. Since then, they have shipped and subsequently dropped bridges for both Ruby and Python.

Any implication that Swift will supplant Objective-C like Objective-C supplanted Java requires revisionist history. Objective-C supplanted C, not Java. Java was never even in the running. And Objective-C still hasn't supplanted C. You'll still find tons of application code for OS X written in C even after nearly a decade and a half of Apple encouraging developers to move away from C and towards Objective-C. (Mind you, most of the UI code is in Objective-C at this point.) And that's when moving to a language that's close enough to C that you don't have to retrain all your programmers.

Compared with the C to Objective-C transition, any transition from Objective-C to Swift is likely to occur at a speed that can only be described as glacial. IMO, unless Apple miraculously makes the translation process nearly painless, they'll be lucky to be able to get rid of Objective C significantly before the dawn of the next century. I just don't see it happening, for precisely the same reason that nine years after Rails, there are still a couple orders of magnitude more websites built with PHP. If a language doesn't cause insane amounts of pain (e.g. Perl), people are reluctant to leave it and rewrite everything in another language just to obtain a marginal improvement in programmer comfort.

C'est magnifique, mais ce n'est pas l'Informatique. -- Bosquet [on seeing the IBM 4341]

Working...