Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Alright smart guy (Score 2) 504

Go take a look at which devices support "mavericks" and come back and post your findings.
HINT: Many Mac Pro users were unhappy with the "line" at "2008 and newer"

The list for Mavericks was identical to the list for Mountain Lion. They were mad a year earlier than that.

The way Apple determines support tends to be based upon hardware functionality. Mountain Lion dropped the 32-bit kernel, which means only systems with a 64-bit EFI could run it. Of course, for as long as I can remember, folks have been hacking OS X to run on older, unsupported machines—usually by hacking up the installer and replacing the missing drivers and platform experts, IIRC.

In the case of Mountain Lion and later, such a hack would also require writing a custom bootloader, because the 32-bit EFI can't load a 64-bit bootloader, and the Apple-provided 32-bit bootloader can't load a 64-bit kernel. It seems likely that the non-EFI kernels and bootloader used in the Hackintosh community would "just work" in that regard, but I've never tried it.

Either way, I'm pretty sure the discussion was about phone hardware, rather than computers.

Comment Re:Alright smart guy (Score 2) 504

IMO, a phone should be supported with (at a minimum) security updates until at least two years after the last time the manufacturer made it available for sale new. That way, anyone on a two-year contract can upgrade to a newer device before the security bug fixes stop.

Comment Stop with the SLASHVERTISEMENTS! (Score 3, Insightful) 101

I've been following this reactive programming "movement" and it's all traced back to one guy who has a consultancy in "reactive programming" This is the 4th such reactive programming post that I am aware of on /.. No where else have I seen "reactive programming" and this is the only guy I know of who is pushing it.

In addition, the /. comments are highly ciritical of this "movement"

I call on slashdot to identify what articles are slashvertisements and or are carried on special grounds.

Comment Re: only manual lenses? (Score 2) 52

...the connection between the manual focusing ring and the lens part is electronic rather than mechanical...

Just because a lens has electronic focus doesn't mean that it doesn't have mechanical manual focus. At least on the Canon side of things, focus-by-wire lenses are rare. Most of the focus-by-wire lenses are old, discontinued models like the 50mm f/1.0. The only current focus-by-wire lenses I'm aware of are their STM lenses (mostly low end) and the 85mm f/1.2L II. The rest of their L line is mechanical, including the 50mm f/1.2 L (popular for movie work), the 135 L II, their various zooms, etc.

The big advantage that fully manual lenses have over autofocus lenses when it comes to manual focusing is that most manual lenses have a longer throw. This makes it easier to get a more precise focus when focusing manually. They don't do that on autofocus lenses because it would make focusing slower.

With that said, I think the industry's obsession with manual focus is badly misplaced. When you're dealing with 4K video, you want the focus to be right, not just close, and autofocus is a lot more precise than any human can possibly be, even with static subjects, with the best long-throw lenses, and with a separate person doing nothing but handling the focusing. The only thing holding back autofocus for video use was the slowness of contrast-based autofocus (and its tendency to seek). With the advent of on-die phase-detect autofocus capabilities, that limitation is rapidly disappearing. Add a bit of eye tracking into the mix, and I think you'll find that within the next ten years, nobody in their right minds will still be focusing manually, particularly when they're shooting 4K.

it is often better if the aperture can be set in a step-less fashion

AFAIK, that's fairly rare even in fully mechanical lenses unless they've been modified. Perhaps dedicated cinema lenses are different in that regard. I'm not sure. But even some of my old screw-mount lenses from back in the black-and-white TV days had mechanical stops, so I'm guessing stopless lenses aren't exactly common.

So I can conclude that while having a powered mount is very much desirable on Axiom cameras (and so it will come just a bit later) it is also true that the old lenses are in fact more suitable to the task of shooting movies and so the decision to deliver a fully manual Nikon-F mount first is justified

The problem with old lenses is that they're designed for a world where cameras had relatively poor spatial resolution, and for much less reflective sensor material (film). I enjoy playing with old lenses on a 6D, and they create an interesting artistic feel, but they don't even approach the level of flare resistance, sharpness, etc. that you'd want for a digital 4K cinema camera. So if you're limiting yourself to mostly old lenses, you might as well limit yourself to 720p as well, because you'll be lucky to out-resolve that with most lenses designed more than about a decade or so back.

And if you have the money for modern, full-manual cinema lenses, chances are you aren't in the market for anything less than a highly polished, turnkey camera system.

So I really think that they need to at least lay the groundwork (in hardware) by making the plastic plates in front of the sensor removable and by including USB and DC connectors near the back side of that plate so that the system will be readily extensible in the future. That small change shouldn't require a huge amount of effort, and it will future-proof the design in a way that nothing else will.

Or, if USB isn't feasible, a high-speed serial port capable of at least 230 kbps would probably be good enough.

Just my $0.02.

Comment Re:"compared to consumer grade cameras" (Score 2) 52

I think you missed my point. If they don't provide an electrical interface near the front of the hardware as part of the core design, there's no way that users can develop any electronic mount hardware, because there's no way to communicate with said mount hardware... or at least none that doesn't involve a box fastened to the back of the camera with a wire wrapped all the way around the camera to the front.

That said, so long as they provide a multipin connector with full-voltage DC and USB pins on the interior of the body, just inside the mount, that's good enough to make it possible to add electronic mount hardware by replacing the mount with a redesigned mount. That's the minimum that the core developers must do. If they don't, first-generation hardware users will be stuck with that limitation forever, and folks will try to work around the lack of that hardware with disgusting workarounds, which future hardware users will then get stuck with... probably forever. :-)

Comment Re:"compared to consumer grade cameras" (Score 2) 52

The biggest problem I see with this is that the lens mount system appears to be purely manual. This seriously limits the lenses you can use, because these days, 99% of lenses don't have mechanical aperture control. They really need to have some sort of adaptable lens electronics in this thing, so that people can design adapters that actually support modern lenses, similar to the Metabones adapters for NEX. The absolute minimum requirement for such things is a set of electronic contacts inside the lens mount that are controllable through software.

I think if I were designing a camera system to be extensible, I'd make the lens contacts speak USB 2.0, with appropriate short-circuit protection for when the lens is being attached to the mount. That way, the adapters could be very basic USB controllers that speak a particular lens protocol, rather than having to convert one arbitrary lens protocol to another (potentially incompatible) protocol.

There is one caveat to using USB, though. You'd need to also provide a 7.2VDC pin on the lens mount. Many camera lens systems require that much voltage to drive the focus motors, and it would suck to have to boost the voltage from a 5VDC USB supply in an adapter, particularly given that you probably already have the higher-voltage DC supply floating around inside the camera.

Comment Re:Edge routers are expensive (Score 1) 85

First, I'm not talking about adding any additional gear. There's no reason that what I'm talking about can't be handled entirely in the DSLAM or head end or whatever and in the existing CPE hardware that talks to it.

Second, I wasn't really talking about changing the CPE for business customers with fiber connections anyway. They're not (usually) the ones who are constantly on the phone with tech support saying "The Internet is down" when really, they just accidentally unplugged something. I'm talking about providing smarter, preconfigured cable modems and DSL modems for home use.

Comment Re:Edge routers are expensive (Score 1) 85

I keep thinking that if an ISP really wanted to cut costs, they could proactively monitor their network for problems:

  • Provide the CPE preconfigured, at no additional cost to the customer. (Build the hardware cost into the price of service.)
  • Ensure that the CPE keeps a persistent capacitor-backed log across reboots. If the reboot was caused by anything other than the customer yanking the cord out of the wall or a power outage, send that failure info upstream. Upon multiple failures in less than a few weeks, assume that the customer's CPE is failing, and call the customer with a robocall to tell them that you're mailing them new CPE to improve the quality of their service.
  • Detect frequent disconnects and reconnects, monitor the line for high error rates, etc. and when you see this happening, treat it the same way you treat a CPE failure.
  • If the new hardware behaves the same way, silently schedule a truck roll to fix the lines.

If done correctly (and if clearly advertised by the ISP so that users would know that they didn't need to call to report any outages), it would eliminate the need for all customer service except for billing, and a decent online billing system could significantly reduce the need for that as well.

Comment Re:Article shows fundamental lack of understanding (Score 2) 183

They won't see people switching to Swift uniformly. There are trillions of lines of code written in Objective-C, and programmers already know it and are comfortable with it. There are no tools for migrating code from Objective-C to Swift, much less the hodgepodge of mixed C, Objective-C, and sometimes C++ that quite frequently occurs in real-world apps, so for the foreseeable future, you'd end up just adding Swift to your existing apps, which means you now have three or four languages mixed in one app instead of two or three, and now one of them looks completely different than the others. I just don't see very many developers seriously considering adopting Swift without a robust translator tool in place.

I do, however, expect to see Swift become the language of choice for new programmers who are coming from scripting languages like Python and Ruby, because it is more like what they're used to. In the long term, they'll outnumber the Objective-C developers, but the big, expensive apps will still mostly be written in Objective-C, simply because most of them will be new versions of apps that already exist.

BTW, Apple never really treated Java like a first-class citizen; it was always a half-hearted bolt-on language. My gut says that they added Java support under the belief that more developers knew Java than Objective-C, so it would attract developers to the platform faster. In practice, however, almost nobody ever really adopted it, so it withered on the vine. Since then, they have shipped and subsequently dropped bridges for both Ruby and Python.

Any implication that Swift will supplant Objective-C like Objective-C supplanted Java requires revisionist history. Objective-C supplanted C, not Java. Java was never even in the running. And Objective-C still hasn't supplanted C. You'll still find tons of application code for OS X written in C even after nearly a decade and a half of Apple encouraging developers to move away from C and towards Objective-C. (Mind you, most of the UI code is in Objective-C at this point.) And that's when moving to a language that's close enough to C that you don't have to retrain all your programmers.

Compared with the C to Objective-C transition, any transition from Objective-C to Swift is likely to occur at a speed that can only be described as glacial. IMO, unless Apple miraculously makes the translation process nearly painless, they'll be lucky to be able to get rid of Objective C significantly before the dawn of the next century. I just don't see it happening, for precisely the same reason that nine years after Rails, there are still a couple orders of magnitude more websites built with PHP. If a language doesn't cause insane amounts of pain (e.g. Perl), people are reluctant to leave it and rewrite everything in another language just to obtain a marginal improvement in programmer comfort.

Comment Re: Apple not in my best interests either (Score 1) 183

No, they're saying Apple switched because GCC's core wasn't designed in a way that made it easy to extend the Objective-C bits in the way that Apple wanted. And that could well be part of it—I'm not sure.

But I think a bigger reason was that Apple could use Clang to make Xcode better, whereas GCC's parsing libraries were A. pretty tightly coupled to GCC (making it technically difficult to reuse them) and B. licensed under a license that made linking them into non-open-source software problematic at best.

Comment Re:Will continue to be developed for other platfor (Score 2) 330

And you know what Mojang's opinion means at this point? Absolutely NOTHING. They can't tell their new owner to honor their intended promises, even if it were written into the deal. All they have to do is replace the boss with someone willing to change the company on Microsoft's behalf and POOF! It's happened with every other developer that's been bought out thus far that came out and said they were told/promised nothing would be changing.

Depends on how good their lawyers are. If they write into the contract a term that says that all rights revert to the original authors if the new owner violates such a term, then yes, they can force the new owners to honor those promises.

Comment Re: +-2000 deaths? (Score 3, Interesting) 119

Ebola may not be easy to transmit, but it sure as heck isn't hard to transmit. It's not pedantically known to be airborne, but it is believed to be spread by droplets (e.g. sneezes). There's a very, very, very fine line between the two.

And yes, I can provide citations if you'd like, but it's not like they're very hard to find with a Google search.

Slashdot Top Deals

"The medium is the massage." -- Crazy Nigel

Working...