Why does a game like this need to have access to all your contacts? I'll load it when I can block this kind of nonsense.
Why does a game like this need to have access to all your contacts? I'll load it when I can block this kind of nonsense.
As someone that was part of the team that pioneered iris recognition in the late 80s, I can say that this is totally the fault of the current software. We had various techniques implemented from the start that would prevent this kind of problem. Controlling multiple IR leds to provide a changing specularity pattern. This would guarantee that the eye was shaped as expected, rejecting all flat copies. Checking for the normal pulsation of the pupil would reject dead eyes. There were various other checks, like verification of facial features (there were two eyes, etc.). Checking for the proper occlusion of the eyelids was also part of the process. With only a few captures our testing has not shown this kind of issue (and we did try perfect eye replication). I've heard this kind of thing from the beginning, nothing new here. Again, we implemented all of these features in our original work, but implementors felt that these should not be included in their products.
You hit the nail on the head. If it weren't for a $200 "loyal customer" incentive I wouldn't have bought my Priv. That said, I'm not sorry I did. BB10 was pretty good, not nearly as synergistic as WebOS, but very usable and a far cry better than Android's jump from app to app approach. Unfortunately, there was a dearth of apps and the Android emulator only partially filled the bill. The Hub implementation on Android is getting better with each release, but it is still a far cry from the experience on BB10.
Part of AT&T and Verizon's problem selling this unit is they take way too long to update. The Priv's software was pretty immature when it was released but got better (smoother and faster) with each update. Non-carrier phones have been upgraded to Marshmallow for months and there is still no sign of it for the carrier locked phones.
One of the things that may scare off people is the locked bootloader. If Blackberry stops supporting the Priv then there is no way to load an alternative OS. This is a far cry from my WebOS Pre 3 that has an active homebrew community making sure that as standards change (carrier APIs, Google APIs, etc.) the phone continues to be a winner. If it weren't for the fact that there isn't any modern hardware (LTE, etc.) support I would keep rocking with WebOS.
I've been working at home on and off for 35 years (mostly on). I've been very successful at making my work at home experience both productive and pleasurable. When I start working I get "in the zone" and produce high quality work in short order. Then I tried to put together a team, each working at home, to do development, QA, and documentation for various projects. Documentation was the only piece that I could claim worked. I found that, left on their own, my development team became unproductive and the QA team drifted away from the goals I specified and documented. I ended up doing much more micro-managing than I imagined to keep the team productive and focused. My productivity went dramatically down and the quality of my work was suffering from all the interruptions.
As for finding work to do at home, I ended up doing it by circumstance. The company I worked for shut their doors at a really bad economic time. I started a company to develop software products, but ended up mostly consulting and designing hardware and software under contract to keep bread on the table. I developed a reputation for quality work so when a former client started up a new company, he didn't balk on my request to continue to work at home across the country.
The hardest part of working at home is training your family that you shouldn't be disturbed during work hours. I don't know if I would entertain someone working for me at home again unless I saw the same commitment that I have. The worst part of working at home is the isolation from your colleagues and co-workers. I think my company keeps my visits to a minimum because I try to make up the time I wasn't interacting when I come.
I hung onto my WebOS Pre 3 phone until I could no longer find any more Verizon devices on ebay (the last one I cannabilized two broken ones into one working one). For the last few years I was trying to find a suitable replacement. I tried iOS, Android, and Windows phones but none could match the elegance of WebOS (yes, I know that all stole many features from it over the years). I ended up with a BB10 device which provides a well thought out design. It feels like it might be what WebOS would be if development didn't stop. It's "Just Type" implementation is superb. BB10 seems to be the same whipping boy in the media that helped kill WebOS.
Personally, I think this is a rumor due to the "Knox" security created by Blackberry for Samsung Android phones. How many times has the media reported falsely that Blackberry is about to be bought by Samsung, Microsoft, etc. John Chen has shown vision and grace in dealing with the detractors, even when they take things he says out of context and make him look like he's giving up on Blackberry. He's not afraid to push the envelope on phone designs, just look at the Passport. I'm hoping that the slider keyboard phone that comes out this year will be a Pre 3 killer and available on Verizon. It may be enough for me to leave Verizon if their customer sales and support bad-mouth it like they always do for anything that isn't Android or Apple.
If you like your Android or Apple phone then fine, but don't bash something just because it's not mainstream. I can guarantee that Blackberry phone and BB10 features are already being copied as we speak.
Kurt Vonnegut wrote about just such an event in a short story in the book "Welcome to the Monkey House". In that story he suggested that families would be confined to living together in a single house, with pecking order dependent upon age ranking. The eldest got to pick what to watch, got to eat first, etc. In this story one of the family members decides to water down the elder's anti-aging medicine so he would age and die. It has a strange and interesting twist at the end so I won't spoil it.
Thanks for the information. I spent a little time reading about this. However, at some point the model predictions have to be matched with real-world results. This becomes difficult as climate models predict long term trends, not actual results in a short term. This opens them to criticism and denial claims. It's a tough sell to people without a technical math and science basis.
Anyone who develops simulation models will tell you there are tradeoffs and unknowns that cause errors. The real issue is how significant are the errors. Weather models fall apart quickly as the predicted time frame get's large. Anyone trust the weather prediction for several weeks away? Global warming predictions are in the same vein because there are too many possible variations to the system. What the models can tell us is, given the things we know about, trends emerge.
What I find amazing is that people without understanding the limitations of the models will take them as fact or fiction when they are somewhere in-between. Because one weather model accurately predicted a hurricane when all others missed it everyone started using that model. The next hurricane was completely missed by that model, whoops!
Personally, I believe that there is substantial empirical evidence of GW even if the model predictions are off.
I'm very confused. Wasn't Carla Fiorina an instrument of HP's down-slide with her involvement in the "Pre-texting" scandal where she hired private investigators to spy on the other board members? How soon we forget. It was a similar situation with RCA's board near it's end that pushed the decision to sell to GE.
After getting my BSEE I begain my career designing integrated circuits, I soon started writing software to aid me in design and then migrated into the in-house design automation software group, working on projects such as gate-level simulations and circuit synthesis. I tried to get into the computer science program for my masters in the BIG local university. I was told flat out to forget it as EEs didn't have the necessary background to get into the program. I then went to another school where I completed masters in Biomedical Engineering, Electrical Engineering, and Computer Science. It's let me work productively with Physicists, Mathemeticians, Engineers, and Computer Scientists. There is room for all to coexist and learn from each other, but experience has made me skeptical of pretty much anything my co-worker's say until I do the research myself. That skepticism has served me well throughout my career.
I think that Bill's generalization should be taken with a grain of salt until actual data supports his suppositions.
It is possible. With LG getting out of the plasma market, I found a new 60" one for $400. My only complaint is that LG has always been stingy with their inputs. This one only has one HDMI input.
A similar story was told to me about Joe Weisbecker when he was working in the RCA research laboratories. He came to management with an idea for a general purpose video game system. After rejection, he built it anyway in his garage and called it FRED (something like fun, recreational, education device).
When microprocessors just started taking off, management came back to Joe and made FRED into it's first microprocessor, the 1801 and RCA created it's first video game system called Studio. The 1800 family had a very intriguing architecture. It had 16 general purpose registers. And by general purpose, I mean that you had to specify which one would be the program pointer, which one would be the stack pointer, and so on. You could change them at will in your program so you could switch the program pointer register to make a subroutine call with virtually no overhead as long as the last subroutine instruction put it back to the calling procedure's pointer. Putting a value in the accumulator automatically set the status flags. It took me many hours make my first 8008 program work since I was expecting the "zero" flag to be set when I loaded the accumulator with a zero value, silly me. It also had an instruction pipe so almost every instruction (except for long ones) took exactly 8 clock cycles (long took 12). This made it trivial to figure out how long your program would take (just count the instructions) or write UART functionality. It was a perfect design for a micro controller. The big drawback was cost as it was fabricated in SOS CMOS so it could be radiation resistant in satellite applications.
Joe was an interesting character. I have a book he wrote that describes how computers work by using pennies.
The term to google is "Super-resolution". Unfortunately, I wrote this a long time ago and most of the techniques were based upon image pyramid processing which my employer has patented. They used this in a product for consumers back in the early 90s from their spinoff company (now defunct) called VideoBrush. Besides super-resolution, it did real-time video stitching to capture everything from panoramas to high-resolution whiteboard capture just using a handheld camera (no tripod necessary). Pyramid processing helped in allowing real-time high accuracy alignment of images with at least 10% overlap on a consumer PC.
That said, the technique is pretty straightforward:
* Capture a number of images that overlap the region of interest.
* Align the images using the appropriate degrees of freedom (Affine should work fine here) to sub-pixel accuracy.
* Merge the aligned images. Basically, at each upscaled pixel location, average the values from the aligned images.
If you pick up something with reasonable video resolution that can do I-Frame only then you can use multiple images to do a super-resolution still. The premise is easy... Multiple images will not cover the exact same pixel positions (unless the drone is affixed to a stationary point). You can use this fact to merge multiple images into a single one with much higher resolution than any of the single images. The more images that you can overlay, the higher the resolution you can squeeze out.
The trick is to have good alignment and warping algorithms to do the overlays. I've done this for an employer in my previous life with impressive results.
For your particular scenario iris recognition seems to be the most viable option. Iris is very fast and accurate and will not require removing gloves etc.
Iris scans are much more reliable than fingerprints. However, they don't come without issues. The capture algorithm must include:
* Dealing with occlusions. Either the top or bottom of the iris is usually occluded depending on racial origins.
* Dealing with spoofing. For this a single snapshot is not reasonable. A sequence (video) is needed in order to check for pupil pulsations that indicate a live eye. In addition, you need to do spherical eye checks so you know you're not looking at a projection. The best system I worked on used random flashes of IR illumination to cause specularities on the surface of the eye. This also aided eye positioning for finding the eye and doing these checks.
* Dealing with eye covering. Glasses and shields are a minor problem since they can distort the iris and they can reduce spoofing detection.
Wasn't there something about a PASCAL programmer knowing the value of everything and the Wirth of nothing?