"tepples" - you get one star for being more pedantic than I am.
Boy, this Peeple app is fun!
"tepples" - you get one star for being more pedantic than I am.
Boy, this Peeple app is fun!
The difference there is that your driving record could be based on verifiable facts taken from the public record. "You had an accident in 2013 where at trial you were found 50% at fault." "In 2012 you pled guilty to driving 75 on a 55 road when you paid your traffic ticket." This is purely random digits, assigned out of spite, fear, hate, love, admiration, or whatever. Worse, it might be digits that are bought and paid for by the account owner (hire a sock puppet army to boost your score) or as a result of an attack (hire a sock puppet army to slag someone because they cheated on your sister, or because they dress funny, or simply because you're a sociopathic troll.)
I wonder if they are going to allow anonymous reviews? The trolls will be out in force.
Even if they don't, the sock puppet tsunami that results will make even Slashdot's moderation system look honest by comparison.
The merchants here change readers every three years or so.
That's because the terminals are required to be more and more secure to protect the mag stripe data, and their older terminals were out of compliance with the standards. This has been a massive exercise in kicking the can down the road.
With chip cards, the game changes fundamentally when security moves into the chip. But until the whole ecosystem of cards, mag stripes, and web entry of account numbers gets fully converted to EMV, the data passed out of the chip can still be stolen and abused at some of the weakest links. Hopefully the "liability shift" will convince these weakest links they need to convert to chip readers before they get stung with crippling losses because they allowed their systems to be used for fraud.
The problem is that there are six million merchants out there with mag stripe readers, and nobody can force them all to change to EMV overnight. It took Europe four years to get even to 90% adoption rates. Until such time as most all retailers take them, the crappy mag stripes are required for backward compatibility. And if we say "this does nothing", that's wrong. It takes us one step further down a path we need to fully traverse.
They all communicate through NFC. The differences are in the back end payment systems. To the consumer, there is no real difference except in what cards are supported or how their particular device works. Apple made Apple Pay easy to use on their phones because they use biometrics (fingerprints), and easy on the Watch because you have to log in only once, when you put it on; it auto-locks when you take it off. We'll see how easy Samsung made it: do you have to enter a PIN every time, or do they have some other magic?
Having seen Waterfall inaction, I find that story sadly believable. The moment you start siloing what should be a single person's responsibility, the turf wars emerge to amplify the chaos. And a developer's responsibility should encompass everything from testing through coding to design.
Presumably you are using modern tools to compile and build your software, manage source code, and manage your project work. Many of these tools will either incorporate or integrate with bug tracking software and testing frameworks. If there's a native bug tracker available, select it. If there's a native test framework available, use it.
What you need is a least-friction option, where testers, analysts, and developers can all see the bugs, write up the bugs, test the bugs, and fix the bugs. You don't need "The Most Advanced Framework Available Today", you don't need "The Best Test Tracking and Reporting Software Ever Produced", you need a solution that works well for all the people involved. Having a third party tool where the developer has to stop working, log in to the bug tracker, read the bug details, switch back to the development environment, make some changes, switch back to the bug tracker, write up the findings, switch to the test framework, execute a test, switch back
Here's the problem. Some organizations say "hey, let's evaluate and buy the bestest test software out there" without giving a thought to the developers. So the QA department runs off on their own, buys a tool, and starts building tests in it that the developers can't run. If the developers can't run the tests, they don't know if they're fixing the problems correctly, so they waste tons of time. Worse, if a developer makes a change that breaks some test, they won't know until that result is reported to them, possibly days, weeks or even months later, depending on your QA cycles. During the intervening time, the developer continues to write code based on their original faulty change, creating technical dependencies on what may be a completely flawed base assumption. When the test finally reveals the flaw, the developer's choices are limited to: A) rewrite everything according to the better architecture uncovered by the flaw, or B) make a scabby patch so the test passes. If you choose A, the software's release will be expensively delayed. If you choose B once, you'll likely choose it again, you're incurring technical debt, all your software is likely to be crap, and no good developers will want to work for you. The correct answer is of course C) don't produce tests the developer's can't run themselves on demand, or tests that aren't automated as a part of the build process.
Internally, their version numbers are still consistent, but the fact that their product names include numbers (sometimes related to the date, sometimes related to the internal version number, and sometimes related to the previous release) makes for a confusing mess. Windows 3.5, Windows 95, Windows 98, and Windows ME were all built on top of DOS and were not built on the NT kernel. They were all just windowing systems, and none of them deserved to be called an operating system. Underneath, they ran MS-DOS, which was a more or less compatible version of PC-DOS. They were all 16 bit operating systems.
Windows NT 3.5.1 was Microsoft's first actual multitasking OS, followed by NT 4.0, which was followed by Windows 2000 (NT 5.0). XP was Windows NT 5.1, and was when the Windows NT kernel finally went mainstream to the home users (and introduced most to a 32 bit OS.) Vista was NT 6.0. Windows 7 was NT 6.1. Windows 8 is NT 6.2, and Windows 8.1 was NT 6.3. Starting with Windows 10, they actually renumbered the internals so it's reporting itself as NT 10.0.
On top of all of those, you can try to overlay their server versions. At least they're all named according to the year of their intended release.
As marketing is clearly in charge of OS naming at Microsoft, don't look for consistency in future versions. The only thing you can count on is the unpredictability of their naming schemes. Their next release is equally likely to be called Windows 7331, Windows Forever, Windows 64, Windows 11, or Windows 2018. Internally, it'll probably still just report itself as NT 10.1.
Digital movies can be watermarked all the way down to the exact screening. I remember seeing a movie that had a few flashes of birds, the number and pattern of the birds was a fingerprint identifying the theater and showing. But not all schemes are that ugly or visible. All they have to do is insert and remove a few frames from a few scenes, altering the durations slightly, and they have a non-viewer-disruptive unique fingerprint of the showing that survives anything -- including cam-rips.
Then you're going to the wrong theaters. I wouldn't give places like that a second glance, let alone cash. If you walk in and it's disgusting, go get your money back and leave.
I can't imagine that many people will eschew going to the movies for a smartphone camera recording. Maybe for screeners and Telecine rips but cam versions? Really?
That's because you are ignoring those who live below the poverty line. You're thinking of wealthy people who can afford to buy their own food; people who could afford to see the movie in the theaters. For them, $5 invested in a cam-ripped DVD will serve as the night's entertainment for quite a few people, and it will likely serve as a babysitter for a couple of weeks. Or it might be part of a social mask, hiding their finances from others so they can go to work the next day and say "yeah, I saw the new movie last night, let's talk about it at lunch. What, did you think I'm so poor I can't afford to go to the movies?"
You're right in that removing a single guy with a camera (even removing one per screen per showing) isn't worth the expense of the goggles. This is obviously just a ploy to exploit the deterrent effect. Use night vision goggles, bust a dozen people recording the movies on their cell phones, and fine them a crazy amount like a million pounds each. Spread the news of the arrests, and of the insane penalties. Every few days when the public needs reminding, publicise another bust.
It doesn't even matter if the fines are later overturned by a judge, and reduced dramatically. The hope is that this makes many fathers say to their sons "put that phone away you stupid git; we can't afford a million pound fine!"
That reminds me of a friend. He read so much about the harmful effects of smoking that he gave up reading.
Actually, why do they need a physical facility at all? It's not like they go out and interact with the hardware. All they need is to ship a fancy chair to the Netherlands or wherever, and train them virtually.
"You must have an IQ of at least half a million." -- Popeye