Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:What's the big problem? (Score 1) 675

You don't get it?

Old way: Swipe takes 1 second, and put back in wallet. New Way: Insert card for 10 to 15 seconds. Remove card, and insert back in wallet.

Myopic perception bias is myopic.

If you compare the time it takes to physically move the plastic card along the reader slot to the time it takes between inserting a chip-enabled card and removing it, you may be technically correct, but then you're guilty of observation bias. You're only observing the actions taking part on the customer's side.

The entire transaction, end to end, takes about the same time either way. What you're conveniently omitting is the wait on the cashier's side after the card has been swiped. (on average, about ten seconds)

The difference is that you, the biased observer, are pinning that extra wait (after swiping) on the cashier. It's not his/her fault that their system takes time to clear the transaction. It's also the same with chip-enabled cards, where the de-facto requirement for chipped transactions is that the card remain connected until the transaction is approved.

With your observer's bias, you have not only scapegoated millions of well-mannered cashiers, but also declared your sheer arrogance that watching a small screen for a few seconds is beneath your dignity. That's exactly what the cashiers have been doing for decades. You, coward, are part of the problem.

Comment Re:wtf is this article (Score 1) 264

Apparently it's some apologism for Windows 10...

Really!? You're going there. That's like saying the sun is dying because it doesn't look as bright behind the tinted windows inside your car.

But taking you at your word, let's also say that your remark is apparently apologist for the paranoia cabal that supports Mr. Crust (as the ZDNet author dubs him) and his oh-so-loosely termed "research" claims.

A guy installs Windows 10 on a VM slice under Linux, blocks all LAN traffic and records the result. The only thing conclusive about that is the inter-dependency between modern PC platforms and the Internet... and that's all; something TFA makes incontrovertibly clear. In other news, water is wet and the sun will rise in the morning.

For anyone who took 10 minutes to read TFA, the truth is plain to see. The claims of Mr. Crust are firstly, trumped-up, and secondly, wholly presumptuous based on highly circumstantial and incomplete data, and lastly, hyped-up pseudoscience masquerading as research. It's not being an apologist, it's being a realist. The real "test" here is done on the audience; to find those among us who would be gullible enough to believe such rabble-rousing.

FYI: Record-low prices on hat-making material; Wal-Mart has 50 sq ft available for under four bucks.

Comment Re:You must be new here (Score 1) 1839

If you disagree with a comment Post. A. Reply.

Indeed! That's the spirit of /. for you. It's the spirit vs. the status quo, and that's the most important struggle of them all.

Hunt that wumpus, embrace the snark! We will not have feels stifled, and the glory of earnest discussion should shine through!

Humorous rejoinders and snarky rebuttals are a symptom of 'dotters', but not necessarily the truth. The real value of comments and threads is the increasingly rare unmoderated debate threads. These threads need to be unearthed, for the sake of all the snarky wit "fertilizer" under which they are often buried.

The mod system was, is and ever should be a method for visitors to find the earnest debates, despite the status-quo rabble.

My votes:

  1. Mod system eliminates down-vote scores in favor of 'flagging' —posts with "enough" flags would be reviewed for appropriate handling.
  2. Mod system upvotes expand into more and positive-oriented categories; e.g., "FunnyGood", "FunnyBad", "ROFLcopter" and "NiceTry"
  3. Mod cap for a single comment should now be 11. (because it goes to eleven)

If anyone dares suggest it go over 9,000... I salute you, then you will be shot.

Comment Again!? (Score 1) 210

Despite the OP's attempt to make this seem like the first of its kind in 40 years, there have been numerous (tragic) attempts at making D&D into an entertainment franchise.

That said, none of them were terribly successful, nor could they ever be. The longest-lived adaptation (IIRC) was the Saturday Morning animated version; aired 1983-1985. It wasn't terribly good at depicting the game, or even the genre, but instead simply became another podium for tales of morality in the 1980's TV nannyscape. Remember, D&D was just the framework, the universe in which stories took place. There were hundreds of different stories, but none of them were actually called "Dungeons & Dragons"; they had their own titles, y'know. (e.g., Ravenloft and The Keep on the Borderlands, et al)

Since WotC took over the IP, I've had high hopes that they would curate it with more responsibility. To wit, bringing some of the more popular dungeon tales to life, and portraying each under their own chosen title (vs. relying on D&D brand recognition) thereby doing honor to the authors of those stories. I have yet to see any adaptation that truly does justice to Gygax's true genius for writing adventures.

In that light, let me say that I hope to never see another movie, game or program titled "Dungeons & Dragons", but rather, an interesting title, bourne from the story itself, followed with the tagline: a Dungeons & Dragons adventure tale.

Comment Zero cake is a lie (Score 1) 1067

Courtesy of the logic of GLADOS...

1. You have a cake.

2. You have to divide the cake to serve it.

3. Count the number of humans that will get cake. (set to 0)

4. Divide the cake into equal slices for each human.

Conclusion: The cake is a lie. Q. E. D.

Div/0 will always be "a lie", because even if you do substitute an infinite value (the closest "irrational" answer) and return a result, the function becomes useless. Infinity is symbolic, it doesn't actually exist, because there's no rational way to express it, let alone apply it to a process. It's an endpoint, not a step.

tl;dr -- Stop using such reckless floating-point math and improve your exception handlers. (maybe even validate input... I know; shocking.) It's not that difficult when (or if) you plan your projects in advance.

Either that, or write your own function for division that traps div/0 and returns a zero for you, then substitute it in for every '/' in your code. Good luck debugging that.

Comment Re:reasons (Score 1) 327

The reasons to use PowerPoint are many, but the reasons against using PowerPoint are also many.

The tool is not at fault for bad workmanship, and that's all PP really is; a visual information tool.

Do meetings with PowerPoint suck rocks? I'm sure you know the answer, but it would be the same answer as "does the same sort of meeting with an overhead projector and printed slides suck rocks?"

Used effectively, it can floor a room and blow minds by the score. Used poorly, it can suck the life out of an entire campus.

There's nothing about PowerPoint that instantly succeeds or fails, the results are measured by the experience and showmanship of the presenter. Sadly, many are lacking in either showmanship or public-speaking experience. What's even more pathetic, there are those that believe conducting PP meetings or lectures counts as actual public-speaking experience. PP slides are not a 'script'. It's not a teleprompter. It is merely the backdrop for what you have to say.

Banning PowerPoint is like banning multi-tools. You can't effectively build a house with a multi-tool, but you can try. Banning the tool for it's misuses is not just putting the cart before the horse. It's like putting the horse in the cart then blaming the horse for being ineffective.

Comment In a word... (Score 1) 447

To answer TFP's question, which many seem to be avoiding, I have one word: yes.

The answer should have been 'yes' 15 years ago, when micro-scale video recording became commercially feasible. Nowadays, with almost countless SoC and autonomous micro-controller hardware married to multiple GB (possibly TB) of solid-state storage —all within the size of a deck of playing cards— I would have to say that video monitoring is not only feasible, it's an imperative.

When we get news of a flight disturbance, what do we get to see? That's right, just some blurry, hand held camera-phone footage with muffled audio. Of course, any footage from the cabin is going to be at the discretion of passengers and not the airline corporate, which mitigates any in-cabin monitoring. But perhaps we should think about it a different way.

How many gadgets are out there for our car windscreens, to monitor other drivers? A dozen? A hundred? These essentially represent the solution for any warranted video monitoring. They have long-term (i.e. per-trip) recording functionality, as well as constant-loop recording for capturing the unexpected. These devices are typically the same size that radar-detectors were two decades ago!

Now, does this necessarily mean that we have to see inside the cockpit? No, it doesn't. Take that privacy argument and stow it.

Video monitoring could mean many things, such as the cockpit door exterior. (IMHO, a much more compelling angle when considering hijackings) It could also mean hull-exterior views, which could be quite valuable for take-off/landing mishaps. Rather than rely on modeling to visualize the attitude, speed and point of impact, it could be right there on a screen for you.

In an aviation scenario, we just start with the functionality of the classic black-box device and evolve it to include video, solid-state storage and an automated distress feature that attempts to upload the last 2 minutes of recorded data to satellite. (for extra credit, make a monitoring algorithm that senses flight-path and altitude deviations for real-time alerts and warranted monitoring)

True, there may be limits to the durability, but being able to put such systems in a compact physical space already increases the survivability of such a system. I bet it could even fit inside a contemporary black-box chassis without much effort. It's anybody's guess why there hasn't been any significant retrofit of the classic Flight Data Recorder design, now that technology is more compact and survivable than when the program began in 1967. The current debate over 'deployable' recorder systems just seems silly. With the profits that airlines are making lately, it's horrifying to consider that one's final message to the world would be from 40-year-old tech.

If this Germanwings incident reveals anything, it's that the safeguards for mishaps are still in human hands, including the reporting of essential data. There's no technological solution for suicidal pilots, because experienced pilots know every manual override. (and can wield a roll of duct tape) Let's at least take the next step (looking at you, FAA) and start mining these mishaps for the valuable lessons they could teach to future avionics, international regulation and corporate norms. Put some bloody cameras on that flight!

Comment You mean, IF the revolution comes. (Score 1) 516

In the past MS used http://iconfactory.com/

They did not use internal staff.

But the managers that approve it are to go first.

At least the folks at Icon Factory know a thing or two about iconography, which is as much of an exact science as UI design ever was; part pixel art, part language. As other 'dotters here have happily provided links to not only the historical iconography of Microsoft, but other platforms as well, you can see the evolution of aesthetic choices; the playful isometric simplicity of BeOS, the monochromatic elegance of NeXT, and the neo-realism of Gnome. Saying that the flat colors is a throwback to the primitive computer era (8/16 bit) is rather ignorant, simply because the color-palette choice wasn't a matter of preference, as much as necessity. Back then, the engineers were put in charge of defining the color gamut based on just 16 or 256 'slots' to use. Naturally, the engineers approached this in an algorithmic fashion, rather than aesthetically. That's why it took us 30 years to come up with color rendering that could represent natural/earth/skin tones, because there were all these mathematical gaps in the subtle spectra of blues, browns and greens. In that sense, I suppose the selection of flat saturated colors is indeed ironic in the age of hyper-realistic imagery. I applaud an aesthetic choice for elegant iconography, however the execution can be equally delightful or disastrous.

While I agree in part with the dissent over the design choices, I don't agree that TFA is representative of any significant "majority". Let's be real here, the headline reads, "icons look like a bad joke." Do you really think that contributing readers would be unbiased? You might as well have a big sign out front, "MS-bashing Trolls Welcome!" ...majority indeed.

But here's the catch. It's hard to have a serious discussion about UI choices even in this forum, one that's so inclined to conflate the design with every poor PR move, questionable politics and troubled past of the legacy platform, all making it impossible to take a step back and appreciate the design choices for what they are. It's also important to add that UI choices aren't just about making it artful, but mostly, meaningful. These mini-pictures are purpose-made to fall into the background, rather than be their own eye-candy. (that's what custom icon sets are for)

So here, I'll take a stab at it. This icon gallery clearly perpetuates the traditional Windows brand "manila folder" trope as a foundation. With flat colors and angled lines, it does an attempt at three-dimensional appearance, which arguably does look very 'flat', with or without comparison to its predecessors. While those do not make up 100% of the new icon set, the "folders" establish the overall paradigm and 'look' of the interface. I'm not convinced that the non-folder icons are even complete, since most of them still resemble Aero's photo-realistic set of devices. The icons that notably reflect the new art style are the "My Computer" and "Network" icons, which is a simple line-art treatment style. This is not consistent with the folder paradigm, not only because they don't resemble folders, but because these images are using boundary lines to define shapes, rather than flat colors. Overall, it's rather inelegant and poorly executed. The folders use subtle boundary lines, but inconsistently, and the line doesn't diminish on the smaller icons, making the left face of the folder look awkward, like a backwards "L" from a varsity jacket. Again, we see that the Redmond workshop has neglected the beauty of scale and only centers their model on an 'ideal' size, whatever size that may be, and also belies an underlying framework that is—yet again—bullishly ignorant of modern, precision rendering. As I'm running Win10-TP myself, I can also see that File Explorer attempts to express folder contents as foreground icons using the open-folder trope as a background. It renders another closed-folder icon atop the background if a sub-folder is present. A poor choice, since the flat colors run together and make the background folder look strange. We also see that, when the contents throw the object-recognition algorithm for a loop, the second foreground icon is a square outline; a tell-tale indication that MS still hasn't learned it's lesson about meaningful representations. If this was an in-house job, then I strongly recommend outsourcing it to an expert group once again, all the way down to the visual rendering engine. Aero was dumped, rather than evolved; another poor choice.

And there you have it; an attempt at a serious discussion about the merits and properties of the new icons that isn't just a splattered statement of subjectivity. I welcome anyone to contribute, and I hope (against hope) that the sincerity of this discussion may be preserved.

Comment Who defines 'patentability' anyway? (Score 1) 43

So far, it's the patent owners and warchest protectors that seem to be driving the definition of what can and cannot be patented in the digital realm. This should be reversed; there should be an international (or universal) standard definition that a applicant must fulfill before it can even be considered for legal protection.

Just spit-balling here, but maybe it should be a rule of threes; a project must demonstrate it leverages the three parts of digital technology: the hardware, the software and the network. Among each of those, there must be three distinct techniques being used to separate it from common operations and, in each technique, three uncommon modules that can be considered proprietary in nature and therefore be protected as "trade secrets". So, in total, we have a basis for patentability that covers the basic facets of digital products, requires them to define how they set themselves apart and lastly requires that the applicant specify what makes their work unique at the code and/or API level; requiring nine points of uniqueness in each digital facet. No 'black box' definitions either; all patents must encompass and explain the concept that makes the patent... well, patentable.

This not only provides a structure for burden-of-proof arguments, (currently non-existent, apart from the ruling described in OP) but also creates the need for distinguishing one's work to set it apart from what platform developers and shared-library contributors can claim as prior art or common practice. More importantly, it eschews the petty bickering of single-factor patents; things like "swipe to unlock" or "presentation as a square tile with 10% rounded corners" or "putting a virtual button in the corner of the screen to select a program" sort of nonsense.

Comment Re:Liability (Score 1) 474

Indeed. The Xfinity Wifi service is not like a public hotspot, it's app-enabled and otherwise walled off.

The app asks you to log-in with Xfinity HSI credentials, connects to a geographic database and shows 'coverage' on a small map. When you want to connect to a hotspot, the app coordinates the security automatically, kinda like a pushbutton feature on a router.

If you don't have the app, these hotspots look like any other secured private WAPs.

Despite all that, it's an arrogant and draconian move to just switch-on customer equipment to provide a service. I believe Zordak made the point that the gateway/router devices are leased to customers, but essentially Comcast property. To me, that means they can take control to provide enhanced services, like advance port forwarding, traffic balancing and delivering QoS metrics back to their root network. All that makes sense, right?

What doesn't make sense is basically hijacking the device to provide a subscription-based service for other customers. If I have one of these routers, then I expect it to serve the purpose of fulfilling my service subscription, not someone else's. Providing such a service should be at the option of the subscriber, not the default stance with an opt-out procedure. Organizing the majority of subscribers to opt-out of this service clause will surely pressure Xfinity to re-think their strategy, but good luck getting the attention of all 50,000 households. (or even half of them)

A responsible, progressive and fair-minded company would provide incentives for becoming part of their service infrastructure. Monthly service discounts would be a good start, and might even improve Xfinity's reputation in the process. Let's say, the more isolated your WAP is on the Xfinity map (thereby filling in a wide gap in coverage) the more of a discount the homeowner gets.

In this day and age, it takes a level competitor to enact change in the marketplace; so we're looking at you FIOS, DSL and Google Fiber. Do it better!

Comment The rest of the story (Score 1) 711

Anyone think of the percentage of iPhone adopters that switch to Android? Those numbers are conspicuously absent. I doubt they did any follow-up for iPhone "consumer corrections" to see how many later dropped iPhone and went back.

And they say Microsoft "drinks the kool-ade" on their own products. Seems like both camps have a strange brew now. However in this respect, Apple has some serious catching up to do.

If the rumors are true, then we'll get to see who can make the better "geez I feel like I'm going to break this thing it's so thin" device for 2015.

Still waiting for the bluetooth, bio-powered, wetware interface cartilage implant accessory. (stereo, please)

Slashdot Top Deals

"Most of us, when all is said and done, like what we like and make up reasons for it afterwards." -- Soren F. Petersen

Working...