Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Note: You can take 10% off all Slashdot Deals with coupon code "slashdot10off." ×

Comment Again!? (Score 1) 210

Despite the OP's attempt to make this seem like the first of its kind in 40 years, there have been numerous (tragic) attempts at making D&D into an entertainment franchise.

That said, none of them were terribly successful, nor could they ever be. The longest-lived adaptation (IIRC) was the Saturday Morning animated version; aired 1983-1985. It wasn't terribly good at depicting the game, or even the genre, but instead simply became another podium for tales of morality in the 1980's TV nannyscape. Remember, D&D was just the framework, the universe in which stories took place. There were hundreds of different stories, but none of them were actually called "Dungeons & Dragons"; they had their own titles, y'know. (e.g., Ravenloft and The Keep on the Borderlands, et al)

Since WotC took over the IP, I've had high hopes that they would curate it with more responsibility. To wit, bringing some of the more popular dungeon tales to life, and portraying each under their own chosen title (vs. relying on D&D brand recognition) thereby doing honor to the authors of those stories. I have yet to see any adaptation that truly does justice to Gygax's true genius for writing adventures.

In that light, let me say that I hope to never see another movie, game or program titled "Dungeons & Dragons", but rather, an interesting title, bourne from the story itself, followed with the tagline: a Dungeons & Dragons adventure tale.

Comment Zero cake is a lie (Score 1) 1067

Courtesy of the logic of GLADOS...

1. You have a cake.

2. You have to divide the cake to serve it.

3. Count the number of humans that will get cake. (set to 0)

4. Divide the cake into equal slices for each human.

Conclusion: The cake is a lie. Q. E. D.

Div/0 will always be "a lie", because even if you do substitute an infinite value (the closest "irrational" answer) and return a result, the function becomes useless. Infinity is symbolic, it doesn't actually exist, because there's no rational way to express it, let alone apply it to a process. It's an endpoint, not a step.

tl;dr -- Stop using such reckless floating-point math and improve your exception handlers. (maybe even validate input... I know; shocking.) It's not that difficult when (or if) you plan your projects in advance.

Either that, or write your own function for division that traps div/0 and returns a zero for you, then substitute it in for every '/' in your code. Good luck debugging that.

Comment Re:reasons (Score 1) 327

The reasons to use PowerPoint are many, but the reasons against using PowerPoint are also many.

The tool is not at fault for bad workmanship, and that's all PP really is; a visual information tool.

Do meetings with PowerPoint suck rocks? I'm sure you know the answer, but it would be the same answer as "does the same sort of meeting with an overhead projector and printed slides suck rocks?"

Used effectively, it can floor a room and blow minds by the score. Used poorly, it can suck the life out of an entire campus.

There's nothing about PowerPoint that instantly succeeds or fails, the results are measured by the experience and showmanship of the presenter. Sadly, many are lacking in either showmanship or public-speaking experience. What's even more pathetic, there are those that believe conducting PP meetings or lectures counts as actual public-speaking experience. PP slides are not a 'script'. It's not a teleprompter. It is merely the backdrop for what you have to say.

Banning PowerPoint is like banning multi-tools. You can't effectively build a house with a multi-tool, but you can try. Banning the tool for it's misuses is not just putting the cart before the horse. It's like putting the horse in the cart then blaming the horse for being ineffective.

Comment In a word... (Score 1) 447

To answer TFP's question, which many seem to be avoiding, I have one word: yes.

The answer should have been 'yes' 15 years ago, when micro-scale video recording became commercially feasible. Nowadays, with almost countless SoC and autonomous micro-controller hardware married to multiple GB (possibly TB) of solid-state storage —all within the size of a deck of playing cards— I would have to say that video monitoring is not only feasible, it's an imperative.

When we get news of a flight disturbance, what do we get to see? That's right, just some blurry, hand held camera-phone footage with muffled audio. Of course, any footage from the cabin is going to be at the discretion of passengers and not the airline corporate, which mitigates any in-cabin monitoring. But perhaps we should think about it a different way.

How many gadgets are out there for our car windscreens, to monitor other drivers? A dozen? A hundred? These essentially represent the solution for any warranted video monitoring. They have long-term (i.e. per-trip) recording functionality, as well as constant-loop recording for capturing the unexpected. These devices are typically the same size that radar-detectors were two decades ago!

Now, does this necessarily mean that we have to see inside the cockpit? No, it doesn't. Take that privacy argument and stow it.

Video monitoring could mean many things, such as the cockpit door exterior. (IMHO, a much more compelling angle when considering hijackings) It could also mean hull-exterior views, which could be quite valuable for take-off/landing mishaps. Rather than rely on modeling to visualize the attitude, speed and point of impact, it could be right there on a screen for you.

In an aviation scenario, we just start with the functionality of the classic black-box device and evolve it to include video, solid-state storage and an automated distress feature that attempts to upload the last 2 minutes of recorded data to satellite. (for extra credit, make a monitoring algorithm that senses flight-path and altitude deviations for real-time alerts and warranted monitoring)

True, there may be limits to the durability, but being able to put such systems in a compact physical space already increases the survivability of such a system. I bet it could even fit inside a contemporary black-box chassis without much effort. It's anybody's guess why there hasn't been any significant retrofit of the classic Flight Data Recorder design, now that technology is more compact and survivable than when the program began in 1967. The current debate over 'deployable' recorder systems just seems silly. With the profits that airlines are making lately, it's horrifying to consider that one's final message to the world would be from 40-year-old tech.

If this Germanwings incident reveals anything, it's that the safeguards for mishaps are still in human hands, including the reporting of essential data. There's no technological solution for suicidal pilots, because experienced pilots know every manual override. (and can wield a roll of duct tape) Let's at least take the next step (looking at you, FAA) and start mining these mishaps for the valuable lessons they could teach to future avionics, international regulation and corporate norms. Put some bloody cameras on that flight!

Comment You mean, IF the revolution comes. (Score 1) 516

In the past MS used

They did not use internal staff.

But the managers that approve it are to go first.

At least the folks at Icon Factory know a thing or two about iconography, which is as much of an exact science as UI design ever was; part pixel art, part language. As other 'dotters here have happily provided links to not only the historical iconography of Microsoft, but other platforms as well, you can see the evolution of aesthetic choices; the playful isometric simplicity of BeOS, the monochromatic elegance of NeXT, and the neo-realism of Gnome. Saying that the flat colors is a throwback to the primitive computer era (8/16 bit) is rather ignorant, simply because the color-palette choice wasn't a matter of preference, as much as necessity. Back then, the engineers were put in charge of defining the color gamut based on just 16 or 256 'slots' to use. Naturally, the engineers approached this in an algorithmic fashion, rather than aesthetically. That's why it took us 30 years to come up with color rendering that could represent natural/earth/skin tones, because there were all these mathematical gaps in the subtle spectra of blues, browns and greens. In that sense, I suppose the selection of flat saturated colors is indeed ironic in the age of hyper-realistic imagery. I applaud an aesthetic choice for elegant iconography, however the execution can be equally delightful or disastrous.

While I agree in part with the dissent over the design choices, I don't agree that TFA is representative of any significant "majority". Let's be real here, the headline reads, "icons look like a bad joke." Do you really think that contributing readers would be unbiased? You might as well have a big sign out front, "MS-bashing Trolls Welcome!" ...majority indeed.

But here's the catch. It's hard to have a serious discussion about UI choices even in this forum, one that's so inclined to conflate the design with every poor PR move, questionable politics and troubled past of the legacy platform, all making it impossible to take a step back and appreciate the design choices for what they are. It's also important to add that UI choices aren't just about making it artful, but mostly, meaningful. These mini-pictures are purpose-made to fall into the background, rather than be their own eye-candy. (that's what custom icon sets are for)

So here, I'll take a stab at it. This icon gallery clearly perpetuates the traditional Windows brand "manila folder" trope as a foundation. With flat colors and angled lines, it does an attempt at three-dimensional appearance, which arguably does look very 'flat', with or without comparison to its predecessors. While those do not make up 100% of the new icon set, the "folders" establish the overall paradigm and 'look' of the interface. I'm not convinced that the non-folder icons are even complete, since most of them still resemble Aero's photo-realistic set of devices. The icons that notably reflect the new art style are the "My Computer" and "Network" icons, which is a simple line-art treatment style. This is not consistent with the folder paradigm, not only because they don't resemble folders, but because these images are using boundary lines to define shapes, rather than flat colors. Overall, it's rather inelegant and poorly executed. The folders use subtle boundary lines, but inconsistently, and the line doesn't diminish on the smaller icons, making the left face of the folder look awkward, like a backwards "L" from a varsity jacket. Again, we see that the Redmond workshop has neglected the beauty of scale and only centers their model on an 'ideal' size, whatever size that may be, and also belies an underlying framework that is—yet again—bullishly ignorant of modern, precision rendering. As I'm running Win10-TP myself, I can also see that File Explorer attempts to express folder contents as foreground icons using the open-folder trope as a background. It renders another closed-folder icon atop the background if a sub-folder is present. A poor choice, since the flat colors run together and make the background folder look strange. We also see that, when the contents throw the object-recognition algorithm for a loop, the second foreground icon is a square outline; a tell-tale indication that MS still hasn't learned it's lesson about meaningful representations. If this was an in-house job, then I strongly recommend outsourcing it to an expert group once again, all the way down to the visual rendering engine. Aero was dumped, rather than evolved; another poor choice.

And there you have it; an attempt at a serious discussion about the merits and properties of the new icons that isn't just a splattered statement of subjectivity. I welcome anyone to contribute, and I hope (against hope) that the sincerity of this discussion may be preserved.

Comment Who defines 'patentability' anyway? (Score 1) 43

So far, it's the patent owners and warchest protectors that seem to be driving the definition of what can and cannot be patented in the digital realm. This should be reversed; there should be an international (or universal) standard definition that a applicant must fulfill before it can even be considered for legal protection.

Just spit-balling here, but maybe it should be a rule of threes; a project must demonstrate it leverages the three parts of digital technology: the hardware, the software and the network. Among each of those, there must be three distinct techniques being used to separate it from common operations and, in each technique, three uncommon modules that can be considered proprietary in nature and therefore be protected as "trade secrets". So, in total, we have a basis for patentability that covers the basic facets of digital products, requires them to define how they set themselves apart and lastly requires that the applicant specify what makes their work unique at the code and/or API level; requiring nine points of uniqueness in each digital facet. No 'black box' definitions either; all patents must encompass and explain the concept that makes the patent... well, patentable.

This not only provides a structure for burden-of-proof arguments, (currently non-existent, apart from the ruling described in OP) but also creates the need for distinguishing one's work to set it apart from what platform developers and shared-library contributors can claim as prior art or common practice. More importantly, it eschews the petty bickering of single-factor patents; things like "swipe to unlock" or "presentation as a square tile with 10% rounded corners" or "putting a virtual button in the corner of the screen to select a program" sort of nonsense.

Comment Re:finally (Score 1) 43

If we're to fully plagarize Neil Armstrong, let's do it right.

That's one small click for man, one giant drag-and-drop for mankind.

I don't know if that's how we're going to celebrate the first crack in the shell of software patents, but yes, it is a step in the right direction.

Comment Re:Liability (Score 1) 474

Indeed. The Xfinity Wifi service is not like a public hotspot, it's app-enabled and otherwise walled off.

The app asks you to log-in with Xfinity HSI credentials, connects to a geographic database and shows 'coverage' on a small map. When you want to connect to a hotspot, the app coordinates the security automatically, kinda like a pushbutton feature on a router.

If you don't have the app, these hotspots look like any other secured private WAPs.

Despite all that, it's an arrogant and draconian move to just switch-on customer equipment to provide a service. I believe Zordak made the point that the gateway/router devices are leased to customers, but essentially Comcast property. To me, that means they can take control to provide enhanced services, like advance port forwarding, traffic balancing and delivering QoS metrics back to their root network. All that makes sense, right?

What doesn't make sense is basically hijacking the device to provide a subscription-based service for other customers. If I have one of these routers, then I expect it to serve the purpose of fulfilling my service subscription, not someone else's. Providing such a service should be at the option of the subscriber, not the default stance with an opt-out procedure. Organizing the majority of subscribers to opt-out of this service clause will surely pressure Xfinity to re-think their strategy, but good luck getting the attention of all 50,000 households. (or even half of them)

A responsible, progressive and fair-minded company would provide incentives for becoming part of their service infrastructure. Monthly service discounts would be a good start, and might even improve Xfinity's reputation in the process. Let's say, the more isolated your WAP is on the Xfinity map (thereby filling in a wide gap in coverage) the more of a discount the homeowner gets.

In this day and age, it takes a level competitor to enact change in the marketplace; so we're looking at you FIOS, DSL and Google Fiber. Do it better!

Comment The rest of the story (Score 1) 711

Anyone think of the percentage of iPhone adopters that switch to Android? Those numbers are conspicuously absent. I doubt they did any follow-up for iPhone "consumer corrections" to see how many later dropped iPhone and went back.

And they say Microsoft "drinks the kool-ade" on their own products. Seems like both camps have a strange brew now. However in this respect, Apple has some serious catching up to do.

If the rumors are true, then we'll get to see who can make the better "geez I feel like I'm going to break this thing it's so thin" device for 2015.

Still waiting for the bluetooth, bio-powered, wetware interface cartilage implant accessory. (stereo, please)

Comment Re:Who owns the pipes? (Score 1) 343

[...] (1) regulation, (2) competition, and (3) public ownership of pipes [...]

  • (1) What regulation? Lobbyists control legislators, and lobbyists are powered by corporations. Along with the recent chairman appointment, (y'know... a former lobbyist) the FCC is as good as sold.
  • (2) What competition? The feeding frenzy of cable infrastructure -- 80's and 90's -- has already been divvied up. The alpha predators are just bloated giants and looking to mate with– or destroy the rest.
  • (3) See #1... or do you really think there's a budget, or even a motion, for that purchase? It would be political suicide, because it would be interpreted as "big government getting bigger." You'd have better luck going parcel-by-parcel with Kickstarter or Indiegogo, but then the big boys would just play the same game they did with smaller competitors; milking you dry until you end up selling it back. (and at a discount) It has to be all or nothing, and the sticker-shock on that could just about kill you.

This isn't a simple game, if it's a game at all. In fact, it's more like a quail hunt, where the hunters are doing so well that they're getting bored and shooting their friends in the face. (see what I did there)

Comment How a programmer views time (Score 1) 209

That's what this 'calendar' essentially says. Let's just call it what it is, a simple algorithm for a few celestial body movements. It's rail-minded development applied to the solar system, with only a nod to the Gregorian lunar-based system. (28-day months, or approximately one lunar rotation) Also, and let's be honest here, the whole "timemods" idea is just a gadget. It's not practical outside of the inner-workings model. I mean c'mon... calendars are supposed to work for everyone.

All that doesn't mean it's a bad idea.

On the contrary, it's a great start. But if it's to become a great system, worthy of usurping the Gregorian calendar, then it has to embrace the natural marks of celestial time frames... not just one solstice per year.

  • First improvement would be to include both solstices in measurements. This already doubles the accuracy of the system.
  • Take it one step further and include both equinoxes for additional reliability.
  • The previous two suggestions annihilate the 13th "mini month" idea, (which BTW is horrible) so tack those on to the quarterly ends as 'meta months'. See? Quarters are now built-in!
  • The whole point of a standard calendar is to be predictable, so making corrections four times every year means the next cycle is always more reliable than the last. (though it will never be perfect, because entropy)
  • We also open up the possibility that sub-diurnal adjustments can now be quarterly, semi-annual or annual. Another leap-second in June, why not?

This system then retains the single greatest advantage of the Gregorian calendar; division by the most factorials. (!12=1,2,3,4,6,12 -vs- !13=1,13) And now it has more frequent course corrections. Consider this programmatically with the above suggestions, and the system is still computationally simpler than our legacy Gregorian system. So there it is, an accessible system that everyone can use.

Comment Re:If you regulate properly, we'll stop our busine (Score 1) 286

Seems to me like there must be something to it. If the ISP's are threatening to sit on their asses (believe me, they'll do it, those crazy bastards) then there's got to be something proper and fair about that bill.

The ensuing pity party will undoubtedly be called the thumb-up-the-ass-mageddon.

Either that, or face the rise of a Chart-warner-cox-cast abomination, sure to be renamed the Cable Operators Commision Kabal. The acronym should make it obvious.

Real Programs don't use shared text. Otherwise, how can they use functions for scratch space after they are finished calling them?