Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:OSS needs technical writers more than coders (Score 2) 244

You nailed it with your last sentence. In fact, I prefer to start with good technical (why/how) documentation and code from that, commenting usage (which eventually translates into user documentation) along the way. With the theoretical and technical documentation out of the way before coding even begins, and the bulk of the user documentation written alongside the code itself (still needing to be compiled into a coherent document and edited, of course), all that's really left is to provide some usage examples, which can be done during final editing of the documentation.

Because the theoretical and technical documentation do not need to follow the program (it happens the other way around in this case), they become much easier to write. Coding becomes easier, as well, because the theoretical and technical documentation define everything (and if it doesn't, it gets kicked back to the author to be fixed while the developer takes a coffee break). That whole process just moves more smoothly and quickly, at that point. Having the developer detail usage in their code comments means the person who best knows how to use the code is the one documenting how to use the code; of course, a technical writer should compile those comments into a coherent document, then edit it. As a final test of that documentation, that technical writer should, while editing the user documentation, write a series of usage examples and have the developer verify that the examples are correct. Any corrections to the examples should result in similar corrections to the documentation, as well.

This seems to reduce the number of revision cycles while, at the same time, ensuring that the documentation and application are in sync.

I always tell my clients, if they're not willing to work with me to write the technical documentation before I begin development, my development quote doubles and only minimal documentation will be provided. I only have one client who chooses that option; they also prefer to pay me to maintain everything, so I guess it's a win-win.

Comment Re:OSS needs technical writers more than coders (Score 1) 244

Perhaps he isn't arrogant, but skilled in writing documentation? For him, then, it would not be hard, and never would have been. Ignorant, perhaps, of the fact that it might be hard for others, but not arrogant. The same can be said of coding, as well; I don't find it the lest bit difficult to write good and maintainable code, provided I'm given the time to do so.

That said, in both case it's actually almost impossible, because whoever you're writing the code and/or documentation for wants you on the next project tomorrow.

Comment Re:On iOS platforms. (Score 1) 270

I'm sorry you can't tell the difference between the larger (and not used much on anything more recent than a PS3 controller) USB-mini and smaller (and user everywhere) USB-micro, but that's a you problem, not a USB problem. Remember, the opposite end of that lightning cable is USB, too; it used to crack me right the fuck up when I would hear someone claim that lightning was faster than USB.

Furthermore, the lightning port is no more or less sturdy than USB-micro. Personally, I've never broken either port, despite having owned more USB-micro devices than I can count over the past decade, and a handful of lightning (read: iOS) devices since that port was released. In fact, I don't know anyone who's broken a USB-micro port (though I did knock one partially loose by dropping something heavy on it as it sat hanging off the edge of a table, something that would have done the same, or worse, to a lightning connector), but I do have a friend who snapped the lightning connector right out of her iPhone. This is in spite of the fact that I know more people who own more USB-micro devices.

I'll concede that you may have better access to lightning cables within your circle of friends than I do within mine (despite my wife and two best friends being iPhone users and myself and my other best friend being iPad users), but that does not change the fact that a USB-micro cable can be had for $1 (literally from the dollar store) while a licensed (e.g. won't make iOS 8 devices complain and possibly refuse to charge) lightning cable can not. You surely could have purchased the correct cable while on your trip; even if you were broke, you could have found enough cash on the side of the road to buy the USB cable you needed, but I doubt you'd have been able to scrape together $5-10 that way for a lightning cable.

Regarding the 30-pin connector, there were plenty of unlicensed 30-pin connectors out there, which Apple didn't make a penny on; that's why the lightning connector includes an authentication chip, the absence of which makes iOS 8 devices complain when an unlicensed cable or device is plugged in. That connector actually was stronger than USB-micro, on top of also being more capable (having analog audio/video and control pins, on top of USB and Firewire). It was a proprietary connector that actually brought something meaningful and useful to the table.

Of course, then there's USB3, which neither the 30-pin nor lightning connector support. Plug either into a USB3 port and it falls back to USB2 speeds (another reason I laugh when I hear someone say lightning is faster than USB). USB-micro "A" doesn't support USB3 speeds either, but will plug into a USB-micro "B" (the wider one with extra pins) port and work with no issues. The design of the lightning connecter prevents that, which is why Apple recently adopted USB-C, rather than try and make that work. Expect the next round of iDevices after the iPhone 6s (which will, of course, feature the lightning port) to ship with USB-C. You know, because lightning is just so much better.

Comment Re:On iOS platforms. (Score 1) 270

Both platforms are probably not equally easy to exploit

Consider that the vast majority of Android "malware" consists of games or toy apps that request every permission under the sun and the user has to agree to let that app have those permissions at install time. Now, consider that apps also request permissions on iOS but, rather than listing all the permissions they'll want at install time, they request them the first time they try to use them. It's a user decision in both cases.

both platforms probably do not provide equal returns

All you're likely to get from a phone, in any case, is a contact list, schedule (calendar), and some photos; you can grant full access to all of that on either platform. You can also send mail or text messages from an app on either platform, if the user grants those permissions. Android does allow full filesystem access (again, if granted by the user), but iOS also allows access to stored documents (including iCloud, so not just what's on your phone, which could potentially be worse). Neither platform allows system files, configuration, or application files to be overwritten unless rooted or jailbroken; and Android not only needs to be rooted, but also specifically configured to allow those behaviors and the app granted root access (which is not the default). I'm not positive about iOS but I don't recall seeing any way to manage root privs when I had my jailbroken 3Gs, which would make jailbroken iOS much more vulnerable than rooted Android.

(Obligatory apple users easier to deceive yet wealthier comment here)

I don't know about wealthier; as an Android user with a job, I've bought my unemployed wife her last 3 iPhones. As for easier to deceive, well, I wasn't gonna go there but... One of the two platforms is marketed to people who just don't want to have to give a shit about security. That's not an inherent flaw in the platform so much as it is a flaw in the marketing, though, and it's only the user's fault insofar as they tend to place entirely too much trust in a corporation that places profit over people. Google is no better in the latter regard, but at least they don't do the former; Android users (generally) have no illusions that their devices are any more or less secure than any other computing device.

Android and iOS are both fine platforms. Neither is perfect, neither is as secure as we'd like, but they're the best we've got at this point. I do feel that iOS is somewhat hamstrung by Apple's policies (NFC finally comes to iPhone, but only for Apple Pay? Proprietary connector that brings nothing to the table but the ability to plug in both ways and a new licensing revenue stream for Apple? No, thank you). As a result, will never own another iPhone; I've been using NFC for more than just payments for a few years by now, and think it's great that almost every rechargeable device I buy now uses the same micro-USB cable. Even iOS accessories. It's great, it really is, if I ever find myself needing a charge while I'm out and about; everyone has the cables, everywhere, and they're so cheap and plentiful that people don't mind lending them; that doesn't seem to be the case when my wife's iPhone runs out of juice. Mind you, I love my iPad, but it also doesn't leave the house all that often, so charging is rarely an issue, and none of the other restrictions that bother me when talking about the iPhone (an always-on-me device) seem to apply, either.

Comment Re:On iOS platforms. (Score 3, Interesting) 270

This oft-quoted line of complete bullshit is why people think iOS and Apple's walled garden is more secure than any other mobile platform. The reality is that Apple gets a binary to review, just like Google, Microsoft, or Blackberry. Apple and Google actually review the binaries, even; and Google even actually catches some nasties (I'm sure Apple does, as well, but they aren't as transparent about it) and prevents them from entering the Play store. Apple has pulled malware from the iOS store in the past, but has never made any official comment on it, unlike Google, which does leave one wondering... Knowing that malware authors likely submit to both platforms at the same rate, we know how often Google rejects or removes malware from their store (because they publish the statistics), one should expect that Apple's numbers are roughly the same, but are they? If anyone has a link to some official (e.g. from Apple) stats on that, I'd love to see them.

Comment Re:Battery life non-issue (Score 1) 113

It's a device with an accelerometer, heart rate monitor, and other sensors that would be immensely useful for sleep tracking. Unfortunately, despite being the best equipped for that task, sensor-wise, it's useless for the job simply due to its battery. Sure, the Pebble Time I ordered as an upgrade to my Pebble doesn't have a heart rate monitor on it, but at least I'll be able to wear it every night for sleep tracking and throw it on the charger when I go out to detail my car (or my wife's) on Sunday.

As an added insult to the Apple Watch: When offered an Apple Watch, my wife, an Apple fan who hates hand-me-downs, asked if she could have my old Pebble when the Pebble Time arrives.

Comment Re: No (Score 1) 161

This. If everything is done in the client, the application will lag every time anything processor-intensive is done. Likewise, if the client has to call back to the server every time it does anything, the client will lag on high-latency connections or when the server is overloaded. There's a balance there; the trouble is that most developers don't know how to find it.

I think part of the problem is that web app developers seem to fall into one of two camps: Do everything locally for the best chance of availability, or do everything remotely for the best performance. The first camp is almost right, they do achieve better availability doing everything locally, right up to the point where their app becomes unusable due to processing lag (in other words, they're wrong). The second camp is also almost right, they do achieve the best performance in their local development environment, running in a VM on their workstation, where they're the only user; that falls apart the moment they add thousands users and wildly-variable latency between client and server (in other words, they're wrong).

What I prefer to do is provide all capabilities in the client (a-la first-camp), then identify those that cause the application to lag and implement them on the server (a-la second-camp). Once a function exists in both the client and the server, the client can run a job locally and on the server. If the local job finishes first, the client alerts the server and the server-side job is terminated; if the server returns its result first, the local job is terminated. I find that this structure provides the best performance, as well as availability, since the faster resource will always be the one to return the result used; and if the server is unavailable, times out, or returns an error, the application still works. I find that most users are willing to accept occasional slowness during server outages and upgrades, especially when the application is generally very responsive under normal conditions.

That, I'm sure, can be built upon to predict (e.g. based on bandwidth, latency, and local vs server load and performance) which job will finish first and only start the remote job when it will be the clear winner (still starting the local job just in case). That would give you the benefit of reduced server requirements without impacting application performance (unless you take it too far and don't keep any spare compute power online). I haven't run into an instance where this has been necessary or where the savings would be worth the effort (as evident by the number of cancelled jobs compared to completed jobs), but I'm sure such a case exists.

Slashdot Top Deals

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin

Working...