The vast size of the available labor pool greatly reduces the positive benefits that China's factory managers might gain from better treatment of the workers they currently have. They can work them into a stupor and then kick them out for a fresh new batch. The overhead to retrain is quite low in most positions.
A physical polling station prevents this by ensuring a) the voter is not documenting the vote and b) no one else is documenting the vote. Neither a) nor b) can be guaranteed with online voting. It is extremely hard to provide PROOF to someone you voted a certain way in a physical voting situation. It is easy to SAY you voted a certain way, but that doesn't have to be true.
A physical polling station does nothing to ensure that the voter is not documenting their own vote, nor was it designed for this purpose. It's trivial in the modern era to take out your phone and film yourself voting, from beginning to end, inside the booth. Whether you throw some tantrum and manage to get your vote changed, or edit the video footage later, is your own business of course. Your peers pressuring you into demanding "proof" is just as much a problem with paper voting as it is with any other form.
The more important point, though, is this: If you don't want anyone else to see you voting on your smartphone, you can go hide in a closet and vote. If you do, you can always register your vote "in public" and then change it later. If you think someone is going to hold you at gunpoint and stare directly at your phone for the entire duration of the voting period - which can be as long as a WHOLE MONTH, considering how vote-my-mail ballots already work in this country - then you have much, much bigger problems than your ability to vote being tampered with. You are the victim of a kidnapping and the police should be out looking for you.
If you're especially paranoid I suppose the voting software could implement a "no take-backs" feature where you can lock in your vote, so even if you're kidnapped near the end of the voting period, you can't be forced to change it. Then the kidnapper has to simultaneously abduct enough people to sway an entire election the SECOND the polls open, then have enough coercive power with them - threat of imminent death for example - so that they don't just refuse to vote altogether. Again, if you live in a city where this can happen, you have bigger problems.
Same deal with the hypothetical Texas church: If your church locks you in and compels you to vote a certain way on pain of excommunication or whatever, you have much bigger problems at hand. You should be videotaping that and going to the feds with it. Sadly, if you're a member of such a church, you probably think the feds are an agent of Satan anyway. Properly implemented encrypted online voting is not going to influence this, since this sort of ugly fraud is just as possible with absentee ballots and voting-by-mail already.
(Note that this scenario is pretty damn out-of-wack. In many towns, the church is trusted as non-political enough to double as an official polling place.)
Actually, from my point of view, it's you who is missing the point. UI does not exist in a vacuum, it exists in a context of history, because the end user is a moving target.
The biggest example that comes immediately to mind for me is in the web browser. Back in the 80's, no one expected to be able to encounter a random place in a document with some stylistic emphasis, that when prodded with a mouse, would cause the computer to display a different document. That functionality was reserved for very clearly defined buttons, as it had to be to avoid confusion.
Nowadays if a user sees a word on a screen that is merely a different color - let alone underlined - they ASSUME it is functional, and furthermore, that the word itself, plus the context, gives the entire story about what the function is. Everything that isn't "normal" text is likely to be interactive, in fact, including other document elements like pictures and icons - with or without borders - divider panels, blank underlined sections (poke to fill them out) etc. That is a difference in the user, and it informs the direction for the design.
"Good" UI is a lot more than signal versus noise. It's about understanding the "signal" itself: What the user expects, and wants to do.
Yeah, well, that's just, like, your opinion, man.
A button is no longer a button because it was never a button in the first place.
UI design was mouse-focused for about 25 years, and the UI design of smartphones just used it as a starting point. I'm glad to see it move on.
When I poke at a word I am poking a word on a screen with my finger, not a button. Why should it be dressed up in the clown makeup of a button? Position, context, color, precedent, and the name of the thing itself are all strong indicators, and when I am, I aim for the center of it, and the size of my finger intuitively defines the range of error around the target.
Even your meme-ready screenshot is actually proof of how much things have changed: Everything on the Windows 8 screenshot is a button - and we understand it intuitively as such because we're using a touchscreen - and so, there are no button borders. And, we understand everything is draggable, from any anchor point, so there is no need for title bars along the "windows" to provide that anchor point.
Back in the day, some jerk invented that UI by messing around in a workshop with a mouse and going with what felt right. Why in the world would we cling to it, now that mice are dying out?
Really freaking simple reason: Ability to sell, coerce or otherwise influence a vote.
Physical presence at a polling location makes it impossible to do these things, at least on a large enough scale to change an election. No one knows your vote so you can't sell it and no one can "check" to make sure you voted a certain way.
1. It is possible to design an electronic system where no one but you knows your vote. That is, where no one but you can uniquely verify that a given vote is yours, and that it is set the way you chose.
2. The ability to sell, coerce, or otherwise influence a vote is a complex problem, and could just as well be _decreased_ by electronic voting. In general, for every abuse you dream up on the electronic side, there is an equivalent abuse on the physical side. For example, nefarious vote organizers can close polling stations in areas they don't like - or, they can attempt to disrupt internet services to those areas. Want to make the system better? How about offering both?
Pure online voting could / would lead to massive fraud,
Care to provide a reference? Here are some working models you can investigate.
"voting parties" where peer pressure will rule, and otherwise socialize voting.
Have you ever been to a church in Texas?
It is one thing to tell someone who you voted for and an entirely different thing to be able to prove it. Just the ABILITY to provide proof will cause problems. Imagine a fraternity, church or other strong social institution. Do you think you are strong enough to say in the group when they question your loyalty and demand proof of who you voted for. Will you give up your status/membership in that group to preserve your voting integrity? Most won't be strong enough.
You're pretty far behind the times if you think these are new problems - for paper OR electronic voting.
Run the numbers. How much would it cost to
1. convince a voting authority to accept UNENCRYPTED PDFS as a means of voting
2. covertly install functioning hacked firmware on the wireless routers of a significant percentage of the citizenry
Wouldn't the return-on-investment be far better just running a bunch of attack ads?
*shrug* Okay, we'll play the analogy game your way.
You wanna violate the warranty on your fridge and stuff it full of strange items, you go ahead.
The manufacturer is under no obligation to alter their design to facilitate your efforts, though. If you find it awkward to "mis-use" their fridge, go buy a different one.
If you build a mall on your property, you get a say in who sells there, and how they do it.
As soul-crushing as malls are, I don't think they should be illegal...
Speaking as someone who hand-crafted both a 3x5 font and a 7x7 font with italic and bold variants, for use on 320x200 screens, back in the day, your accusations of "hipster" sound kind of ironic to me.
The specific conditions of early lo-res computing included something unknown even in the ancient days of moveable type: An iron-clad fixed WIDTH, as well as an iron-clad requirement that mathematical expressions be unambiguous. So, no matter what your opinions of serifs, you added bars to your uppercase "i" to make the spacing look consistent, and a bar to your numeral "1" to distinguish it from the lowercase "L", and when you could, you put a slash through the zero. Then in the 7x7 font you put serifs all over the lowercase letters wherever you could - like on i, j, and f - again for the sake of spacing. That was what you did for maximum readability in those conditions.
Those conditions are gone. Dragging them into an argument over the readability of serifs is farcical.
... this anonymous comment is a counterexample to your argument.
Personally, I hate anonymous gripes!
* still doesn't show my network transfer speed. Occasionally, I'm moving 5+ GB files between my computer and NAS, I would like to know if there's a bottleneck somewhere.
MenuMeters is free, quick to install and configure, and is absolutely essential for anyone who likes to keep an eye on vital stats without leaving a window hanging around.
I've installed it immediately on every system I've used over the last 8 years, and now it feels like I'm "working blind" when I use a system that doesn't have it!!
"defaults write com.apple.finder AppleShowAllFiles TRUE
Command line-only setting to see hidden files in the GUI? Bad design.
If you like you can download a dumb little application to do it for you that one time. Or you can add a context-menu to do it on the fly. The vast majority of OS X users should not be seeing hidden files, it would only confuse their work. What's bad design is leaving that option around with all the others where they can discover and toggle it and then accidentally f*% up their systems.
Are you saying you want a "pro version" of the GUI to be bundled with the OS? I suppose that would be nice for you, but it would suck for developers, who now have to support two completely different GUIs for the same platform.
The scrolling behaviour is designed to work with touch pads, because they're the primary analog interaction device on OS X, I'd strongly suggest you grab one.
Far less precise. Works for people playing, not people working.
Don't knock those trackpads until you've tried them. It's actually possible to be very precise indeed, because the surface even senses changes in the "center of contact" of your finger, i.e. if you roll your finger forward slightly, that registers as movement. Compare this to the mouse, where your accuracy is limited to how slowly you can push an object on a flat surface by flexing your hand.
The difference here is like playing the guitar with a pick, and playing the guitar with your fingers.
Correct, exposÃfÂ© is the right tool for this job. You can also use cmnd-` to cycle through windows within an application.
Extra work for a simple task, bad design
Not sure what you mean by "extra work". There are five different UI mechanisms for sorting windows on OS X.
1. command-tab plus command-~. Hold down "shift" to cycle up the stack instead of down. Mouse over an icon to select it in the stack.
2. Expose, via extra mouse button, trackpad gesture, or key combination.
3. Flip between screens, with trackpad gesture, or key combination.
4. Raise the dock, click on an icon
5. Command-H to hide your foreground app and send it to the bottom of the stack, bringing the next app into context.
Or if you really love spotlight, command-spacebar, a few letters of the app you want foregrounded, and the return key.
And of course there's the old favorite: Move the freakin' mouse and drag the window into view.
And here's a tip: Most of these, along with exposing the desktop, work from the keyboard while you're dragging something with the mouse. You can do some really sneaky things this way!
As I said above ctrl-a and ctrl-e. Also cmmd-left arrow and cmnd-right arrow.
Bad design. Home and End should work as in every other system, the other shotcuts should be the "Apple Custom" ones.
Given that "every other system" in this case probably refers to two other radically different platforms whose interfaces treat every other function key on the keyboard differently, it doesn't make sense to call this "bad design" just because it's different. 'Home' scrolls the window without moving the cursor, just like 'page up' right next to it, while 'command-up arrow' moves the cursor to the top, just like 'command-left' moves the cursor to the left side. From my subjective point of view, this is more consistent, more useful, and makes more sense.
As an aside, you may not realize it, but the days when Apple's OS design team needed to cater carefully to Windows 'switchers' are actually over.
So what you're saying is that on Linux you're willing to install the appropriate software to make the machine behave like you want it, but on Mac OS, having to install software is unreasonable?
No, you're misreading. In KDE, one sets options. In OS X, one hacks, if cusomization is possible at all.
Well, that's only if you consider "setting a built-in configuration option via a command line" as "hacking", and "installing any of your choice of 3rd-party GUI extensions" as an unreasonable burden for OS X, when it's practically the price-of-admission for KDE.
I'm assuming you're extremely familiar with doing things via the command-line. If you, or I, needs to be able to see a ".git" folder from a UI, or go poking into our own ".ssh" folder from a UI, we surely have the sk1llz to turn that option on without breaking a sweat.
... like that was the "good old days" of software development?
The exclusivity of the walled garden, the novelty of the device and platform, the deep pockets and enthusiasm of the userbase, this all created a gold rush environment for a number of years. (Remember that "I am rich" app that sold for $1000 a pop and did nothing but display a picture?)
Back In My Day, you only joined a small studio or became an independent developer if you had a REALLY INSANELY GOOD idea, were willing to work like hell for it (perhaps because you were tired of working for The Man), and were willing to evangelize it like hell, and even then, you were not guaranteed success, you were almost guaranteed to fail, but you did it anyway because you were deeply compelled. If you had to go slouching back to The Man in a few years, so be it.
The gold rush is over - and it's not a tragedy.
Programmers are as in-demand as they've ever been, and are paid fantastic money for labor that doesn't even involve, say, standing around in the hot sun, carrying a firearm, or constant exposure to hazardous waste. (Unless you count the exhaust from all those commute buses.)
I appreciate your desire to find common ground for all sides, but my take is different, as per the attitude in my above comment and the reason it was downmodded as a "troll".
Slashdot's active userbase has undergone an astounding contraction over about the last five years. A significantly larger proportion of it now consists of old-guard geeks looking either for validation, or for a fight, and in both cases they often find it because their fellow old-guard geeks are here looking for the same thing.
So when I declared the ad-hominen of "luddite", I was being serious. A texbook luddite is one opposed to technological innovation beyond their own immediate purview. I've met enough folks like this in my work history to know they're around, and they're as annoying as hell. Take for example the entrenched sysadmin who consistently denies your request to integrate a modern tool because he mistrusts it on principle, and won't even do the research that might lead him to trust it, because what's Best for him is Best for you. His attitude affects the workflow of many other people. And then he comes here and proudly declare his luddite nature, with examples, as though it were a point of pride.
You're right - it's a different mental model. But around here, amongst the luddites, it also gets special treatment. And so, my post gets labeled "Troll", and the parent gets labeled "Insightful", and so it goes.