Congratulations! And I mean that sincerely. The next step in your transcendence is: Abandon Slashdot. It is the corner gas station.
A very _wealthy_ "niche group" of developers, too.
I noticed you completely failed to mention Java, which was hailed as the total cross-platform solution for a while until web browsers started crapping on it, and now it's synonymous with Android programming and would be decisively in the rear-view mirror of the tech industry by now, if not for that. What's funny is, Google almost had to choose it by default. What were they going to use instead? C# from Microsoft? ObjC from Apple? What else are you going to implement and entire OS in that isn't 20 years old?
If you're placing bets that Swift will dry up and blow away because Apple is due - any day now - to do the same, you're probably a little TOO old-school. You know what will die before Swift dies? In terms of popularity and profitability? C#, because its fate is tied almost entirely to Microsoft. And that's not going to die for quite a while.
Text input via voice is garbage for anything you don't already do in direct, live conversation with another human. Instant example: Mispronounce something, then try to correct it. What we need is a novel new pointing device. My idea of the future tech involved is: Very very f*%^ smart radar, bounced off your skull, that tracks the location of your tongue in your mouth.
For a long time, these things will need to NOT have an obvious camera on them. The cultural zeitgeist is against it. They'll just have to do augmented reality some other way.
They sold me a phone with the RAM SOLDERED IN?
Good grief, next you'll be telling me that I can't swap out the L2 cache in my CPUs any more...
The smartphone market is NOT simply a larger version of the desktop or laptop computing market. The priorities of the consumers making it up are quite different. You're comparing apples and
And seriously, if you think the iPhone 6 or the Macbook Air is "mid to low end tech"
Oh hang on,
This is worth knowing about. Google doesn't just know what users send to their search engine.
It knows most of the browsing history of the average user, in order, and in real time. (And you don't even need to use their search engine once, for them to assign you a UID.)
Of course, the average user doesn't do more than shop online at major outlets, watch videos, and poke social media. Not exactly high-risk information.
The vast size of the available labor pool greatly reduces the positive benefits that China's factory managers might gain from better treatment of the workers they currently have. They can work them into a stupor and then kick them out for a fresh new batch. The overhead to retrain is quite low in most positions.
A physical polling station prevents this by ensuring a) the voter is not documenting the vote and b) no one else is documenting the vote. Neither a) nor b) can be guaranteed with online voting. It is extremely hard to provide PROOF to someone you voted a certain way in a physical voting situation. It is easy to SAY you voted a certain way, but that doesn't have to be true.
A physical polling station does nothing to ensure that the voter is not documenting their own vote, nor was it designed for this purpose. It's trivial in the modern era to take out your phone and film yourself voting, from beginning to end, inside the booth. Whether you throw some tantrum and manage to get your vote changed, or edit the video footage later, is your own business of course. Your peers pressuring you into demanding "proof" is just as much a problem with paper voting as it is with any other form.
The more important point, though, is this: If you don't want anyone else to see you voting on your smartphone, you can go hide in a closet and vote. If you do, you can always register your vote "in public" and then change it later. If you think someone is going to hold you at gunpoint and stare directly at your phone for the entire duration of the voting period - which can be as long as a WHOLE MONTH, considering how vote-my-mail ballots already work in this country - then you have much, much bigger problems than your ability to vote being tampered with. You are the victim of a kidnapping and the police should be out looking for you.
If you're especially paranoid I suppose the voting software could implement a "no take-backs" feature where you can lock in your vote, so even if you're kidnapped near the end of the voting period, you can't be forced to change it. Then the kidnapper has to simultaneously abduct enough people to sway an entire election the SECOND the polls open, then have enough coercive power with them - threat of imminent death for example - so that they don't just refuse to vote altogether. Again, if you live in a city where this can happen, you have bigger problems.
Same deal with the hypothetical Texas church: If your church locks you in and compels you to vote a certain way on pain of excommunication or whatever, you have much bigger problems at hand. You should be videotaping that and going to the feds with it. Sadly, if you're a member of such a church, you probably think the feds are an agent of Satan anyway. Properly implemented encrypted online voting is not going to influence this, since this sort of ugly fraud is just as possible with absentee ballots and voting-by-mail already.
(Note that this scenario is pretty damn out-of-wack. In many towns, the church is trusted as non-political enough to double as an official polling place.)
Actually, from my point of view, it's you who is missing the point. UI does not exist in a vacuum, it exists in a context of history, because the end user is a moving target.
The biggest example that comes immediately to mind for me is in the web browser. Back in the 80's, no one expected to be able to encounter a random place in a document with some stylistic emphasis, that when prodded with a mouse, would cause the computer to display a different document. That functionality was reserved for very clearly defined buttons, as it had to be to avoid confusion.
Nowadays if a user sees a word on a screen that is merely a different color - let alone underlined - they ASSUME it is functional, and furthermore, that the word itself, plus the context, gives the entire story about what the function is. Everything that isn't "normal" text is likely to be interactive, in fact, including other document elements like pictures and icons - with or without borders - divider panels, blank underlined sections (poke to fill them out) etc. That is a difference in the user, and it informs the direction for the design.
"Good" UI is a lot more than signal versus noise. It's about understanding the "signal" itself: What the user expects, and wants to do.
Yeah, well, that's just, like, your opinion, man.
A button is no longer a button because it was never a button in the first place.
UI design was mouse-focused for about 25 years, and the UI design of smartphones just used it as a starting point. I'm glad to see it move on.
When I poke at a word I am poking a word on a screen with my finger, not a button. Why should it be dressed up in the clown makeup of a button? Position, context, color, precedent, and the name of the thing itself are all strong indicators, and when I am, I aim for the center of it, and the size of my finger intuitively defines the range of error around the target.
Even your meme-ready screenshot is actually proof of how much things have changed: Everything on the Windows 8 screenshot is a button - and we understand it intuitively as such because we're using a touchscreen - and so, there are no button borders. And, we understand everything is draggable, from any anchor point, so there is no need for title bars along the "windows" to provide that anchor point.
Back in the day, some jerk invented that UI by messing around in a workshop with a mouse and going with what felt right. Why in the world would we cling to it, now that mice are dying out?
Really freaking simple reason: Ability to sell, coerce or otherwise influence a vote.
Physical presence at a polling location makes it impossible to do these things, at least on a large enough scale to change an election. No one knows your vote so you can't sell it and no one can "check" to make sure you voted a certain way.
1. It is possible to design an electronic system where no one but you knows your vote. That is, where no one but you can uniquely verify that a given vote is yours, and that it is set the way you chose.
2. The ability to sell, coerce, or otherwise influence a vote is a complex problem, and could just as well be _decreased_ by electronic voting. In general, for every abuse you dream up on the electronic side, there is an equivalent abuse on the physical side. For example, nefarious vote organizers can close polling stations in areas they don't like - or, they can attempt to disrupt internet services to those areas. Want to make the system better? How about offering both?
Pure online voting could / would lead to massive fraud,
Care to provide a reference? Here are some working models you can investigate.
"voting parties" where peer pressure will rule, and otherwise socialize voting.
Have you ever been to a church in Texas?
It is one thing to tell someone who you voted for and an entirely different thing to be able to prove it. Just the ABILITY to provide proof will cause problems. Imagine a fraternity, church or other strong social institution. Do you think you are strong enough to say in the group when they question your loyalty and demand proof of who you voted for. Will you give up your status/membership in that group to preserve your voting integrity? Most won't be strong enough.
You're pretty far behind the times if you think these are new problems - for paper OR electronic voting.
Run the numbers. How much would it cost to
1. convince a voting authority to accept UNENCRYPTED PDFS as a means of voting
2. covertly install functioning hacked firmware on the wireless routers of a significant percentage of the citizenry
Wouldn't the return-on-investment be far better just running a bunch of attack ads?
*shrug* Okay, we'll play the analogy game your way.
You wanna violate the warranty on your fridge and stuff it full of strange items, you go ahead.
The manufacturer is under no obligation to alter their design to facilitate your efforts, though. If you find it awkward to "mis-use" their fridge, go buy a different one.
If you build a mall on your property, you get a say in who sells there, and how they do it.
As soul-crushing as malls are, I don't think they should be illegal...
Speaking as someone who hand-crafted both a 3x5 font and a 7x7 font with italic and bold variants, for use on 320x200 screens, back in the day, your accusations of "hipster" sound kind of ironic to me.
The specific conditions of early lo-res computing included something unknown even in the ancient days of moveable type: An iron-clad fixed WIDTH, as well as an iron-clad requirement that mathematical expressions be unambiguous. So, no matter what your opinions of serifs, you added bars to your uppercase "i" to make the spacing look consistent, and a bar to your numeral "1" to distinguish it from the lowercase "L", and when you could, you put a slash through the zero. Then in the 7x7 font you put serifs all over the lowercase letters wherever you could - like on i, j, and f - again for the sake of spacing. That was what you did for maximum readability in those conditions.
Those conditions are gone. Dragging them into an argument over the readability of serifs is farcical.
... this anonymous comment is a counterexample to your argument.