Link to Original Source
Link to Original Source
I've been part of Slashdot for a little longer than that. I for one can't wait for Slashdot Beta to go live this month! I'll save so much time in my day by not going to Slashdot anymore.
Beta is unusable. Once Beta goes live, I quit. I'm moving on.
Sad thing is, much of the behavior one sees out of federal contracts is due to taxpayer groups demanding anti-corruption measures. A great deal of the bureaucracy comes directly from people complaining about waste and demanding a complex audible process.
I'm going to assume that was an autocorrect error on your post.
Man, I hope they don't close Best Buy. That's where I try out stuff before I go buy it on Amazon.
Sure, Best Buy has been close on price for a few items, but I really don't like getting my bag searched on the way out. I'll try out the stuff that's there, but I won't spend my money there.
1. I have a 3G phone, but my service area only offers 2G. After loading your new site on slooooooooooooooooooow 2G, I'm not feeling very motivated to find a menu item to turn off images. I'll likely go to Google News - Technology section instead.
2. Your beta site clearly is detecting that I'm using a mobile phone, because it gives me a different top-banner than my desktop browser. But that icon that you pointed us to does not exist on mobile.
It's 2013. Supporting mobile devices at the same time as the high-resolution desktops should be a no-brainer. But the beta site looks pretty bad on mobile. On my phone, the new site design does quite a lot of scrolling to the right, mostly thanks to a huuuuuuuuuuuuuuuuge ad at the top. Also, the site pops up a message box that disappears off the left side of my screen, rendering half the message unreadable.
This layout does not auto-adjust to the width of the browser. It is responsive for smaller screens, but for large ones, it wastes space.
The beta site may scale down well for desktop browsers, but not for phones. On my phone, the new site design does quite a lot of scrolling to the right, mostly thanks to a huuuuuuuuuuuuuuuuge ad at the top. Also, the site pops up a message box that disappears off the left side of my screen, rendering half the message unreadable.
It's 2013, we must support mobile devices at the same time as the high-resolution desktops.
Actually, Nautilus and the other GNOME applications listed do have a menu. At the top bar in the left corner next to the "Activities" is a little image of the currently focused application. If you right click on it, it brings up the normal menu that you're used to. It's not very intuitive at first...
That's an interesting UI decision. I would argue it fails the Obviousness criteria.
Here's an example: I use a laptop, with a 22" desktop flat-panel monitor as my second display. For me, it works well to run Chrome, GIMP, and other "large real estate" programs on the desktop monitor. (I run "small real estate" programs on the laptop display, such as Nautilus and Terminal.) GNOME presents the "Activities" action (hot-corner) on my laptop display.
So if my program is running on the 2nd display, there's no connection between the "menu" you describe and the program.
The reason for the inconsistencies you identify is very simple and I know for a fact it has been explained to you *multiple* times before, so I conclude that you are acting in bad faith by posting as if you had no idea about it
No, but I can only comment on the state of things today.
[...] for the sake of the rest of the audience, I'll explain it again: the GNOME applications are in the process of being revised to meet new design guidelines. This process is not complete yet; until it is, you'll see inconsistencies between apps which have been fully converted, apps which have not yet been fully converted, and apps which haven't been converted at all.
And I look forward to trying GNOME again when things are more consistent between all the applications. Until then, I consider Xfce to have much better usability than GNOME.
I've said it before, and I'll said it again: Fedora's GNOME has really lost me. I've been a longtime Fedora user, and I still like the distro, but I'm giving GNOME a pass in Fedora 19 and going back to Xfce.
Fedora 19 includes GNOME 3.8 as the graphical desktop, and I've previously noted that GNOME 3 has poor usability. The GNOME developers have continued this poor usability trend in GNOME 3, which fails to meet two of the four themes of successful usability: Consistency and Menus. Where are the menus? There is no "File" menu that allows me to do operations on files. There is no "Help" menu that I can use when I get stuck. The updated file manager (Nautilus) doesn't have a menu, but other programs in GNOME 3 do (Gedit has menus, and is part of GNOME). Also: when you maximize a Nautilus window, either to the full screen or to half of the screen, the title bar disappears. I don't understand why. The programs do not act consistently.
I will give a positive comment that the updated GNOME file manager now makes it easier to connect to a remote server. This used to be an obvious action under the "File" menu, but in GNOME 3 it is an action directly inside the navigation area. So that's a step in the right direction.
The updated GNOME desktop environment seems to avoid familiar "desktop" conventions, tending towards a "tablet-like" interface. This further removes the obviousness of the new desktop, and it's familiarity.
So it's not really that "Fedora has lost me," but the GNOME desktop. I consider Xfce to have much better usability than GNOME. While I haven't done a formal usability study of Xfce, my heuristic usability evaluation is that Xfce meets all four of the key themes: Familiarity, Consistency, Menus, and Obviousness. The menus are there, and everything is consistent. The default Xfce uses a theme that is familiar to most users, and actions are obvious. Sure, a few areas still need some polish (like the Applications menu, and some icons) but Xfce already seems better than GNOME.
Additionally, if you are technically capable, you can dramatically modify the appearance of Xfce to make it look and act according to your preferences. At home, I've modified my Xfce desktop to something similar to Google's Chromebook (see example and instructions). It works really well and I find it is even easier to use than the default Xfce desktop.
The OP is an absurdly long and wordy essay/rant, and I didn't read all of it. Don't expect me to. (At 3,730 words after the "Read on," that's just into 5 pages if I load it into LibreOffice at 10pt Times New Roman.) But I can give Bennett one example where the 5th Amendment right against self-incrimination is a good idea:
Q: "Where were you when you witnessed the murder?"
In this case, let's say the witness happened to be hotwiring a Mercedes-Benz on a dark and empty street - and saw one person rush out from an alley, shoot the victim right in the head, and run off. The whole thing happened less than 10 feet from where he was sitting inside the car. A perfectly valid witness, he definitely saw the murder (and murderer) even though no one saw him. After witnessing the murder, he called 911 from a nearby pay phone, then drove off in the now-stolen MB.
Without the 5th Amendment, the witness would be compelled to admit to committing an unrelated crime, so would self-incriminate himself.
In 1982, our dad bought a Franklin ACE 1000 computer (an Apple II clone) for us when we were in elementary school. My brother and I experimented with AppleSoft BASIC programming on that, and our parents also bought a book about BASIC. We mostly skipped the tutorials in the book, and jumped right to the reference section, figuring out things on our own by looking them up in the reference then trying them out on our own.
It didn't take very long to "get" programming. I think after a year of writing programs in AppleSoft BASIC, I was writing pretty advanced stuff. The SIN and COS functions were pretty hard to grasp until I had the math classes (years later) to understand them, but otherwise I "got" it right away. I remember writing several small programs, such as a number-guessing game ("too high" or "too low" until you got it right).
I figured it would be interesting to write computer programs that mimicked the computer displays from television and movies. So whenever I watched a movie or TV program that featured computers (and it was the 1980s, almost everything featured a computer) I tried to mock up a similar display on our computer at home. It didn't take long to realize that a special effects person was doing the same behind the scenes, rather than the movie or show using an actual program, but that didn't take the fun out of it.
In 1983, the movie War Games came out, and I decided I wanted to write the nuclear war simulator featured in the movie. It took me all summer, but I eventually wrote something that would draw maps of both the US and Russia, then let you select a few targets and launch missiles. The opposing side would return with a few missiles of their own. It even drew the missile tracks like in the movie. At the end, the program would tally the damage to determine the winner. (I think the "nuclear war is bad" message was lost on me at the time.) It was all in AppleSoft BASIC.
I used different versions of BASIC until college, when I learned my first compiled language. As a physics student, we needed to write our own data acquisition and analysis programs, so we learned FORTRAN77. It was pretty easy to pick up in class, since it wasn't worlds apart from BASIC. In the summer between my junior and senior year, I interned at a research lab. For half the summer, I took data. By the second half, they realized I was pretty good at FORTRAN, so they asked me to update a FORTRAN-IV program that analyzed ellipsometry data (precise optical measurements using a laser). I didn't expect this during my internship. I enjoyed it a lot, but learned to hate FORTRAN's computed GOTO.
After FORTRAN, my brother (a computer science student) introduced me to C, and I took to that right away. It was more powerful than FORTRAN, but still easy to write code. My brother taught me the basics of C, then I picked up the rest on my own through books, including Kernighan and Ritchie's book. And C really put me on the path to computing as a career.
In my experience, "Familiar" doesn't have to mean "Same." Using your example, iOS shares a lot of familiarity with MacOSX. The two environments aren't the same, but they aren't worlds apart either.
I think those two points are somewhat linked. You can lose a little bit of obviousness if it looks like something that already exists (Familiarity)
In one of my usability tests, I observed typical Windows/Mac users with average knowledge quickly figure out how to use most of GNOME 3.4 (Fedora 17) because GNOME 3.4 seemed familiar enough to Windows/Mac, programs acted consistently within GNOME 3.4, they could find actions in menus, and (most) application functions were obvious and had obvious effects.