Software Development

The key to making applications discoverable

In application development, discoverability is how easily a user can find a feature or function. Read tips on making applications discoverable, and learn when you may need to apply the 80/20 rule.

I recently had a conversation with a friend about the usability difficulties I had with Android phones, and something that came up is the matter of discoverability. Discoverability is how easily a user can find a feature or function. A key component of discoverability is consistency within an application and across the user's experience.

When people talk about usability, they often think an interface needs to be intuitive or that it can be sussed out in a few seconds. While it's great to shoot for that goal, the truth is, many applications require a sophisticated feature set that is too deep for users to "get" in a few minute. Some examples of these applications are:

  • Image editors
  • IDEs
  • Statistical analysis
  • Geographic information applications
  • 3D modeling
  • Animation

The list is pretty long. Even if you are quite familiar with those particular kinds of work, it takes a good amount of effort to learn to use the tools. I have been writing software for more than 20 years, and using Visual Studio (a pretty advanced IDE as far as these things go) for nearly 10 years, but that does not mean that I can pick up Eclipse and just start working with it proficiently -- not by a long shot. Even within Visual Studio, there is functionality that I either stumble upon or re-discover. Visual Studio (and Eclipse) is so massive that it can take a long period of usage to find each function, and then learn it well enough to incorporate it into your development workflow. I am not saying that Visual Studio is bad, but it is a great example of an extremely sophisticated application with a discoverability problem.

For application developers, the key to making your applications be discoverable is to follow conventions. In the Windows ecosystem, Microsoft (often with Office) generally is the standard setter. In Windows 95, Microsoft introduced the idea that a right-click should consistently bring up a list of less-used, but still important functions. Office 2007 brought about the Ribbon and contextual, floating toolbars to try to aid discoverability to varying degrees. In each case, application developers quickly imitated the Microsoft style in their own applications.

On the Web, we see a multi-layered set of OS conventions. First, there are the conventions within your application and site. A good example of this is the display style for hyperlinks; if they always appear in the same color on your site, users will see them more easily. Next, you have the conventions that other sites have established, such as linking the logo in the top-left corner to the home page or putting a search box in the top right corner of the screen. Beyond that, there are the conventions of the Web browser; and the expectations of the "Back" button are another example. Finally the user's OS gives them a consistent set of behaviors (like copy/paste). If your website or application is going to ignore or contradict these conventions, you do so at the risk of damaging the user's ability to discover functionality.

In some cases, consistency is not necessarily a good thing. On many mobile phones, for example, there is a consistent idea that a "long tap" should be different from a "short tap" to bring up a context menu. The problem is that the consistency in this case is reinforcing the use of a horrible UI metaphor. The "long tap" is a bad UI idea, because elements almost never have a way of indicating that they can be acted upon by a long tap; the discoverability is close to nil. The good news is, while using these UI metaphors may be consistent, they are so underused by users due to their poor discoverability that not being consistent with them has little penalty.

As an application developer, you need to be aware of the conventions that your users will be accustomed to due to the platforms they use, how commonly they are accessed by users (for example, right-click is not used by many users, despite the number of applications that support it), the level of training users will receive, and other factors that will play into the discoverability of your capabilities. If your features will not be very discoverable, you need to seriously consider whether they should be cut under the "80/20 rule," or if there is another way to expose them in the interface so they can be used.

J.Ja

About

Justin James is the Lead Architect for Conigent.

28 comments
Htalk
Htalk

I don't see that you've said what the key IS to making apps discoverable. Maybe you should rename the article "Try to make your application features more discoverable." because what is written is not about making them so. It's more about how many are not.

kismert
kismert

The counterpoint is that for most modern software products, there is no real training or documentation available. We've all sent an email to the 'support' department, only to get a semi-intelligible response from someone in another country who clearly has no interest in helping you. As for documentation, a typical help file will cover maybe 50% of the UI elements, usually in a way that does not help new users. Online help forums vary wildly: few are excellent, most are more or less a waste of time. So, the reality is, as I see it, that for most of the software you interact with on a daily basis, there is no support in the traditional sense. What you have is the program itself, and the UI that (supposedly) is designed to convey its functionality to you. Thus, discoverability is a critical part of successful program design, and a significant competitive advantage for those who do it right.

jim.lonero
jim.lonero

Yes, large applications do not intentionally make their features undiscoverable. But, there is only so much that can fit on one screen (or window). I have come across applications that are very difficult to get through. And the help is less than helpful. Help that tells you how to do something with out first explaining what it is you are looking at. What is this screen for and what does each control do? Figuring out how to do something, first you need to know how to word it. As far as "right-click", how do you right click on your IPhone or IPad? Use your right hand rather than your left to tap it?

CharlieSpencer
CharlieSpencer

"When people talk about usability, they often think an interface needs to be intuitive or that it can be sussed out in a few seconds." This touches on something I've notices about American culture over the past couple of decades. No one is willing to read instructions or ask for training anymore. It isn't just men lost in Metropolis and unwilling to ask for directions. We treat reading the manual almost like it's a self-inflicted insult; "Look at me; I'm too dumb to figure out a $20,000 GIS tool without help!" Back when desktop computers cost $3000 each, no one minded spending $200 on a training class. Now when the computer costs less than the class, companies conclude there's no economic value in training. After all, the HR VP's kid can "use" it.

apotheon
apotheon

Usefulness is a part of usability, too. People who get too wrapped up in discoverability can sometimes pursue that ideal to the extent that they neglect usefulness. Actually effective help systems with search facilities go a long way toward aiding discoverability in sophisticated applications and environments. Alas, whatever other work Microsoft has done with its work on interface discoverability, it has utterly neglected the effectiveness of its help systems. Part of the problem, of course, is the simple fact that to some extent the usefulness of a help system depends to some degree on the design of the application, and Microsoft treats help systems solely as an afterthought in the vast majority of cases at least. Another thing that can help is to avoid the "one application to rule them all" approach to software development. Make each tool the best it possibly can be at its primary purpose, and let any ancillary purposes fall by the wayside if they interfere with that even slightly. Provide ways to combine multiple applications in a single workflow, arbitrarily, according to the user's needs. This leads to the design concept of composable systems of discrete tools, as well demonstrated by the Unix shell approach of tools that each "do one thing well", with fairly standardized, generic means of interaction (text streams as input and output, pipes and redirects as tools for hooking different utilities together). When doing one thing well requires providing access to more features than can be cleanly presented to the user immediately and at all times, providing an "advanced" feature interface such as a right-click is a great idea. You say "right-click is not used by many users," but that's sorta the point; the features may still need to be there, but not for many users, so you put those features behind an "advanced" feature interface. It's easy to get to them, but easy to ignore them as well, that way. Of course, this all depends on how many it takes to be "many". What's your source for the statistics about how many people use right-click?

taurair
taurair

thanks for shedding light on this. im sure to utilize it in my project

Justin James
Justin James

"For application developers, the key to making your applications be discoverable is to follow conventions." J.Ja

apotheon
apotheon

Macs are designed to obviate the need for more than one mouse button. Apple doesn't want more power at your fingertips, after all. I'm sure the thinking about how to design the iFoo devices is something like "Whew, people probably won't bitch about the lack of right- and middle-click options now."

Justin James
Justin James

I see this all over the place... I really don't think that the price is the reason... cars are expensive (and lethal) and people don't care to maintain them or even learn what maintenance is useful. Firearms are quite obviously lethal, and I frequently see them handled (even by folks who should know better) is a definitely frightening manner. What I *do* think changed, is back in the $3,000 computer day, the only people who bought them were the folks that HAD to have them, and they HAD to know how to use them. At the current prices, zillions of folks have them without a clear purpose or reason, so they never learn to actually use them. J.Ja

apotheon
apotheon

As a result, of course, we end up with security issues running amok, lost documents due to behavior the professionals consider "stupid", and corrupt filesystems as a result of incautious behavior like unplugging a computer to move it from one desk to another without thinking to shut it down first. No, we don't need training! Computers are perfectly intuitive! Didn't Neanderthals and Australopithecines have them?

Justin James
Justin James

I agree completely that application specialization helps a lot. That's something I really like about the mobile application ecosystem right now, the specialization. It's lacking the workflow aspect of it, though some OS's address this. Windows 8 with it's "hub" and "share" ideas is a fascinating concept, though I wonder how well developers will cotton to it. I do NOT think that hiding "advanced" features is sensible in many cases. For example, I do a right-click in this box, and I see features like "cut", "copy", and "paste" on there. For the user who doesn't right click or know keyboard shortcuts, and working in an application that DOESN'T put those features on a toolbar (floating, contextually activated, or static) or menu, those commands don't exist, end of story. We're not talking about "convert to PDF" or "translate with Google" or similar rarely used features... we're talking about copy/paste from crying out loud. I do not have a source for right-click data (note that I didn't quote a number either), but anecdotally I can tell you that painfully few people use it, even people who should know better (like fellow developers). While trying to find a number just now, I did find this sadly representative Q/A: http://au.answers.yahoo.com/question/index?qid=20080703213851AAq5S1Z Next, I checked Jakob Neilsen's site, since he's a usability expert AND a data wonk AND has decades of research under his belt. While I didn't go through every result on his page, I did find this: http://www.useit.com/alertbox/features.html They key points: "Two weeks ago, I observed dozens of average-skilled business users as they attempted common business tasks with two high-end applications. Even though these people were neither geeks nor experts in the software we tested, most of them frequently used right-click shortcuts." I'd like to know what "most" (and "average skilled business users") is, but anything less than 100% of "average skilled business users" means that right click isn't a hot idea. From the same article: "Right-click helps medium-skilled users because it's a consistent interaction technique that works the same everywhere. (Indeed, high-skilled users are often disappointed when an application doesn't support right-click -- for example, if it's implemented in Flash and brings up the Flash player menu instead of contextually-appropriate application commands.) Right-click also works because business professionals and other mid-level users typically depend on their PCs and are willing to learn a few techniques to use it better." Again, it looks like low-skilled users are left in the lurch. While that may be OK in a purely business environment where training is going to occur or users are pre-screened by the hiring process for a basic level of computer literacy (yeah, right...), it's a big mistake in the public application market. So, if a significant portion of users of many apps can't even get to copy/paste, we're really failing. J.Ja

CharlieSpencer
CharlieSpencer

"..business professionals and other mid-level users typically depend on their PCs and are willing to learn a few techniques to use it better." I don't know where Jakob has worked / consulted, but it wasn't here. The only think most of my users want to learn is how to bypass the proxy server so they can get to Facebook.

apotheon
apotheon

quote: I do NOT think that hiding "advanced" features is sensible in many cases. For example, I do a right-click in this box, and I see features like "cut", "copy", and "paste" on there. For the user who doesn't right click or know keyboard shortcuts, and working in an application that DOESN'T put those features on a toolbar (floating, contextually activated, or static) or menu, those commands don't exist, end of story. Are you aware that you basically just said that users don't know there's a right-click button on the physical mouse sitting on the desk to the right (or, rarely, left) of the keyboard? Can you see how this does not strike me as a real concern, how it looks to me like you're really reaching to find a way to complain about interface design? If you are not aware of the fact there is a (useful) right-click button on the mouse, you need some very basic education about effective computer use -- the kind of education that should include things like explaining how to plug it in, which button controls power, and that things like menus and icons have useful purposes on the typical MS Windows display. If you can't figure out the notion of a right-click and a context menu, you are not competent to use a computer without doing about as much harm as good. Would you let a person drive a car without understanding the concept of the "reverse" gear? When I referred to hiding "advanced" features, I was not saying right-click should be reserved for things like performing character encoding translations and bit-width comparisons in a text document. The basic functionality for a text document is stuff like "read" and "scroll so you can read more" and "close without saving accidental changes" and "get this thing out of the way so I can look up how to use it through the handy tool of a web browser because MS Windows help functionality is crap" (i.e. minimize). There are varying levels of "advanced" for features, and they need to be separated from each other only to the extent that there are so many such features that cramming them all into one access method will clutter things up too much. Then, of course, there's the fact that a given feature may be more advanced in one application than in another. Consider, for instance, the fact that the actual intended purpose of PDFs makes copying and pasting very much an "advanced" feature; PDFs are meant to be distributed and read, but not edited or mined for content to put in other documents, in the vast majority of cases. They are not plain text files, and there is a very good reason for that fact. Given the need to support both plain text files and PDFs within a single computing environment, with a UI for accessing "advanced" features that is somewhat consistent across applications, argues for something simple like a right-click acting as gatekeeper for access to copy/paste operations. If a particular application should make copy/paste operations more accessible, it can certainly do this (via the toolbar, for instance), but should probably not eliminate the copy/paste options from the right-click context menu when it does so due to the consistency of user experience that helps improve discoverability and more general usability. By suggesting that copy and paste features should not be in context menus for some applications, you are now essentially arguing against your own points about discoverability. quote: I do not have a source for right-click data (note that I didn't quote a number either), but anecdotally I can tell you that painfully few people use it, even people who should know better (like fellow developers). In my experience, there are only three types of people who do not use it as a rule: 1. people who have better, far more efficient ways of doing things 2. people who have different ways of doing things that are sometimes more efficient, sometimes less efficient, but generally consistent, which automates common activities somewhat so they can think ahead to the next task while performing those activities 3. people who do not know anything, really, about computers, and probably also have difficulty with left- and middle-click, for that matter This does not mean that right-click is unused. In fact, people who have better, far more efficient ways of doing things than the right-click often treat the right-click as "the place to go for something I have not used regularly enough to have a better, more efficient way to do it". That includes me, for instance, when I'm using a system that makes my preferred approaches (e.g. easy, flexible command line access) to acting as gatekeeper to the lesser-used functionality of the system unavailable. Anecdotally speaking, I see people in general using right-click almost exactly the right amount for what it is intended to offer users; not regularly, but when needed. quote: I'd like to know what "most" (and "average skilled business users") is, but anything less than 100% of "average skilled business users" means that right click isn't a hot idea. I betcha he'd find that anything less than 100% is about the number of people who don't use Ctrl-C and Ctrl-V, who don't use Shift-Up to highlight a line, who fail to use "Save" instead of "Save As" when "Save" is the better option, who don't use PgDn or the spacebar to scroll down large chunks (and just use the arrow keys a whole lot), and so on. I guess there is no part of the interface of a computer system that is a good idea. Let's throw it all away, and just not computers, then. quote: So, if a significant portion of users of many apps can't even get to copy/paste, we're really failing. In applications where copy/paste functionality is an expected common usage, key to the core functionality of the application, provide a couple of buttons at the top of the application window. Provide menu items as well. Great. We're cooking with gas now. Don't eliminate the right-click context menu, though. That way lies madness. There is no silver bullet, y'know.

CharlieSpencer
CharlieSpencer

I'm defining it as upper and mid-upper management; and categories like sales, finance, engineering, legal, publishing, or secondary education professionals.

apotheon
apotheon

He probably ends up doing his usability testing with people who are prone to end up involved in usability testing. There are limitations to such testing procedures, unfortunately, though they can still prove valuable. Just take "business professionals . . . are" to mean "business professionals . . . are, more often than some other demographics," and you may still get some value from the information. It's also possible he's defining "business professionals and other mid-level users" differently from how you define them.

apotheon
apotheon

The fact that users must have a minimum level of competence is not my point; it's a supporting argument for my point. My point is that dumbing down interfaces to the point where only the incredibly dumb get any benefit from them is counterproductive. Sure, anyone can use a sufficiently dumbed-down interface, but at some point in the process of dumbing down the interface you discard enough usefulness that most people won't even want to use it, no matter how easy it is to use. It may seem self-evident that continually stupidifying interfaces increases the pool of potential users, but the fact of the matter is that there's more to usefulness than just ease of use -- and past a particular point in dumbing things down, people will rebel against that approach, choosing something else that is still capable of performing the tasks for which they choose their software, even if that means having to spend the fifteen minutes or so necessary to learn to use it. As for garages that hire mechanics who do not know their thumbs from cucumbers and lug wrenches, even when inserted where the sun doesn't shine, my solution for that problem is simple: I prefer to take my car somewhere else.

Justin James
Justin James

... (out of town on business), but I wanted to respond to some of what yousaid here with a rather general statement. Your position is essentially, "users must acheive a certain base level of competancy". Which I agree with on the long term without a doubt! The level of general computer use knowledge is woefully inadequete for the tasks people are being required to do. My position boils down to, "if you want to make a product that can be used by as many people as possible, you need to target the lowest common denominator". I would *love* if the LCD here was right-clicks. I would love it if I could deploy an application to mechanics and their employers told them, "guess what, you need to learn how to really use this" and if needed put them through training. But the world does not work that way. When you write software, with rare occassions, you do not have a choice as to which package your potential users choose. For the time being, application users, all else being equal, prefer more streamlined, easier-to-use applications with simplified UI schemes. In some cases they choose the more complex choice for other reasons (Android is a good example; cost of devices is a primary motivator there for most... Quicken/QuickBooks is another good example... Microsoft Windows and Office also come to mind). As someone who does not develop applications with the luxury of a "you must use this" mandate, ease-of-use is a primary concern of mine. It sounds like the applications you work on allow you to afford a "take it or leave it" position, but that doesn't fly for the projects that I am on. J.Ja

apotheon
apotheon

QUOTE: You say: "anything less than 100% of "average skilled business users" means that right click isn't a hot idea." No, I don't say that. In my comment to which you replied, the only place where those words appears is a block of text quoting Justin James. In fact, I was making a point similar to the point you made here: QUOTE: General ignorance of a very useful feature does not make it a bad idea - it makes it a good idea to spread the knowledge. In other news . . . QUOTE: You can cut, paste and delete with variations on Insert and Delete with shift keys, or Control-X, -C and -V (Unix convention?). The Ctrl-X and other control-key shortcuts you mention do not come from Unix. I don't recall off-hand where they appeared first, though. In fact, on Unix systems, Ctrl-C is generally used to kill a process (like Alt-F4 on MS Windows) unless the application captures the shortcut and does something else with it, and Ctrl-V is generally how one tells a shell or editor to print a control character rather than respond to it (so that, for instance, you might get a special ^D printed to standard output rather than ending your shell session with an EOF control character).

HGunter
HGunter

You say: "anything less than 100% of "average skilled business users" means that right click isn't a hot idea." I disagree vehemently. General ignorance of a very useful feature does not make it a bad idea - it makes it a good idea to spread the knowledge. As another example: an astounding number of people click with the mouse to move from field to field in a form. Many don't even know that the Tab key will do the same thing. This

CharlieSpencer
CharlieSpencer

Just as long as we acknowledge the company bears some responsibility for training employees for tools it requires them to use, especially if the required skills were not part of the original job description.

apotheon
apotheon

Let a corporation that can't be arsed to train its employees to do their jobs suffer the consequences of its own short-sighted penny pinching policies.

CharlieSpencer
CharlieSpencer

There's probably a string of characters in that sentence that is tripping the spam sensors. We saw it last fall when the filter was set to block ads for 'U g g' boots (without the spaces) and wound up auto deleting posts with the word 'suggestion'. You might want to run the sentence by a PTB, or one of the newly-knighted moderators. Edited - I had to post this three times before I found a way to include the brand name without tripping the sensors.

CharlieSpencer
CharlieSpencer

What about those corporate users who are expect to use a computer effectively, but are given no opportunity for training? There's a difference between not wanting to learn about reverse gear and not knowing it even exists. Home users are on their own. They made a voluntary purchase; let them buy a book or Google for 'How To' information.

apotheon
apotheon

QUOTE: applications are not used only by business users who can (or should be) be fired for incompetence On the other hand, I don't see anyone designing refrigerators without doors because non-professional refrigerator users cannot operate refrigerator doors effectively. QUOTE: firing people for sheer incompetence is surprisingly difficult much of the time, and in many countries it is near impossible Yes -- and that's stupid beyond belief, just as designing refrigerators without doors because there are people who suck at closing refrigerator doors, or front doors for houses without deadbolts because some people are incomptent to use a separate interaction technique from the door handle, or smartphone UIs that are incapable of presenting context menus because some people never bother to learn how to do anything other than deliver a quick tap, is stupid. QUOTE: Many applications (such as a number that I deal with) are for use by consumers... or professionals for whom computer use isn't a "job skill". For example, I've been working on an application for mechanics to enter workorders. If you require them to enter work orders into a computer, computer use has become a job skill for them. In addition, if the trainer at a garage shows a new hire trainee how to use right-click three times to accomplish part of the work order process and the trainee is just too stupid to get it, I probably don't want that mechanic operating a timing gun when tuning up my car. We're talking about right clicking here, not computer programming. QUOTE: It's a goofy pattern that few folks learn or get used to. Everything is a goofy pattern that few folks learn or get used to until it becomes a norm. Let's look at your example of a better option, though. . . . QUOTE: WP7, for example, has a menu bar that can be triggered by selecting something with a tap, so the user has a chance to see the options; I note that objects that support this behavior typically do NOT have a function attached to a single tap; the idea is that you tap it, hoping to see something happen, and you are immediately presented with a list of items. That's friggin' awful. It creates circumstances where people get unexpected results, some of which might be harmful and unrecoverable. I can just imagine someone trying to get a context menu and ending up with activation of some functionality the person did not at all want to have happen. I've seen this happen in Ubuntu's Unity desktop, except that it was an actual right-click that acted like a left-click rather than the equivalent of a left-click that you expect to act like a right-click but ends up acting like a left-click after all (which is even worse than a right-click that acts like a left-click). QUOTE: At the end of the day, though, applications exist in a free market. You can sit back and demand that users gain a basic level of competence if you want... but given a choice of applications, users almost invariably choose the "wrong" way (as you perceive it) out of convenience It's not a free market, but I get your point. I'm not sitting back and demanding that users gain a basic level of competence. I'm saying that there comes a time when you have to say "Well, tough; you don't get to have everything your way," if you're incapable of learning to use a device effectively. If you don't just declare some incredibly stupid subset of users a dead loss and move on with your life, and instead increasingly cater to them, you're ultimately going to end up going out of business because the only people you serve are the people who can't use what you're selling anyway. Everyone else will have started using your competitors' offerings instead, because at least those have functionality. At this rate, you'll next be claiming that drag-and-drop is bad on smartphones because it's too similar to swip-to-scroll, or that swipe-to-scroll is the bad interaction technique because of that same similarity, or that both are bad just because they aren't single-tap operations, and they're "goofy" and take a little time to learn to use effectively. QUOTE: Android is the gateway drug for mobile, but iOS is what folks are really happy with. I keep encountering people who resisted smartphones because of the cost, but a cheap Android phone lured them in... and when they got tired of dealing with its issues, they moved to iOS and stayed there. I've seen what amounts to the opposite -- people fleeing iOS for Android, or who resisted getting a smartphone until Android appeared because until then pretty much the only choices were Blackberry, iPhone, or Nokia N-series running an actual Linux distribution (with no service carrier support at all). Meanwhile, Android has been climbing in popularity since its introduction on general availability smartphones, catching up to iOS, achieving market parity, then surpassing it and continuing to climb. If most people who used Android decided iOS on the iPhone was better, Android devices would likely never have even gotten close to catching up with the iPhone, and would have seen huge hits to market share with the introduction of Blackberry touchscreen devices and WP7. Instead, these things do not appear to really have any effect at all on Android market share; iPhone seems to be the loser where Blackberry and WP7 are concerned. Sure, some people pick up a cheap, piece of crap Android and (when they are disappointed with the hardware) switch to an iPhone and complain about "fragmentation" and "poor quality control" and so on for Android (utterly missing the point), but there's a minority of people who try Linux for two hours then give up on it and declare it unusable with no applications and swear off open source software forever (not realizing they're probably using open source software for their home router devices, smartphones, and MS Windows computers). You've seen people try Android with the cheapest piece of crap device they can get (Free with a two year contract!!!) then switch to the iPhone; I've seen people get iPhones because some Apple fanboy raved about it, then regret the purchase and being stuck with a two year service contract. I think what you describe as some kind of trend is actually just selection bias. QUOTE: Personally, I see using Gmail as the "wrong choice" and I do my best to explain why, but its ease and convenience overpowers things like privacy This is completely irrelevant to the discussion at hand. Your complaint that something with a convenient UI is popular despite issues unrelated to the UI is not in any way analogous to my argument that a specific UI feature of Android is not the bad idea you claim it is. QUOTE: You can be "right" until your are blue in the face, but it doesn't change the fact that the criteria by which you make decisions and the criteria that others do not are totally different things. You have absolutely failed to demonstrate that this is the case here. All you have demonstrated is that you think "everybody" wants one thing and I disagree, pointing out that a lot of people want a different thing, and disappointing those other people will not automatically make your product a success. QUOTE: Who am I to try to force others to comply to my vision of what products they should use? That's a good question. I await your answer. You seem unaware of the fact that this is exactly what you're trying to do here, mandating that people conform to your perception of good UI design and that users adopt the same perception as yours, when the numbers tell a different story. Even if some majority thought exactly like you expect them to think, though, three vendors serving 51% of the market will each do worse for capturing market share than one vendor serving the other 49%, especially when the vendor serving the 49% does so by offering functionality, while the three serving the 51% are all offering a lack of functionality that may case 20% of the market to eventually decide "Y'know, I don't actually get any benefit out of owning this piece of crap -- it's just a cellphone that's harder to use," and stop buying the things in favor of a plain ol' touchscreen cellphone plus camera with customizable ringtones, Tetris, and maybe -- just maybe -- GPS. Apart from the touchscreen, the second-to-last cellphone I had fit that description, by the way -- and frankly, a touchscreen would have just gotten in the way. QUOTE: Until schools and employers start doing a good job teaching computer literacy, that means designing apps for the computer illiterate. The best way to serve the computer illiterate is to stop trying to convince them they need general purpose computers, thus my mention of "a plain ol' touchscreen cellphone . . ." above. edit: It turns out that quoting the sentence of yours before the "goofy pattern" sentence was causing TR to eat my comment. WTF is up with that?

Justin James
Justin James

I see where you are coming from... truthfully, a higher level of competence should indeed be required for holding a job that involves computers. Just as a mechanic would be expected to know how to use a wrench effectively, someone doing a job where the computer is an important tool should know the basics. That said... applications are not used only by business users who can (or should be) be fired for incompetence (incidentally, firing people for sheer incompetence is surprisingly difficult much of the time, and in many countries it is near impossible... even in the US, a bad hire can haunt a cubicle for 6 - 12 month). Many applications (such as a number that I deal with) are for use by consumers... or professionals for whom computer use isn't a "job skill". For example, I've been working on an application for mechanics to enter workorders. Some of the mechanics don't even have email addresses, let along computer skills. If someone made "computer literate" a job requirement for mechanics, they wouldn't have more than a couple of people working for them. So no, for the work I do, I *can't* just take the, "you aren't qualified for your job" stance, part of *my* job is to make sure that the userbase can use the application! Re: right-click... no one in the conversation said right-click should be the only path. That observation was drawn from the example of an input box within IE, where right-click *is* the only path to copy/paste functionality, other than the keyboard shortcuts! So there's a real-world example of an organization making the conscious decision that right-click is the only path to functionality, other than having an awareness of keyboard shortcuts. I am not arguing with *you* I am arguing with a real-world example of a horrible decision. In terms of smartphones, yes, I *am* saying to get right of the idea of tap+hold to replicate a right-click. It's a goofy pattern that few folks learn or get used to. The OS/app would need to go extraordinarily out of its way to highlight that a widget has dual behavior. Drag/drop on a widget that also support a tap is another example. With reduced capability UIs, users do not expect widgets to have more than one behavior. There ARE alternatives, by the way. WP7, for example, has a menu bar that can be triggered by selecting something with a tap, so the user has a chance to see the options; I note that objects that support this behavior typically do NOT have a function attached to a single tap; the idea is that you tap it, hoping to see something happen, and you are immediately presented with a list of items. Furthermore, the menu starts collapsed, but shows icons for the most common features, but can be expanded to show the full list. While that adds a gesture to get to the most common item (select then tap the icon), it is the most discoverable method I've seen on the phone UIs to do this sort of thing. At the end of the day, though, applications exist in a free market. You can sit back and demand that users gain a basic level of competence if you want... but given a choice of applications, users almost invariably choose the "wrong" way (as you perceive it) out of convenience, and then tend to be happier for it. Android is the gateway drug for mobile, but iOS is what folks are really happy with. I keep encountering people who resisted smartphones because of the cost, but a cheap Android phone lured them in... and when they got tired of dealing with its issues, they moved to iOS and stayed there. As you know, the free market is a very powerful thing. Personally, I see using Gmail as the "wrong choice" and I do my best to explain why, but its ease and convenience overpowers things like privacy; indeed, that's how Google built its business. The relative marketshare of desktop Windows to everything else is another example. You can be "right" until your are blue in the face, but it doesn't change the fact that the criteria by which you make decisions and the criteria that others do not are totally different things. Who am I to try to force others to comply to my vision of what products they should use? Instead, I need to deliver to them products that work the way they want, expect, and need to. Until schools and employers start doing a good job teaching computer literacy, that means designing apps for the computer illiterate. J.Ja

apotheon
apotheon

quote: [My over all point] that you aren't really addressing well, is that there is a significant portion of the computer-using population that has absolutely no clue what they are doing... yet they have to do it. My point with regard to that specific point of yours is that we should not design the computing world for the people who, when sitting at their desks at work, cannot be arsed to learn that there's a "reverse" option in the car. Rather, they should just be fired so we can get on with our lives, and people who are aware they might have to use their brains from time to time should be hired in their places. Trying to design UIs for people who will never really care to learn any UI at all that allows more sophisticated work than occasionally pushing a single button without any discernment in the task will not help us get anything done. In fact, it will largely result in people who are competent and willing to do something useful -- whether for themselves or for others -- being hindered in that pursuit. I refuse to support the idea of removing the reverse gear from cars just because some people are too stupid to figure out that there's something other than "drive" available. I think the difference between our two perspectives is that you look at someone who can't be bothered to figure out how to use "reverse" and say "We need to redesign the car around this idiot!" and I look at the same schmuck and say "I guess that guy's just too stupid to drive." There has to be a lower bar *somewhere*, or we will instead end up with an upper bar that is as low as the lower bar should otherwise be. quote: I have simply dealt with too many situations where I asked, "did you right-click it?" and the answer was "no" to believe that right-click should be the only "easily discovered" path to access functionality that anyone who is not an "advanced user" would access. I don't think anyone in the world has ever said that the right-click should be the "only" path here. You're making that crap up to make a point that doesn't need making because nobody disputes it. quote: I'm not saying "get rid of right click contextual menus" of course It actually looks like you are saying that, effectively, when you complain about attempts to incorporate a right-click-like feature into smartphone OS interfaces. You very specifically said that we don't need it, so it shouldn't be there. You are now talking about offering alternatives, but what you have previously argued is that the iOS approach of just giving us less functionality so there's nothing left to put in a right-click-like access path is the Right Approach -- which is the approach of not offering alternatives, and not even offering the thing for which such alternatives would be "alternative".

CharlieSpencer
CharlieSpencer

I have yet to find a good example or analogy for explaining right-click to the average users. At least daily, a help desk caller will repeatedly ask, "Right- or left-click?" if I use the phrase 'right-click' even once. We know it lists the possible actions, of which the default is assigned to the left-click ("If someone doesn't specifically say 'right-click', 'click' means 'left-click'."), but that's Greek to my users.

Justin James
Justin James

... that you aren't really addressing well, is that there is a significant portion of the computer-using population that has absolutely no clue what they are doing... yet they have to do it. You talk about drivers who don't know how "reverse" works, and that's a VERY apt analogy. I'd love to have a glimpse at Microsoft's telemetry data, but I'd be guessing based on what I've seen in the Windows 8 Developer Preview that they're seeing a very high percentage of users who essentially do not know what the reverse gear does. I have simply dealt with too many situations where I asked, "did you right-click it?" and the answer was "no" to believe that right-click should be the only "easily discovered" path to access functionality that anyone who is not an "advanced user" would access. I think that the contextually appearing, floating toolbars (like in Office 2007 and 2010) are ideal. Putting things in a toolbar at the top is ONLY a winner if 1) there are few enough functions that what's there isn't easily lost (this is my primary beef with the Office "Ribbon", it is far too crowded!) and 2) if your application doesn't need the screen space (which makes it a loser for certain apps like Web browsers). I'm not saying "get rid of right click contextual menus" of course... but what I'm saying is that there is a certain non-trivial percentage of the population who are not aware of them, and if they are the only path (or the only path that isn't totally buried) to pieces of functionality, you've created a bad situation. J.Ja