Enterprise Software

Enterprise software is about data -- not people

Justin James says that enterprise software is simply a system that consumes and/or exposes integration hooks and programmatic interfaces, while adding some custom workflow and business logic of its own along the way.

In the last week or so, a ton of industry commentators have been yammering about enterprise software's sexiness, cool factor, and so on. George Ou's post about enterprise software is pretty insightful (which doesn't surprise me because we've talked about this topic a zillion times). One area that I don't think he spends enough time discussing is that enterprise software and desktop/consumer/Web software tend to address wholly different needs.

A hallmark of enterprise software is integration and interfaces with other systems. In fact, I would say that enterprise software is simply a system that consumes and/or exposes integration hooks and programmatic interfaces, while adding some custom workflow and business logic of its own along the way. I can interpret many systems through this lens, from SAP to Exchange. Traditional desktop/productivity applications and consumer applications just do not fit this description; they are about communication, content creation, and content consumption. Web applications and dynamic Web sites are overwhelmingly about content consumption and are increasingly becoming "enterprise-y" on the backend.

This is why enterprise software costs so much and takes so long to install and configure. Popping the CD in and hitting "Next" is the least of the concerns. The headache begins when you need to get the system "talking" to everything else. To make matters worse, no vendor can predict or test every possible combination of OSs, other systems, hardware, and so on. The interface between Widget 5.0 on Solaris and Foobar 8.2 on Windows may work great, but who knows what will happen when Widget 5.0 gets updated to 5.1, and a dinky field in the XML schemas suddenly disallows the NULL characters that Foobar 8.2 used to send? Or what if Foobar 8.2 on Windows insists that the data that it receives from Widget 5.0 use the Windows newline character combination, and Widget 5.0 on Solaris won't send it?

In enterprise software, what the user touches is a very small fraction of the data, workflow, or decision-making process. In comparison to other applications, the user has little insight into what is happening "under the hood" when they use an enterprise application. The transparency that desktop application users value ("that happened because I clicked the button") is flipped around ("I clicked the button and magic happened").

This is why enterprise applications tend to have such miserable interfaces that look like a dinky Access form with a toolbar slapped full of buttons that make little sense. It is grocery store functionality -- everything you can possibility imagine under one roof, mostly grouped in some fashion but without much logic in the group. In a way, it makes sense. If the usage of the software could be more linear, chances are, the system would not need the user in the first place. The user interface handles the relatively rare (measured by number of bytes handled) instances where human intervention is needed. The rest of the data manipulation happens automatically.

Another hallmark of enterprise software is that it is always on. Even when you are not working, it is working. In the "Mad Max" future, SAP, PeopleSoft, and Exchange systems will still be swapping packets, while the rest of our civilization is a smoking crater. These systems are the cockroaches of code. Look at those old COBOL mainframe applications; they are the template from which all enterprise systems are designed after. They will not (and probably cannot) ever die. They keep mutating in the environment, but the internals will remain mostly unchanged for decades.

These systems provide great opportunity for employment and profits for programmers -- someone needs to keep them running and make changes. Moreover, hooking them up is big business. It is a poorly hidden fact that integration costs often far outstrip the original price of the software. Imagine paying someone $1,000 to spend three weeks hooking up an inkjet printer, and you get an idea of the situation as it stands. I would never recommend extorting a customer to keep an aging system running or deliberately designing a system to sell cheap and integrate pricey. Nevertheless, this is a fact of our business: If you are interested in $150/hour to $300/hour consulting fees, learn to integrate one of these contraptions because these systems are only good for integrating to other systems. What the user touches is ignored, and the developers working on these applications would prefer that users never touch them.

I think it is a shame that the users are the ones who suffer for this, and it is business owners and stockowners who foot the bill. Until businesses end their reliance on the data processing mindset (fat chance), the world of enterprise computing will remain the same in spirit. The platform might shift a bit -- for instance, maybe Web 2.0 will get some mindshare, and mainframes like AS/400s will lose a bit of market share -- but "it is what it is."

J.Ja

About

Justin James is the Lead Architect for Conigent.

47 comments
Absolutely
Absolutely

The reason for the aesthetic differences you described (accurately, I'll agree!) is that desktop application software is designed for users of client machines to click a button because they want to, and enterprise application software is designed for users of client machines to click a button because they are paid to. There is no reason those GUIs should be as "sexy" as what they use on their own machines (Robert Scoble). Frankly, I think the ability to change color schemes in Oracle is frivolous, and the comments that started this are even worse! [i]A hallmark of enterprise software is integration and interfaces with other systems. In fact, I would say that enterprise software is simply a system that consumes and/or exposes integration hooks and programmatic interfaces, while adding some custom workflow and business logic of its own along the way. I can interpret many systems through this lens, from SAP to Exchange. Traditional desktop/productivity applications and consumer applications just do not fit this description; they are about communication, content creation, and content consumption. Web applications and dynamic Web sites are overwhelmingly about content consumption and are increasingly becoming ???enterprise-y??? on the backend.[/i] That is very good support for your choice of the word "yammering"! Regardless of who is the user of the application, the purchaser of it decides the context in which it works. Desktop applications let people create whatever content they wish, share it as they wish, and in general make decisions of a type that would not be appropriate to leave up to them at work, ie, the enterprise. "Boo-hoo!" [i]If the usage of the software could be more linear, chances are, the system would not need the user in the first place. The user interface handles the relatively rare (measured by number of bytes handled) instances where human intervention is needed. The rest of the data manipulation happens automatically.[/i] In addition to bytes handled, we might look at dollars earned. Either way, these unfortunate users of imperfect interfaces don't have to use them. They can quit. [i]I think it is a shame that the users are the ones who suffer for this, and it is business owners and stockowners who foot the bill.[/i] I don't think it is a shame because although business owners and stockholders do foot the bill, I don't believe that the above differences in design do lead the bills that we foot to increase, and I don't believe the end users "suffer" as a result. The interfaces they use while they're being paid need not give them more flexibility than their employer wishes to give them. They should stop whining, and those other writers should stop encouraging them to whine. [/i]Until businesses end their reliance on the data processing mindset (fat chance), the world of enterprise computing will remain the same in spirit.[/i] Good. That mindset is responsible for the parts of the dot-com boom that remain. The mindset that more so-called "visibility" + pretty interface = good software was responsible for the dot-com bust, and the ongoing malware & data security problems. "Content creation" does not necessarily equal value creation. Ironically, malware and data security problesm are disproportionately experienced by users of the operating systems affiliated with the guy who George Ou cites as setting off this topic! Let that guy have more influence, and there will be nothing left of IT but video game players that cannot connect to each other because all the servers will be too busy with all the botnets & spam.

Mark Miller
Mark Miller

Paul Murphy over at ZDNet has written about this some. He's written a few posts about his view that the way IT is structured, as it's been since the early 20th century, is as an extension of bureaucracy. Bureaucracy is about policy and regulation. For that they need data. I am by no means bashing bureaucracy. The way I think of it is a take-off of what Alexander Hamilton said about debt: "A little bureaucracy is a blessing." Bureaucracy helps keep things organized, and used properly can help focus a team on a goal, and keep up quality standards. The problem occurs in large organizations that were formed in the era when top-down organization was considered the norm (perhaps it still is). Here, the bureaucracy takes on a life of its own. It lives to maintain itself, and the software reflects that. I've written before on my blog that a lot of times programmers are not really given tasks that use their computer science skills. Often they're tasked with plugging "this" into "that" in software, occasionally writing something for an edge case in the infrastructure. What you've written here helps fill this picture out a little more. The complaint I've heard about enterprise software is it tries to be this monolithic thing that's all things to all people, which ends up being a disaster. The alternative I've heard from some is a more bottom-up approach of addressing problems with computing as they arise. So initially computer solutions are fragmented, letting "a thousand flowers bloom", and eventually they get integrated. Even so, it's helpful to have some policy around this, because if employees pick up a bunch of different off-the-shelf software it may be impossible to integrate them later. This may be where the enthusiasm for open source/open standards software comes from. Maybe it's better at allowing this organic process, the result of which can be integrated down the road more easily.

LocoLobo
LocoLobo

What is the definition of Enterprise Software? We're not just talking about the difference between Server "Standard" and "Enterprise" versions are we? It seems the articles are talking about more than that. Sorry if I'm slow.

Jalapeno Bob
Jalapeno Bob

Enterprise software is about data - true, but user interfaces is about USERS. The poor quality of enterprise software UIs cost companies millions - in lost productivity, training costs and employee dissatisfaction leading to turnover. As any retail software vendor knows, intuitive UIs lead to profits. For the business owner, intuitive UIs cut training costs, reduce employee mistakes, reduce employee frustration and allow the employee to complete more work. This ROI (return on investment) may not be measurable in one or even two fiscal quarters, but is measurable over a fiscal year. The problem is that many senior executives, especially those from sales or marketting backgrounds, just do not understand "production work" and what helps or hurts the workflow.

Professor8
Professor8

Yep, "enterprise software" involves massive privacy violation: retaining information beyond the relationship, integrating, i.e. using information for purposes other than what a customer or employee would reasonably expect... They must be eradicated. Get out the herbicide, the pesticide, mix concrete with Round-Up and pour over these ERP annoyances. Give me liberty from enpire-building B-school bozos or give me death!

georgeou
georgeou

It's the mindset that if it isn't broke, don't fix it. IT organizations (especially when it comes to enterprise software and critical systems) are extremely conservative. Very little ever changes even if you can show that brand new hardware costs 1/10th the maintenance fees. Just trying to convince people that tapes are a bad way to backup will get you some evil eyes in certain circles.

Justin James
Justin James

You are right that a major difference is the idea of employment. When using a piece software is a requirement of the job, the employee has little vested interest in whether or not the software is any good, other than whether or not the inefficiency or overall suckiness of the application will affect the annual performance review, and whether or not it is so frustrating to use that job satisfaction is affected. I would hope, though, that a smart employer would choose software that allows their employees to be more efficient, and happier with their jobs, *all else being equal*. There may indeed be some system that is so amazing on the backend, that a miserably bad UI is forgivable. I beleive that the Oracle DB fits neatly in this category. So do systems like the AS/400. They are so good "behind the scenes" that a "put up with it or leave" mentality can work. But it is still not ideal nor optimal (two slightly different concepts). At the end of the day, these products tend to be quite mature. When you look at the thousands of programmers who have been involved with developing these systems over the years, it is difficultto understand why no one budgeted for a usability expery, or a good designer. Heck, explain to me why Oracle still has Mickey Mouse adminstration tools... no one spends big bucks on a tool to admin SQL Server, MySQL, Sybase, DB2, but they'll spend money on TOAD to compensate for Oracle's lousy tools. At the end of the day, I think the customers are complacent. Mark's point is valid too. I think that there is a time and a place for the data processing mindset, but even in that time and place, there is no legitimate reason why these systems should *completely* ignore the parts where people put the hands on the mouse & keyboard. J.Ja

Mark Miller
Mark Miller

[i]The reason for the aesthetic differences you described (accurately, I'll agree!) is that desktop application software is designed for users of client machines to click a button because they want to, and enterprise application software is designed for users of client machines to click a button because they are paid to. There is no reason those GUIs should be as "sexy" as what they use on their own machines (Robert Scoble). Frankly, I think the ability to change color schemes in Oracle is frivolous, and the comments that started this are even worse![/i] I agree with you that color scheme changes are not real functional. They may make the workspace a little nicer to work with, but other than that they add no value. I think you give a false argument though, because you present a choice of either this, or the "paid to use it" interface. Your analysis leaves me cold. User interfaces should be an extension of the user's mind. That was the idea from the beginning. If you want to go down the road of "use it because you are paid to do it", go back to punch cards for crying out loud! I'm sure they're just as incomprehensible as the "paid to use it" interface. Why the hell are you using a GUI anyway, because it's fashionable? [i]That is very good support for your choice of the word "yammering"! Regardless of who is the user of the application, the purchaser of it decides the context in which it works. Desktop applications let people create whatever content they wish, share it as they wish, and in general make decisions of a type that would not be appropriate to leave up to them at work, ie, the enterprise. "Boo-hoo!"[/i] If you have people who either don't know what their job is and don't care, or who aren't appreciated for their talents then I can see how you have to "lock them down" to get them to do their job right. The problem is this is not an efficient way to get work done. You don't get any creative problem solving from the majority of your workforce. That all happens at the top of the pyramid in their "infinite wisdom". Company culture is a very difficult thing to change. Cynicism like yours, though, is part of the problem. It's basically an attitude of "This is as good as it gets". Have you ever noticed that there are some large corporations who thrive and yet there are others who are going down the rathole right now? Ever wonder why? [i]In addition to bytes handled, we might look at dollars earned. Either way, these unfortunate users of imperfect interfaces don't have to use them. They can quit.[/i] Gee. Nice attitude. I bet that really attracts talented people...not. If the work involved just involves button-pushers, then yeah, go ahead and replace them with machines. You'll be doing them a favor. [i][The data processing mindset] is responsible for the parts of the dot-com boom that remain. The mindset that more so-called "visibility" + pretty interface = good software was responsible for the dot-com bust, and the ongoing malware & data security problems.[/i] I beg to disagree. I wouldn't lump Google in with "the data processing mindset", as just one example. It seems to me you're jaded by the "pop culture" that's grown up around the web. If so, I agree. I don't like it either. You're ignorant if you think, though, that the only choice is between the pop culture and the data processing mindset. The data processing mindset began in the early 20th century. Newer ideas came along in the latter third of the 20th century that bring out the power of computing, not just make it "pretty". Some of those ideas have since been adopted in badly designed platforms, which has led to the problems you've seen with it. I'm sorry this has discredited those ideas in your eyes. It's about time they were rediscovered and put to good use (in well designed platforms). Now is not the time to fall back on the "tried and true". Those ideas were good in the absence of something better. A good part of the blame is us. I don't mean just us engineers. I mean us as a people, a workforce. What we need is a workforce that is ready and willing to learn, and believes in the human potential that is created in the presence of computer technology. Imposing IT policies from on high does not lead to efficiency and effectiveness. Instead what comes about is the result of a small group's myopic view of these traits. You'll find efficiency and effectiveness is not their primary goal, but rather "don't screw it up", which leads me to wonder "what's the point?" Get rid of the incomprehensible crap and work with something the workers can actually understand and use effectively. We can be better than this. If anyone's wondering why foreigners are kicking our @#$, this is it.

Justin James
Justin James

The number one problem with E.S. is that all too often, the developers (and integrators) try to turn a collection of various vendors' unrelated products into an integrated whole. In other words, they are using a general purpose, non-company specific software mix (SAP, Seibel, Exchange, Oracle apps) with a DB of some sort (the customer's DB of choice, of course), a general purpose OS (*Nix or Windows), all running on general purpose, commodity hardware... and try to turn it into a cohesive whole like an AS/400, with a workflow and logic customized to their needs. The system ends up having so much of its data structure and logic in an overridable condition, it is rediculous. Imagine a car designed so that if the customer prefers Ford motors, one can be put in, even though it is a Chevy, or if the customer likes RWD (or AWD) better than front-wheel drive, that is a "bolt on" part replacement. No way. Yet we demand precisely this from E.S. systems. And on top of that, the UI needs to somehow work no matter what the customer did underneath the hood. I really wonder sometimes how the mainframe/minicomputer era actually came to an end, because it is pretty clear to me that many customers actually want an AS/400, not a *Nix or Windows box. J.Ja

CharlieSpencer
CharlieSpencer

"We're not just talking about the difference between Server "Standard" and "Enterprise" versions are we?" Nope. We're talking about application software designed for running large corporations. I haven't read the source article in a couple of days, but I recall it featured strong emphasis on enterprise or corporate resource planning (ERP / CRP; what use to be manufacturing resource planning (MRP)), SAP- or MAPICS-type apps although there are others. These server or mainframe apps often form the backbone of corporate data systems, usually require a staff of their own for care and feeding (often contracted or outsourced), and are major investments usually require a couple of years to implement and intended to run for over a decade. Round-the-clock availability is a given, as is network access across regional or often transcontinental networks. Compatibility with and between these apps is usually a requirement for other potential software purchases.

NickNielsen
NickNielsen

The end user can be trained to use a non-intuitive interface (vi comes to mind), but nothing frustrates users more than a UI that actually hinders task completion. The two worst, in my experience are the "back-to-home" and the "zig-zag." The back-to-home UI requires that the user drill down through two or more menu screens just to reach actual transaction screens and returns the user to the home screen after each transaction. The zig-zag UI advances the cursor out of sequence with field position on the screen (i.e. from Employee Number to Street address to Shop to Name to City, etc.). These and other frustrating UIs point to a distinct unconcern with the actual user of the systems on the part of both management and the developers. Edit: clarify

seanferd
seanferd

Admission: I only have experience with enterprise systems in the form of AS/400 terminal use. It was a fulfillment-type inventory, client, and customer info database. Sure, sometime this software can look like a pig to the user. Some things may not be implemented very well, but this probably isn't so much the case with software purchased from a major vendor. The UI can be a bit convoluted for the user, however. What I wonder about is, are complaints about the UI more prevalent since the advent of widespread PC use at home or in the office. The simple point-and-click interface of Windows has led people to expect that anything with a keyboard and screen should be easy and intuitive. Even this expected ease-of-use is somewhat mythic, or there wouldn't be so many help websites and books for those who want to know: "How do I...", or "Why doesn't this work." Enterprise software is probably much like other tools: one needs to learn how to use it. We could look at this in terms of some entirely different industry. What about an operation with a lot of heavy machinery. What would we say about an engineer who complains that he has to check gauges here, open valves over there, check parts for wear all over, and power up units individually, etc.? Unfortunately, this fellow doesn't live in the world of Star Trek where you can re-route this, re-configure that, or modify whatever bit of complex hardware by poking 3-4 spots on a touch-pad. Yes, I'm sure that some companies could save time and money, and lower employee frustration by building more intuitive front-ends for their business apps, but the employees who use them must also realize that they get paid for a reason. Their interaction with the system is minuscule compared to the overall operation of same. I am sure there are valid complaints from the users of enterprise software, but I will bet the most complaining arises from sheer laziness. I've seen it. Perhaps some of these folks would rather hang iron 500 feet off the ground in freezing temperatures at wind speeds of 30 mph. Then they can talk about how hard it is. Note: I am really not trying be an a$$ here. If my comments don't apply to you, I'm not talking about you.

Justin James
Justin James

I agree completely, and have written quite a bit about issues of usability. The reason why usability gets ignored in these things is because everyone is so focused on the data itself, they forget that a real person has to use it at some point. The "project sponsor" execs get tied up in making sure that they get the reports they want to see, and forget about the worker bee who puts that data into the system. J.Ja

CharlieSpencer
CharlieSpencer

"'enterprise software' involves massive privacy violation: retaining information beyond the relationship, integrating, i.e. using information for purposes other than what a customer or employee would reasonably expect..." Care to justify any of those positions? Like any other technology, software can be used properly or misused. Just because one company misuses a technology is no reason to deprive those who use it ethically as an effective business tool. Should we ban automobiles because of drunk drivers? How about electricity? It can start fires, you know. Let's just climb back up into the trees tonight and huddle beneath the Moon Goddess.

ManiacMan
ManiacMan

but they are so old school in their thinking that they simply won't go for the idea. I've even shown them how much time and money they would save by consolidating everything onto the newest generations of blade servers, but they're still resistant to adopt it. Some companies will never change because they're afraid of that which they don't know or understand.

Tony Hopkinson
Tony Hopkinson

or customised shelfware, interface is a bit crap is not going to be a justification for a lot of expenditure. Data integrity, ie, stocking database that loses stuff, that you can justify. Most of the time you can piggy back some UI fixes on top when you are in the code anyway.

Absolutely
Absolutely

[i]I would hope, though, that a smart employer would choose software that allows their employees to be more efficient, and happier with their jobs, *all else being equal*. There may indeed be some system that is so amazing on the backend, that a miserably bad UI is forgivable. I beleive that the Oracle DB fits neatly in this category. So do systems like the AS/400. They are so good "behind the scenes" that a "put up with it or leave" mentality can work.[/i] That phrase "all else being equal" really got me thinking, still in my "data processing mindset," that what you're calling "behind the scenes" is where the essential aspect of the [i]productive[/i] purpose of the entirety of any business computer system occurs -- whether "enterprise"-scale or not. Some people forget, in discussions of employees especially. that businesses exist to profit. But that is the truth, and the purpose of the "end user," [b]in the context of this discussion[/b], is corporate profit, which is achieved by performing the processes "behind the scenes." Until other "behind the scenes" systems are uniformly as good as AS/400, or at least very close, those which invest instead in user interface-type improvements to essentially inferior systems -- defined in the terms of the "data processing mindset"! -- will disproportionately fail. Probably at great expense to the economy, as we saw in the bursting of the tech bubble, because people with the stupidest, lousiest, superficially insightful but essentially worthless ideas get the most practice making their worthless schemes seem "exciting." [i]But it is still not ideal nor optimal (two slightly different concepts).[/i] A prerequisite of being ideal or optimal is being good enough. The market is cluttered with products that are not, and [b]that[/b] is why the fine point of an optimal or ideal user interface cannot be produced -- the market is not sufficiently competitive that the investment of time necessary to create one can deliver any competitive advantage. Those that could realize marginal gains in efficiency are already using better "behind the scenes systems," while those most likely to adopt prettier user interfaces use crap "behind the scenes," and will inevitably fail. [i]At the end of the day, these products tend to be quite mature.[/i] It's an old story, really. "Slow and steady wins the race." [i]When you look at the thousands of programmers who have been involved with developing these systems over the years, it is difficult to understand why no one budgeted for a usability expert, or a good designer. Heck, explain to me why Oracle still has Mickey Mouse administration tools... [/i] I cannot rationalize that, belief in UFO abductions, or any other variant of insanity for you. Beware the man who can. [i]At the end of the day, I think the customers are complacent.[/i] Well, of course they are, they have all that morass of government regulation discouraging all but the most obsequious from ever entering the market to "compete" against them! You'd be complacent, too, if your life was that easy! [i]Mark's point is valid too.[/i] I'll grant that it's partly valid. [i]I think that there is a time and a place for the data processing mindset, but even in that time and place, there is no legitimate reason why these systems should *completely* ignore the parts where people put the hands on the mouse & keyboard[/i] I agree fully, there is no *legitimate* reason, but the reason that exists is not within the power of software designers to redress, profitably. Anybody who tries to tell you otherwise may as well be selling you beachfront real estate on the moon.

Absolutely
Absolutely

Mark Miller: [i]It seems to me you're jaded by the "pop culture" that's grown up around the web.[/i] absolutely: I might say "jaded" is a euphemism, but you're approximately right. Mark Miller: [i]If so, I agree. I don't like it either.[/i] absolutely: Why not? What about it? Mark Miller: [i]You're ignorant if you think, though, that the only choice is between the pop culture and the data processing mindset.[/i] absolutely: And you're projecting if you think that's what I said. Mark Miller: [i]The data processing mindset began in the early 20th century.[/i] absolutely: It has proved its potential to do useful things. Mark Miller: [i]Newer ideas came along in the latter third of the 20th century that bring out the power of computing, not just make it "pretty". Some of those ideas have since been adopted in badly designed platforms, which has led to the problems you've seen with it. I'm sorry this has discredited those ideas in your eyes.[/i] absolutely: "Newer" does not always mean "better." Let's try not to forget that some new ideas are good and implemented well enough to be recognized as such; others are good but implemented badly enough to seem like bad ideas; and unfortunately, some ideas are so lousy that the skill of those responsible for implementing them has no chance to help the idea seem good, or bad. Mark Miller: [i]I agree with you that color scheme changes are not really functional. They may make the workspace a little nicer to work with, but other than that they add no value. I think you give a false argument though, because you present a choice of either this, or the "paid to use it" interface. Your analysis leaves me cold. User interfaces should be an extension of the user's mind.[/i] absolutely: Why should they? To accomplish the user's task, which is set by ... ? Different answers at home than at work. That's just a fact of life, and if that seems "cold" to you, so be it. Mark Miller: [i]That was the idea from the beginning.[/i] absolutely: From the beginning of what? I think you place more emphasis than necessary on making the interface "easy." Where the user is capable of doing useful work, that user is capable of learning. Where it is necessary to make interfaces to easy that they are "an extension of the user's mind," you may as well get rid of the users and have programmers capable of making such an interface do [i]all[/i] the work. Mark Miller: [i]If you want to go down the road of "use it because you are paid to do it", go back to punch cards for crying out loud! I'm sure they're just as incomprehensible as the "paid to use it" interface.[/i] absolutely: Punch cards were not incomprehensible to those who operated them. You're apparently conflating my argument with the antithesis of your position, to conclude that I prefer inconvenient user interfaces, [i]other things being equal[/i]? Mark Miller: [i]Why the hell are you using a GUI anyway, because it's fashionable?[/i] absolutely: In a very real sense, yes. The market forces that have replaced command lines with GUI's have made GUI's the "only thing out there," for entry-level consumers, for some time. I've actually found that I can work more efficiently with a more elegant and stable code base, operating at a command line, than in the leading manufacturer's bloated, unstable, inefficient GUI. To the extent that a GUI is more efficient for a particular data task, I'll use it. I often use software with a GUI, even in Linux However, because a tool takes some time to learn to use properly does not constitute a failure on the part of that tool's designer. That does not imply that I mean that interfaces should be made deliberately more difficult than necessary for the end user. It does mean that I have accurately recognized the fact that at home, one's computer exists to serve the end user, but at the office, using a computer owned by somebody else, the same expectation is not reasonable. Mark Miller: [i]I don't mean just us engineers. I mean us as a people, a workforce. What we need is a workforce that is ready and willing to learn...[/i] absolutely: Take note, you begin to contradict yourself here. Mark Miller: [i]...and believes in the human potential that is [b]created[/b] in the presence of computer technology.[/i] absolutely: Human potential exists independently of computer technology. Computer technology did not create human potential; humans created computer technology, and other humans can certainly [u]learn[/u] computer technology without mollycoddling user interfaces, that are "an extension of the user's mind." Whatever effect computer technology has on workers, it can only build on our potential, not [b]create[/b] any. Mark Miller: [i]Get rid of the incomprehensible crap and work with something the workers can actually understand and use effectively.[/i] absolutely: Where did your "workforce that is ready and willing to learn" just go? The way that competent workers "get rid of" what is "incomprehensible" to us is to learn, ie expand our knowledge, not to expect somebody else to redesign everything down to the level of our current knowledge. Mark Miller: [i]We can be better than this. If anyone's wondering why foreigners are kicking our @#$, this is it.[/i] absolutely: True, but in the context of my statements more than yours. Our economy is soft because too many of "us" are whiny little lazy crybabies, and because they have enough enablers to get away with it, for now.

Tig2
Tig2

Is magic. They want you to do magic and make it work the way they want. Of course that doesn't take into account if what they want is the right thing for EVERYBODY. There comes a place where we have to draw a line in the sand and simply tell business when it can't be done. Or when it CAN but is not feasible. Somewhere along the way we have gotten accustomed to tolerating scope creep until business can't get their needs met. Then we de-scope and declare victory. And no- most end users do not want to confront the pain of actually LEARNING the box on their desk. They want to be productive, which has nothing to do with actually learning something new.

Mark Miller
Mark Miller

Perhaps this is the result of compromise. The "PC revolution" was perceived by the business community as a rebellion against the mainframe culture. As I understand it mainframe systems were more centrally managed than the IT departments of today. Some people really chaffed at this. They wanted computing time when they wanted it, where they wanted it. The "priests" of the mainframe system told you how it was going to be. Period. Secondly, computing was not distributed. It was centralized. So a large company might have only one or two mainframes, and everybody was competing for computing time on it. The way the "PC revolution" turned out is people could buy small versions of minicomputers that were economical, and you had your own private computer you could do anything with, when you wanted to. No gatekeepers. There were the attendant problems of PCs when it first started, like system instability and software incompatibility. The "internet revolution" was a "netification" of this. Processing would be distributed, but on managed servers, not individuals' PCs. Ironically, the web resurrected the mainframe end-user interface to all this. So I think what you're seeing is the result of the "success" (it's a mixed bag) of attempts at decentralizing big system processing. You have distributed processing on hardware, you have grassroots computer installation (like in the "PC revolution"), and you have the desire to unify all of this into a cohesive whole. It seems like what permeates this whole evolution is the desire to "democratize" computing, like it's some sort of cause. Democratizing is fine, but the result has been democracy with little understanding of what the end goal is. You have people making choices that are ill-considered. It answers their needs for their neck of the woods, but not the needs of the organization. Like I was saying, maybe this is where OSS comes to the rescue. It takes some effort, or mucho cash, to pay someone who understands it, but at least the pieces can be adapted to work as a cohesive whole after the fact. Closed-source software complicates the situation, because integration is either on the vendor's terms or on terms of a third-party supplier that's been able to license access to the parts that need to be integrated. It's like the goals are mixed up: democracy with limited access to its tools. Another problem, IMO, is cultural. There's not only been a "let a thousand flowers bloom" attitude towards IT development in some cases, but also among software developers. This isn't as bad for the situation you're talking about, but it leads to an excited attitude about "gap filling", instead of really taking software and system architecture into consideration. The result has been an unpleasant computing experience, which discredits the whole enterprise (of IT, that is). The sense I've gotten is that IT is largely viewed by business as a "necessary evil". I get the sense that this is what the post-er "absolutely" was reacting to. IMO the answer is not to go back to the past, but instead take the good ideas that inspired this ill-considered mess and apply them in a well architected system. This is easier said than done right now, because most people are not familiar with the true power of the newer ideas. There's less easy money there as well. The money is going into "gap filling": "Ooh! Look at THIS cool thing!".

LocoLobo
LocoLobo

put in that context some of the comments in the referenced articles make more sense. Not something I have to deal with much here. We're reasonable small (20-30 people, 60+ computers), but our parent organization is larger.

seanferd
seanferd

The odd thing is, sometimes "bells and whistles"-type "improvements" are made to a UI at the behest of certain users, while really inefficient core issues go unresolved. I think you really hit the nail on the head as to what actually needs to be improved in the UI, whether it is green screen or a GUI. Related functions should always be available no matter which "screen" one is on. Any functions that apply across compartments (for the lack of a better term) should be available on-screen without backing out or drilling down. It becomes difficult to reference two or more different but related data sets when they can't be made available at the same time, much less having them 2 to 10 screens away from each other. Sure, there are printers and pencils, but these just add to waste of time and material. The thing that always bugged me I stated in my first paragraph: That the folks who do drive changes to a UI frequently aren't bringing about real improvements. I do agree that better work flow should be built in from the ground up, after all, humans are expected to interact with these systems. Even when 99% of processing time is is due to automation, if there are more than a fistfull of users forced into time-wasting acrobatics, it is costing a company money, and possibly worker morale. As to the zig-zag phenomenon, I have even seen the cursor disappear completely. This was on a green-screen, and what happened was that the cursor had parked itself in some solid green space at the bottom of the screen, becoming entirely camouflaged. Arrows or back-tabbing didn't always work, and most of the users in my department wouldn't even think of trying these solutions anyway. I think that Nick here has narrowed in on what *should* be improved in an enterprise UI. I agree that 'blame' rests with the lack of concern of vendor-developers, in-house developers, and management. I think that it depends on the situation in deciding who actually is responsible in a particular case. When changes *are* made, frequently they seem to be in response to the wrong users. There is much room for improvement in enterprise software UI. However, I still think that much of the complaining about this software is due to sheer laziness, the desire for eye-candy, or the belief that enterprise software should be just like the PC software that most users are used to using. I have even noticed converse of this issue: That some older users find the modern PC utterly incomprehensible, whereas they have no problem with any mainframe or enterprise system they might use, or old DOS PCs for that matter. It's VAX over Vista for some of these folks. While this may simply be a disinclination to learn something new, the same holds true of some of the younger crowd, who don't want the bother of understanding how enterprise systems work. Heck, a lot of them can't be bothered with learning how to use their more favored PCs either. They just want everything to look and work like their one or two favorite programs. >>> Happy New Year, folks. :)

ray
ray

Yes, the complaints about UIs are more prevalent since the advent of the PC. As we users mature, we become more demanding because we have experienced and learned more - we have a more extensive pool to draw from. The intent of a SW system is to simplify a work process - allow it to be accomplished safer, in less time, and with greater accuracy (less errors) - cheaper. Fundementally, the problem is the people that buy the systems; they are not the users; they are not even the consumers of the results. They are too removed from the work process to understand the value of simplicity. This is a huge disconnect: We have the people that hold the seat, the people that justify the expenditure for that seat, the peoople that program the interface to that seat, and the marketers that sell the seat. This is a disperse, desprate group spanning mutliple companies. The industry is too immature to have this coordinated yet; but it will come in time. The advantage being that software is so mutable, we won't get locked in. The COBAL nugget that is so securely wrapped today will be displaced as the understanding of business needs progress. Learn the tools? Change the tools and the rules change. The intent is to have the system perform as much of the work as possible. The user should just be handling exceptions. Star Trek? It is here today. I design control systems for refineries and chemical plants. What we saw on Star Trek is here today - it is here where it is needed; when the right people get together. It is a shared experience; each member of a team brings their expertise. In a chemical plant, the string is long: Business leaders have markets to satisfy and maintain, production engineers have profit margins to stretch, operation engineers have to keep the system safe and working, operators handle exceptions. When a change is needed, they bring in an engineering, procurement, construction team to help. All these bring their expertise. A good project manager coordinates the requirements to deliver the business solution through technology - Star Trek in Beaumount, TX or Shedgum, Saudi Arabia. This business model is more mature than the sofware systems industry. The network of requirments and solutions has been exercised exstensively. But not as much as the steel industry where we have an accountant on each team. The key: Money. What is the most cost effective solution? More important - what is the problem?

Absolutely
Absolutely

End users of inventory systems did not always have a basis for comparison. As a result, they learned to use the tool their employer paid them to learn. Now, just because both are "computers," some get the expectation that work should be as easy as composing an e-mail. That is not realistic.

Absolutely
Absolutely

[i]"'enterprise software' involves massive privacy violation: retaining information beyond the relationship, integrating, i.e. using information for purposes other than what a customer or employee would reasonably expect..." Care to justify any of those positions?[/i] I take it back; it's also possible he's just trolling and has no clue whether what he says is accurate.

Mark Miller
Mark Miller

Sorry about that. I've been busy lately so I may have missed my chance to continue this with you, but I was using some shorthand. What I meant was in terms of being sympathetic to the user, the GUI they created came closer to the way the mind works than the previous methods of interaction that had been tried. Not that the system was closer to becoming artificially intelligent, but rather it was an environment that supported what human minds tend to do already--namely try out ideas. One of the leaders in the development of the system was Alan Kay, who believed contrary to the "data processing mindset" that the computer should be an [i]extension[/i] of the user's mind (not a replacement for it). It would help a person think about phenomena and probe problems for solutions by providing an environment for doing that, which used cues the user could understand readily. A fair amount of cognitive science went into it. There was some computer literacy involved for the user, but not nearly as much as the more traditional IT systems of the time.

Absolutely
Absolutely

*Other things being equal* better software adds less difficulty to the end user's task; it cannot take away *all* the challenges of work, but should take away more of them than it introduces, otherwise it is a net loss.

Absolutely
Absolutely

[i]The idea for the GUI format that we're familiar with today, for example, originated with a team that developed the first version of it at Xerox PARC about 30 years ago. The intent behind the project was to make the computer come closer to working like the mind works, to facilitate people being able to create their own mental models in the computer and run them as simulations. You might say, "Okay. So what?" These ideas have been embodied in word processors and spreadsheets, tools which have led to workers being more productive than typing on typewriters and line editors, and "running the numbers" by hand on paper. The idea was rather than forcing the user to use a cryptic interface it came closer to working the way the user's mind works. Instead of forcing the user to plan what they're going to do before they do it to avoid headaches, allow them to try out solutions, see them succeed or fail in the simulator, and correct as necessary, with quick feedback. The idea, in short, was to help the user think through problems and come to solutions, which was in stark contrast to the way computers had operated up to that point--hence my reference to punch cards.[/i] I'm not sure that's a good idea; when there's already a human user, doesn't the computer become increasingly redundant the more closely its processes resemble human thought processes? Or do I have that backwards, in your opinion?

Mark Miller
Mark Miller

[i]Me: It seems to me you're jaded by the "pop culture" that's grown up around the web. If so, I agree. I don't like it either. absolutely: Why not? What about it?[/i] Basically what you have are hucksters who come up with something that solves a problem, neglecting the larger whole. I don't believe they do this on purpose. I think they genuinely think they are doing something beneficial for themselves and their customers. Among them some are better at actually producing something helpful than others. They manage to get people excited about it, make a lot of money, and they call it a victory, which encourages others to do the same. One more chip on the pile, which contributes to the mess you've seen. [i]Me: You're ignorant if you think, though, that the only choice is between the pop culture and the data processing mindset. absolutely: And you're projecting if you think that's what I said.[/i] I apologize for the strong language. I should've said "you're mistaken", rather than "ignorant". I was trying to think of a better phrasing of that, but drew a blank at the time. Maybe I should've followed the "if you can't say something nice..." policy. Anyway, the intention was the same, and I imagine you still would've disagreed with me. All I saw you do in your post was rip the pop culture, and then present the "data processing mindset" as the solution. I didn't see any consideration for alternatives, which told me you think those are your only choices. What other conclusion would I draw? I can see how if you're only looking at the commercial options available one would come to your conclusion. Some of us are not content to only look at these options. Sometimes we like to write our own solutions. I wanted to inspire a little vision on your part. I obviously failed. Next topic: I brought up the point about "newer ideas" [i]absolutely: "Newer" does not always mean "better." Let's try not to forget that some new ideas are good and implemented well enough to be recognized as such; others are good but implemented badly enough to seem like bad ideas; and unfortunately, some ideas are so lousy that the skill of those responsible for implementing them has no chance to help the idea seem good, or bad.[/i] And/So??? Tell me something I don't know. I was being vague about the "newer ideas", because I've talked about them at length elsewhere and I tire of repeating them. I'll kill two birds with one stone here, since you questioned where the idea of the GUI being an "extension of the mind" came from. It helps to know some history... The idea for the GUI format that we're familiar with today, for example, originated with a team that developed the first version of it at Xerox PARC about 30 years ago. The intent behind the project was to make the computer come closer to working like the mind works, to facilitate people being able to create their own mental models in the computer and run them as simulations. You might say, "Okay. So what?" These ideas have been embodied in word processors and spreadsheets, tools which have led to workers being more productive than typing on typewriters and line editors, and "running the numbers" by hand on paper. The idea was rather than forcing the user to use a cryptic interface it came closer to working the way the user's mind works. Instead of forcing the user to plan what they're going to do before they do it to avoid headaches, allow them to try out solutions, see them succeed or fail in the simulator, and correct as necessary, with quick feedback. The idea, in short, was to help the user think through problems and come to solutions, which was in stark contrast to the way computers had operated up to that point--hence my reference to punch cards. Some of these ideas have been adopted in the genres I've mentioned. The "data processing mindset" does not favor this approach. Data is pretty static, and is only changed through policy-oriented processes. This approach is fine IMO for businesses where industry practice is well established. There's no real need for experimentation because the experiments have already been done, and the process has been worked out. The problem is this kind of process is adopted pretty much everywhere no matter whether it's appropriate to the IT situation or not. In my view, in the typical case learning the business is NOT the same as learning how the computer system works. There should be a nice mapping between the two, but more often than not in my experience the two diverge. In this context, "paid to use it" leads to decreased productivity. The business is to produce a product or service and serve customers and investors. The point is computers can help with that. They are not the business in and of themselves. The connotation in your post about an interface employees are "paid to use", rather than "want to use" is that they're doing their job in the way they are because they have to, because they are not considered intelligent enough to do it the way they want to without creating havoc. That's what left me cold. For the benefit of this discussion, I'll talk in the context of interfaces for doing jobs. [i]absolutely: Punch cards were not incomprehensible to those who operated them.[/i] Yes, but the punch card doesn't represent someone's work very well to a person. I like the idea of the interface being easy in the sense that it works with the worker's understanding of the problem domain (in this case, their job). It doesn't distract from it. It doesn't get them thinking "Huh?", or "How do I work this darn thing?" It doesn't get them to use its process, it uses [i]their[/i] process, assuming they're a professional who can formulate a reasonable one. [i]absolutely: In a very real sense, yes. The market forces that have replaced command lines with GUI's have made GUI's the "only thing out there," for entry-level consumers, for some time. I've actually found that I can work more efficiently with a more elegant and stable code base, operating at a command line, than in the leading manufacturer's bloated, unstable, inefficient GUI. To the extent that a GUI is more efficient for a particular data task, I'll use it. I often use software with a GUI, even in Linux.[/i] I see. So you are using the interface that works best for you for a given task. Why can't users at work do the same? [i]Human potential exists independently of computer technology. Computer technology did not create human potential; humans created computer technology, and other humans can certainly learn computer technology without mollycoddling user interfaces, that are "an extension of the user's mind." Whatever effect computer technology has on workers, it can only build on our potential, not create any.[/i] My definition of "potential" here is equivalent to "possibility", not certainty. I think that it creates potential in the sense that it can help organize thoughts, and simulate ideas, which enhances what we can think about and understand. This is assuming that we as people recognize that it can do/be this. Technology does not create progress. It can change habits, but by itself it doesn't create human advancement. People have to understand what it's good for first for that to happen. This does not mean learning how a particular computer system works. It means understanding how computers in general can help you do you job--you are the practitioner. It is your aide, and/or facilitator. [i]Where did your "workforce that is ready and willing to learn" just go? The way that competent workers "get rid of" what is "incomprehensible" to us is to learn, ie expand our knowledge, not to expect somebody else to redesign everything down to the level of our current knowledge.[/i] So you mean learn the computer's/software's system, not your own based on what your job actually is. I think the basic difference between your view and mine is that in my experience it's possible to have software that is billed as being appropriate for doing one's job, but in fact it works in a way that distracts from the job, forcing me to do its thing, rather than allowing me to focus on what I'm actually supposed to be doing. In your view the computer tells the employee how to do their job. That can work, but it's a shaky argument to make generalizations on. One of my peeves is bad software design. In my view, the better way to not distract the worker from doing their job is to work appropriately with how the worker expects to do their job, with the assumption that they know how to do their job from their experience of working within the organization, or a similar organization.

Absolutely
Absolutely

Pro or con, I'd like to see what you write about it.

Justin James
Justin James

It's on my "to read" list. I've been diving into some odd corners lately, and I keep dancing *around* this book without actually reading it. It will happen, one day quite soon. :) I am familiar with many of its principles, though, because I keep reading about it. J.Ja

Absolutely
Absolutely

I have never worked on a project designed with its principles declared at the outset, or seen any deliberate effort to employ them, but it's a good read, and looks more logical than trying to be something as vague as "agile," outside of actual athletic competition.

Justin James
Justin James

I just had to say, I've never heard that phrase before, but it describes how the majority of the projects I've been on finally ended. It's a result of the customers wanting a vendor who is using Agile-like methodologies, a vendor where the people who made the sale didn't consult the people who would be fulfilling the sale, and a contract set up for a Waterfall-type methodology. It never works. Thus, de-scope and declare victory. :) J.Ja

Absolutely
Absolutely

Frankly, I thought the various exclamations of Kay [i]changing your life[/i] were mere hyperbole, but these are really great ideas. "Instead, the heterogeneous mixture that the system is made from must simply obey message passing conventions in order to interoperate." Of course, Microsoft keeps throwing up crap like ActiveX to interfere with general html standards, but when that stops working on Joe MBA [ Joe User and Joe Six-Pack never really had any say in the matter, let's be serious ], Alan Kay's model, being the only one I've seen of a cohesive computing system, and very sensible, seems most likely to happen, whenever computers start to work the way they really should. Anyway, very enjoyable to read. Thanks again, Mark!

Tig2
Tig2

There is much to what you say. Having been a programmer years ago and having stepped away from procedural thinking to an extent, I am willing to see just what I can accomplish with Squeak. Who knows? I may find that I enjoy it!

Absolutely
Absolutely

I just got dizzy! But seriously, folks, I don't think the Squeak model is truly less compatible with security, it just seems more like an "open door" at first. But, if you're working with a computer as a black box, like a GUI whose workings you don't fathom, the sense that such a computer is a "closed system" is false, and much of the standard assumption that makes Squeak seem "insecure" is also false.

Justin James
Justin James

Much of what I've been writing about in this space for the last year or so is directly influenced by stuff like this. I have been kicking and screaming against the existing paradigms for years, and when I saw the videos that Mark has linked to, I understood that I was not alone in thinking like this, and that there was a better way out there. It is unfortunate that topics like security are not addressed, because they are fundamental to adoption. But that is not a flaw with the system per se, it has to do with the stated goals of the system designers. And that is OK. Security is the enemy of ease of use, and the mindset of control and centralization that is so important to security is the antithesis of what things like Squeak and Croquet are about. And that is OK too. I suspect that a more "blended" approach just would not work, the two concepts would beat each other to death. At the end of the day, there is data processing (shuffling data from point A to point B with some minor "value add" along the way like validation, verification, aggregation, formatting, etc.). And there is something else (not quite sure what the term is!) that is *not* data centric, but is about allowing the computer to act as a lever for the user's creativity. This is the marketing image that the Macintosh presents, BTW. And it's what Alan Kay talks about. And it's what Squeak and such are about. But it takes a lot more than your average user and a lot more than your average programmer, and more or less the user to be a programmer, to get there. J.Ja

Mark Miller
Mark Miller

BTW, I added a little more to my comment on Kay's work. I forgot something. To TiggerTwo: I know it sounds dramatic to say this, but I saw that video in the fall of 2006 and it changed my life. I was blown away. I had never seen software work like that. I've since heard that Game Maker works similarly, but it doesn't allow you to change your program while it's running, which Squeak/EToys allows you to do. This is late-binding in action. If you want to see more videos of Kay's talks, just do a search on "Alan Kay" on Google Video. There's a bunch. To get some more perspective you can also look up "Smalltalk" (for historical videos on Smalltalk-80), "Squeak" for videos on it (though there are some others that fit the category but have nothing to do with the "Squeak" I'm talking about here), and "Croquet". Squeak is a very different system from what most programmers are used to. I've told people it's probably the most different, disorienting system you've ever seen. I don't mean to scare you at all. Just saying if you end up feeling lost very quickly, it's not your fault. The reason I say this is we programmers are usually taught to expect a certain system configuration, which systems like Unix/Linux, Windows, etc. conform with. We've been using it for so long we take it for granted. Squeak breaks that mold and it can throw a newbie for a loop. As far as starter material on Squeak, I'd recommend Mark Guzdial's book called "Object Oriented Design with Multimedia Applications". It covers an older version of Squeak, but a lot of what it covers is still relevant. Another I'd recommend is "Squeak by Example" (you can download it at http://www.iam.unibe.ch/~scg/SBE/index.html), and Stephan Wessels's tutorial at http://squeak.preeminent.org/tut2007/html/000.html. Wessels doesn't cover much of the beginner basics, so I recommend "Example" or Guzdial's book first. You might be able to skip Guzdial's book and just use "Example", but I found that Guzdial's book provides some interesting insights, largely because of the goals he gives you.

Tig2
Tig2

I watched all 58 minutes of the demo you linked to. And will bookmark it. I haven't written a line of code for a long time. It became boring, and frankly, infrastructure was more interesting. But I WANT to play with Squeak. It looks interesting enough to hold my attention. Can't tell you how much I appreciate seeing Mr Kay and the fascinating work he has done!

Mark Miller
Mark Miller

Alan Kay is different from most technologists I've encountered. He makes strong arguments, but he's not set on just one approach to things. What I told you earlier was an approach he advocated for years ago. More recently he's been working on other stuff. Back in the mid-1990s he worked on Squeak, which was released from Apple Computer. It was basically an updated version of the Smalltalk-80 system, which was the one I described in my last reply to you. It was originally created at Xerox PARC, and was worked on for about 8 years, during the '70s. Squeak is still out there, and continues to be updated. There's an excellent video of Kay demonstrating it (specifically a major package in it called EToys--it's still a cool demo) and another project he worked on called Croquet at ETech 2003 at http://video.google.com/videoplay?docid=-9055536763288165825. Croquet is fundamentally similar to Second Life, but it's more P2P-based. It runs on PCs, and is network-aware. I don't think there's a central server involved. The ACM Queue article I referred to is at http://www.acmqueue.org/modules.php?name=Content&pa=showpage&pid=273. Kay is currently working on a project funded by the National Science Foundation. There's a write-up on it at http://irbseminars.intel-research.net/AlanKayNSF.pdf I asked Kay about a month ago what his concept of programming was, hoping he'd elaborate more on his original principles of OOP (he invented the term). He said that he didn't have one, really. He doesn't even insist on objects. He said what needed to be emphasized was something that he was taught when he took CS in the 60s: We don't understand what computing is. This is a continuing process of research and discovery. It's up to you, the next generation, to advance the state of our knowledge. Somehow that message got lost not too long after that in the CS departments of universities. He said what's happened since then isn't very exciting, and frankly very little progress has been made. He said that he thinks of programming as architecture. So the programming language, and the environment you work in, is part of that. He said that what gets lost so often is that we need to not only use architectures that are more appropriate to the computing problems at hand, but also we need to find new architectures, because the ones we have don't solve all problems. This gets more at a technical practitioner level of things, but that needs to be emphasized as well. He personally believes that since we still haven't figured computing out yet that we need to use late-bound programming systems for software development. The whole edit, compile, link, test cycle is archaic, and does not promote good software development. For example in Squeak, when you come upon an instruction that causes an exception, you can fix it right then and there, and continue execution. There's no such thing as a program that shuts down due to a fault. You can shut down a stream of execution, but you have to specifically ask for execution to stop. You can get Squeak at www.squeak.org. I once did a little research on the idea of securing Squeak, since it functions like an operating system (though it can run in any OS: Windows, Mac, Linux, etc.), and the discussion was interesting. The conclusion seemed to be that securing Squeak was seen as a bad thing, because it would lock down the system design, and the Squeak community didn't want that. One of the goals Kay has had with it is to let programmers modify it, and turn it into something new. Adding security to it would make this more difficult. So basically one would have to create a kind of firewall around it in order to fully secure it, though it's possible to do things like not allow entry into Squeak without the proper password, etc. That's something that would have to be custom coded into it. Good talking to you. [b]Edit:[/b] Forgot to mention, the doctoral dissertation I mentioned is "Tracing the Dynabook: A Study of Technocultural Transformations", by John Maxwell. You can find it at http://thinkubator.ccsp.sfu.ca/Dynabook/dissertation

Absolutely
Absolutely

Not having read Kay's writing myself, I can say that your summary of the semantic web led me to the same conclusion you reached, that security does not seem to be a consideration. Add to that the blurring of the definitions of data and software, and it seems fairly obvious that the danger of finding unwanted code in downloaded data would be much greater. But, since it's that obvious, I have to wonder if Kay has taken that into account in some way that is less obvious. Aside from the problems with securing data, the idea of storing system state on hard disk instead of discreet files seems like it would cause any software bugs to crash an entire system, which already happens too often with the present "stovepipe" model. But redefining code into objects instead of applications sounds like Utopia. I'm thinking of some hassle I had earlier today on file transfer protocol, and how nice it would be to just plop a different port number into my "Internet module," specifying new iptables rules maybe, and opening a "local system" window next to the "remote system" window, rather than downloading and configuring a new, separate app, taking up more storage space on my drive, and much more importantly, multiplying the probability of coding oversights which might allow uninvited system access, by multiplying the number of lines of code that do the same thing, or a similar enough thing that the difference is equally described in a short .conf file. If the various means of accessing each piece of hardware are coded once each and clearly labeled/documented, then the user is free to cobble together those modules and parameters for each, to do more exactly what I want. That part sounds fantastic; I see no downside to "object-oriented computing" as you've described it. Putting the name "Alan Kay" on a sticky note on my monitor ... Thanks, Mark!

Mark Miller
Mark Miller

[i]It turns out I agree with you much more than I had thought I did, Mark. . . . Giving any attention at all to the "user interface" was my reason for concluding that you support that kind of nonsense. Obviously, I assumed too much. "Ooh! Look at THIS cool thing!". Eww! It's so crappy at its purpose! How can anybody think that's "cool"? Clearly, you're not the kind of person who could rationalize that, or UFO abductions for me. Sorry I mistook you for somebody stupid. I misunderstood the context of your other remarks.[/i] I assumed some things about you as well. Glad to hear we agree on a few things. :) Something I've woken up to is the fact that for a significant part of my life I've been caught up in a "pop culture" of computing. I first became aware of it through an interview I read with Alan Kay, done by [i]ACM Queue[/i]. Kay is credited as being the inventor of the personal computer concept back in the early 1970s (the "Dynabook"), though he's complained that what's come out since then is NOT what he had in mind. I was reading a doctoral dissertation recently which analyzed the differences in Kay's vision for the PC and what actually happened. It said in short that the hardware form factor we have today in laptops is basically what he predicted and called for more than 30 years ago, but the way we are using this technology has diverged greatly from what he said it was intended for. So the "letter" of his vision has been followed, but the spirit of it has been totally forgotten/misunderstood. In fact it's been just what you said earlier: People have attempted to recreate data processing systems using PCs rather than mainframes. What I was arguing for was that we as technologists should take a close look and re-evaluate computing in the context of the PC, realizing that we should NOT try to recreate the old systems with it, but rather something better. The PC has been discredited in many people's eyes, but I say it's only because what you've seen is a botched version of the original vision, created by a computing culture that didn't have the perception to see that it represented something totally different from the mainframe architecture, and should be treated as a departure from it, not a "mainframe in miniature", and certainly not a distributed/democratized mainframe. From what I remember, Kay's vision for the PC was a machine about the size of a "notebook" (as he put it) that could be used by both children and adults, for their own purposes. Children could use it for learning, exploring ideas, and including other kids in the learning activity via. wireless networking (mind you this was in the 1970s). Adults could use it for keeping notes, creating documents, creating creative/artistic designs, communicating ideas with others, exploring libraries of information, consumating transactions on the ARPANet (the internet), and hooking up to larger computer systems (though he doesn't describe them) to upload/download information (basically using the larger systems as databases). The latter example sounds like what we have today, though he didn't anticipate the web with its HTML, CSS, Javascript, etc. One thing that Kay has a strong aversion to is the whole concept of an application. Applications to him are stovepipes of functionality and data, and they weaken the power of the computer as far as the user is concerned. What he envisioned was a computer that offered objects in a UI sense that the user could interact with on a screen, and which would do all that he described. The user could create combinations of these objects that would approximate customized "applications" that did something that s/he specifically wanted. He envisioned object-oriented computing--not just programming. As I've looked at his work, the impression I get is that this could be contrasted with process-oriented computing like we have now. His idea, though I imagine it sounds pretty wild, is that all computing would take place through the interaction of objects. The purpose of this design is to tightly couple code and data. This increases the power of computing, because now all that a user has to worry about is what an object can do. Objects should be the interpreters of, and interactors with data, not the user. What I've just described may sound similar to OLE/ActiveX in Microsoft Windows. It is similar, but there are a couple of key differences: In his system most objects are "open source" (I put this in quotes because I mean this in the most generic sense, not in the whole "democracy", "freedom of expression", and "free software" sense). I mean that the source code is available for viewing and modification in the system. Secondly, the system is "live", meaning that there's no such thing as "just data" in the system. Everything that's "data" in a mainstream system would be a live, active object (code and data together) in his system, and could be interacted with at any time via. message passing. It does not need to be loaded. It's there in memory. He went whole hog on it, whereas ActiveX only goes part way. With ActiveX code and data are tightly bound for a brief time, but once the component is unloaded, the data becomes seperated into a file or database. The code resides on a filesystem. In his system, if it needed to be shut down, it merely went into "hibernation" saving an image to disk, which could be revived later. Since code and data are tightly bound, the filesystem is mostly used to save system state, not store individual data files. IMO this changes the role of the database, because this means a database is basically a serialization, swapping, syncing/mutexing system for code and data that adds indexing. There have been some attempts at this, Gemstone being the most recent. Ten years ago Kay was talking about the semantic web in much the same way. Rather than objects just interacting inside of a singular computer system, they should also communicate over the internet, working in concert with each other. Objects should be clients and servers. What's needed is a good way to discover remote objects and what they can do. There have been some attempts at this. I don't know if there have been successes. One glaring omission I have seen in Kay's ideas and work so far is the role of security. Surely in a system that contains critical data, and is connected to a network, security of the system and the information it contains needs to be dealt with. I have not seen that addressed so far. This is one area where the "data processing mindset" has a leg up on the newer vision of things. Not to say that it's impossible to secure the system he envisioned, but I don't get the sense that a lot of thought was put into creating infrastructure for that.

Absolutely
Absolutely

[i]The sense I've gotten is that IT is largely viewed by business as a "necessary evil". I get the sense that this is what the post-er "absolutely" was reacting to.[/i] I don't actually have the first-hand impression "that IT is largely viewed by business as a 'necessary evil'" -- I don't have any first-hand impression, as a matter of fact, of the reason that so many businesses have implemented so much technology that is obviously technically inferior to its competition. A lower up-front cost only provides advantage to the extent that usable time is comparable. I won't elaborate; we all know exactly what I mean. But, it's reasonable for me to assume, hypothetically, that your evaluation of the general business attitude toward computing is accurate. Doing so, I notice that the thing that "the post-er absolutely was reacting to" would be a direct consequence of that implied attitude of neglect of IT and its suitabilityy for its intended purpose. You're probably exactly right about that, too. But I'm not [b]absolutely[/b] certain, because I have too little first-hand knowledge of what those clods have in mind. I can only see that they're screwing up, badly & often.

Absolutely
Absolutely

[i]Secondly, computing was not distributed. It was centralized. So a large company might have only one or two mainframes, and everybody was [b]competing for computing time[/b] on it.[/i] OK, not altogether; it took the competition for computational resources out of businesses, though, and provided a competitive advantage to bad ideas, in large enough organizations to turbocharge their stupid ideas, with computers, which allowed them to continue being stupid, doing the same [i]ratio[/i] of things incorrectly, but accelerating the rate at which they can cover their tracks. It turns out I agree with you much more than I had thought I did, Mark. [i]It seems like what permeates this whole evolution is the desire to "democratize" computing, like it's some sort of cause. Democratizing is fine, but the result has been democracy with little understanding of what the end goal is. You have people making choices that are ill-considered. It answers their needs for their neck of the woods, but not the needs of the organization. Like I was saying, maybe this is where OSS comes to the rescue. It takes some effort, or mucho cash, to pay someone who understands it, but at least the pieces can be adapted to work as a cohesive whole after the fact.[/i] Giving any attention at all to the "user interface" was my reason for concluding that you support that kind of nonsense. Obviously, I assumed too much. [i]"Ooh! Look at THIS cool thing!".[/i] Eww! It's so crappy at its purpose! How can anybody think that's "cool"? Clearly, you're not the kind of person who could rationalize that, or UFO abductions for me. Sorry I mistook you for somebody stupid. I misunderstood the context of your other remarks.

Justin James
Justin James

You are not the first person I've seen bring up the comparison between software production and other forms of production or product development. The comparisons almost invariably "feel" like software development is being shown to be, at best, unprofessional. I also know that satisfaction with the end result is also extremely low for software. I think the two are related. Why is it that the project management methods which seem to work so well for building bridges, developing new types of cars, and so on, seem to universally fail or be fought against on software projects? I have no idea. Sure, construction often has cost and time overruns, but at least they can often get pretty close... software development is still a complete guessing game as to schedule and budget. J.Ja

seanferd
seanferd

You have brought to the fore, in my thinking, something that has been nagging at me a bit, but I have let slide, because of the general trend of this conversation (at least in my mind). That is this: What kind of enterprise software, in what environment, are we talking about? A generic approach works well sometimes, but there are cases where I believe that the software, and the direction it is going in, is quite well suited to the task, and the UI is one of the most important aspects of the system. Mostly (once again, in my understanding) I had the idea that we were talking about data processing oriented software, where human input, regardless as to how many humans are involved, is a very small fraction of what the enterprise system is doing. What is more important in these situations is that processing continues uninterrupted and without error, rather than the comfort level of someone interacting with the system who has issues with pressing F2 several times to get back to a main screen. Even here, I would agree that there is always room for improvement, but what is more important is that, for instance, your bank account information is in a stable system. Mind you, there are serious issues with the way some of these older systems are used currently, the data transfer systems used by the stock exchanges being one example. So, I would agree that the production control systems in chemical plants or similar are very good examples of where the UI is one of the most important aspects of the software, both for input and display, where real-time can be very, very real indeed. Also, for more modern business models, where the base software design is much more recent, flexibility tends to be built in. Good. I may be wrong, but I am guessing that software in these cases does not have 40 years of inertia behind it, and is not designed to be 99% number-cruncher/ 1% human interaction. Maybe my point where I mentioned Star Trek would be better illustrated by the following: No one is going to hire a carpenter who can't swing a hammer. "Boo-hoo", crieth this carpenter, "I can't get the nailgun to work here." Now, if it is about the money, and efficiency, no contstuction company is going to have 50 kinds of nailgun, one for every conceivable situation, unless it is to the company's advantage, that is, enough use is gotten from these nailguns that having and maintaining all of them is worth their while. Thank you for making that point. It leads me once again to wonder what any individual is thinking of when he or she is thinking "enterprise software".

Justin James
Justin James

You guys are right that the increasingly high penetration of the computer into people's lives makes a difference. Users ask, "why can't this inventory system be as simple as Google Maps?" or whatever (baseless) comparison they can make. They are right that the systems are probably more difficult than they need to be, but also have an unrealistic expectation of how much it could be improved, too. J.Ja

Editor's Picks