Software Development

A skeptic's history of C++

Was C++ originally meant as a joke? Why did it beat Objective-C? Will anything replace it now?

The C++ programming language was designed by Bjarne Stroustrup as an improvement over the C programming language, incorporating a number of enhancements. Probably the most important is built-in facilities for object oriented program design, as hinted by a term used as a working title for the project early in its development: "C with Classes."

It is heavily used for software development where performance matters, particularly in the realm of very complex application development where a little bit of OOP goes a long way toward making that kind of complexity manageable. Unfortunately, C++ itself is an incredibly complex beast of a language.

Originally, it promised to be a superset of the C language with facilities for object oriented programming. Another language, developed around the same time (arriving a mere three years later) with the same goals, was Objective-C. To a significant degree, the differences between the languages can be attributed to two things:

  1. Influences on their design
  2. How well they achieved those early goals

C++ is listed on Wikipedia as having been influenced by a slew of languages, including C (of course), Ada 83, ALGOL 68, CLU, ML, and Simula. Most of its casual users would likely have a difficult time coming up with that list off the tops of their heads, and for the parts of the list that might occur to them, it is likely that any random pair of casual C++ programmers would not come up with the same short list. To some extent, the influences on C++ are obscured by how they were bent to fit the new language, and some might suggest absinthe is another important influence.

Objective-C, by contrast, has two influences that would come immediately to mind for even the most casual Objective-C programmer (possibly excluding people who are not aware the language is older than MacOS X): C and Smalltalk. The Wikipedia article about Objective-C offers those two, and only those two, influences on the language's design, reinforcing the obviousness of those influences.

As for achieving the early goals of providing an object oriented superset of C, Objective-C appears to have succeeded in all essential details, while C++ looks in some respects more like the result of giving up on those goals in mid-effort to pursue something shiny instead.

Much as the influences on Objective-C are much simpler than those on C++, so is the design of the language much simpler. The reason Apple (and NeXT before it) chose Objective-C as its primary object oriented application and system development language seems obvious, in that it offers simplicity and elegance of design -- at least as compared with the design of C++. Outside of its eventual rise to prominence as the language of choice on Apple platforms like MacOS X and Apple's iOS, however, C++ is the clear winner in terms of popularity and mindshare. Even now, so many years after its creation and without any major resurgence like Objective-C has enjoyed, C++ is in heavy use. For instance, it is the core implementation language of all the most popular major Web browsers for non-Apple platforms; Chromium, Firefox, Internet Explorer, and Opera are all substantially built using C++. Even Apple's Safari browser is written primarily in C++, though other browsers for MacOS X have used Objective-C instead.

Perhaps it was the fact that C++ was "first to market" by about three years that accounts for the massive success of the language, and the relative popularity failure of Objective-C. We can finally find books about programming in Objective-C on the shelves of every major book store, but they focus on development for MacOS X and Apple's iOS. General-purpose, platform-agnostic use of the language is apparently not a popular enough area of interest for anyone to peddle books about the subject.

Criticisms of C++ appear to substantially outnumber its praises. Meanwhile, the only people talking about Objective-C (for the most part) are developers for Apple platforms. Those who develop only for Apple platforms are generally regarded as untrusthworthy in their pro-Apple biases by many other programmers, so the fact they primarily sing its praises carries little weight amongst developers who avoid Apple platforms.

What are probably the best criticisms of C++ have both been attributed to Stroustrup himself. The first is his statement that C makes it easy to shoot yourself in the foot, and C++ makes it harder -- but when you shoot yourself in the foot with C++, you typically blow off your whole leg. The second is an interview for IEEE Computer Magazine that was supposedly shelved because it was decided it could not be published, in which he "admits" that C++ was all a joke from the very beginning, and he goes on to humorously extoll its vices. Stroustrup has disclaimed the article, saying it was a hoax, and said that he thinks it would have been funnier if he had written it himself.

By the time one is done reading that fictional interview, one might be forgiven for questioning for a moment whether C++ really was all a joke from the beginning. If so, the joke seems to have been made at the expense of Objective-C, which lingered in obscurity for almost a generation -- an eon in the terms of computer technologies -- before finally finding its niche in the Apple ghetto. The fact of the matter is that, without Objective-C and Cocoa, Apple would almost certainly be in real trouble in its search for developers to support its platforms; the joy many find in developing with those tools helps developers who are the target of systematic mistreatment by Apple's legal and marketing teams overcome some of their misgivings.

One might think C++'s days are numbered, now. Alternatives that seem significantly better suited to the same jobs litter the landscape, and the obvious direct competitor -- Objective-C -- is in some ways the least of them. Objective Caml is regularly held up as an example of a high performance language, frequently outperforming C++ by a significant margin in benchmark tests, offering more succinct and well-organized source code, and providing developers with far cleaner and more interesting development models, and it is not even derived from the same family of programming languages. D aims to compete in the same space, though its proprietary roots may hinder its widespread adoption. Google's Go language presents controversial trade-offs, but there is no doubt that its design offers huge advantages for certain types of software development, including concurrency.

Judging by the lessons of history, however, I am inclined to believe that C++ will have a long, stable tenure in its niche for some time to come. It has even been sneaking into operating system kernel development for years, as horrifying a thought as that might be for people who care about things like OS reliability. There is no doubt that it offers some advantages over C for certain types of performance-critical programming, and that its library support is extensive -- even legendary. Despite this, at least some of the strength of its hold on developers seems to be based on ignorance of the alternatives, and that is not a characteristic that will easily be pushed aside by a would-be competitor.

About

Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.

108 comments
waltertross
waltertross

I would rather call it "a despiser's view of C++". Not much history. And not objective.

Jtempys
Jtempys

I personally prefer Ada 95, the generic packages along with instantiation of an instance of the package very much lend them selfs to OOP concepts while working with a "proceedural" language. Ada 95 also allows for dynamically allocating resources and garbage collection. Generic Data structures are great, double linked lists anyone. The "record" data type is a rediculously flexible data type. A record basically is a container that can hold other variables and is refered to using a "." operator. myRecord.Attribute etc. The flexibilty allows for many type of solutions, such as using a record to store "object attributes" in the "prvate" section of an instantiated class, or in the linked lists, are basically record elements with pointers to the previous and next nodes in the list, and carries any data payload at a node. Very cool stuff, also very strongly typed, the compilers dont take no guff. Compilers are free. Ada 95 is used primarily by the department of Defense (or was) for embedded systems such as missle guidance systems (actually the patriot missle is written in ada, never mind the buffer overflow from not turning the machine off /wink.) Great overall language and aBEAUTTIFUL new to computer science language, as it does not allow for bad pointers and what not. Interpreted languages are fun too.

mikeflynn.psc
mikeflynn.psc

I'll take Python over C++ any day. When it comes to performance programing C produces tighter, more efficient, faster code. All C++ has done is remove computer science students from what actually occurs inside a computer and what computer science is actually about.

verelse
verelse

This is a coke and pepsi article. No statements made herein are supported by anything beyond conjecture. I can't see granting the author the presumption of "expert status" based on his stated qualifications. Every newb that comes along wants to toss out everything he doesn't understand in favor of what he does. If I saw a Computer Science degree next to his name I would be more inclined to accept some of his assertions. Not saying degrees are all that but a couple of MS Certs does NOT make an expert.

oldbaritone
oldbaritone

We used C as the primary language for embedded process control systems. It was lean and fast, and compiled extremely efficient code. When compiled with processor-specific optimizations on, C's latency was comparable to ASM, but much easier to debug. Then C++ with the MS foundation classes bloated the overhead beyond belief.

Craig Dedo
Craig Dedo

Mike Page wrote: [Begin quote] The vilification of the capacity for multiple inheritance is quite silly. Having used C++ for 20 years I have run into problems caused by it once every few years. In those times there are generally rather simple ways to address the issue. The advantages gained by multiple inheritance can be huge. Yet I become really frustrated when I can't multiply inherit in other languages. Are they protecting me from myself just in case I don't have the capacity to think my way around potential infrequent problems? [End of quote] I was a member of J3, now PL22,5, the ISO/ANSI Fortran Standards Technical Committee, at the time that Fortran 2003 was developed. Fortran 2003 is fully object-oriented. The members of J3 explicitly considered and rejected multiple inheritance. The reason was not to protect programmers from themselves. Rather, multiple inheritance is by its nature extremely complex conceptually and difficult and costly to implement correctly. The extra complexity, difficulty, and cost did not seem worth the apparently limited advantages. And, many members argued, if multiple inheritance offered real advantages, it could always be added in later without compromising the language design. Therefore, Fortran 2003 is a single inheritance OOP.

Robert Lerner
Robert Lerner

This article is more of a comparison of Apple and PC, than it is about languages. Everybody knows Apple and their products are trash, and that they only succeed by convincing a pliable "green" society to believe their products are "hip" and "environmentally friendly." If you're going to compare the languages, then cite the differences between them, cite the problems and give demonstrations, don't just focus on it being older than "MacOSX" or "Not rly gud 4 multi-platform", because at the end of the day, who uses an Apple anymore? They've adopted the Intel chipset years ago, and anybody who garbage picks an Apple these days is usually smart enough to run BootCamp.

PhilippeV
PhilippeV

C++ has just been a nightmare from its begining (including its huge lack of interoperability across platforms but also across vendors for the same target platform). Really the way C++ extensively still uses implicit typecasts (with a very complex path when you mix if with multiple inheritance), and an horrible syntax for explicit cast, is counter-productive. Really, no language needs multiple inheritance for classes (i.e. for their implementation). All what matters is multiple inheritance for interfaces and a clear way to delimit it. C++ clearly lacks the separation between designs (interfaces) and implementations (classes). C++ has in fact failed in all most obvious wanted aspects of OOP programming. For this reason, I have abandonned this language completely, except for interfacing with prebuilt third-party libraries. But developing libraries with it is simply a loss of time. Such language cannot be managed safely without using another programming tool, notably an UML workshop that will manage what C++ fails to do correctly and safely. The UML tool will generate the skeleton, and we'll concentrate on something else: in this view, you no longer need multiple inheritance interfaces are generated for you, and you work in a safer sandbox on which implementation blackboxes are much easier to debug. These sandboxes don't even need to be written in C++, and in fact many projects are in fact implementing them in another language (such as C#, Java, or even VB, Javascript or PHP, SQL, or Fortran for complex numerical computations, notably because Fortran is highly parallelizable and offers excellent performance, optimizations, and the best numerical or algorithmic recipes that have ever been written). Some even use Cobol for these implementations. With the development of massively parallel programming (of "kernels"), in fact you no longer program in C++, you write in the "kernel" language, or work using a much safer model than OOP : i.e. the "Functional programming", which adds things still absent from C++ : designing by contract, allowing contracts to be partly implemented, and the missing cases (still not implemented) to be tracked directly by tyhe compiler, that will help you locate those parts that need further development, and unitary testing in a much more modular and collaborative way. C++ has a very bad reputation for working in teams : all C++ projects tend to highly depend on a single superpower governing everything and if this one leaves the team, there is a huge risk that the project will be abandonned because no one else will understand the logic on how it was written. On the opposite, I have never abandonned C. In fact, most of my implementations of classes are NOT written in C++, but still in C (I will keave the UML workshop generate the C++ skeleton to interface it). This has considerable advantage : there's a much cleaner and simpler way to replace one dependant library by another one. All C++ projects tend to be fragile castles of cards, and changing the design at one time is always a very challenging problem: not only the interface structure will collapse, but the C++ implementations will also have to be rewritten, simply because there are too many dependencies between C++ classes, many of them becoming rapidly really obscure. C++ still lacks the concept of packages (namespaces are not replacements, they are just a naming scope to avoid some collisions, but because of the way the namespaces are used and imported, even the code using them does not obviously show dependencies of namespaces). What are the alternatives ? Can you remember the merit of Modula, and even Delphi (the most common extension of Pascal to OOP) ? Can't you remember the merits of simple inheritance ? Why do you need to import so many headers and declarations in C++, exposing almost everything from the inner implementation of the libraries that you just want to interface with ? Let's build things with something better: even C++ is deprecating when you have now to work with deplyable objects that must work over the network, in failsafe environment with redundancy. C++ does not naturely handle the transparent deployment of code in multi-tiered environment. We are now going to virtualized OSes deployed over the network and that can freely be relocated from one host to another, or from a processor to another, or from an architecture to the other. C++ also only offers a static optimization model. It will survive only as a language compiling for a VM (for example C++ for .Net, or for the successor of Java). VMs are now everywhere, because optimization is better performed when installing the application on its effective target runtime system on which the code will run. I'm fed up to see code that was optimized only for some classes of Intel processors, and whose optimizations will behave miserably on the next generation of processors. VMs solve these problems, and can do things that a classic C++ compiler will never be able to do : optimizing code at runtime based on live profiling. VMs also offer a natural sandboxing with a very strong security model for the isolation. This is excellent for security, but also for forcing the design interfaces to not use more hidden things that what the implementations can use. So instead of generating native bainry code, C++ compilers will generate VM binary code with many more metadata that will help deploy the code much more easily and more efficiently. Now try programming with C++ in .Net : almsot all those quirky additions of C++ are simply not usable. You return to safer bases, but the syntax is still horrible. C++.Net was a good idea, but its default standard library (the CLI) was extremely badly designed, with many inconsistencies (including in naming conventions). So which choice is remaining ? Java. I can already hear those C++ supporters claiming that Java is slow : it has been slow in its early versions, but definitely not since v4 (after Microsoft was ejected from all the attempts to break it in a non-interoperable way). In fact, Microsoft has learned a lot from the Java experience, and has realized that it needed a strong VM (and that's why it has bought the technology from a company that was already working on Java optimizers, just to restart a new VM design, based on the kind of runtime optimizations that this later company was able to do for Java or in other VMs). The work on VMs is not finished. sooner or later, we'll have a unified VM that will be able to run all the legacy .Net and Java environments transparently (including in the support of introspection/reflection). VMs are now damn fast today, and offer the best optimizations for many more platforms xith the same application codebase, instead of a good performance and security tested only on the machine used by the initial developer. VMs allow long-term reusability and maintenance of the existing code, which can continue benefiting from all the improvements constantly made in architectures. VMs will soon be able to compile at runtime automatically any precompiled codebase to make it massively parallelizable in a very heterogeneous network. Even the differences between Java and C++.net will fade away, and in fact you'll be able to use many more languages than today for this unified VM (notably the scripting languages, Javascript gaining a lot of adoption now). This will be transparent for the design, as you'll be able to use the mosts solid codebases available to to things simply with less code: some Perl if you like, some C if you prefer, some Java elsewhere, all working together, and with objects communicating through native interfaces or through impliciitly and automatically built proxies across networks (including Web servers via XML or JSON). Who cares about C++ ? Not me ! I don't want to use its various incompatible ABI now, and with most of its unneeded syntax; a C++ subset (strongly restricted by the compiler to make sure we won't use its worst aspect) will emerge as a successor (and the old legacy code will run only within isolated sandboxes enforced by the VM). This successor language however will have to support functional programming (with a very strong theorem verifier that will be enforced by the VM to make sure that the code effectively honors the contracts and has no side effects that is not FULLY disclosed and exposed in its designed interface).

Elmonk
Elmonk

The only point where I might have some doubt about the article is the suggestion that some absynthe had played a role - I'm sure it must have been some stronger substance than the "green fairy". In my 40+ years of programming (I'm retired now) I refused to learn C or C++. Starting with various assembler flavours and using them for more than 10 years I came to love Pascal (and Simula, and Portal if that is at all known anywhere), had some contact with ADA and Chill and - sadly - Fortran and Basic in various forms. The last 10 years of my professional career were then mostly with Java, Javascript, and C# and I was quite happy with all of them - never missed the lack of knowlege of C++ except with some books and articles which had examples in C and/or C++. In general it appears that there is some truth in the saying that if millions of flies go for manure does not mean manure is valuable.

tweetypie
tweetypie

C++ owes its longevity to habitual use...

mbyamukama
mbyamukama

Every OS, every browser, every Multimedia Suite, every media player... Has a BIG of it written in C++!! Seriously, C++ is the only solution that lets you use OOP and interact natively with hardware (Objective C, yes but its warts are a lot worse!)

aleborn
aleborn

I believe that the OP is a bit too young to present a clear viewpoint on this subject. There was a time when only Objective-C was available, C++ was still struggling to get barely decent compilers. For many years, C++ was merely used as a C dialect, because its features were either poorly implemented or not yet present, whereas Objective-C was considered effective and much closer to the original OOP paradigm (which is absolutely true). Basically, C++ was taken a bit seriously outside academic or C-with-classes projects only after the end of 1994, when the STL's basic concepts emerged, and forced templates to become a bit more serious than glorified macros. Today it's a bit easy to see a conspiracy or ignorance at work in order to explain the success of C++ over Objective-C, but the truth is that Objective-C lost its own battle for mindshares fair and square in the late 80s, and that was with some serious lead in the installed base (high-end programming workstations), in the quality of tools (compilers were much faster, and debuggers worked right off-the-shelf with native Objective-C code, which was impossible after the "front end" style of compilers of that time) and against a cripled C++ (no generic programming at the time). I don't even think Objective-C failed against C++, it just faded away while C++ was still not firmly rooted. Presenting Objective-C as "the obvious competitor" of C++ just sounds weird to dinosaurs like me... Without Apple's policy, nobody would have heard of that language ever again. There are certainly much more qualified competitors around!

apotheon
apotheon

I'm not sure that should actually be taken as an indicator of a C++ renaissance. The source doesn't provide a very strong case; it basically just says "I heard somewhere there might be some increased usage of C++ in Redmond. Maybe." What I find much more interesting, though, is the fact that the discussion in comments categorically rejects the ".NET everywhere" unified front many developers would like us to believe exists in the Microsoft world. I must admit that part of the reason I find that more interesting than the post at the top of the page is the fact that several people in comments come across as more thoughtful and articulate than the C++ Community PM.

Justin James
Justin James

Interesting, thanks for posting it! There may indeed be a resurgence of interest in C++; I haven't seen it personally, but it doesn't mean it's not there. At the same time, there has been a massive surge in interest in functional programming languages... starting with the Erlang and Haskell crazes a few years ago, F# and Scala recently. All of them have excellent merits, but of them, Scala is the only one which gained any kind of real traction. While FP is an interesting paradigm with some real utility, it wasn't worth a switch for nearly every programmer. I suspect the uptick in C++ is like that. While C++ definitely is the best tool for the job in some scenarios, Java, C#, VB.NET, etc. are adequate in nearly all of those situations too, so most programmers feel little need to make a switch, just to handle a few rare (in their jobs) cases. J.Ja

apotheon
apotheon

1. I don't despise C++ entirely, but the fact I don't hate it enough is the least of my failings. 2. It has only the right amount of history in it. 3. It was never intended to be viewed as objective.

Tony Hopkinson
Tony Hopkinson

record or struct is a comon concept as are pointers to them. Dot notation also. There are number of issues with the traditional ones though. No intitialisation, packing, casting pointers to them, variable / union. It's like any language feature it does what it does, and if you try to bend it too much it breaks. Can you attach methods to a record definition, of you can't it's not an object in any sense of the word. How do you deallocate the memory for them, ie what happens to the pointer when you do. Can you cast a pointer to one record as a pointer to another of a different definition. Is it an explicit or implicit cast? Do you have to explicitly initialise all the members? Garbage collection is extremely iffy if you start aloowing direct access to pointers. Double linked list, can't remeber the last time I explicitly defined one of them, since the classroom. :p Have a look at generics in C#, then closures, lambdas, then LINQ which they wrote all that stuff to get to. It will be a bit of an eye opener. Not knocking your choice, it's just that the best way to learn one language is to learn another one or three, the differences and the thinking behind them is more instructive than anything else.

Tony Hopkinson
Tony Hopkinson

If I saw a CS degree I'd be surprised at his assertions, as most of them tended towards the practical..... And an expert is someone who knows nthing about anything else. Why don't read it again, face forwad that way your blinkers might not get in the way. On top of that, if you are going to criticise someone for unsupported and unsupportable assertions, supporting yours will make you look less of a knob. A newby, :D :D :D

Tony Hopkinson
Tony Hopkinson

Designers of all these languages looked at the cost and benefits, and dropped MI. I put it in the same box as passing arguments with pointers, might be useful on occasion, perhaps even necessary, but in the long term it's a compromise that will hurt as the code changes.

seanferd
seanferd

TR should have you write articles for the Programming and Development blog.

Charles Bundy
Charles Bundy

I really don't think C++ is unique in this regard :) Almost any higher level language will let you do inline assembler. If you can do that you can interact with hardware.

Tony Hopkinson
Tony Hopkinson

If you are working in windows you interact with drivers, likely to be written in C or C++ admittedly, but that's a historical choice now, not a technical one. You need to get out more, have a look at past and present alternatives, that do all that more and in many areas better at least in terms of describing what you want to do well.

Justin James
Justin James

Delphi sticks out as a native code option that can directly interact with hardware and is OOP. If memory serves, you can still embed Assembly in Delphi like you could in Pascal, if needed, too. And there is certainly no *requirement* that you use C/C++ to write native code with direct hardware access. Some firmware environments natively run Java, as I understand things. That being said... C++ is the default choice and the choice of probably 98% of folks who want to write native code and use OO. It is a mainstream tool with tons of support, libraries, tools, etc. available for it. J.Ja

Mark Miller
Mark Miller

Just to dovetail on your points, C++ was recognized within academic circles at least as early as 1992/93. I knew a couple of undergrad students in college who were using it then, because it was the language being used in their courses, or a professor let them use it for a project. My guess is the students were using GCC as their C++ compiler. I don't recall exactly. Whatever they used, it was a cfront-style compiler, because when they looked at the generated code in gdb, all the classes had been translated to C code. They were using classes in their projects, and focusing on the OOP paradigm, not just using it as a dialect of C. Templates had just come into existence as a feature of C++ a few years earlier, as I recall, but I don't remember anyone using them. Nobody I knew the whole time I was taking CS (1988-1993) had heard of Objective-C, and nobody was using it in coursework. I had heard of it because of the presence of NeXT workstations in the computer marketplace, but I didn't know a thing about it.

apotheon
apotheon

Erlang and Haskell are actually gaining traction in a lot of places, steadily and quietly. Once the initial furor for each language died down a bit, they both started gradually picking up developers who use them in "the real world". In the case of Erlang, it is growth in places that explicitly need concurrency. In the case of Haskell, it is mostly a case of people choosing it where they will not get in trouble for picking whatever language suits their fancy. I don't know how long the growth of either will continue, but they are both steadily growing, under the industry news radar, from what I have seen. F# is growing rather more quickly in the .NET world, especially given the explicit choice to use F# for a lot of stuff within Microsoft itself -- which means growth in the official support for the language. Outside of the .NET world, of course, OCaml and F# (even on Mono, which seems largely obsessed with C#) are not getting any growth worth discussing, from what I have seen, though; on the whole, this might mean that while F#'s current growth is more visible, it is no more substantial in actual numbers across the board than Haskell and Erlang. In fact, I suspect that Erlang adoption grows more quickly than F# + OCaml adoption. Scala may become the JVM's equivalent to Haskell. Where Haskell is being used by programmers who have long been forced to use C++, and want to use something "better" (for their purposes, at least), Scala seems to be adopted largely by programmers who have long been forced to use Java, and want to use something better as well. Haskell's inroads seem to be in places where the higher-ups only care about he fact they're getting executable binaries, and Scala's inroads seem to be in places where the higher-ups only care about the fact they're getting bytecode that runs on the JVM and leverages Java's extensive library support.

Justin James
Justin James

... just incorrect. I was going to disagree with his comments regarding Chad having MS certs, and after I posted it I looked at the bio on the article and saw that he did indeed have MS certs. It surprised me quite a bit, to be honest, totally out of character for him. J.Ja

verelse
verelse

...have no idea what you're on about, maybe the mark is too close to home? I was suggesting that when one claims to be an expert then makes baseless assertions, we the people have a right to call one out, yet never claimed to be one myself. The article is an opinion piece, as fluffy as feathers, and smells of elderberries. What prompted your miniflame? I am uncertain and certainly uncaring. "Why don't read it again, face forwad that way your blinkers might not get in the way" -- I think the word you are looking for is "blinders", but I am wearing none. I merely point to the author's own bio and claim of "expert". This article proves otherwise. "On top of that, if you are going to criticise someone for unsupported and unsupportable assertions, supporting yours will make you look less of a knob." I never look like a knob, and if I did it would still be better than a domeless wonderboy ;) And newb from my point of view means less than twenty years in the industry. Expertise comes with time, but does not require one to become an "expert" per your definition. "I do a lot of UI, some networking and a depressingly small amount of threading in my current role, so not choosing C++ vindicated by an expert." Who is the expert and where is the vindication? "Everyone's a hero, everyone's a Captain Kirk". Not me, just an engineer doing my job -- well.

apotheon
apotheon

If it's that sensible -- maybe you can translate it for me. It looks to me like someone who skimmed, saw that C++ was being compared to Objective-C, and just assumed the rest of it was about Apple and Microsoft platforms. If you see more to it than that, please let me know.

apotheon
apotheon

I suspect that much of the reason for the use of C++ in academic circles was bound up in two facts: 1. Some books were written about academic subjects with C++ as the example language. 2. C++ was similar enough to C while still being something that smelled a little like an object oriented language. Objective-C, in addition to being limited by its proprietary roots (as you pointed out -- thank you for that), failed the first of those two tests; it lacked academic texts using it to illustrate examples, despite its better illustration of object oriented development principles than C++ offered.

apotheon
apotheon

I actually pursued educational opportunities that, in retrospect, were of some dubious value -- and that, in the end, made it easy to get the certs, so I got 'em. I got a lot out of the instruction environment, but not because it was necessarily the best learning environment. Rather, I got a lot out of it the same way I got a lot out of other schooling that sucked (including most of my public high school and college experience): by noticing interesting subjects that were not addressed very well, and pursuing them on my own time. I'm in a fairly constant state of considering going back to college, in part to get a degree since so many people put real stock in them even if they do not prove much of anything other than time spent outside the job market and the accessibility of funding, but also in part because it's nice sometimes having ideas about what knowledge to pursue on my own time fed to me by people who think they're experts. Also . . . I occasionally run across a professional instructor worthy of the title "teacher". That's always a joy to discover.

Justin James
Justin James

... I was just stunned that you had certs in Microsoft technologies. :) I figured that you got them as a sop to clients who wanted to know if you were certified. J.Ja

apotheon
apotheon

Me, a Microsoft shill. Good times, good times.

apotheon
apotheon

I actually have quite a shocking lot of Microsoft platform expertise. I like to know enough about something to have good reasons for loathing it before I go around telling people it's loathsome. I've had those certs for about a decade now, give or take.

Tony Hopkinson
Tony Hopkinson

he criticised some linux stuff and got accused of being a microsoft shill. Anyway you don't have a CS degree either, so what do you know? LMAO

apotheon
apotheon

First off, I'd like to apologize for some errors in composition that appear to have crept into my immediately previous comment. I just re-read it for the first time and noticed those errors. Some such errors were actually quoted by varelse, who appears to have done a good job of ignoring them in favor of trying to address the substance of my statements. varelse: Unfortunately, there is still at least one failing in that regard that needs attention: QUOTE: Difference being I did not post an article and claim that expertise Please, if you cannot determine for yourself that I did not "claim that expertise" from the text of the article, go back and re-read that preceding comment of mine, where I explicitly stated that I did not "claim that expertise". For instance, I said "I have no claimed expertise in anything in particular". Later, I referred to the fact that your assertions I have claimed experience are mere straw men, and not representative of the reality of the situation -- again basically saying "I do not claim the expertise you suggest I claim." . . . but you continue to say that I claimed that expertise. Why? QUOTE: Why? Isn't the purpose of these languages and compilers the production of machine code? Not really. That's the purpose of assembly language, and the assembler. The purpose of a higher level language than that is to provide a language more easily understood, and supportive of more easily understood code, than assembly language. QUOTE: Wouldn't an object critique actually evaluate what the language produces? Languages do not produce machine language code. Compilers do so -- or, rather, they usually produce assembly language code that is then assembled into machine language code by an assembler. Even interpreters and VMs do not produce machine language code; they just act as an intermediate "machine" whose primary purpose is to free the developer from having to read and write machine language code (or even assembly language code). I'm pretty sure, based on your statements thus far, that you know this stuff -- but you may not be considering it in the context of determining the actual relevancies of this discussion. QUOTE: I am not suggesting that one cannot criticize these languages without a formal knowledge of assembler One certainly requires some knowledge of assemblers and assembly language to critique certain aspects of these languages, but if your premises and goals do not relate overmuch to such low-level concerns, then no particular knowledge at that level is required to critique a given programming language of a higher level of abstraction. QUOTE: the criticisms tended towards efficiency and effectiveness and how can we exclude the actual *output* from the discussion? What kind of efficiency do you mean, here? I don't recall software performance being a specific criticism of C++ anywhere in the article or following discussion. If you think that some particular bit of knowledge of extremely low-level implementation concerns is relevant to part of the discussion, the correct thing to do is bring that up. If you do so to dispute something I said, and it happens to make use of knowledge I do not have, I will either educate myself in a hurry to determine the value of your disputation or simply concede the point, at least tacitly and/or conditionally, based on your apparently greater understanding of the topic. I will not, however, just refuse to comment on C++ because I do not spend a lot of time writing software in a hex editor. QUOTE: Google "Construct validity"...it is all about the scientific method. Fair point. I had overlooked that particular use of the term. My error was in thinking you meant "hard" sciences, here, since we were talking about matters relevant to computer science rather than, for instance, sociology. Of course, now I'm not sure how your reference to validity in this case pertains to anything really relevant. Perhaps you could expound upon your previous statement so I know how that's meant to be relevant. Was it perhaps related to benchmarks? QUOTE: Given that you are stating that the piece was an opinion only this now seems to me irrelevant except to say that the opinion is not based on a formal evaluation of the languages. As long as you recognize that presenting the particulars of a formal evaluation is not necessary to the opinion in question having some value for discussion and practical decision-making, I have no reason to disagree with that assessment. QUOTE: This goes back to the use of my expression "coke and pepsi" article. I can see how that phrase could invoke a bit of hostility if it were used to describe one of my own articles. I actually do not much care one way or the other that you used that phrase. I did not find it offensive, in and of itself. A bit flippant, maybe, but there generally isn't anything wrong with a touch of flippancy from time to time. QUOTE: I will rephrase it as "This article appears to express only an opinion and not a formal evaluation of the languages". This is true. I would have thought the title would be a pretty strong indicator of what the article provided, however. QUOTE: The field of expertise to which have been referring is computer architecture -- and it goes to the heart of what is or is not an effective and efficient programming language. To some extent, that depends on your definitions of "effective" and "efficient". A lot of C++ developers seem to have the idea that all things must be evaluated first and foremost -- in some cases, even solely -- from the perspective of trying to get as close to bare metal as (reasonably?) possible. I take issue with that context for discussion, however. The real value of programming languages, and the most important measures of things like "efficiency" and "effectiveness" in such matters, is predicated upon the understanding that we use computers to automate and to provide advancement as the basis on which to build further advancement. If we were solely interested in the dominant computer architecture paradigm at its lowest levels, we would have to reject all languages that are not specifically tailored to that architecture. That would mean nobody would be able to use any descendant of LISP, for instance. Functional languages would, by definition, be excluded. Depending on how strict we wanted to be, we would have to start ripping functionality out of C++ itself, including support for recursion. That, or we'd have to abandon current architectures to facilitate the use of such programming languages, and revive the Lisp Machine. Luckily, however, most of us are not so zealous in our focus on bowing and scraping before the unvarnished architectural design as to refuse to use languages that offer useful abstractions. QUOTE: All I can say to that is if it seems that way it was not intended to be such and if you can point out how I can correct it I will do so. Meanwhile, I offer my sincere apology and make every effort to avoid anything of that nature in the future. Start by either pointing out where I supposedly claimed expertise, where you object to that claim, or retracting that statement and all arguments and statements that depend upon it. I am willing to believe (conditionally) that you did not intend a character assassination. If that is the case, however, you have made some errors that created the form of such an attack, apparently without the intent. QUOTE: The only advice I can give you here (not being an expert and not claiming to be) is to put on your thick skin when you enter the public as an author. I have pretty thick skin. I weather claims of being a Microsoft shill pretty well, for instance, in discussions of articles less than a month after having weathered claims of being an unreasoning hater of all businesses -- most especially Microsoft -- in response to an article in which I addressed some of that company's software development failings. Having thick skin does not mean I should just ignore everybody who attacks me. I engage others in the hopes that: * I can provide some perspective that helps other readers understand both sides of an issue (or that one "side" is in fact spurious nonsense, as it sometimes is). * I can achieve some meeting of minds with someone who starts out pretty intent on tearing me a new one. QUOTE: For my part I will think a little longer before I post and put myself in the author's shoes to see how my post will sound to that person. I've not been posting enough and got out of my good habits. I look forward to your planned response. QUOTE: You certainly provoked discussion here and that is the point of opinion articles, is it not? It's my purpose in writing them. Discussion often leads to cogent analysis and useful insight, at least for some participants and readers -- hopefully including the author of the opinion piece that sparked discussion in the first place. QUOTE: Thanks for sticking with the discussion. Back atcha. ==================== Mark Miller: I think I understand, and (obviously) sympathize with, your experience of the problem you have identified. Your account is well-stated, and stands on its own; I don't really have anything of substance to add. ==================== edit: had to repost because TR ate the original

Mark Miller
Mark Miller

An annoying thing I found a while ago is that even if I just use a couple languages for comparison in an argument to make a larger point beyond the languages, I get people who hate me for it, because they think I'm disparaging their language of choice. I have to keep explaining to them, "No, I was making a larger point about software architecture," or something like that. I've gotten the same thing when I've used operating systems as examples for similar arguments. Ever since I got into the technical aspects of computers at the age of 12 (1981) I have seen people attack someone else for using a technology, and others become defensive about their choice, and they both spend LOTS of time talking about the merits or demerits of each. I was into that for years myself, until I finally realized that it was all about defending one's knowledge base about a particular technology or technology set, and the community that supports it, because community is so important to technology adoption. The more adopters, the better for you, because your technology preferences get more support. The problem I see with this setup is it's as if technical merit takes a back seat to community dynamics. Surely, each preference is based on some kind of technical merit. It's just that each person's sense of technical merit means something different. To a lot of programmers performance is key, though not "ultimate performance", or else everyone would be programming in assembly language, or at least C compiled with an optimizing compiler that produces code as good as hand-coded assembly. A lot of times "acceptable performance" is good enough. What gets to me is expressions of preference that boast they are superior because of popularity, among other attributes like performance. It's as if whether the language that adequately describes the computing process that's most compatible with the problem domain, or a language that enables a uniquely well suited problem solving process for such is irrelevant. No, we prefer something that forces programmers to produce an unintelligible mess, creating more opportunities for bugs in the code base, forces programmers to go through the run-stop-edit-compile churn (for a good analogy, imagine if the only editor available to you was a line editor, instead of a really good code editor, for writing code), and forces them to waste hours producing repetitive code (or if the operation is really good, substitutes this with code generation tools using meta-languages), because we can't do without our static control structures and static types, because it's popular and has the most library and tool support. Makes sense to me!

Tony Hopkinson
Tony Hopkinson

why he or I should not be skeptical about the value of C++. Doesn't have to be a treatise, a hint will do. If I haven't heard of it, I'll look it up. I can change my mind, being wrong until I get it right is what I do.

verelse
verelse

"I do, however, believe that when you say you are justified in criticizing without some formal standard of expertise but I am not, you engage in some pretty blatant exercise of a double standard. Perhaps you should consider applying your standards more evenly -- or even apply a strict standard to yourself than to others, I do to myself than to others." Difference being I did not post an article and claim that expertise :D "That's interesting, given that the article is not about this topic. As a result, your recognition of the potential value of such a degree seems somewhat irrelevant." Why? Isn't the purpose of these languages and compilers the production of machine code? Wouldn't an object critique actually evaluate what the language produces? I am not suggesting that one cannot criticize these languages without a formal knowledge of assembler, but the criticisms tended towards efficiency and effectiveness and how can we exclude the actual *output* from the discussion? "Scientific? Last I checked, "validity" is not a formal term of science. It is a term of predicate logic and related fields of philosophy." Google "Construct validity"...it is all about the scientific method. In this sense it means the degree to which one can legitimately infer a conclusion from the scientific test of a theory. Given that you are stating that the piece was an opinion only this now seems to me irrelevant except to say that the opinion is not based on a formal evaluation of the languages. This goes back to the use of my expression "coke and pepsi" article. I can see how that phrase could invoke a bit of hostility if it were used to describe one of my own articles. I will rephrase it as "This article appears to express only an opinion and not a formal evaluation of the languages". The field of expertise to which have been referring is computer architecture -- and it goes to the heart of what is or is not an effective and efficient programming language. All any language does is interact with the hardware--if it cannot do that it is not doing much. That is why I referred to the Computer Science degree as Computer Scientists are trained in this -- yet I know people who do work at this level without CSc degrees. Such is the nature of our field. "It does, in fact, look like a character assassination, where first you set me up as a straw man, then you knock me down using hand-wavy standards to which you seem to believe you have the key, and woe betide anyone who has a different (and perhaps more formal, rigorous, or academically respectable) definition. It looks especially bad when you claim I need such expertise to hold my opinion, but you do not need such expertise to hold yours." All I can say to that is if it seems that way it was not intended to be such and if you can point out how I can correct it I will do so. Meanwhile, I offer my sincere apology and make every effort to avoid anything of that nature in the future. I re-read the original article and your more fully written "C++ Skepticism, Not Hating" and see you are willing to defend your opinions strongly. The only advice I can give you here (not being an expert and not claiming to be) is to put on your thick skin when you enter the public as an author. You are going to get some flack...your best bet is to do what you did--defend your position. For my part I will think a little longer before I post and put myself in the author's shoes to see how my post will sound to that person. I've not been posting enough and got out of my good habits. You certainly provoked discussion here and that is the point of opinion articles, is it not? My responses were not intended to be personal attacks nor to claim any superiority of knowledge. Finally, I am not a C++ partisan and actually prefer Objective C--but most of my work is (very large) database work these days because it pays the bills. Thanks for sticking with the discussion.

apotheon
apotheon

QUOTE: As I said, I do not claim authority or mantle of "expert" status, but I can and do challenge an author who claims that status then makes assertions. We each clearly have our own definitions of "expert". Mine is closer to that of the Dreyfus model of skill acquisition. Yours is evidently closer to some HR bureaucrat's notions of what constitutes sufficient job experience to gain entry level employment. As the author of the article in question, I have no claimed expertise in anything in particular, by either of those standards, and I think you probably are not reading very closely if you believe otherwise. I do, however, believe that when you say you are justified in criticizing without some formal standard of expertise but I am not, you engage in some pretty blatant exercise of a double standard. Perhaps you should consider applying your standards more evenly -- or even apply a stricter standard to yourself than to others, as I do to myself than to others. I'd be satisfied with a recognition of your double-standard, and an abandonment of it, however. It's not like I will hold you up to the same standards as those to which I hold myself. QUOTE: I also said a CS degree is not "all that", but those who possess them tend to understand more about machine language and processor instruction sets than those without (note word "tend"). That's interesting, given that the article is not about this topic. As a result, your recognition of the potential value of such a degree seems somewhat irrelevant. QUOTE: The arguments made by the author were historical and preference based, not grounded in any sort of science, testing, or any other construct that would have validity (in the scientific sense of the word). Scientific? Last I checked, "validity" is not a formal term of science. It is a term of predicate logic and related fields of philosophy. In any case, the article is not a formal argument. It is an expression of skepticism. That is set forth pretty clearly in the title of the piece, I think. QUOTE: But, as a reader, I am more inclined to listen to those whose credentials and "expertise" lend credence and credibility to their assertions when no evidence is presented. Okay. Great. QUOTE: Without that credibility (not gained by MS Certs of which many, myself included, possess many) then the article must be considered opinion at least, at worst advocation. I'll make this clear for you: it's opinion, and was never meant to be taken as anything else. Again, see the title for where the tone was set. It's well-considered opinion, with some grounding in the conditions of fact, but as a whole it is opinion. People do still have those, and other people are actually interested in hearing about them, believe it or not. I'd be interested in hearing about yours, if your opinion did not apparently consist of nothing more interesting than the notion that only people with doctorate degrees in computer science can have opinions. QUOTE: As for "assassination of his 'expertise'", I did not challenge his bio, merely pointed to it, so I see no assassination. If he has more expertise than stated, I have no way to see it...he should post it. You are suggesting two things: 1. that I claim "expertise" by your standard of expertise, for some reason 2. that I do not live up to that claim, with my TechRepublic bio as "proof" of that It does, in fact, look like a character assassination, where first you set me up as a straw man, then you knock me down using hand-wavy standards to which you seem to believe you have the key, and woe betide anyone who has a different (and perhaps more formal, rigorous, or academically respectable) definition. It looks especially bad when you claim I need such expertise to hold my opinion, but you do not need such expertise to hold yours. QUOTE: When one posts an article with such strong conclusions, it is reasonable to expect others to challenge them on fair ground. I did just that. No, you did not do that. You implied I had untenable opinions without identifying them explicitly, stated that I have no business reaching them because I do not have the requisite degrees and length of professional experience in some as yet unnamed specific field of expertise, and claimed you do not have to live up to the same standards to have even stronger opinions without being willing to share much in the way of their particulars with us. QUOTE: Then again, when I post articles I am subjected to the same reviews. I fully expected this kind of reaction from C++ partisans. That doesn't mean your "argument" carries much weight so far, however. I hinted at some more of my experience -- and my lack of clear expertise in certain areas, though my long experience in others -- in another venue. Maybe you will find more reason to hate what I have to say based on (edit: blogstrapping dot com essay "C++ Skepticism, Not Hating"). (edit: typo) (edit: fixed the way TR's changes in formatting rules broke this commentary significantly)

verelse
verelse

As I said, I do not claim authority or mantle of "expert" status, but I can and do challenge an author who claims that status then makes assertions. I also said a CS degree is not "all that", but those who possess them tend to understand more about machine language and processor instruction sets than those without (note word "tend"). The arguments made by the author were historical and preference based, not grounded in any sort of science, testing, or any other construct that would have validity (in the scientific sense of the word). When that is the case, then the arguments *are* assertions. Whether or not they are assertions is not debatable. I welcome your correction of this viewpoint if you can cite those sections in the article. I won't hesitate to withdraw my claim and acknowledge mistake. But, as a reader, I am more inclined to listen to those whose credentials and "expertise" lend credence and credibility to their assertions when no evidence is presented. Without that credibility (not gained by MS Certs of which many, myself included, possess many) then the article must be considered opinion at least, at worst advocation. Hence the term "Coke and Pepsi". There is no argument for the merits of the authors positions other than more opinion, there are merely statements of preference and a citation of a discredited story acknowledged by the author as bogus but later used to build his arguments. Why? As for "assassination of his 'expertise'", I did not challenge his bio, merely pointed to it, so I see no assassination. If he has more expertise than stated, I have no way to see it...he should post it. When one posts an article with such strong conclusions, it is reasonable to expect others to challenge them on fair ground. I did just that. Perhaps I could have been more judicious in my language and considered the effect of my words and not just my intent. Then again, when I post articles I am subjected to the same reviews.

Tony Hopkinson
Tony Hopkinson

ability to discuss the subject based on. No declaring he had a degree in CS. That he 'only' had an admin cert. And now because he has less than twenty years in. Not one word about the arguments he was making... I've got twenty years + in, his arguments in the main made sense to me. Why you have a problem with them, other than the total inaccurate attempted assassination of his "expertise", I have no idea, because you haven't bothered to say. As though we are supposed to take your unsupported assumption of the mantle of true expert unquestioningly....

apotheon
apotheon

I thought you might be aiming for sarcasm, but I'm less prone than most to make assumptions without double-checking them.

seanferd
seanferd

I often wonder when I might need to make sarcasm or some other twist more obvious (Poe's Law and all). I did choose to state I was joking in another post when I called Chip a "morphing sockpuppet", because I thought missing the joke would be more personally offensive. Sometimes, I choose incorrectly. My apologies if I gave you a dose of head-explodey. I hoped to be over-the-top obvious, but failed.

apotheon
apotheon

It seemed like a sarcasm tag might be in order, but it wasn't a certainty based on your phrasing, so I had to ask.

seanferd
seanferd

All I saw was the crazy. Been seeing quite a bit of that here, lately. I should have added a </sarcasm> tag. ;)

apotheon
apotheon

Technically, the less-than and greater-than symbols are redirects. The vertical bar is the pipe.

Mark Miller
Mark Miller

"Have you been on the recieving end of code where they used operator overloading but completely changed the sense of the operators. user < and > as a sort of redirection but + and - made sense" I worked on a C++/MFC project 10 years ago where the programmer had redefined "<" as a stream pipe (in the Unix sense). They had implemented a thread to set up a dialog box that would notify the user of progress of a data-collection operation. In the background it was contacting a server to download data. In the foreground it would give status updates. The "<" was used to say, "Send this string to the foreground process." It was fortunate some of the developers who were around when this metaphor was created were still around, and I could ask what this meant. I ended up reworking some of this code, because the status updates came as text strings through a textbox or listbox (I forget), and the customer just wanted progress bars. I think I got rid of the "<" pipe metaphor, because the dialog box didn't need most of the string updates anymore. As I remember, I substituted that for a singleton (I forget exactly how I structured this, but it worked. I know a singleton was in there somewhere) that acted as a go-between for the background and foreground processes, and just dealt with MFC's documented messaging incompatibility with threads via. a well-known work-around.

Tony Hopkinson
Tony Hopkinson

where they used operator overloading but completely changed the sense of the operators. user < and > as a sort of redirection but + and - made sense. Serious dandruff creator that was. (lots of head scratching) All to save them coming up with a function name...

Mark Miller
Mark Miller

One student I remember was using it for his computer graphics course. He was using its operator overloading facility to do matrix computations. When I used C in graphics I remember defining a struct or 2-D array for a matrix and then calling routines like matrix_add(a, b), matrix_multiply(a, b), etc. This one guy I talked to was able to take two Matrix types "a" and "b" and do stuff like: Matrix c = a + b; Matrix d = a * b; Matrix e = 5 * b; etc. It looked really nice at the time. I was impressed that it even applied operator precedence to the overloads. Another student I knew was using an educational package called NachOS (kind of like Minix), with C++, for his operating systems course. It was an OS that was purposely broken (it had some missing pieces), and the idea was to fill in the missing pieces, like writing your own file system. I had just taken a course where we covered Smalltalk for a couple weeks (I fell in love with that language!), and what I saw of C++ at the time looked about as elegant (in a C sort of way), though I knew little about it. I got into learning C++ 6 years later, using Borland C++ 3.0, and it still seemed nice, though I didn't get into templates much. It wasn't until I got into MFC in 2000, when I took a job at a Microsoft shop, and the C++ developers were using templates like crazy, that I finally saw what a mess it could be! Even so, I had been so conditioned by C (and the belief that I would never work with something as nice as Smalltalk again) that I put up with C++'s mess for a couple years. From there I hopped from frypan to frypan for a while before I realized that I was missing out on some good stuff.

Editor's Picks