Software Development

Poll: Have you used C/C++?

Developer Justin James has never used C in his professional work and is curious to know if other developers use it in their careers.

When I entered this business, languages like Perl and VB were just coming to prominence and giving developers a chance to develop applications in languages other than COBOL, C/C++, FORTRAN, and Pascal. Those languages had really dominated application development for at least a decade, and even longer when you consider the history of COBOL and FORTRAN. By the time I graduated college, many schools had switched from teaching C or C++ to Java, and now it is not uncommon to find CS graduates who have never touched either language.

I used C in college briefly, but I've never had to use it professionally. Have you used C/C++ in your career?

J.Ja

About

Justin James is the Lead Architect for Conigent.

33 comments
oldbaritone
oldbaritone

I spent 6 years programming 6809 in C, running the embedded system controlling autonomous forklift trucks in a fully-automated warehouse. The vehicles were sent commands to position anywhere in a 500,000 sf warehouse within +-.75" both horizontally and vertically, then extract or place the load. No rails, just a grid of guide wires. What amazed me from the day I arrived was that the system ran that accurately, using integer measurements, on a processor with 32K ROM and 8K RAM, clocked at 900 Khz. Typical latency was about 2 ms from interrupt to control response, and the relays (FET's weren't up to the task yet) had a mechanical response time of at least 20 ms. Commutated SCRs switching 300-750 amps inductive load - talk about noise! Have you ever seen 4-0 copper wires visibly jump as the load switched on and off? (That's a welder wire, roughly the diameter of your thumb) 36VDC, not counting the KV and MV transients... ;-) Good days. Fun.

StevenDDeacon
StevenDDeacon

There are a lot of reasons why CS grads pass up using C/C++ as development languages. For business applications it seems to be easier to learn C/C++ object oriented or OOP derivatives like Java, PHP, Ruby, C# and Python. I find object oriented languages to be cumbersome and difficult to learn. Having programmed in IBM 370/XA Assembler and Intel Assembler I feel more comfortable with them than C. I have always found structured and function based programming languages easier to learn. Most of my business programming has been done with IBM COBOL Command Level CICS. In UNIX environments I used shell languages like Korn, Bourne, and Bash with Perl and a little C. For Web development XHTML, CSS, and JavaScript seem to be the way to go. PHP seems to be used often for web development these days and Ruby seems to have quite a following. Java is still considered the preeminent web development environment and is closely associated with SOA. For relational database accessing it is good to know SQL.

CharlieSpencer
CharlieSpencer

does anyone still teach them? How about RPG or RPG II? Just wondering. I haven't dealt with COBOL or FORTRAN in over 15 years, although we were still running our manufacturing plant on a FORTRAN MRP application four years ago. I only saw RPG in school in the early '80s.

Mike Page
Mike Page

Universities that provide an education in computer science would be short changing their students if they didn't teach C/C++. Most modern languages hide a lot of complexity and allow the coder to use them without understanding how a computer works. C/C++ requires you to understand important concepts such as stack variables, heap variables, pointers, and memory management. In practical use C/C++ is important for writing high performance applications. You may be able to write slightly faster code in assembly language, but your productivity would be much lower and the code would not be portable as it would be bound to the processor's instruction set. Languages are tools in a tool box. Chose the one that solves the problem best. If performance isn't an issue then you may be able to code more quickly in C# or Java.

derek_c
derek_c

I knew about C in the mid-late 80s, in fact I bought a computer ( a Sinclair QL) specifically because there was a C compiler available for it. Two or three years before I had been writing professionally in Coral 66, and C was the closest fit in terms of features (but not syntax) with an affordable compiler. However my first professional project in the C family was in 1989 using C++. It was an absolutely resounding success. I've been writing in C++ and C for a living ever since! Before 1989 I had been using a whole host of languages for real time and engineering software, after 1989 there was simply no need to be a polyglot...

Tony Hopkinson
Tony Hopkinson

I did a bit of Turbo C, DEC C, and some C under Unix. I've done no C++ short of sight reading and bug fixing at all. Academia chose Pascal for me. Very small percentage of an extensive career though. I think it or an other procedural languge should be taught in CS though, don't see how you can understand OO without knowing procedural.

TheWerewolf
TheWerewolf

Does a programmer get through a career without doing any C or C++???? These are THE most commonly used languages in the world... Especially if you include derivative languages like Java, C# and ObjC... We wouldn't even hire someone unless they had a solid understanding of C++.

Alpha_Dog
Alpha_Dog

...the trouble is that no online school teaches it!

Sterling chip Camden
Sterling chip Camden

With only a few of those years not involving C or C++ almost every day. Mostly software for making software.

Mark Miller
Mark Miller

I had C beginning in my 3rd year of college, and used it for 4 years (officially) in industry. I did a little application development with it for 16-bit Windows, but mostly used it to write tools for MS-DOS, and transaction servers for Unix. The three things that were "dangerous" about it in my experience were pointers (when there were a lot of them to deal with in a data structure), the strcpy() function (because it'd happily overrun a buffer without batting an eye), and floating-point arithmetic (do not expect precision!) I used C++ for about a year-and-a-half. It was kind of disillusioning. I learned OOP in college with Smalltalk. I kept expecting C++ to be a similar experience. It wasn't. My first foray into it was nice, at least from the perspective of having used C for a long time. I translated a C program into it, and noticed that C++ added some neat capabilities to the code, since I was familiar with OOP. One was greater clarity. I think what screwed up C++ for me was templates, and my experience with MFC, somewhat. Templates seemed nice at first, because they helped make code more generic. It was really nice to be able to pass parameters of different types into a function and to have it "just work." I only had to specify the templates once, and I could let the compiler figure out the rest. They got tiresome, though, when I got into working with a large application that was actually done using OOD. We used some design patterns, which frequently involved using templates, and a smart pointer to do memory allocation/deallocation. Templates became very visible. They didn't "just work" anymore. They became like any other type I'd have to specify anywhere else in the code. Having to wade through all that metadata, which was quite repetitive, became a chore. A smart pointer, BTW, is a class (at least this is the way we implemented it) that acts as a container for an object on the heap. It handles creating the object, holding on to the actual pointer to it, and counting references to itself. All access to the object occurs through this container. Once the reference count to the smart pointer reaches zero, it deletes the object it's holding (not unlike dealing with COM in C). It requires a "gentleman's agreement." Every programmer in the group has to use the same container for objects on the heap in order for it to work effectively to prevent memory leaks. There is a smart pointer type within STL, but from what I remember it did not do reference counting. It would simply allocate and delete an object automatically. You could "transfer ownership" of the object from one smart pointer to another, but that was it. Two STL smart pointers could not access the same object. A lot of the C++ code I worked with didn't use templates at all, because the original programmer just wrote C code, only using objects to access the MFC API. So you could say I worked unofficially in C for a bit longer than 4 years. MFC (Microsoft Foundation Classes) started out nice. I felt as though I was able to be 10x more productive in it than I had been programming in C for either Windows or Unix. The thing was, this was only while I could stay in MFC's classes, which didn't provide complete coverage. Once I had to drop down into Win32 (C) code, it was not as pleasant. For anything beyond a simple MFC app. you had to eventually get into Win32. It didn't take that long, either, to find MFC's warts. Its generic containers sucked big time. We used STL's containers. The screwiest thing I remember about MFC was the way it did message handling. The typical stuff wasn't that bad, but again, if you wanted to get beyond basic stuff, the warts would come out! C# and the .Net Framework looked positively sane by comparison, but eventually that turned out to be deceiving. The two things that killed that for me were Visual Studio's drag-and-drop method of application design, and ASP.Net WebForms. WinForms was actually decent on it, from what I could tell, if you'd skip using most of VS's tools, but hardly anybody in industry wanted to use it when it came out. That ship had sailed.

charlesnburns
charlesnburns

I used it and taught it for years, but recently have written only C#, PHP, and Oracle-related. I really like C++ and have rarely found significant merit in complaints against the language. It is a more challenging language in which to write some software, and some of its more advanced features can be confusing at first (multiple inheritance, template meta-programming), but many such features are simply absent from other languages, so it's a matter of "can use them but they are confusing" vs. "don't have the choice." Ultimately, C++ has three major strengths: 1) It is based on C and thus based on assembly language, so is closer to the machine. This not only makes for better compiler optimization, it also makes better programmers, even if they never use C++. To have a good idea what is going on "under the hood" leads to better, faster programs and more confident software development. Library functions don't have to be mysterious and magical. 2) The core of most languages and operating systems is written in C/C++. C/C++ is also the standard for compatibility -- if you want a library that every language can use (even Java!), you write it in C. 3) C/C++ is simply faster. I've studied the avalanche of benchmarks that insist that Java "can be just as fast or faster", that C# provides "about 90% of the performance of C++", etc. but they are nearly all flawed. Every time I read one, the first step taken is to write a bunch of simple functions in each language. Ignoring the fact that the author may not know how to properly write them in every language aside, most compilers/VMs/interpreters can handle a bubble sort, stable factorial, or other simple algorithm with minimal overhead. Whole applications is what matter in the real world -- applications that are affected by variables not often thoroughly tested, including start up time, user-pattern disk IO (no, STR and random IO have little to do with real-world application performance), function call overhead, garbage collector overhead with tens of thousands of objects, threading overhead, and countless others. Even Facebook, like a champion of PHP, is now converting their PHP to C++ to get "70% more traffic on the same hardware" (and that's after the overhead of converting a high-level language to a lower-level one -- if the software was written in C++ in the first place, the difference would be larger. I love my C# with its properties, LINQ, consistent and powerful libraries, and other time-saving features, but when software performance is at the top of the list (and programmer time is not!), C++ will win every single time.

jkameleon
jkameleon

Machinocentric stuff, mostly for embedded systems for industry. Things C/C++ was intended for.

charlesnburns
charlesnburns

I taught Fortran to students of nuclear engineering. It is widely in use for reactor control and related software. I currently work at a semiconductor company whose MES (Manufacturing Execution System) is written in pure Fortran running and runs on VMS. I have a few relatives at JPL and NASA that use it regularly to model spacecraft navigation problems. In "normal" industries though, no, it's all C/++, Java, .NET, web languages, and occasional assembly. I haven't seen much of COBOL or RPG. A friend of mine worked for a company that sold COBOL software running on low-end IBM proprietary workstations with AIX, but they went out of business a few years ago because, well, it was really old and no one knew how or wanted to update the software to move past 80x24 consoles.

Mark Miller
Mark Miller

I can't remember when this was the case. Basic (I'm assuming this included VB) was the most commonly used language in developer surveys up into the mid-1990s. It wasn't until the late '90s that this dropped off. Maybe C/C++ enjoyed a brief stint at the top spot during this time. I remember when I went to one of the first .Net developer events, most of the people who showed up were VB developers. There were literally about 5 people in the whole place who programmed in C++. From what I understand, Java has taken over the top spot. I heard that Basic programmers switched to it several years ago after Microsoft made VB.Net incompatible with VB 6. I have no idea if that's the reason for its popularity.

Tony Hopkinson
Tony Hopkinson

If it's because you have a a lot of C++ code to work on that would make sense, though it make your statement somewhat misleading. In 25 years, I've done may be two to three weeks of C++, probably, 3 - 4 years of C. not counting deciphering API calls...

Slayer_
Slayer_

As schools stopped teaching C and C++ a long time ago thinking that java is close enough that you don't need to learn C. Many new developers don't even know there is such a thing as the windows API. (Seriously) We've made programming to easy....

coderancher
coderancher

Most languages use the same IEEE standard to encode floating point numbers so the same rounding errors apply to all of them. I also have to take that into consideration when writing algorithms in Java, Python, and even MATLAB.

bboyd
bboyd

Better than god awful spreadsheet embedded VB. That has a habit of showing up around my current shop. I guess that's what you get when engineers program when the don't really want to.

derek_c
derek_c

VB has always really been a language for the GUI end of multi-tier applications. Then Java arrived to start taking that role over, but then the Sun and Microsoft tussle occurred, and Microsoft brought out C# which has basically replaced Java in all-Microsoft projects. Of course now there's the great swathe of Web projects where servers might be using C or C++ or C# or PHP or Python or Ruby (too many choices already!), but the front end might well be using Javascript in a web browser, or Actionscript in a Flash front end. But C and C++ are still the most popular languages for system software, embedded software and real time systems.

dltaylor
dltaylor

I am still in school at this time and as far as I know they are still teaching C++ here in Texas. My class was Learn C++ by making Games.

Mark Miller
Mark Miller

If I heard a CS professor say that, I'd want to slap them upside the head. I'd also write a letter to their department recommending they be fired for incompetence. The only thing that's "close enough" to C about Java is its syntax. I don't think that's the reason they're teaching Java. At least I hope not! I think the main reason is they think it's the way industry is going. A big part of that, it seems to me, is that CS is moving away from its theoretical roots, and is essentially teaching software engineering, not computer science. C/C++ used to be taught as introductory languages in CS, as is Java today, which in hindsight I think was, and is, a mistake. C, C++, and Java were not designed as educational languages. I learned C 20 years ago in college, but I only took a half-semester course in it during my junior year. Most of my senior year courses used it. The introductory language was Pascal, which was much better suited for that purpose, because it was designed for students of programming. From what I saw some years ago, the introductory course in C++ was 4 semesters, and most of it was in plain 'ole C. A terrible waste of time (not the languages, but the length of time spent just on learning them).

Mark Miller
Mark Miller

I realize that the IEEE standard is used by most languages. Some languages have implemented their own arithmetic functions for floating-point, providing greater precision. I read up on the IEEE standard several years ago, and it basically admitted that you cannot expect precision from it. It only produces an estimate of the actual value.

apotheon
apotheon

We need more high quality fixed point decimal libraries.

apotheon
apotheon

There are a couple of schools in my general area that use C and/or C++ for introductory programming classes.

apotheon
apotheon

> engineering is a principled approach to making something. How do you define "a principled approach", though? Does cut-and-paste cargo cult programming qualify? > Even teaching the building of CRUD apps. is a kind of engineering. The problem with this statement is that all of the engineering in most of this work was done thirty years ago, leaving these guys doing paint-by-number rather than engineering -- and, even worse, they aren't even incorporating advances in the state of the art that have been developed since then, which means they not only aren't doing the actual engineering (nor are many of them competent to do so if they wanted to), but also tend to ignore the people who are doing engineering. Engineering is not construction. It's defining how something should be constructed. When all you're doing is putting pieces together according to instructions handed down as sacred texts, blindly copying what others have done, and diverging from that only to the extent accidents of incompetence produce such divergence, I'm not inclined to call that "engineering". > I agree, though, that the engineering we see today is not as good as that of just 10 years ago. I would say, rather, that for the most part we aren't seeing engineering today -- but what little there is tends to be better than what we saw a decade ago, because the people actually engaging in activity that might reasonably be called "engineering" are building on the advances in the state of the art provided by previous generations of engineers.

Mark Miller
Mark Miller

As Alan Kay says, though, engineering is a principled approach to making something. That doesn't mean it's of high quality. It's an outlook applied toward a solution. That's all. This goes back to what he's said before about how typical programmers make software. He compares it to how the ancient Egyptians built pyramids--big piles of bricks. Sure there were techniques for constructing them, but the quality of the engineering was low. It depended mostly on unskilled manpower, with a small cadre of skilled engineers and craftsmen, and bureaucratic organization. Compared to Chartres Cathedral, you can see a clear difference in the quality of the architecture, and the quality of building in terms of how it accommodated human beings. This speaks to the advanced outlook and skill of those builders. Both they and the Egyptians used engineering, but the builders of Chartres used a better form of it. Even teaching the building of CRUD apps. is a kind of engineering. At base, engineering is "how to" knowledge. *Modern engineering*, which most software has yet to meet, is more advanced, incorporating knowledge from science, providing a keen understanding of systems, tolerances of materials, and how scale affects what's being built. I agree, though, that the engineering we see today is not as good as that of just 10 years ago.

apotheon
apotheon

It might be enough to give someone a grounding in theory with one highly flexible language, but I think it would be more optimal to start with such a language, then teach some others along the way (HLA being one of them; something like Python, Ruby, or Smalltalk being another -- and yes, I'm aware there are distinct differences between the three).

apotheon
apotheon

I think your use of the term "software engineer" is either based on old information or entirely too kind. These days, they don't teach anything like engineering, judging by what I've seen; they teach daycoding. Clock in at eight thirty, write DB API code for CRUD apps for four hours, clock out for lunch, clock in again after half an hour, write DB API code for CRUD apps for four more hours, clock out, go home, and don't use a computer outside work except perhaps to buy Christmas presents and look at porn. The people turned out by college "Computer Science" departments seem barely trained to perform these basic tasks. As much as I refer to myself as a mediocre developer, I believe I'm a better programmer than most of the people graduating with (nominal) CompSci degrees today.

Sterling chip Camden
Sterling chip Camden

But those are really all implementation details that you learn how to manage -- a good programmer starts with a functional understanding of the problem, then adapts the solution to the tools with which s/he must work. The cleanest C code is written in a style that is almost free of side-effects. That's a lot easier done in C than it is in COBOL or Basic.

Mark Miller
Mark Miller

I remember having an argument with one my CS professors when I was an undergrad over the popular language issue. This was 22 years ago, now. He told me what you said. The "hot" languages come and go. What you need is skill in understanding how to learn a language so that you can learn any one that comes along. It's hard for me to say if teaching only Lisp would prepare people for any language. C is quite different, isn't it? I understand that the functional aspect, and the idea of references would be similar. I'm talking about the memory model, and pointer arithmetic. Are there analogs to these things in Lisp? It just seems like there's a lot more that can "go wrong" in C.

Sterling chip Camden
Sterling chip Camden

... ten years ago. That's the trouble with teaching supposedly marketable skills instead of theory: the schools can't keep up. But if they taught Lisp and all the theory behind it, they'd produce people who could code in any language.

Mark Miller
Mark Miller

Dealing with double literals was okay. Where it got really hairy was when I'd use atof(), or strtod() from the standard library, which converted a string (in my case, a currency value) to a double. I used to use this rather often when I was writing CRUD code for database apps. The numbers would appear to convert fine. If I printed out the value, it would show what I expected. I didn't get around to confirming this, but it seemed like this would always lead to some "stray bits" in the double value, because if I'd divide it by an evenly divisible value, I'd get junk. It would be really close to the value it should've been, but it'd always be a bit off. I didn't see this sort of extreme behavior with literal doubles. Looking back on it, maybe atof() was getting confused by having two significant decimal digits in the string. Anyway, the only solution I found at the time was to do gymnastics, converting the double to an integer, maybe sometimes converting it back, clean, if I needed to. It was really frustrating. One time I saw a bug along these lines that horrified me. Our client app. was receiving database records from our server. The code for one screen would convert a currency value (string) to a double, and then display it for the user. I'd just hit "OK" on the screen, changing nothing. The value got converted back to a string (probably using sprintf(buffer, "%f", value)), and when it did so, 1 was added to the hundredths place. I did not add 0.01 to the value. Just the conversion did it...somehow. Every single time this value got transferred back and forth between the server and the client, a penny was being added to the value! This drove me up the wall. I agree with apotheon. A good fixed-point library would've been a lifesaver. There was one commercial package I remember seeing around at the time that promised greater floating-point precision (this was in the late '90s). I don't remember if I passed the idea by my boss, but I doubt my employer would've sprung the money for it. It was an annoying problem as far as they were concerned, but it wasn't worth shelling out bucks to them. Sometimes I think, though, they were penny-wise, pound foolish about this sort of thing. Consider my time trying to fix problems like this. It was certainly costing them more than the library. Then again, there is the old rule that any time you introduce unfamiliar software into your system, you could introduce new bugs or portability problems. We had to worry about the latter, because our Unix software had to be ported to customers' systems.

coderancher
coderancher

Modern high level languages (e.g., Java, Python, SQL) have exact (fixed point) decimal types in their standard libraries, but computations using those types (which are implemented in software) are slower than floating point arithmetic (which are implemented in hardware). Thus, fixed point decimal types are often used only when penny-pinching is literally necessary (e.g., financial computations). In science and engineering, as long as the precision of the floating point result significantly exceeds that of the measuring devices, it is a good enough compromise between computational efficiency and exactness. The simplest way to mitigate rounding errors is to always use double precision in computations. Also, instead of testing for equality, test for whether a value is within twice the estimated error of another. There is an extensive paper on the issues of floating point arithmetic: http://download.oracle.com/docs/cd/E19957-01/806-3568/ncg_goldberg.html