Software Development

Is there still a need for Hungarian notation?


I do a fair amount of Java programming, but it has always been for servlets and JSPs, so I've never had a need to do any Swing GUI programming. I've long been curious to learn more, so I recently picked up Herbert Schildt's Swing: A Beginner's Guide.

I haven't finished it yet, but it's been quite good -- one of the better books of this type that I've read. No complaints at all about the technical aspects of the book. The thing is, in all of his code examples, he's using a variant of Hungarian notation for his Swing components:

JLabel jlabBass;
JSlider jsldrBass;

I haven't actually seen Hungarian in quite some time -- mostly because I think many programmers (especially me) are kind of lazy. Remembering to consistently prepend an appropriate prefix to variable names is tedious.

Now for those of you younger programmers, Hungarian came out of Microsoft and was essentially some rules for creating a prefix in front of all your variables. The prefix would tell you something about the type of that variable. The idea being that you'd easily know whether two variables were compatible types by the prefix on them.

As computers got faster, there was enough horsepower available so that IDEs could start detecting variable types in real-time; so, as you code, the IDE can alert you to type incompatibilities (usually with the little squiggly red underline popularized by Microsoft Word's auto spell-check).

Which brings me to my question: Is there still a place for Hungarian notation in a world with type-savvy IDEs? I probably would have written those variables like this:

JLabel BassLabel;
JSlider BassSlider;

On the other hand, I do see the benefit to consistent use of Hungarian. I think it actually makes sense for code that's going to be printed, as in Schildt's book. In paper form, Hungarian makes the code quickly readable at a glance.

So, do you think there is still a place for Hungarian notation? Do you use Hungarian where you work?

159 comments
JackOfAllTech
JackOfAllTech

"Hungarian came out of Microsoft" Not a chance! Hungarian notation has been around since C was invented and Unix was born. Ralph

orlando-metcalf
orlando-metcalf

I have read with interest everyone's comments and views on this question. My first programming job was in 1966. What I have learned over the past few decades is there is a need for standardization. This leads to cleaner code reviews, better design testing, and fewer bugs in the initial release. Standardized coding processes are shop-level activities and independent of the language used. Is it a pain to provide a header for every code module and subroutine? Yes. Is it necessary? Yes. With all the built-in capabilities of today's IDE tools, why is this necessary? The Object Oriented languages permit a given function and variable to be overloaded or changed during runtime. How, when, and where do I learn what I am dealing with when the code produces the incorrect result? As a point of reference, in the early 70's I wrote self-modifying and adaptable code. We were overloading functions and variable data types before the fancy words existed to describe this ability. Is there still a need for Hungarian notation? No, there is not a NEED for Hungarian notation. There is however a need for standardized coding processes within a given organization. Many of the respondents have touched on various portions to the "shop standardization" process: 1) Module / Subroutine Headers describing the function of the module, the input variables (typed), the output variables (typed), the author and date, and the revision history. 2) Shop naming convention - I still use Hungarian. 3) Code review procedure 4) code release procedure 5) Code source control procedure I love these question and view threads. Please keep up the great exchange of ideas and experiences.

Zeroesque
Zeroesque

Uh, I think it was really born from BCPL, which had no type-chekcing.

nickpixel
nickpixel

When I write script (VBScript, Javascript, etc.), in which case I'm dealing with variants as opposed to typed variables, I always use Hungarian Notation. When writing C# or Java, I find it redundant because Eclipse and VS.NET can tell me whatever I need to know about the variable and in the rare cases I open up source in notepad, I can still see the type wherever the variable is declared. In C#/Java, I'd much rather see "contractCount = 0" than "intContractCount = 0" but that's a personal thing. There is no right or wrong answer (unless, of course, your employer has coding standards you have to follow and then it doesn't matter what your opinion is :) ).

david.castlewitz
david.castlewitz

We abandoned strict Hungarian Notation about a year ago because "intellisense" (Visual Studio, C#) works better without it - at least, in our opinion. However, we still prepend member variables with m to aid readability. Gone, however, are the s,n,g,z, and other type indicators.

jslarochelle
jslarochelle

Where I work several programmers are allergic to it. I personnally don't think that there is much value added when you use a strongly typed language. I think having some kind of convention to differenciate parameters from local variables, data members and static members (Ruby requires this: @ and @@ prefix) might be usefull but I'm not sure exactly what form this could take. JS

Wayne M.
Wayne M.

aGood nGrammar vIs nImportant pTo aGood nCommunication. nThat vIs nWhy nI vPropose nWe vUse aHungarian nNotation pTo vIdentify aWord nTypes pIn aOur nWriting. nThis vWill vMake aOur nWriting aMore nClear.

Jalapeno Bob
Jalapeno Bob

I have used Hungarian notation since I was an undergraduate learning PL/1 in 1968 and I still use it today. We didn't call it Hungarian notation, however. It was just Dr. Braun's recommended way to name variables. I have used it in all languages I have worked with, except FORTRAN, in the years since, including PL/1, TAL, BCPL, C, Visual Basic, TCL/tk, Unix/Linux shell script, awk, at least half a dozen assembler languages, ...

Tony Hopkinson
Tony Hopkinson

is the way it exposes a specific implementation detail in what is meant certainly in coding for coding in a windows gui, a fairly high level abstraction. If your implementations don't change that often and you have a well adhered to single standard and you maintain and evolve it in step with the code base, you can do it nd gain the original benefit. If all those pieces aren't in place though, or even weren't in place, it will harm more than help. Sensible standards both in naming and design which have continuity over a code base's lifetime are the single biggest factor in a maintainable design, anything after that is red wine gravy.

Wayne M.
Wayne M.

[Reference: US TV commercial for Ho Hos - a snack cake] There is even less added value with prefixing a variant or any other dynamically typed or untyped variable with a data type. I can treat theBirthdate as a date or a string. I can treat theNumberOfOrders as an integer or a string. One of the fes places where I ever encode data type information into a variable name is when I need to convert types (and then I find it is clearer to include the information as a suffix rather than a prefix). If there is no need to do type conversion, when try to specify an arbitary data type?

Tony Hopkinson
Tony Hopkinson

Don't use hungarian in a dynamically typed scripting environments (perl, python, ruby, etc ), that's massively counter productive. I consider variants untyped, horrible nasty things, give your self as much help as possible. As Deepsand says don't rely too much on the IDE, after all you could be reduced to notepad and the command line. :D Simple conventions like EmployeeNumber is always numeric and EmployeeID is alphanumeric along with decent names usually takes care of most things. I do use hungarian when the name would be misleading e.g strFileDate or some such.

deepsand
deepsand

Unless you are developing for your sole personal use, [b]never assume[/b] that the tools used for development will [u]always[/u] be available to [u]everyone[/u]. If for your use alone, make such assumption if you like, but be aware of the risk before doing so.

Paul W. Homer
Paul W. Homer

The problem solved by Hungarian notation -- creating names for variables -- isn't a real problem if you understand what you are writing. Everything has at least one unique and consistent name, and most things have many. Thus, it is extra unwanted complexity that increases the likelihood of someone misunderstanding the code. A very negative thing indeed. Paul. http://theprogrammersparadox.blogspot.com/

deepsand
deepsand

The application of taxonomy to computer programs' elements began as soon as those procedures which served to translate the programmer's code into machine language, i.e. interpreters, assemblers & compilers, along with storage resources, allowed for such.

Paul W. Homer
Paul W. Homer

The funniest thing I've ever read in a technical book came from the intro to one of the Charles Petzold's Programming Windows books. If I remember correctly, it was when the underlying implementation went from 24 bits to 32 bits. He was trying to justify (explain?) why the names of two of the variables -- WPARAM and L something -- where different, yet the underlying types were actually the same. I found it hilarious at the time mainly because it so adequately showed why Hungarian notation was such a bad idea. Paul. http://theprogrammersparadox.blogspot.com/

nickpixel
nickpixel

The value I find in using Hungarian notation with variants in scripting languages is that it provides a hackish way of declaring the way a variant is going to be used. In the grand scheme of things it shouldn't matter since you should be checking type before performing type specific operation on a variable (like checking IsArray() in VBScript before accessing an index) but it prevents me from having to scan through code to see how a variable is being used in order to determine it's type. In javascript, it can save some concatenation/addition headaches. For instance if I have 2 variables with generic names like "a" and "b", the "+" operator is going to act differently based on the value stored in the variable. If a = 1 and b = "2", (a + b) will evaluate to "12". If a = 1 and b = 2, (a + b) will evaluate to 3. With scripting language variants, HN serves more of a purpose in making sure I use variables a certain way.

Tony Hopkinson
Tony Hopkinson

It's so you can fool yourself into believing that you've written type safe code. I f'ing hate variants.

jslarochelle
jslarochelle

Although I don't totally disagree with your point about tools I must say that my position on this has shifted in recent years. I think that for OOP, a tool with a decent class browser is almost an absolute requirement. A large project made up of small highly cohesive classes will often contain a lot of classes. Being able to visualize the hierarchy of a class and easily move to a class or method definition makes the "correct" OO approach much easier. Several years ago (around 1991) one of my friend had a job programming a kind of multimedia application in Smalltalk. I remember being puzzled by his excitement about his "browser". What was so great about this feature ? The rest of the graphical IDE he was using was excitment enough for me compared to what I was working with: a (text based) text editor. I also wondered why Bertrand Meyer was spending so much time discussing the "proper environment" in his great book: "Object Oriented Software Development". Of course I read the arguments but I was not convinced. Well today, because of the years I have spent working on fairly large projects with lots of code written by other people I am convinced: a good Object Oriented Technology is a language + an appropriate environment. I am not talking about GUI builder but about a tool that suplies functionnality that a computer is good at: "stupid things" like showing a class hierarchy or finding the declaration or usage of a method. Not generating code. Using this I can concentrate on writing good code and solving problems. Of course I can still use a text editor but it is painfull. Fortunatly for me I work in Java and several good open source OO IDE are available I know I'm not totally "on topic" with this. Sorry JS

nickpixel
nickpixel

I appreciate your concern and maybe I should've been more elaborate in my comment but since the topic is 100% subjective (regardless of what some people think), I was just adding my 2 cents. The code standards for the company I work for specifically states not to use Hungarian Notation (this was pulled directly from MS's C# Usage Guidelines). When we became a strictly dot net shop (even though we still have many existing Java apps to support) we updated our standards and used a lot - not all - of MS's suggestions. Besides, there are always assumptions. As I said in my previous post, "in the rare cases I open up source in notepad, I can still see the type wherever the variable is declared". And if the variable happens to be in another class file, it's moot anyway unless you name the public properties using HN as well. Do you use HN to name public properties (for example: MyObject.strProperty)? If not, why not? I don't feel that assuming everyone who works on the code will have a text editor with a find feature is any less safe of an assumption than assuming they have the necessary compiler to compile the code. If you do feel that is an unreasonable assumption then we'll just have to agree to disagree.

jslarochelle
jslarochelle

My opinion on this is very similar to yours (short version): Hungarian notation adds "noise" to the code and is not a good idea. However, I think there are more important factors to focus on (I'm amazed that this blog is still going on) and I'd rather spend my energy fighting for those aspect of programming: Good abstractions, loose coupling, encapsulation and information hiding, ... Anyway, once everybody agrees on those more important aspect of programming I don't think it would be difficult to agree on hungarian notation (don't use it) if it is a problem. I still think that some convention to indicate the scope of a variable (like in Ruby) might be usefull. JS

Tony Hopkinson
Tony Hopkinson

closely followed by labels for jump, call and conditional branch. Then meaningful names for memory addresses. Early language was simply a recognition of very low level patterns such as loops etc. Data types weren't expressed obviously, scoping was though, $ for an address and # for a literal, when I started way back when. We should bring it back educationally, students would appreciate all the tools they are now provided then.

Tony Hopkinson
Tony Hopkinson

with it, then OK. What about 1 + '2' and '2' + 1 ? Variants all the cons of both static and dynamic typing, not one of the advantages. You have my deepest sympathy, I did it for a bit, too easy to make a very hard to find mistake in a complex piece of software. Any help you can give yourself is valuable at that point.

deepsand
deepsand

Doesn't look "off-topic" at all. It's good to recognize that not only the results of one's work are dependent on the tools used for their creation, but so too is the ability to service them. And, that serviceability needs to be part of the design considerations.

nickpixel
nickpixel

Do you not know what Sophism is or have you just grown so tired of replying to your own fallacies that you've resigned yourself to posting arbitrary statements? If you do know what sophistry means then hi Pot, my name is Kettle. Nice to meet you. Do let me know when you post your tutorial on how to edit source code without a text editor. If it involves telepathically communicating in binary, I'm not sure I'd be able to learn but I'd love to read about it.

nickpixel
nickpixel

Do you have any shame? In post 121 of 140, I said "the entire quote, not an edited sentence". In post 123 of 140, I said "the entire quote, not an edited sentence". In post 125 of 140, I said "copy and paste the unedited quote". Of course I'm not psychic, I just knew you had 2 options. One was to paste the whole quote and admit you had never actually read it. The other was to paste an edited quote that, when read out of context, would make it appear you were addressing something I said. For a while you chose option 3 (ignore the question) which makes it pretty clear you knew you couldn't paste the whole quote. Strangely, the little ounce of shame that was preventing you from doing exactly what I predicted you would do (posting an edited quote to change the context of the statement) disappeared. Here is the quote in it's entirety which has a completely different meaning than the truncated version I accurately predicted you would post (thanks for being so predictable). ["When writing C# or Java, I find it redundant because Eclipse and VS.NET can tell me whatever I need to know about the variable and in the rare cases I open up source in notepad, I can still see the type wherever the variable is declared."] Not that you didn't already know but the full quote makes it quite clear that not using HN causes me no problems when opening source in notepad. That contradicts your fallacy that I implied I was relying on these tools. In context, the only point being made about those tools is that they provide helpful features. Hopefully, the lack of shame displayed by you in this thread only exists because of the anonymity that the web provides.

nickpixel
nickpixel

I've read your initial reply and haven't said anything that would imply I didn't. You said ["Given your stated reliance on development tool"] but, even though I asked several times, you can not produce a quote from me where I stated I'm relying on a development tool. If I did state this, is should take you no more that 30 seconds to copy and paste the unedited quote. The fact that you'd rather spend more time constructing straw men and addressing them tells all. That's 3 times now you've avoided my request. Let's see if you can make it 4.

deepsand
deepsand

"Unless you are developing for your sole personal use, never assume that the tools used for development will always be available to everyone. If for your use alone, make such assumption if you like, but be aware of the risk before doing so." ... It would become clear that: 1) Given your stated reliance on development tool, such is reply is both germane & appropriate; and, 2) It is you have strayed from the narrow path that you insist must be kept to. Persist, if you like, in holding to the belief that it's "100% subjective." If that's the case, then why do you persist? You like chocolate; I like coffee. EOD.

Bob.Kerns
Bob.Kerns

Free tools that allow you to quickly locate the definitions for things (as in a keystroke or two) have been available for longer than many programmers have been alive. Emacs (TECO-based) came on the scene late 1970's). It was followed by a series of portable clones, generally with an embedded LISP interpreter, culiminating in GNU Emacs, which is still widely used today. Just about every IDE environment has had this available as well. The Mac environments did in the early 1980's. How, we have free, cross-language IDEs like Eclipse and friends. If you don't have a tool that lets you quickly go to a definition of class, variable, method, etc., then you're probably the sort that hangs pictures by grabbing a pencil, a beer mug, and pounding the the pointed one into the nearest flat surface. Just because one guy still programs by highlighting dead trees isn't good enough reason to degrade everyone's productivity writing and reading code with the most modern tools available. So I think it's perfectly reasonable to assume a least-common-demoninator of tool functionality. Within a project, it may well be reasonable to assume a particular tool, so long as the assumption doesn't carry too heavy a burden if, say, tool choices evolve.

nickpixel
nickpixel

...but all I was saying was that, unless you only read the first half of the sentence where I mentioned using VS and Eclipse, I don't understand how you can say I was relying on either. The second half of that same sentence clearly states how and why not using HN for typed languages causes me no issue when opening or editing source in notepad (or any other text editor with a find feature). "Relying on" and "using" an IDE are not the same. If you were not implying that I claimed to be relying on an IDE, then I apologize but since you posted it as a reply to one of my posts, I'm sure you can understand why I would think that is what you were implying.

nickpixel
nickpixel

I'm glad you know Eclipse is a "developmental tool". Do you want a cookie? I don't recall asking you if Eclipse was a "developmental tool" but since all of your replies are to straw men and not the content of my posts, it doesn't suprise me. I asked you to quote my post where I ["indicated that [I'm] relying on the availability of a particular developmental tool..." (the entire quote, not an edited sentence)]. For some reason (maybe because I never said it) you decide to tell me that Eclipse is a developmental tool. Any other bombshells you want to drop on me? I don't need to read the entire discussion because I was not replying to any post in the discussion. My post was in response to the topic "Is there still a need for Hungarian notation?". You're the one that replied to me. If my first question was answered then you believe the question "Is there still a need for Hungarian notation?" has an objective answer. Is that answer "Yes" or "No"? When I asked you ["Do you use Hungarian notation to name the public properties of your classes (for example: myObject.strProperty = "string")? If not, why not?"], what facts am I assuming? If you're going to avoid a question because, as you say, it ["assumes facts not in evidence and is irrelevant"] then you at least have to tell me what facts you assume I'm assuming and how it is irrelevant. Otherwise, I have no other option that to assume you don't want to answer the question. Finally, in response to your statement: ["That I fail to agree with your belief that the matter(s) under discussion is(are) wholly subjective seems to greatly trouble you"]. Again, this shows that you have not been reading my posts or you are really committed to using Straw Man fallacies. In my first response to you I ended by saying ["then we'll just have to agree to disagree"]. The only thing that troubles me is that you're pretending to be replying to my comments when you're not. I don't want you to "apologize nor alter [your] position". I just want you to stop altering my position. I would've already stopped replying to you if my co-workers didn't find the whole thing hillarious. A couple of them are under the impression that you're not serious and are purposely messing with me but I'm not sold on that yet.

Tony Hopkinson
Tony Hopkinson

would do it. Not that I'm recommending them!. Wasn't my point anyway. Visual studio interacts with the typing system to give you a very rich on hand description of what you are looking at. Without that facility, things become more difficult, so relying on it means you are relying on the tool, that was all.

deepsand
deepsand

2) Is re. "Hungarain notation" vs "taxonomy," you may wish to read the [b]entire[/b] discussion to see how many share your view that we are here perforce limited to addressing solely the former. 3) Your 1st question was answered. 4) Your 2nd question both assumes facts not in evidence and is irrelevant. That I fail to agree with your belief that the matter(s) under discussion is(are) wholly subjective seems to greatly trouble you. For that I neither apologize nor alter my position.

nickpixel
nickpixel

...where I "indicated that [I'm] relying on the availability of a particular developmental tool..." (the entire quote, not an edited sentence). The fact that you have ignored the two simple, straightforward questions I have asked you is just as telling as if you had answered them. You ever think of going into politics? See, when I read "Is there still a need for Hungarian notation?", silly me thought it meant "Is there still a need for Hungarian notation?". I had no idea the question really was "Has taxonomy in programming been rendered obsolete?".

nickpixel
nickpixel

...since they're not long and it appears you may have missed a sentence or two. Specifically, the sentence that starts with the qualifier "When writing C# or Java..." and ends with "...in the rare cases I open up source in notepad, I can still see the type wherever the variable is declared". Otherwise, if you did read this sentence, I need to go back to school because I didn't know you could use untyped variables in C# or Java. Even when I use reflection, I still have to use typed variables (Type, MemberInfo, MethodInfo, etc., etc.). Since I'm always up for learning new things, if you didn't misunderstand my posts, could you point me to a document or tutorial on how to use untyped variables in either Java or C#? I'd appreciate it. Thanks.

Tony Hopkinson
Tony Hopkinson

is far more than find. Any language where you don't type your variables by explicitly declaring them will depend on what's found first.

deepsand
deepsand

And, that not all languages are equally accomodating of such perforce allows of the need for comments, which is therefore also germane to this discussion. My responses to your post which began this sub-thread were prompted by the fact that you indicated that you relying on the availability of a particular developmental tool, which may or may not be available to all at all times. My point is that one should not rely on a tool to retrieve knowledge which is easily & more reliably otherwise stored; i.e., a tool used for creation makes for a poor means of memory. The question "Is there still a need for Hungarian notation?" implies that the need for taxonomy in programming may have been rendered obsolete. As such question allows of objective analysis, it is not "100% subjective."

nickpixel
nickpixel

Are you sure you're in the correct thread? I never said the constraints you mention are objective and am hoping you are just misunderstanding my posts (as opposed to creating straw men arguments to reply to). The topic of this conversation (thread) is "Is there still a need for Hungarian notation?". This is 100% subjective. Do you agree or disagree? Second question: Do you use Hungarian notation to name the public properties of your classes (for example: myObject.strProperty = "string")? If not, why not?

deepsand
deepsand

We're not speaking of writing prose, but of computer code. Both the compiler/interpreter/assembler and the language itself constrain one to a specified and finite subset of the universe of possible inputs as being valid ones. These constraints are [b]objective[/b] in nature. How we respond to such constraints [b]does[/b], therefore, [b]allow[/b] of objectivity. It is only to what extent one retains an objective stance that is subjective. As for assumptions, all that I've previously said re. such in no way means that such assumptions are perforce unavoidable; in fact, [b]all[/b] of them are [u]easily & simply[/u] avoided.

Tony Hopkinson
Tony Hopkinson

Fundamentals of Computing from the Open University (the recognised correspondence course degree in the UK), not exactly MIT. Language design is the key to it though, look how bad the VB one was. Just the continuation line is a horror story. The better the design the simpler the code, I think I'll make that Rule Three.

deepsand
deepsand

It seems that the real question is whether theory & its practical application can and should be taught separately or as a unified whole. This does, in part at least, depend on both the discipline & one's intended use of such. As my study of compliler theory occured the better part of 40 years ago, my memories of such, and the extent to which theory & application were taught hand-in-hand is rather vague. However, I can sufficiently recall my studies in both electronics & electrical engineering/analog computing to say that the hands-on lab experience was, particularly re. the latter, invaluable for reinforcing the theories. Actually building an analogue, by using integrators, differentiators, etal., created of physically real op amps, served to put flesh on the skeleton of learned theory; and, it served to satisfy me that what I [u]believed[/u] to be a correct understanding of the theory was in fact the case. As for compiler theory, that small quiet voice in my head suggests that I was less than satisfied with that course, that the ah-ha moment when theory met practice was too long in coming.

Bob.Kerns
Bob.Kerns

(Reply to deepsand, but can't nest replies any deeper...) I won't get into whether Lisp is easy to read, other than to assert that a simple, regular syntax DOES make things easier for users than one with a convoluted, involuted syntax that makes it hard to predict. APL is not the contrary comparison I'd make here -- the only issue with APL is that the symbols are unfamiliar, not a difficult syntax. Of course, if you view compilers as translators, then parsers have a role. But consider -- the grammar of the source language is just a part of understanding the source language -- and then you have the target lannguage to consider. How can a compiler theory course ignore everything that happens after you produce a parse tree -- something that for most languages, happens in the very first pass? (Ignore insanities like the C or C++ preprocessor!). Let's see -- semantic analysis, rewrite rules, macro processing, variable analysis, TRANSLATION, register allocation, various types of optimization, code generation, abstract machines, etc. etc. All ignored in favor of lengthy discussion of the very surface of compiler theory. Truth be told, I think a part of the problem is that for major parts of the topic LACK any true theory -- but not all! But to compound the problem -- not only did they only cover parsers, they didn't even consider what parser theory was actually useful. As I said, I've used very little from that course -- and, of course, I HAVE written quite a few parsers. So sure, a compiler theory class should touch on LALR parsers. I just think it should cover a bit more. Like maybe how a parser can fail with useful context, rather than just rejecting the input. "Sorry, but MyBigHairyClass.c++ is not a valid C++ file." Or how to design a language so that syntax failures can actually be reported and understood. Last time I used C++, ALL the compilers gave really bad errors if your input made something unhappy deep with the layers of the particular vendor's rendition of the STL... (Of course, maybe things have changed, and the compilers now at least agree as to what's a legal input!) So my criticism is NOT that they include parsers in compiler theory, just that they label parser theory as compiler theory, and ignore the meat of the topic!

deepsand
deepsand

it's a bitch to read! Compared to it, APL is a piece of cake. As for compliers, since the purpose of a compiler or an interpreter is to translate one language into another, how can that be done without knowing how to parse that language?

Bob.Kerns
Bob.Kerns

Hot button... Dunno how it is now, but it always annoyed the BEEP out of me that courses and books on parsers were labeled as "Compiler Theory". Hah! Grammars are grammars, not compilers! I had to take a "compiler" course at MIT. By that time, I was already MAINTAINING a compiler. A Lisp compiler. Lisp comes with a recursive-descent parser; the language is DESIGNED to be easy to parse. Not a single darned thing in that course was useful to me. Not then, and not much later, either.

deepsand
deepsand

That served as an excellent introduction into how I/O works at the physical level. With that I was truly able to meld my knowledge of electronics with that of programming; i.e. hardware & software became a functional whole.

Tony Hopkinson
Tony Hopkinson

but that was parsing I think. A disassembler for the Z80 was the first piece of non trivial code I ever wrote, in basic for the Sinclair ZX80, line numbers and everything. I did an emulator for that as well, very good way to learn a processor, lots of code, fairly simple and next to useless if you don't finish it.

deepsand
deepsand

Now, all I need is a 1401, etal.. Assemblers were, of course, the stepping stone from which all "higher level" languages were born. The implementation of mnemonics served to make for more readable code while maintaining an unobscured view of the underlying architecture of the target machine. Today, for many programmers, the target machine is simply a black box, the inner workings & hidden mechanisms of which are, to them, mysteries known to & understood by a select few who dabble in such arcane arts. Little do they understand that they see so far owing only to the fact that they stand on the shoulders of others. While I do know that study of assemblers is not here part of the 2 year Associates degree curricula for programmers/developers, I suspect that it may still be part of the 4 year Bachelor program for Comp. Sci. majors. I distinctly recall a class in which we were required to develop a virtual assembler, based on a given instruction set for a specified machine architecture, that would run on the IBM 360 series machines.

Editor's Picks