Software Development

Ripoff Educations

115 comments
jk2001
jk2001

I think the old MIT course is available online. I've heard the first semester course lectures (CS60A) from UC Berkeley, and the course is basically the same course as was used 20 years ago but updated with lots of OO concepts. It's a really great course, based on the classic Abelson and Sussman text. A lot of the cool ideas discussed in the text have been emerging as language features in mainstream languages for the past decade or so. Presumably, their second semester course on data structures is still good. (In the MIT coursework, data structures are covered in the second semester.) I suggest that folks get the podcasts, and then look up the ideas discussed on Wikipedia. From there, you can learn to use language features like closures, anonymous functions, and generators in your own language.

techkid
techkid

Would it be worth it considering the fact that programming is a very dynamic field so much so that there is already widespread speculation that code generators will replace the developer? It would be better teach programming towards the end of high school because that is when kids get more career specific.

ByteBin-20472379147970077837000261110898
ByteBin-20472379147970077837000261110898

Well, I don't know what Computer Science would entail. I thought some programming but it was supposed to be "a little bit of everything" in computers. Not all schools are good either. One does have to shop around. And also now days, "Computer Science" would be too general as most folks need to take courses in certain areas that they are hoping to work in. IT is more niche based than it was back in the 80s. There's more out there to specialize in. Knowing everything can't be done anymore. You can be a jack of all trades and know a little bit of this and that, but to really be good you have to have training in a specific area. I used to want a Computer Science Degree. I guess in a way I kinda still do. But never could afford it. Instead, I'm taking online courses via my local school district's continuing education system. I've went through two courses (Introduction and Intermediate) in Visual Basic.NET. I am currently doing one in C++. Next is two in C# (Beginner and Intermediate) before I break for summer vacation. The Visual Basic.NET didn't just teach Visual Basic.NET. It did that well, but also some OOP programming concepts and even Database programming! Things I really needed and am now using at work! Same with C++. I'm learning more than just programming but as in the VB.NET course, I'm also getting a lot of hands-on practice in writing programs. There is so much to computer technology today that a 4-year degree program just can't cover all of it. Granted, some that try to only cover one thing and probably should be renamed to what they actually cover. I don't think there's such thing as a "Computer Scientist" anymore. Unless you talk about a scientist who develops computers (but that's a computer engineer, mostly) or one that writes programs (but that's a programmer and there's many languages to that too). And there's also web developers now, who are also programmers but deal with the internet, now that .NET is usable on internet sites. Then you have media developers, graphic artists, on and on and on... I think it's impossible to lump everything into one. And I agree that schools do a bad thing trying to. I think they should be more careful what they name their programs. Also one should be careful in choosing the right schools with the right courses in what they really intend to do. 99% of it is students trying to plan for an uncertain future. Not easy, if not impossible. But if you know what area you want to work in, then you can get a good idea what courses you need. Then find an affordable school with a program that teaches these courses. It's not all that easy to do though, I admit.

Ed Woychowsky
Ed Woychowsky

In the 1980's for an A.A.S. you needed the following ?Computer Classes?: Introduction to Computer Science (Pascal or Fortran) Introduction to Data Processing (COBOL) Data Structures COBOL I COBOL II Assembler I Assembler II Operating Systems Systems Analysis Student Design Project Electives: Business Data Processing C PL/I Data Modeling BASIC Advanced BASIC Now, however, things would be rather different due to so many possible career paths. For example, in 1984 there was an Internet, but there was no Web. That single addition could add classes on the following: PHP IIS Apache ASP JSP ASP .Net HTML XML XSLT AJAX Things have changed and it seems that educational institutions are having trouble keeping pace. Lately it appears that if a class is available on a particular subject then it is already well established. Sort of like the colleges that were still teaching how to wire tabulator boards to their computer science students in the 1980's.

awk
awk

If you hadn't hear the answer. It is ?an educated dummy?. Computer Science isn't shop class. But I did attend Computer Science School in Quantico, VA with the USMC while in the USMC. It was a multi branch school when I went and the Army at the time was the only branch that did not participate. The Marine Corps school at that time was rated even with IBM's school, as the story is told, by IBM. What the school taught then was what was needed/used in the field. I think that our Universities have lost site of what is needed in the field by what they teach. They dying breeds that business is losing is COBOL and Natural to name two, without new students entering those languages. Some classes are purely by the book and the instructor has no in depth clue about a question. I state this having taken a few course as an experienced coder. Being a good/great coder is simple to me. There are fundamentals that apply to almost any coding (this is a mini topic for me, I won?t cover them here). Have code that "works" is not always the best solution. The answer in the book is not always the best answer either. Largest amount of code does not always gain what you need to do. Keep it simple as possible, never assume you have good data, never rely on defaults. The code should structured... so the yahoo behind you can find what they need to change/fix. As far as the science... statics generally have been the accepted norm - back when. Statical information of what you process through the computer should be your gauge. The gauge should show how well your programs do under stress. Is it good code or robust? If it is robust, it should handle almost anything throw at it. There is the science... it is not the language that you code that makes the science.

Wayne M.
Wayne M.

I can only speak from own educational background, but I think that Computer Science is too young to have formal mechanisms to teach about developing computer programs. Until that changes, schools will largely teach a primer in various languages. If you asked me 15 years ago what colleges should teach, I would probably have recommended a very rigid CMM/CMMI approach with well-defined and independent roles. Today, I would just as strongly argue for an agile approach with very general and overlapping roles. My reommendations for things I would like a CS graduate to know (as of today), would be heavily weighted in favor of soft-skills. How to gather requirements and facilitate a requirements gathering meeting. Testing theory. How to write clear code. How to refactor code without breaking it. How to decompose something complicated to an independent set of uncomplicated items. As I look back to what I learned in college (BSEE), I use almost nothing I learned in the technical classes. The llasting knowledge came from elective courses in the Business college and English college. I'm not sure CS is ready to teach lasting knowledge.

rrvillan
rrvillan

Knowing every computer language by heart, or knowing the UNIX manual like the back of your hand or even knowing enough Calculus to be a math scientist doesn't amount to a hill of beans when your trying to create an application a user can actually use. I've seen many smart "computer scientist" in my career who don't have a clue how to code an application per the user's requirements. They usually come up with some obscure application that only a "computer scientist" can operate. Why do you think so many coding jobs are being outsourced, it's a lot easier than actually gathering requirements and designing a system. Most company's want an IT person to know the business behind the application they are developing, NOT just the "nuts and bolts".

C_Tharp
C_Tharp

The discussion is about the age old problem of education, theory versus practice. An educational institution can teach theory and leave the student to learn the tools on their own or teach the tools and leave the student to learn the theory on their own. The theory is usually more involved and takes longer to learn than the tools. If theory is emphasized, the graduate must struggle to get the job without being able to show skill in a particular set of tools. Their market will be limited to the set of employers who appreciate teh benefit that a thorough foundation in the science provides. If technical expertise is emphasized, the graduate will have tool skills for the set of employers who are eager for production and care little for optimum solutions. Neither group will get through the door of employers who are looking for on the job experience. Each graduate has a limited market of employers. The real difference develops as they gain work place experience. The one educated in theory has the ability to adapt to new tools quickly and the knowledge to produce solutions that are appropriate for the need. The theory is applied as needed. The one educated in practical tools is limited to the tools learned in shcool or learned at the expense of an employer. They have little means to learn the theory. The tools can be learned easily by the one knowledgable of theory, but the theory can not be acquired easily by the technician. Always, there are the exceptions, but they are rare and they have no way of documenting their achievement. This is similar to the difference between the engineer and the tradesman. The point is well made that a tradesman education is being given the greater certification. Value is being stolen from the more valuable education. The result is a pool of people, all holding apparently equal creditials, but with vast differences in knowledge and ability. Both groups lose something. Employers know which type they want and need, but the institutions are removing the distinction provided by tthe degree. Graduates can easily show that they have qualities different from the others. The degree should provide the differentiation.

Tony Hopkinson
Tony Hopkinson

Academia always lags technologically, after all it takes a finite time to put together a course. So you get someone with a good grounding in the theory and industry whinges about applicable skills. Set up a course for particular skill and it's no longer the 'in skill' and you have someone who's out of date and weak on the theory required to translate their current skill. I started my CS education at thirteen, that gave me plenty of time to learn both, and to see a progression of theory and tools and my education, formal or otherwise progressed. Start early and start simple is the only solution to both problems, unless a thirty year old academic, who needs a route map to his cubicle is considered progress. So it's Reverse Polish Notation vs how to put a formula in a worksheet cell.

Justin James
Justin James

That was the only way I survived, what with my BA in liberal arts. :) Admin'ed Novel 3 and NT 3.1 on up, pulled cable, wrote code, did spreadsheet & DB work, all before graduating high school. The only formal CS education I had was a few programming courses in HS (good mix of theory and hands on), and a few CS coourses in college before I switched majors. That is another difference I see between great coders and the not so great ones: starting early. Get the "how to use a computer" and fundamentals of programming out of the way early, teach them real theory in college to polish them up, and you have rock stars. Dump someone who has never touched a PC beyond playing video games and posting messages to MySpace, and you have a problem. At best, they can write passable code in whatever languages their college taught them (if it was a hands on program) or are simply really specialized mathemeticians if they went to a theory school. Third world programmers are a great example of what I am talking about. J.Ja

Tony Hopkinson
Tony Hopkinson

int numberofinvoices; // the number of invoices int j = 1; // a counter //Loop through the invoices While (j

NickNielsen
NickNielsen

"DO WHILE ... //start search loop" isn't good commenting? :^0

Justin James
Justin James

I can always tell when someone copy/pasted from the docs or some example code, because it is written in a way that I'd hate myself for writing. All of the things that make a piece of code a good example (global variables, generically useless variable naming, rediculously redundant comments like //This is the loop iterator) make it poor code... J.Ja PS... /* Comment written by me. */ ;)

Tony Hopkinson
Tony Hopkinson

I've ever worked with was of the female persuasion and I sure that was because she approached problems in a very different way to the way a bloke generally would. I'm surprised now development as opposed to coding is much softer discipline we don't get a lot more women in our trade. Then you look at how blokes get into IT, games, electronics, hard science and maths all of which are very male dominated and it's not so much of a surprise after all.

Tony Hopkinson
Tony Hopkinson

Thinking black box you can get away a lack of understanding, but that of course reduces how detailed an implementation you can make. I freely admit to not understanding the physics behind semi-conductors in any useful way. :D It's not so much much what they do get taught as what they don't. Global variables, for instance, how many times do you get academic code snippets using them to keep the emphasis on the logic rather than a useful implementation of it? Don't even get me started on side effects, annotation and naming.

Justin James
Justin James

My CS 111 class was around 25% - 30% female, and the over all class population seemed to be slightly weighted towards having more foreigners than the overall student body. My CS 112 class had 80 - 100 students, 4 were female. Another thing that always stuck out about those 4 women, one was from India, one from the Middle East, and one was from Asia (do not recall specifically which country). The last one was born in the US. In other words, the ratio of foreigners to Americans was extremely skewed. I always suspected that there was something specific to American culture that turned women away from CS. That vast majority of the females I have met in IT were, like you mention, more in "soft skill" positions. One area that I see many women in is DBA work, for some reason. At my last job, one of our customers had only female DBAs, which really stuck out in my mind. DBA is a highly architectural code role. I think you may be onto something there Mark. J.Ja

Mark Miller
Mark Miller

I started programming when I was 12. Learned BASIC with line numbers. Even that was tough for a few months. It helps to develop a passion for it. After that I did a lot of programming on my own time. Back when I started, the schools were just starting to buy computers for students. Often the school would only have a few of them (8-bit computers), so you had to sign up for time. By the time I got into jr. high school, it was fairly easy to find a computer to work on. I tried my hand at writing some useful/fun programs, ones that other people would like. By the time I got into high school my work started getting noticed by other students. We had a computer club run by the computer teacher there. I and some other students got involved with a "computer science league" competition every year. We had programming contests, and we learned some basic CS theory, like boolean algebra. We got to go to the national finals a couple times. Even so I didn't jump ahead of anybody when I got into the college CS program. I took the intro. class like everyone else. Didn't learn much except for the last part on pointers (didn't learn that in high school). I think having the background helped though. Some years back when I saw discussions about "women in computer science", this got brought up, that typically girls were intimidated by it because the boys were always so far ahead in terms of skills. More recently I've been hearing that this isn't so much the case. Instead it's said that young women are turning away from computer science because they don't want to learn the theory and the programming. They like the more abstract areas like software architecture, project management, etc. I'm sure they'd see the importance of technical literacy, but they don't want to go through a bunch of programming/theory classes just to get to the "good stuff" for them. "Women in computer science" was always an interesting topic to me. I noticed when I was in college the vast majority of the students were male. At the time I was in the program we had 270 CS students. 4 were women. Ironically most of them didn't have a passion for this stuff. They were just smart enough to do it. All of them did well. I've sometimes wondered what made them different. Why did they gravitate towards it as opposed to most other female students who went into something else?

kenzo
kenzo

re. software: If you don't understand how it works, it probably doesn't. You just don't know it yet...

Tony Hopkinson
Tony Hopkinson

the basics is all there was :D Basic with line numbers was a high level language. I started with 8 bit machine code for 6502. Assembly was by hand and then you typed in the hex for the mnemonics. This was a significant improvement, my physics professor programmed the PROM for the hex keypad with 24 switches and a push button. I started with basic logic circuits, electronics being my interest before he even let me near his pride and joy. His mantra was always if you don't know how it works, you can't make it do anything useful.

kenzo
kenzo

You fail to see the problem that can result from "academics only". That is: concern for solving the client's business problem, _not_ simply experimenting with whiz-bang technology. I have worked with more than my share of Standford Masters Degree in C.S. who a) can't predict what they can actually accomplish and how long it will take them to accomplish it, and b) don't care about the client's problem all that much anyway. They just want to continue to play with technology and continue to "study" at someone's else's expense. After all, that is what they know and what they have been doing for the last 6 years...

Tony Hopkinson
Tony Hopkinson

because they think it's fun. Then they find someone who isn't interested in it will pay them to do it. Happy days. :D Then they find that the paymaster expects to get more value out of the exercise than a happy smiling tech with the latest gizmo. :( Courses should be like other crafts where as part of it, you are expected to finish something real. It's all right being able to do really good mortice and tenon joint, much more useful to be able to make a sturdy table though, and a half butt joint might be a practical solution. Cease with the itty bitty stuff and do a few useful applications that illustrate the use of various techniques. Of course that would be much harder to teach, and possibly be more subjective, but a pass would mean something in both academia and the workplace. Another thing that I've been recommending for a while is team development, the value of that being taught well is at least 18 months experience.

Tony Hopkinson
Tony Hopkinson

The essence of good software with some longevity is readability. You aren't ever going to pick that up with a 15 line snippet to implement a list in a circular array, which only you and your tutor ever see, and you get marked on the comments anyway.

jmgarvin
jmgarvin

At New Mexico Tech there are courses that focus on teamwork and large projects. In a systems programming course we had to develop a "simple" client and server that could act as something of a mini-web server. While it was nightmarish at the time, I'm glad I got to work in a team and glue the code together. I'm glad we had to meet milestones and create something that was semi-useless. I know a lot of colleges are looking towards ways to develop real world projects so that their graduates can hit the ground running.

Justin James
Justin James

...were both in high school. One involved spending 3 months writing a very large (for a high school student) program in COBOL from scratch, with a partner. I learned a lot about working in a team, working on a large project, and put a lot of theory into practice. The other was EdScheme, a course in turning a barely fleshed out, functional programming language (based on Scheme), learning various theories, and using those theories to construct libraries to turn the base language into something useful. I learned a lot about the fundamental thought process of code writing, and an appreciation for the basics like that. J.Ja

C_Tharp
C_Tharp

The same thing can be said of technical graduates. People with the ability to focus on the problem and apply the technology appropriately exist in both categories. The focus of the discussion is education, not performance.

coderancher
coderancher

Students need to be able to connect the abstract with the concrete. Algorithms and data structures eventually have to be implemented in code, which makes a course in at least one programming language a necessary part of any computer science curriculum. Whether the implementing language chosen by a college CS department will be in vogue with industry after the student graduates does not really matter, because once you learn one language, it is easy to pick others, since basic concepts like data objects, loops, conditionals, and functions/procedures are readily transferable to amost every language. The real problem with the CS programs at some universities is that they make a poor attempt to integrate application into their theory courses. The courses have a heavy emphasis on theory, but homework assignments and test questions involve filling in or evaluating pieces of code. Programming is only briefly glossed over and is not even a prerequisite for the theory courses. The results are students unable to do their homework, high attrition rates for first and second year CS majors, and graduates with neither a proper appreciation or grasp of theory nor the ability to code. The people going to DeVry are probably getting a better education; at least they learn to do one thing well. It would be better if CS programs completely sheltered students from ever having to see a single line of code (not very practical) than to throw in half-baked attempts at implementation. You can't ignore programming as a foundation for the theoretical stuff.

jmgarvin
jmgarvin

Too many CS programs don't know how to balance the theory and the programming aspects. I think the best I've seen was creating a practicum course in C and forcing in on the incoming Freshmen. It seemed to help, although the problem was that between the realities of programming and the theory in the classroom, the new students tended to get confused (recursion was a real killer). On that note: Does anybody know a good way to explain a double pointer without completely losing a first year student?

Justin James
Justin James

You are right about the quality of the hands on work. That is why things like a compiler class are so valuable, they get your hands in the mud. All of the theory in the world is not too useful if you can't do it, and all of the practical knowledge is useless if you need someone dictating to you what to do down to the last line of code. CS at theory heavy schools has an insane attrition rate. I saw it at Rutgers. 800 total students for CS 111, 300 - 400 students total for CS 112, and about 200 total for the classes that only had 112 as a pre-req. In other words, 75% loss rate after 2 classes. On the other hand, I would rather see someone get "weeded out" of a particular major than to waste 4 years and a pile of cash, only to discover that they hated the work in the real world! Let people find their passion, teach them well, and let them run with it. J.Ja

coderancher
coderancher

I agree that students should find out that they won't enjoy a particular field sooner rather than later, but bad teaching or a defective curriculum shouldn't be the cause of them hating the work. A good instructor who engages the students can make a subject, which they had little initial appreciation for, something they develop a passion for. Inadequate prerequisites and allowing unprepared students to enroll in courses in the first place is not the right way to "weed out" people; it is just wasting their time and money.

Justin James
Justin James

You are right, that was an unspoken assumption on my part. A CS major needs to be able to ramp up quick with little help or external bootstrapping, but expecting someone to walk into CS 101 knowing C++ is not helpful to anyone. Definitely the wrong way to "weed out". Expecting them to follow along and comprehend (not knowing up front) a discussion of linked lists is not. Expecting them to grok lambda calculus is unrealistic. Expecting them to be able to "get" the basics of compiling a program and debugging is not. J.Ja

CharlieSpencer
CharlieSpencer

This discussion reminds me of the lab classes required for most science courses. You took three hours of lecture to get the theory, and spent an hour in the lab doing hand-on practical work. That's how other majors handle the theory vs. practice question, although I think a 3-to-1 ratio is too high for data theory vs. programming. Maybe 1.5 theory classes for every 1 programming course. Say 42 - 45 credit hours of theory, 24 - 27 hours of programming, and a 3 - 6 hour senior group project meeting the requirements of a willing off-campus company. (U. of South Cackalacky does this with their engineering students because we have a group of them here every couple of years. Great PowerPoint skills...) If those hours seem out of whack, please keep in mind I haven't looked at degree requirements in years and don't know how many "ash and trash" non-major courses are required in a 120-hour C.S. degree these days. Why more more theory hours than coding? I expect a tech school grad to have a set of skills that will allow him to go to work right away, but his skill set may not allow him to advance far or fast without additional study. I expect a B.S. holder not to be productive immediately, but to be able to ask intelligent questions, understand the issues that may affect a project, and have a better chance of long-term advancement.

NickNielsen
NickNielsen

Most of the science classes I took integrated the lab into the course itself. The ratio varied, depending on the course. Basic chemistry was 3:4. Physics was 4:3. Intro to Programming (COBOL) was 4:1. You had one hour in lab to start punching cards, after that you were on your own. Don't forget a period and DON'T DROP THE BOX[ES]!! Edit: I did take Intro in college. The most interesting thing about the course was that the print routine for the COBOL compiler running on the school's brand new Burroughs 3500/II [u]automatically[/u] placed a period at the end of each line. So your code wouldn't compile, but the printout was perfect. (This was fixed after less than a month) I remember sorting through 2300+ cards just to find that missing d@mn period.

MikeBlane
MikeBlane

As someone who went through a Computing Science program from Sam Houston State University - just down the road from Texas A&M - I agree that a well-rounded Computer Science student will have those fundamental data structures, programming languages profiles, software engineering-type classes. I also agree with Rex Baldazo's article about still requiring a compiler class. Students may never actually write a full-blown compiler, but there are so many data structures and computer science concepts involved that to be able to successfully write a compiler gives a great deal of confidence and satisfaction to someone that is actually able to write one. I personally have used the Finite State Automata processing functionality in several of my programs and, in doing so, have created a much more elegant, maintainable programming solution than someone that tried to brute-force program these same programs. Your really good CS professors - of which, SHSU's Dr. David Burris ranks as one of the best - will push students to their maximum.

techrepublic
techrepublic

I teach Computer Science at a local community college near Philadelphia. I am now, finally, teaching the course on Data Structures which I had been wanting to do now for several years. My students are in their second year and have already taken Java and C++ programming. I try to offer the practical considerations in all the courses I teach based on by real-world experience. Just last week I dragged out my 3 volumes of Knuth and brought them to class and passed them around. The students were intimidated by the contents -- and rightly so. I told them that such theory was de rigueur in a "real" school and that they should own a copy of Knuth and several other books that I recommended. Now I see that I may have been mistaken. I find it hard to believe that a so-called University doesn't touch data structures until the Senior year. Justin, I sincerely hope that U. of South Carolina is the exception rather than the rule or heaven help us.

Justin James
Justin James

What I always found interesting about the old schoool folks like Knuth was the sheer cross discipline work they did. It wasn't just "trees and hashes", it was "trees and hashes to solve problems". So much of what is still in use today (think stuff like soundex) came from folks like Knuth who truly pioneered the whole field. There is very little new in IT that someone didn't do or write about before the 90's. It may have been cruder and text-only, but it was there. That is why the fundamentals are so important. For example, after working on a VAX for a while, I can tell you that "thin client computing in the wild world of Wen 2.0" does not appeal to me much. ;) J.Ja

Mark Miller
Mark Miller

[i]after working on a VAX for a while, I can tell you that "thin client computing in the wild world of Wen 2.0" does not appeal to me much.[/i] Heck, after working on an MVS system in college web 1.0 was nothing new under the Sun. They used to call it batch processing. Now they call it "doing a round trip". I read an interesting interview with Alan Kay that's a couple years old at http://www.acmqueue.org/modules.php?name=Content&pa=showpage&pid=273. What he got across is there's software technology that's existed for 30 years that the IT world still hasn't caught up to, though it seems like it's getting close. He was talking about the Smalltalk language and he said "It's obsolete", but only because he envisioned languages that were even better. In a different interview he said, "Twenty years ago at PARC I thought we would be way beyond where we are now. I was dissatisfied with what we did there. The irony is that today it looks pretty good." He explained that Intel and Motorola did not produce CPUs that would run the higher level languages well and that's the reason they ran so slowly and fell out of favor. Programs ran faster when written in C, a compiled early-bound language that these CPUs ran well. He talked about the Burroughs B5000 machine from the early 1960s, which apparently ran p-code at the hardware level. If it had been modeled in a microprocessor it would've made the higher level languages like Smalltalk and Lisp much more efficient, but the data processing establishment didn't understand it, so it never caught on. He said the only reason higher level languages like Python and Ruby run reasonably well now is the old CPU architectures have gotten fast enough to where it doesn't matter much anymore. When I think about it it feels pathetic. The technology has been sitting there all this time but no one who's mattered has realized how to use it to best advantage. In a sense what we'll be seeing in the next decade will be like the Renaissance, where modern day engineers "look to the knowledge of the ancients" and dig up stuff that will seem revolutionary now. Interestingly in the interview he says just what you said in your post: "but I fear?as far as I can tell?that most undergraduate degrees in computer science these days are basically Java vocational training. I?ve heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification." Yikes!

Justin James
Justin James

You're right on the money with the cog sci remark... my academic days had a ton of cog sci in it (gotta love having Jerry Fodor, Steven Stich, and other fellows in that league down the hall from my professors, literally!). What I learned about cog sci has been invaluable to me, parrticularly in terms of usability. The rest of my "liberal arts degree" has also been immensely helpful, I learned a ton of soft skills as well as concept abstraction. It is odd, but I feel that my formal academic learnings are even more applicable to my current high level role than they were as a programmer. On the flip side, there is virtually no "computer science" in my role at this point, I am no longer working at the algorithmic level. The joke is, you need to rise through the programming ranks to get to architecture, but even the quality CS theory is not as applicable in a higher role as less CS related learrnings. In other words, the learning that helps you rise doesn't help you too much when you've risen. Not that it is useless, by no means... but there are a lot of other considerations and tasks that have nothing to do with CS or programming. J.Ja

apotheon
apotheon

[b]re: new COBOL[/b] Think about it. Right off the top of my head, I have three quick points for you. 1. Java is one of the most verbose "modern" languages in common use today. 2. Java is [b]the[/b] business language in today's market. 3. Java is [b]the[/b] example language for most universities these days. Those were among the defining factors of COBOL when it was big. [b]re: industry pushing CS[/b] I didn't mean to suggest that Computer Science is necessarily going to keep its place as the primary business-oriented degree for working with computers, just because it's chasing that carrot. Once CS degree program evolution converges sufficiently on CIS degree programs, the shortfalls of a CS degree for that purpose become more glaringly obvious. For instance, the incredible mass of (largely irrelevant) calculus credits required for graduation will drive students from CS and into the waiting arms of CIS as the two majors become increasingly similar in other ways. Other, similar effects come into play as well. [b]re: stagnation[/b] The way to make CS relevant again is to start teaching principles rather than practice where at all appropriate, and to start teaching practice where necessary for principles rather than merely "justified" by what "everyone" is doing. CIS should be more about learning what everyone is currently doing, and CS should be more about learning what should be done. For instance, teaching secure programming principles is more suited to CS, while teaching programming security testing suites in common use is more suited to CIS. Teaching compiler design is more suited to CS, and teaching makefile syntax is more suited to CIS. Extrapolate from there. I think that would help immensely -- though, frankly, I think the best bet for something CS-like in the future is not in CS at all, but in a more CS-oriented Cognitive Science major, at this point. CS has slipped too much to pull itself back from the brink on its own. It needs help. If CogSci picks up the slack in certain areas, CS might start being seen as needed to fill in the gaps in other areas.

Mark Miller
Mark Miller

Apotheon- Re: Java is the new COBOL I've been hearing this recently, even from those who use Java regularly. They seem to have just accepted the idea. Hey, if the shoe fits... You'll get no argument from me there. Re: CS is following where industry is pushing it and is not progressing That is just as plausible as what I was talking about. Only question I'd have is if what you say is true why is the CIS program humming with new enrollees while the CS program is just barely keeping its head above water with new freshmen? It seems to me students are finding that the CIS program is just what they want, whereas I think they see CS as old hat and a path to nowhere. The CS program at my alma mater is more traditional. They teach C++ and Java, data structures, systems programming, etc. I didn't look at the curriculum in detail, but it looks like they don't teach about the web except for Ethernet. CIS teaches all Java, a little theory, soup-to-nuts on web app. development, and relational databases. It's kicking the pants of the CS program in terms of freshman interest. If the CS credential was more desirable by employers why wouldn't more students go for the CS program? Re: CS has stagnated I'm in total agreement with you here. I haven't done a detailed study of CS programs across the country, but just hearing what I have about them, I think this is the problem. I think another part of the problem is that the "data processing" mindset that has been a part of CS for a long time has run its course. It's matured. CS can't really go much farther with it because the advancements in IT have less to do with CS than with system configuration and network protocols. There are some places they could move to like emphasizing loose coupling of component architectures, and dynamic languages more, since those are up and coming things. Probably one area where they could REALLY improve, and industry would love them for it, is teach software security--how to write software that's difficult to hack. From what I read it's still a big problem, and CS programs still don't teach it. These would all be fine and good, but would it draw people back to CS? I wonder. I think CS needs to start leading, rather than following, putting more of an emphasis on training for research, encouraging the development of new ideas. One area that could really use some work is how to execute algorithms on parallel architectures (multiple CPUs). Another might be rethinking the traditional CPU architecture, the von Neumann machine itself. That would be a start towards making it an exciting place to be again.

apotheon
apotheon

I don't think that's really what's happening. It's more likely just a natural effect of the cycle of newness. Think about it: "back in the day" when CS was in its heyday, it was in many ways cutting edge. This is where people went to learn what was needed to get out in the world and do work that had never been done before -- to turn corporations like GM into highly networked, smooth-running machines; to develop the software that would revolutionize desktop computing; to invent new paradigms of networking, and to really exploit the potential of such new paradigms that had been invented by others only moments before you hit the scene. Now, the Web is old hat, "enterprise" is a marketing buzzword used to lure managers into multi-million dollar deals to acquire vertically integrated service oriented architecture stacks, and the movers-and-shakers of old are writing books rather than code. What happened is that all the things they used to teach, [i]they're still teaching[/i]. The problem isn't that they've regressed -- it's that they haven't [b]pro[/b]gressed. Because the technology has progressed without them, the same things that were once accomplished by inventing a new language are now done with the "new COBOL", Java. Meanwhile, the cutting-edge stuff is now mostly being done by passionate computer geeks in their free time, rather than at schools and in corporate research labs. We're seeing it happen with open source OSes like Plan 9 and OpenBSD, rather than in university graduate programs. Even the best universities for computer science usually aren't pushing the envelope -- you know the best because they're just teaching their students enough to understand what's happening where people [b]are[/b] pushing the envelope. Oh, sure, there are exceptions, but they're few and far between, and they're not happening in the classroom -- they're research grant projects on which professors are working while their aides teach their classes. Even most of those are tripe. What do you expect from a major program that has become so well-established that its undergrad program requires dozens of credits in mostly unrelated subjects (like calculus)? Computer Science has become not only established, but entrenched. Of course, part of the problem is also the way the corporate world has viewed CS and CIS degrees. CIS was invented to prepare people for working with computers in the business world, while CS was invented to train people for doing real cutting-edge work. Major corporations, however, pretty much refused to hire CIS grads -- they just wanted CS, because they thought it was "better". As a result, the old economic laws kicked in, and where there was a demand for CS graduates who were ready to become Java daycoders in little gray cubicles, the supply side -- universities -- learned to provide exactly that. In other words, there are two effects at work here, I think: 1. Rather than becoming more like a CIS degree as conscious emulation, I think CS looks more like CIS now because CIS has caught up with CS, due to the fact that CS just didn't advance. It's old, now, and nobody bothered to keep pushing it forward. Tenure and bureaucracy won out. 2. In direct contrast to the idea of emulating CIS, I think CS became more like CIS simply because that's where the corporate employment demand pushed it. People go into CS increasingly for the purpose of getting daycoder jobs because only CS grads get hired for these jobs. As a result, the CS degree program starts adjusting to suit the needs of these students. Ultimately, this turns CS into CIS, where everything is centered around the "new COBOL". . . . and yes, Java really [b]is[/b] the new COBOL. It doesn't take a rocket surgeon to recognize that -- it just takes someone willing to peer through the haze built up by the Sun marketing machine.

Mark Miller
Mark Miller

Re: Maybe you're getting in touch with your feminine side Heh, good one. ;) I met with a friend yesterday who just quit a masters program at his (and my) alma mater after one semester. It sounds like the CS program has gone downhill there from the days when we took it. It doesn't sound like the curriculum is bad. The problem appears to be in "implementation". Apparently it's the quality of some of the faculty. They've stopped caring. His assessment of those who taught the undergrads was not much better. Enrollment is way down from when we were undergrads. I entered college in 1988. What I remember is there were anywhere from 60-100 CS freshmen. The expectation was that only a certain percentage of them would graduate with a CS degree. At the beginning of last fall's semester the department had 20-30 incoming CS freshmen--and this was an [i]increase[/i] from what it was a few years ago. It's possible that a higher percentage of these folks will graduate, because it used to be that people saw CS as a path to riches and they wanted to get in on it. Now it's people (alright, I'll say it...largely men) who are more passionate about the subject. He told me about an interesting, well...I guess depressing dynamic that's going on. He said that the Ph.D. program is desperate for graduate students. They can't get enough. Yet apparently most of the people teaching the masters program don't really care. They're just going through the motions. They care more about their research. Most appear unmotivated to teach, and they have no motivation to recruit (though the Ph.D. program does). My friend said that in contrast the Business College's CIS program has improved by leaps and bounds from when we remembered it. The CIS program used to be the butt of some jokes when we were undergrads. They mainly taught COBOL (considered a dead language then...) and relational databases. They had a little C, but not at the depth that we had. Now they have a strong Java program. They're teaching some theory, too. They emphasize practical hands-on skills. The main thing that I'm sure is a big draw is they teach their students about building web applications: HTML, CSS, Javascript, and the back end in Java. They continue to teach about relational databases. He thinks that the business school is getting students that in years past would have enrolled in the CS program. I told my friend about your postings on what you're seeing both in CS and in the work world of shake 'n bake programmers. He said that perhaps the reason the CS programs are turning into Java vocational schools is they're desperate. They're seeing that this is what the business CIS schools are teaching, and they're getting a lot of students. They're thinking that if they teach the same thing, then they'll get back the students that they lost. Not the best way to grow a program. Just copy what the other guy is doing.

Mark Miller
Mark Miller

More recently I've been able to keep my code, working as a contractor. When I worked as a full-time employee several years ago, legally the company I worked for owned the source code, though I used to hear about people sneaking some code out the door. They felt they had a right to it because they worked hard on it. It wasn't that much. Usually a class or two. The way I did it is I just remembered the technique I used. I didn't keep the physical code. Just kept it in my head.

Justin James
Justin James

Mark, that sums it up perfectly. We have machines thousands of time more powerful than the ones that the hardcore CS was developed on decades ago, more powerful than the ones that mapped the human genome... and post people use them to go to MySpace and YouTube as their most challenging task. I have not seen a Web application yet that did not do something that a desktop app didn't already do, or try to do. Our languages are getting dumber and dumber (Java, VB.Net, compared to Lisp, Smalltalk, even Pascal) to accomodate the shake 'n bakers. The people making the big bucks are the C++ guys and low level guys writing the core libraries that the rest of the folks use. I am just grateful that my days are not spent wrriting junk Web apps anymore. I have written 5 lines of code so far for my new job, all SQL statements (SELECT * FROM tablename WHERE column = 'value' for all 5!) for troubleshooting purposes. :) I spend most of my days in Worrd, Excel, and Visio, racing to get control and bring some structure to a project that had no architect for too long. And I'm loving it! Wait... does this mean that I am in touch with my feminine side? ;) J.Ja

Justin James
Justin James

I have not gotten to keep a piece of code I ever wrote for an employer, except the freelance stuff I occassionally do, and that it all dinky Web dev stuff, barely worth my time except that it occassionally puts some "play money" in my pocket. At this point, the freelance stuff is so infrequent, i would be better off being a WalMart greeter, if I needed the extra cash. I just have not put effort into finding customers lately. J.Ja

NickNielsen
NickNielsen

[i]The common denominator in most of what I've done is the app. I'm writing uses a database. Another one is the app. usually does some sort of accounting/invoicing function.[/i] The common denominator in anything I ever coded was the output. I still have several boilerplate print/display routines where all I have to do is change the report heading, column headings, make minor formatting changes, possibly change field names, compile, and POOF! Perfect output. Everything else is already done. Page numbering was a separate sub in almost all my work: the code was efficient and effective and I'd already invented that wheel. I had many other administrative subroutines and functions that I commonly used throughout my work, simply because they saved time and effort: I'd already been there, done that. It didn't hurt that even though I did the work, it was Copyright U.S. Air Force and was essentially public domain to the military. You don't specify in your post, but I can't imagine a programmer not using such a collection regularly. It has been a while, though, since I've followed that side of IT. Are you allowed to keep the rights to your code these days or does it depend on the situation?

Mark Miller
Mark Miller

I can't help you out with the stuff you mentioned. I've heard others complain about writing the same app. over and over again. It's been an argument for the OSS movement. The idea being invent the wheel once and then use it after that. I've been fortunate in that I haven't seen too much repetition. The common denominator in most of what I've done is the app. I'm writing uses a database. Another one is the app. usually does some sort of accounting/invoicing function. Other than that though there are different things put on top of it, which add some variety. I don't think I'm tired of it yet. I know I keep bringing up Smalltalk, but I've been learning a lot through it. Adding to what I said in my last post, it looks to me like commercial computer systems are technologically moving towards what was invented 30 years ago. Maybe someday we'll see it move towards technologies from 40 years ago. In Smalltalk, and indeed Lisp, everything lives within an image. When the system is running that image is in memory. When saved the image is saved to disk. This is like the system virtualization we see now, or the mere act of putting my Windows machine into hibernation. I hardly ever reboot my machine. When I turn it off I put it into hibernation. When I turn it on, the image from disk is loaded into memory, some device initialization takes place, and voila I'm back where I left off. Some implementations of Smalltalk have a kind of "undo" function where you can restore some parts of the system back to their former state if you make a mistake. It's a bit like "System Restore" on Windows. Some people have asked if someday we'll see .Net or Java bytecode processors in hardware. With where Windows Vista is going I think we will. This would be going back to technology that's more than 40 years old though. As I mentioned earlier it sounds like that Burroughs machine probably did the same thing. As I've probably mentioned before, collaborative computing with video conferencing capabilities have been around on PCs for the last several years. Douglas Engelbart and his team invented that stuff more than 30 years ago--even the video conferencing part (though he used analog technology for this). In that interview with Kay I cited earlier he said: "There are just two different worlds, and I don?t think it?s even that helpful for people from one world to complain about the other world?like people from a literary culture complaining about the majority of the world that doesn?t read for ideas. It?s futile. I don?t spend time complaining about this stuff, because what happened in the last 20 years is quite normal, even though it was unfortunate. Once you have something that grows faster than education grows, you?re always going to get a pop culture." He explained further what he meant by "pop culture": "One could actually argue?as I sometimes do?that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects. You could think of it as putting a low-pass filter on some of the good ideas from the ?60s and ?70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were. So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture." It's hard to know how to feel about this. On the one hand I'm starting to agree with him. On the other, I'm glad this "pop culture" took place when it did, otherwise...I don't know. I might not have gotten into computers at all. I was part of that pop culture. There have been times when I've felt as though computer technology was progressing in sophistication as fast as I was. There were other times when it felt like computer/software technology was not progressing fast enough. I was more sophisticated than it was. The early 1990s comes to mind. I've been feeling like this, again, for a year.

Justin James
Justin James

Marrk, that is absolutely on the money. Paul Graham's stuff on Lisp illustrates just how far behind "modern" languages are in comparison to some of the "ancient" ones. On the other hand, I am always amazed by how many times I see the same program get re-written, and no one ever asks, "has anyone else ever written this before?" Heck, I've spent much of my career essentially writing and re-writing the same Web application, just changing the database layout a bit, the design, and a few of the business rules. Yet we are still floundering to abstract that stuff out in a meaningful way (persoanlly, I blame the use of SQL in an OO worrld for that one...). Why do "Web Mashups" remind me so much of the way FOSS coders have always written their code? And why do all of them just seem to tie some database to Google Maps anyways? J.Ja

bg6638
bg6638

Are your students getting jobs in IT with only an AAB Degree? In Ohio, an AAB Degree might get you a job working part-time at Staples. I have nothing against an AAB degree. That's what I have, plus 20 yrs exp working with COBOL/Assembler/Foxpro, plus 10 more working with DOS 1.0-6.0, Win 3.1/95/98/NT 4.0/2K/XP/Exchange/SQL Server/ISA Server/Mac OS 9/OSX/IBM AIX. Now I've been unemployed for 2+ yrs, and most employers won't consider me because my resume lacks "BS in C.S.", even for Help Desk!

al881
al881

Maybe a change in location would help? Northeast, TX, don't know.

techrepublic
techrepublic

1) The Associate of Science Degree given at the school where I teach pretty much qualifies you to gain entry to a 4-year school and carry over most credits. The school has programs with many of the 4-year colleges in the area and transfer of credits is an important consideration when developing courses. 2) Employers value experience over education. 3) In my teaching, I try to impart the practical side wherever possible. I tell my students that they should bring their completed homework assignments and tests to job interviews to show what they are capable of. 4) Several of my former students have gotten new jobs or promotions based on work done at the Community College.

Ed Woychowsky
Ed Woychowsky

I started with an A.A.S. from Middlesex County College in Edison New Jersey knowing S370 Assembler, COBOL, C and Pascal. Over the years I've picked up DL/I, PL/I, VB, C#, SQL, XSLT, JavaScript and some Java. Instead of just adding degrees I have a tendency to go for the classes first and then see what it adds up to after the fact. Alternatively, if something is new I'll read about it and just jump in an try it out without waiting for a class to be offered. People seem to be of the opinion that the purpose of a degree is to teach an individual everything that they need to know for the work place. While that might be true for certain majors like physical education and accounting, it isn't the case for majors like medicine and computer science. Some fields are dynamic, requiring consistent study just to remain current. I've actually lost count of number of former coworkers that never picked up a computer-related book once they graduated. You might recognize the type, the ones that ask people returning from a class how their vacation was or saying that they tried something in the past and it didn't work. A fair number of these folks are now working in different fields now. Regardless of the letters involved, a degree is nothing more than a piece of paper that really doesn't mean very much. What matters is the person holding the piece of paper.

rrvillan
rrvillan

This is absolutely true.

CharlieSpencer
CharlieSpencer

I cruised for years with a 1986 AA in Data Processing from South Carolina's technical college system. The two year course at that time included Assembler, DCL, at least three programming languages (pick from COBOL, RPG II, Fortran, Basic), and options in Security and the then-new field of microcomputer applications. No, it didn't include much theory, but I didn't expect that from a tech / community college. But it was plenty good enough to meet the J.Ja's earlier suggestion of coding from designer's "blueprint". I don't use much of the details from those classes anymore, especially since I've been in support for over a decade, but they did give me the programming fundamentals to learn other languages and skills. If that's what Justin's friend is getting at USC, he can get it at Midlands Tech for a lot less and without the foreign language requirements :-P I eventually got a B.S. in Business Administration (Coker College, Hartsville, SC, via their military outreach program), finding it more useful in a manufacturing environment than a C.S. degree would be. I never saw the main campus; I hear it's nice.

onbliss
onbliss

I always nursed to study history and get a college degree. Some day I will do that..... Now I am looking at pursuing MBA. My wife, after seeing some of the books/articles that I have been reading to go for Masters in Theology. Not bad for a hobby....

Ed Woychowsky
Ed Woychowsky

That was a problem when I went to college and in some areas it is still a problem. There is a possible solution, most colleges allow incoming students to "test-out" of a class. In other words, if you can pass the final you've proven that you know the subject. It is called a CLEP test (http://www.collegeboard.com/student/testing/clep/about.html).

bg6638
bg6638

That was what I planned to do too.....but when I applied at several 4 yr schools, I was told that NONE of the credits that I earned would transfer(even though I was told 90 hrs would transfer before I started at the CC). Since I also attended a vocational HS instead of taking college prep courses, they wanted me to take 6 semesters of 090 courses, followed by the standard coursework for a 4 yr degree. Going part-time it would have taken 14 yrs to complete. Quit the job and go full-time for 7 yrs didn't look promising either. Now I'm in a hole so deep, that a ladder won't help ......

CharlieSpencer
CharlieSpencer

I got the two year degree first, landed a job, and the company picked up the cost of the next two years through tuition reimbursement. Yeah, I had to go at night after working all day, and spend my weekends studying, but it wasn't coming out of my pocket.

Justin James
Justin James

... I find myself yearning for a Master's, in something totally unrelated to computers. MBAs definitely seem to be the ticket to the corner office, but I would love to study Edonomics for some bizarre reason, or possibly law. Midlands Tech does indeed to offer about as much practical, hands-on instruction as USC, but as you say, without the extra cruft associated with a full BA/BS. Those general education requirements are part of my point. If someone simply wants to learn to program, why make them spend an extra two years in school at $10,000 per year, not to mention the fact that they could be making $35,000 - $45,000 for those years? That is a huge waste of money and time! It is a shame that someone with the 4 year degree who learned the same things as the person with the 2 year degree is valued much more, when they learned the same thing. J.Ja

Editor's Picks