The Joys Of Programming

From my experience, a jack of all trades person is usually a master of none in the programming field. I have worked with many jack of all trades people and everything they touch is overly complicated, and usually has fundemental problems. While it is true that someone that knows one language could probably learn another langauge fairly easily, mastry of a language usually produces better end product then someone who spend a few months learning it 5 years ago. While there are a few people out there that can pull it off and it is quickly obvious who they are when you work with them, most people can't.

I think the real issue I see on a daily basis is when people try to learn the entire stack of their field, and even though it is good to have familiarity, they pretend they are masters. As an example, take a Java developer who learns some DB/SQL stuff, and becomes familiar with javascript and some basic to intermidiate Linux knowledge . This person then is asked to make an application that runs on a Linux box, has a web front end and saves content to the database. The jack of all trades will tackle the entire application, defining the DB schema, creating the webpage and defining how it will be deployed to the Linux box.

Now, I am not saying that it is bad to know Java, SQL, Javascript and Linux, but that it is rare for someone to be a master of all and this will usually yield a medicore product that is a nightmere to maintain by anyone who is a master in one specific area and probably have major limitations in extendability and performance. Instead, I would prefer having a Java developer that is a master of Java and maybe some basic knowledge of the others and leave the DB design to a master Database engineer, the web design to a master web designer and the deployment to a Linux guru.

If the coder is not a master in Java but rather a master in PHP, then chances are, their Java code will be bad until they can master that language. And I don't think that you can master a language in anything less then 5 years. Maybe you can get an understanding of the syntax of the language, but being a master of a language is more about knowing the APIs that come with the language and how to use them than just syntax. This could even extend to the same language on different environments, say C++ on Linux vs Windows.

kazar wrote:

From my experience, a jack of all trades person is usually a master of none in the programming field. I have worked with many jack of all trades people and everything they touch is overly complicated, and usually has fundemental problems. While it is true that someone that knows one language could probably learn another langauge fairly easily, mastry of a language usually produces better end product then someone who spend a few months learning it 5 years ago. While there are a few people out there that can pull it off and it is quickly obvious who they are when you work with them, most people can't.

I think the real issue I see on a daily basis is when people try to learn the entire stack of their field, and even though it is good to have familiarity, they pretend they are masters. As an example, take a Java developer who learns some DB/SQL stuff, and becomes familiar with javascript and some basic to intermidiate Linux knowledge . This person then is asked to make an application that runs on a Linux box, has a web front end and saves content to the database. The jack of all trades will tackle the entire application, defining the DB schema, creating the webpage and defining how it will be deployed to the Linux box.

Now, I am not saying that it is bad to know Java, SQL, Javascript and Linux, but that it is rare for someone to be a master of all and this will usually yield a medicore product that is a nightmere to maintain by anyone who is a master in one specific area and probably have major limitations in extendability and performance. Instead, I would prefer having a Java developer that is a master of Java and maybe some basic knowledge of the others and leave the DB design to a master Database engineer, the web design to a master web designer and the deployment to a Linux guru.

If the coder is not a master in Java but rather a master in PHP, then chances are, their Java code will be bad until they can master that language. And I don't think that you can master a language in anything less then 5 years. Maybe you can get an understanding of the syntax of the language, but being a master of a language is more about knowing the APIs that come with the language and how to use them than just syntax. This could even extend to the same language on different environments, say C++ on Linux vs Windows.

I have the reverse experience. Most people I have worked with who are "masters" at one thing and know nothing else are one trick ponies who cannot see an entire problem-space, and can rarely do anything on a full project scale. You have to cut things into tiny bits and feed the project to them, one feature or function at a time and integration test everything twice because dollars to donuts they've solved this problem at the expense of another. They tend to use their one tool to the exclusion of all others, even when that tool isn't the right one for the job, and they are often very resistant to updating/upgrading infrastructure and learning new things when the world shifts around them. Not to mention some awful experiences with prima donna complex. They can be great in large environments, but in small, agile teams they are a nightmare to test and a liability to shipping a solid product that can stand up to real-world use.

Writing maintainable code has nothing to do with language. It has everything to do with the underlying principles of coding clearly, commenting, documentation, and an intelligent version control strategy. If someone skipped CompSci 101 (or has some personal issue with the commonly understood conventions) it won't matter whether they use five languages or they've been writing the One True Language handed down by Lovelace herself their entire career, you're going to be dealing with a pissed-off FSM instead of a smooth skein of code.

Maybe in desktop it's different, but in my realms, by the time you have five years of experience with one thing it's been through two or three versions of the browser and framework, and it has either morphed into something completely different or on the way out. DHTML then Cold Fusion then Ajax and now what? I started coding HTML in 1992 off a background in LISP and ObjectPAL, and was doing SGML before that. I've been doing this since before you could align objects or put a background on the page.

I've been through four or five big shifts since then, and even now I'm finding my long years of experience in HTML and CSS are a hurdle to quickly mastering CSS3 and HTML 5 rather than a help. Everything I "knew" about design has been turned on it's head while I was busy with this other thing. I'll know this out, just as I have before.

Maybe you've been lucky and haven't had to maintain monolithic code bases written by people that would use languages and technologies they just learned.

With regards to big picture, I completely disagree. Just look at how building a skyscraper works. You don't have engineers putting up the building. You get specialized tradesman to do that. The engineer oversees the construction and the architect provides the designs. Software should be done in a similar fashion. Only the architect and lead engineer needs to have a big picture overview of everything, and you hire specialists to deal with the individual pieces. DB experts to do the DB schema and define the queries. Application engineers that write the business logic. Web developers that write the UI layer and make it look pretty and usable. Maybe for a small project with 2-3 people working on it, jack of all trades people are ok, but in big projects, they write code that is near impossible to maintain or update.

As for skills becoming old because of new technologies, I don't think that relates to the concept of a jack of all trades as that trade no longer exists (or is near death). A person needs to make sure that whatever they are a master at is something that is needed. If someone is a master web developer, then they need to know everything about how to develop web pages. That means learning HTML5, AJAX, and any other new technology that applies to their job.

I myself spend most of my time working in Java and I am always learning the newest technologies available to my mastery so I am able to adapt. If I started developing in Struts 1.x and I never spend time to learn Struts 2, Spring MVC, etc.. then I am letting my mastery slide into obsolete technologies. But I am not going to spend time to learn more then rudimentary skills in areas that are outside of my mastery and risk becoming mediocre overall.

There is way to much out there to be a jack of all trades. Good programmers can become a master of one. Great programmers maybe a master of 2 or 3. Anything more would require spending all available time studying and learning new technologies just to keep up.

Mastery is essential, of course: as a C++ developer who was forced to switch to Java, it took me about a year before I was writing what I'd now consider good Java code, and I'd hate to go back now and maintain the projects I wrote during that transition. But surely some knowledge outside your immediate area is always useful?

If your web UI developers have at least a basic grasp of how databases work, then change request discussions are going to be a lot easier between the two teams. Things like knowledge of scripting languages are useful for any developer, even if they're not part of their 'mastery'.

The problem here is that "jack of all trades" is a phrase that can hide a multitude of sins. There clearly are people who are excellent computer scientists with broad full stack experience who can be productive, non-morons when writing code and are capable to turning their hand to easily learning new tech (I'd probably flatter myself by placing myself in the category). Often it'll be someone who enjoys learning new skills but has a deeper mastery of their favourite techs (that T-shaped notion). That's the guy you want for your small agile dev team or comp.sci research team.

But I think all too often people label themselves as "jack of all trades" to hide the fact that they've just dabbled in many things with neither having the commitment to actually follow through and learn something properly nor having the fundamental computer science framework that they can actually hang new skills on. The don't have a deep mastery of anything and they are just parrot learning what they know.

Obviously some of this is to do with project size, for any project that gets sufficiently large and complex it will eventually pay to bring on board domain specialists. But that's a specific type of project organisation and not every project is structured that way.

e2a: more often than not I'd rather maintain code written by competent conscientious programmers than by self proclaimed experts.

Wow, thanks for all the advice, everyone. It has been very interesting to get everyone's perspectives on this issue. It certainly gives me a lot to think about. As a quick aside, the Networking class offered at my school sucked, and the teacher is notorious for not being any good. I feel like I learned nothing in the class but am worried that as I approach graduation that network programming (and all the related stuff that goes with it) is an area where my education is seriously lacking. Any advice about what I should do to round out my skills in that area? The text book we used seems to be fairly standard when I search for other courses offered on the topic, but I was hoping for more practical instruction/practice.

Maybe I have pushed too hard to one side of the issue. I don't believe that one should master only one thing and not know about anything else. As a Java developer, I do know C++, C#, ASP, Javascript, etc... and knowing these languages even at a rudimentary level lets me have a broader understanding of how the entire stack works. I am just saying that just because I have an understanding of something doesn't mean I know it well enough to write production code.

The problem occurs where people who are "jack of all trades" think they know everything and try to do everything which almost always ends up in a large amount of mediocre code that over the years degrades into a big ball of mud.

I think we agree more than we thought we did last night. I'm glad I read your post before I gave you both barrels. As a UI developer for N-tier applications, I have to know all that, plus a couple other things just to get my work to display on the screen. I have worked in both big and small companies, and in the case of the small ones you have to spread out to get the job done.

I also see your problem. But I don't attribute that to "jack of all trades" and more "incompetent, self-important jerk".

IlleBellus wrote:

Wow, thanks for all the advice, everyone. It has been very interesting to get everyone's perspectives on this issue. It certainly gives me a lot to think about. As a quick aside, the Networking class offered at my school sucked, and the teacher is notorious for not being any good. I feel like I learned nothing in the class but am worried that as I approach graduation that network programming (and all the related stuff that goes with it) is an area where my education is seriously lacking. Any advice about what I should do to round out my skills in that area? The text book we used seems to be fairly standard when I search for other courses offered on the topic, but I was hoping for more practical instruction/practice.

I'm not a network programmer by any stretch, but I would look into getting an appropriate MCSE book and running through the exercises in it. Check 2nd-hand bookstores for ones that are recent enough to be using the technologies.

kazar wrote:

Maybe you've been lucky and haven't had to maintain monolithic code bases written by people that would use languages and technologies they just learned.

With regards to big picture, I completely disagree. Just look at how building a skyscraper works.

Am I missing something? You and momgamer agree when it comes to large projects.

It's amazing how much just a bit of proper programming education and common sense can help in the development world.

Mostly, the common sense.

It's amazing how many people seem to check that at the door (or lost it outside somewhere...)

kazar wrote:
MacBrave wrote:
*Legion* wrote:

Breadth is critical, because in a fast moving field like computing, your depth can be rendered somewhat irrelevant almost overnight.

Also, I find it's a lot easier for a programmer with solid breadth to deep dive into a new field of depth, but a programmer who has been tunnel-visioned to only his depth has a hell of a time re-building breadth.

"Jack of all trades, master of none,
Certainly better than a master of one?"

Jack of all trades usually means a person who is average in everything he does, but he can do anything. Would you rather work in code written by an average person, or a master?

To give an example. You live in a village that has a baker, a blacksmith and a jack of all trades that does both. Who would you buy your bread from? Who would you pay more?

To write a data driven web application through all the layers you need the following (microsoft)

HTML
CSS
JQuery/Javascript
WCF/Service Architecture
C#
T-SQL

Thus the jack of all trades tends to make more, because they can do the work that in a larger shop they segment across 2-3 other people. The depth of modern programming is staggering. I mean, I've worked in .NET for about 10 years now, does that make me a master? There are over 6k classes built into the framework.... your average .net developer probably uses a few hundred at most regularly. I guess I need a better definition of master?

bandit0013 wrote:

Thus the jack of all trades tends to make more, because they can do the work that in a larger shop they segment across 2-3 other people. The depth of modern programming is staggering. I mean, I've worked in .NET for about 10 years now, does that make me a master? There are over 6k classes built into the framework.... your average .net developer probably uses a few hundred at most regularly. I guess I need a better definition of master?

Actually, I have been reading articles about this and it turns out that Jacks of all trades make less then specialists. But they tend to be the last that gets laid off when things go bad.

kazar wrote:
bandit0013 wrote:

Thus the jack of all trades tends to make more, because they can do the work that in a larger shop they segment across 2-3 other people. The depth of modern programming is staggering. I mean, I've worked in .NET for about 10 years now, does that make me a master? There are over 6k classes built into the framework.... your average .net developer probably uses a few hundred at most regularly. I guess I need a better definition of master?

Actually, I have been reading articles about this and it turns out that Jacks of all trades make less then specialists. But they tend to be the last that gets laid off when things go bad.

Interesting, my anecdotal experience suggests otherwise. As a jack I've long made more than my specialized friends... could just be that I'm a better negotiator.

How do you guys feel about a cs degree at a college that starts the kids off with java? Still plenty of opportunity for them to jump languages? You guys feel there is still plenty of opportunity next couple if years for kids leaving college with cs degrees?

Anybody have experience with IEC. 61131-3 languages like structured text or function block design?

karmajay wrote:

How do you guys feel about a cs degree at a college that starts the kids off with java? Still plenty of opportunity for them to jump languages? You guys feel there is still plenty of opportunity next couple if years for kids leaving college with cs degrees?

CS is theory. Couldn't really care less about what language they use to teach it. As long as the graduate understands that Java is an extremely rigid OOP language and knows about other programming paradigms then it sounds good to me.

bandit0013 wrote:

Interesting, my anecdotal experience suggests otherwise. As a jack I've long made more than my specialized friends... could just be that I'm a better negotiator.

That must be another trade that you are good at.

SixteenBlue wrote:
karmajay wrote:

How do you guys feel about a cs degree at a college that starts the kids off with java? Still plenty of opportunity for them to jump languages? You guys feel there is still plenty of opportunity next couple if years for kids leaving college with cs degrees?

CS is theory. Couldn't really care less about what language they use to teach it.

I'm gonna disagree because these two statements are in conflict, knowing programming languages doesn't really teach CS theory on their own.

The better question is what do you want to do?
Be a software engineer? A coder? A programmer? A computer scientist? I can't remember, wasn't it on this thread someone had a good link to a blog post or article about the difference?

We teach our undergrads java and Haskell for the programming courses they have to take. What they are really being taught is the difference between functional and imperative programming, Which is a somewhat more important theoretical concept to get your head around than learning any specific language.

Outside of the java and Haskell courses they are free to use whichever languages they see fit, largely we don't really care what they use. The bright/committed ones will expose themselves to 3 or 4 by the time they're done.

RolandofGilead wrote:
SixteenBlue wrote:
karmajay wrote:

How do you guys feel about a cs degree at a college that starts the kids off with java? Still plenty of opportunity for them to jump languages? You guys feel there is still plenty of opportunity next couple if years for kids leaving college with cs degrees?

CS is theory. Couldn't really care less about what language they use to teach it.

I'm gonna disagree because these two statements are in conflict, knowing programming languages doesn't really teach CS theory on their own.

The better question is what do you want to do?
Be a software engineer? A coder? A programmer? A computer scientist? I can't remember, wasn't it on this thread someone had a good link to a blog post or article about the difference?

From my experience there is no difference. 99% of the jobs will treat someone with a 2 year degree in programming with the same weight as as someone with a masters in Software Engineering. Maybe not in pay, but in job duties.

RolandofGilead wrote:
SixteenBlue wrote:
karmajay wrote:

How do you guys feel about a cs degree at a college that starts the kids off with java? Still plenty of opportunity for them to jump languages? You guys feel there is still plenty of opportunity next couple if years for kids leaving college with cs degrees?

CS is theory. Couldn't really care less about what language they use to teach it.

I'm gonna disagree because these two statements are in conflict, knowing programming languages doesn't really teach CS theory on their own.

The better question is what do you want to do?
Be a software engineer? A coder? A programmer? A computer scientist? I can't remember, wasn't it on this thread someone had a good link to a blog post or article about the difference?

I'm confused, what two statements are in conflict? I didn't say "Learn Java and you'll learn CS." I'm saying CS is theory and you can use whatever you want to teach that theory. It could all be psuedocode for all I care.

Edit: A good CS program should include classes on software engineering, comparative languages, architecture, etc so that you have a reasonable understanding of these concepts but for general CS theory I don't really think it matters what language is used.

SixteenBlue wrote:

but for general CS theory I don't really think it matters what language is used.

which is why we should be teaching everything with smalltalk and prolog...

DanB wrote:
SixteenBlue wrote:

but for general CS theory I don't really think it matters what language is used.

which is why we should be teaching everything with smalltalk and prolog...

and haskell

You guys could have been stuck with RPG II, assembly language, COBOL, SGML, and Apple BASIC like I was.

Then the design curriculum I was also taking added AutoLISP.

Guess how much of any of it I've touched in the last 15 years.

IMHO, it doesn't really matter what you start with, as long as you learn the basic principles and then how to apply them. Once you have that, another language is a minor stumbling block (unless you're one of those people who has trouble teaching themselves stuff - that's a huge handicap in this business).

I do wish more CS programs worked more with something more strongly typed, and something case-sensitive though. Not having that experience seems to be a major drawback for the people starting out in my niche in the business realms.

Most(?) CS courses teach Java these days because it has an acceptable object model so it's a reasonably good choice for getting OOP in to student's heads. And additionally it's a reasonably employable skill so there is a demand for it both from students who want jobs and from the outside/commercial world who want educated students that can be productive.

karmajay wrote:

How do you guys feel about a cs degree at a college that starts the kids off with java? Still plenty of opportunity for them to jump languages? You guys feel there is still plenty of opportunity next couple if years for kids leaving college with cs degrees?

Using Java for an algorithms or data structures course allows the teacher to focus on the high-level concepts without fussing with the language too much. That said, I think if students are insulated too long from things like static vs. dynamic memory and manual memory management, moving to a language later on that needs that knowledge isn't easy. My college curriculum used Pascal for the intro course, jumped to C++ in the next course, then delved into assembly, Scheme, Prolog, Perl, etc. Making the transition from C++ to pretty much any other imperative language is fairly easy, while moving from a garbage collected language to something like C or C++ can be quite difficult. If I had to pick a language to teach imperative programming these days I'd choose D, because it has garbage collected objects, C-style structs, and inline assembler, so you can work as high or low-level as desired.

SixteenBlue wrote:

I'm confused, what two statements are in conflict? I didn't say "Learn Java and you'll learn CS." I'm saying CS is theory and you can use whatever you want to teach that theory. It could all be psuedocode for all I care.

The only problem with computer science taught as theory is that real-world behavior can be significantly different. For example, traversing a heap is extremely efficient from a big-O perspective, but for large heaps, sibling nodes live on separate memory pages, so the comparison performed at each step requires the use of 3 separate pages of memory. This murders the VMM and the algorithm ends up performing horribly. I've yet to hear of a CS curriculum that teaches things like this even though the underlying technology (virtual memory, for example) has been in use for maybe 30 years.

[edit]

Here's an article that discusses this issue.

Used recursion at work.
Feels good man.

complexmath wrote:

The only problem with computer science taught as theory is that real-world behavior can be significantly different. For example, traversing a heap is extremely efficient from a big-O perspective, but for large heaps, sibling nodes live on separate memory pages, so the comparison performed at each step requires the use of 3 separate pages of memory. This murders the VMM and the algorithm ends up performing horribly. I've yet to hear of a CS curriculum that teaches things like this even though the underlying technology (virtual memory, for example) has been in use for maybe 30 years.

[edit]

Here's an article that discusses this issue.

Yeah it's a tough balance. At my last job we had an intern who sent me an email one day asking where I learned [a list of X things that had come up lately] and they were all real world things that just don't get discussed in CS programs. On the other hand, spend too much time teaching the real world and you lose the solid theory foundation.

Yup. And to be fair, things like the impact of multi-level storage on algorithms is taught at the theoretical level. Just not to undergrads. But I agree that theory is very important. For example, I know a talented programmer with no formal background in the field and he's told me stories of how he's invented a cool solution to some problem only to discover later that it was an established algorithm. Learning the theory provides a language for describing efficient systems without getting too caught up in implementation details.

I guess, in theory, that's what internships during undergrad are for. Learn theory in class and real world in your internship. But since they're optional that can leave a lot of grads without that experience.

Like me, for example.