Tuesday, January 28, 2014

Controversial Programming Opinions I Agree With

I recently came across Bill the Lizard's blog post about controversial programming opinions. Shockingly, some of these really threw me off guard. Others, however, I kind of agree with. I'm fairly new to this development thing, but I would still like to take my chance to chime in on some of these "controversial" opinions that I fully agree with.

5. "Googling it" is okay!

So, there's been a lot of debate about whether google is helping or hindering us. One of my favorite articles about google's psychological affects argues that we are becoming reliant on google for "mindshare" (the process of delegating memory tasks to people around us). Some people are disturbed by the idea that google could be "replacing our memory" but I see it as a good thing, and here's why:
  • If we didn't have google to quickly retrieve information for us, we would have to use some other form of reference (dictionary, textbooks) which are expensive, are not always available, and have a longer look-up time. Or maybe we would just neglect to pursue correct information altogether (let's face it, we're pretty lazy).
  • Consider the quotes: "I never commit to memory anything that can easily be looked up in a book" and "Never memorize what you can look up in books." (Einstein) and ask yourself, "does this apply to the opinion 5?"
  • There is too much information to store it all in your mind. I mean, don't get me wrong, there are plenty of things we should know as citizens and craftsmen in general, however, computer science is growing by the second. We can't know it all. Personally, I'm thankful that google knows things that I don't.

6. Not all programmers are created equal.

The gist of this one says that it's wrong to think that the amount of experience a developer has will indicate how good of a developer he (or she) is. I think Coding Horror has a few important points to make about this when he writes about becoming a better programmer, without programming.

7. I fail to understand why people think that Java is absolutely the best “first” programming language to be taught in universities.

Here's why I agree with this one:
  • Although OOP is a good thing to teach, universities (perhaps unfortunately) have to cater to a variety of majors, not just computer science. OOP is not necessary for many engineering students (e.g. electrical engineers who program chips in C and Assembly).
  • Java is difficult to "ease into." Intro to programming classes are just that, an intro. They should illustrate basic concepts. Java has a lot of overhead. For example, in Java, the classic "Hello, World" program has overhead that teachers, upon first exposure to students, are forced to wave their hands at. Right off the bat students are told to "just ignore 80% of what they're typing, just to make the computer say hello." Whereas, in simple languages like Python and Ruby, "Hello, World" can be as simple as a print statement and a string, two simple intro concepts that are isolated and less distracting than embedded classes.
  • For more advanced classes, a language like Java may be appropriate; however, there is a lot of speculation going around that Java is on the decline in the industry. The point is, even if it is the current, best language to teach -- it won't always be. It will die.
  • Java is a great language for teaching OOP, I'll give it that. But it fails to illustrate certain, slightly advanced, fundamental programming concepts (that schools like to teach) like pointers, manual memory allocation, etc... So even for advanced classes it isn't capable of teaching some slightly more advanced concepts.
  • I am, personally, fine with schools using Java, it is a good language. The statement that "it is absolutely the best" is flawed in that, it's just one of many excellent programming languages. To anyone who thinks otherwise, I would highly suggest Guido's PyCon keynote in which he criticizes language holy wars and trolls.

9. It’s OK to write garbage code once in a while.

  • It's better to write quick garbage code that does the job and later refactor it than it is to be an Artist.
  • Proof of Concept (POC) code is just proving a point. It doesn't necessarily need to be later maintained, read or extended.
  • Writing crappy code is part of learning to write better code. Also, refactoring your bad code can help you learn how to refactor others' bad code.

18. If you’re a developer, you should be able to write code.

Assuming "developer" means "one who constructs software" it seems necessary to be able to "stack the legos." Similarly to you can't break an omelette without breaking some eggs, you can't construct software without writing some code.

There are the opinions I find myself pretty much for. Please feel free to disagree in comments and let my know what you think.

No comments:

Post a Comment