How to Boost Database Development Productivity on Linux, Docker, and Kubernetes with Microsoft SQL Server 2017
The more I learn, the less I know. For example, in my youth I foolishly thought etymology was probably a useless study. Now I know that by figuring out parts and roots of words and by knowing something about their origins—etymology—I can figure out something about new words, and new words are keystones to new understanding. This realization has given me a reasonable comprehension, but further research reveals a great deal that I don't know about etymology. But I am willing to learn. What's my point? The willingness to learn serves me much better than making it up as I go along. However, being individuals who make things up as they go is where we often find ourselves in the software development business.
OOP Language Wars
For many years now, we have been fighting OOP language (OOPL) wars. Debates over good versus bad OOP have raged in all sectors of our business. Historically, the problem was that everyone involved in these heated debates was a self-appointed expert. In reality, very few people are OOPL experts. The real experts are recognized as such and many of their names are well-known: Grady Booch, Bjarne Stroustrop, Scott Meyers, James Rumbaugh, Ivar Jacobson, Erich Gamma, Kent Beck, Martin Fowler, and a handful of others. Students of these recognized experts could gain insight and perspective about good OOP. Clem Arbuckle from Pigsknuckle, Arkansas, who makes it up as he goes along, is not an expert, and he likely misunderstands good OOP-ness.
Fortunately, design patterns and refactoring seem to be settling many of the OOP language war debates. Those who have learned from the masters are writing better OOP code, whereas those who follow in Clem's footsteps are groping blindly in the dark.
Here We Go Again
The Unified Modeling Language (UML) has placed a strong emphasis on software design and modeling in the software industry. The UML, like English and Spanish, or C# and Visual Basic .NET, is a language with keywords and a grammar. It describes the analysis, design, and implementation of a problem and a solution, whereas C# is used to actually implement a solution.
The good news is that we as an industry are using an industry-standard language to describe problems and solutions. The bad news is that we've failed to heed the words of the philosopher George Santayana: "those who cannot remember the past are condemned to repeat it." Once again, we are fighting the same good versus bad debates about UML that we fought over C# and Visual Basic .NET. Here are some of the arguments (which should resonate as déjà vu):
- Whole words versus abbreviations
- Prefix notations
- Naming conventions
- Subjectivity regarding good versus bad designs
- Single monolithic elements versus multiple discrete elements
It has been established for some time that whole words, nouns, and verbs are better in code than arbitrary abbreviations. The UML's audience is even broader than the audience for code, so why would abbreviations be a good thing in the UML? The same argument applies to prefix notations and naming conventions: They don't add clarity in code; ergo, prefixes and arbitrary naming conventions won't add clarity to UML diagrams.
Code is deemed good if it is based on well-known patterns and refactorings. Therefore, good UML models clearly should contain well-known patterns and refactored designs. Also, single, monolithic implementations with one class or just one assembly are deemed significantly worse than multiple, well-defined discrete elements. This dictates separation of model elements into discrete, organized elements.
Consequently, if we apply Santayana to the arguments listed (and probably many others), arguing over good versus bad UML is a waste of time. Repeating the mistakes of the past twenty years of coding means that every Clem Arbuckle out there will have an equal voice and few will consult the wise old men who have long since settled these subjective debates.
On the other hand, Santayana may be wrong. Maybe we repeat the past as a way to closely examine the mistakes for the pearls of wisdom that may emerge. Through cycles of mistakes, we can make small refinements and revisions and acquire profound understanding. Maybe this is why history repeats itself and why we have to have UML language wars. (Or maybe language wars are a waste time, and we should read more. I am not sure. I seem to know less and less each day.)
Words to the Wise
O world, thou choosest not the better part!
It is not wisdom to be only wise,
And on the inward vision close the eyes,
But it is wisdom to believe the heart.
About the Author
Paul Kimmel is the VB Today columnist for www.codeguru.com and has written several books on object-oriented programming and .NET. Check out his book Visual Basic .NET Power Coding from Addison-Wesley and his upcoming book UML DeMystified from McGraw-Hill/Osborne (Spring 2005). Paul is also the founder and chief architect for Software Conceptions, Inc., founded 1990. He is available to help design and build software worldwide. You may contact him for consulting opportunities or technology questions at email@example.com.
If you are interested in joining or sponsoring a .NET Users Group, check out www.glugnet.org.
Copyright © 2005 by Paul T. Kimmel. All Rights Reserved.