Building the Right Environment to Support AI, Machine Learning and Deep Learning
If you look at a software solution like a race across the ocean through a narrow canyon of icebergs, then companies start out by charting the maze. They have a goal (the finish line) and set checkpoints (milestones). They analyze where the currents are, where the glaciers are stable and where they drift, the weather conditions, the water temperature, the wind speed, right down to the native sea-life. Then, they try to pilot a supertanker through the maze and get to the end as quickly as possible. Many seasoned project managers in their skipper-hats will attest that making small course corrections early on can save a lot of effort and time when aiming the supertanker for a narrow channel across the ocean. So, they have to make certain the channel they are aiming for is the right channel. In a sense, project managers and architects are not only responsible for steering the ship, but also for predicting the future. If the finish line should move halfway through the race, it either takes a lot longer to reach it, or the supertanker simply runs out of fuel (funding) and is dead in the water.
What is 2D?
2D Stands for Differential Development. 2D is a stronger, more powerful approach to developing applications that leverage bleeding edge technology, the opinions of some very progressive thinkers and ardently rejects the notion of freezing any part of the design. It is important to know that the concepts embraced by 2D are not unproven or experimental. 2D seeks to synergistically unite compatible methodologies from a broad spectrum of leading IT experts. Much of what 2D advocates has been in practice and successfully implemented within the industry. 2D seeks to make a supertanker-sized jet-ski that can handle the load of corporate business needs and efforlessly zip across the turbulent waters of changing requirements. It can also be a lot of fun to ride.
Structured Programming vs. Agile Development
However, an Agile Development Methodology can support fundamental changes on any level without a complete re-write, and do so without introducing additional bugs. This is important to understand. Very important. Fundamental changes on any level without necessitating a rewrite.
In many situations, things such as re-naming a base class or namespace, changing data types on an interface, renaming variables, or even changing the base data model is an agonizing decision because it could require extensive revisions throughout the code to accommodate the changes and may also introduce new and unexpected bugs. The fear of "Breaking the Interface"—changing the face of one object to make it unrecognizeable to the rest of the objects in your application tree—results in leaving things as they are, which is often a compromised design that meets the functional requirements, but is not as efficient, modular, or agile as it could have been given the chance to code it knowing what the developers have learned.
The approach to product architecture has a fundamental and unavoidable flaw:
The most important decisions are made at the start of a project with the least amount of road experience and the highest uncertainty factor.
These are critical design decisions that not only involve the foundation of the relational data model but also the architecture and interrelation of the layers/tiers between the UI and the database. How many times have you heard the word "Re-architect" mentioned in reference to a project halfway through the development life cycle? I would venture to guess not very often. That is because re-architecting is almost always synonymous with a major re-write. And like rebuilding a house, the very act of committing to re-architecture implies a demolition of the existing product and devaluation of the time and effort spent on it. Usually, such drastic measures are taken only after enough enhancement and feature requests unsupportable by the current version have piled up or if current product performance is unacceptable and cannot be resolved with hardware upgrades. More than likely, the bigger and more complex projects (supertankers) pose a much greater challenge to re-architecture than a simple, single-user application (jet-ski).
Re-Architecting vs. Re-Factoring
Although Re-Architecting is a major undertaking that cannibalizes existing design, Re-Factoring is actually quite the contrary. In fact, many developers have been secretly refactoring their code for years without saying a word. Some do not call it refactoring, instead referring to it as "tweaking" or "tuning," but the principle is the same. A developer builds a class or module, then goes back and removes redundant code, adds references to remote error handlers, renames the variables or re-classifies their type, adds attributes and comments, and so on. When they are done, the object, class, or module they are submitting to version control has already gone through several personal revisions, versions, and iterations before it was added to the application. Each developer has their own style, their own experiences with what works better in different situations. Development environments in corporate IT often force developers to adapt and use uniform coding standards that can be as ambiguous as "good user experience" (one of my most memorable function requirements) or as specific as variable naming conventions. Though there are many varieties of coding standards and best practices, there seem to be surprisingly few flavors of refactoring methodology, leaving the process mostly in the hands of developers.
Differential Development (2D) using Refactoring principles provides the following advantages:
- Keeps the application up to date with the latest technology.
- Keeps the design up to date and in line with expectations and experiences.
- Allows the dev team to apply what they learn along the way back into the design on every level, including the core libraries.
- Modularity of components, which can effortlessly adapt to changing business requirements.
- Dynamic and on-going testing of the product.
- Established and well-practiced process of applying changes to any level of the application and understanding/managing their impact on the rest of the application.
- Better interoperability within the application layers.
- A non-frozen data model, which can change and evolve throughout the course of the development life cycle without significantly impacting deadline or cost.
The advantage to refactoring as opposed to rearchitecture is that the product design is in a constant state of revision, implementation, and testing. There is no "set" design and nothing is frozen. Any change to design, anywhere, is propagated throughout the product via dependency chains and references. You can call it "Extreme Development" or "Dynamic Design" or whatever makes sense, but the concept is important and quite powerful.