Building the Right Environment to Support AI, Machine Learning and Deep Learning
The majority of project planning efforts today are still performed in the strictly "deterministic" manner where all project tasks are assigned with well-defined timeframes to execute and resources to implement. At the same time, we all know that real life necessitates corrective actions: Resources are not always available, extra tasks pop up during the course of the projects, and what happens most frequently is the initial estimate of the time required to perform a certain task is no longer valid. It's easy to mentally grasp such uncertainties and have an accurate estimate when dealing with a few variables; however, with large-scale projects having multiple uncertainty factors of a different nature and character, such analysis cannot be performed manually and demands that very special efforts are required for the accurate project assessment.
Such uncertainty in the project planning is widely recognized in a lot of industries and a wide variety of different tools exists to help to optimize the planning process and minimize the associated risks. These tools cover a wide range of offerings from the mathematically rigorous and sophisticated solvers (mainly constraint based programming) to different Excel add-ons (VBA macro based) doing similar things, but on a simpler level (you could find even a freebie doing some optimization of your Excel-based projects or dependencies).
Your situation is very common for a product development organization: How to rapidly develop an accurate project plan estimate that is critical to the overall development lifecycle (resources, relation to other projects, and dependencies on other deliverables should be always well defined) and at the same time minimize the time spent performing "if-then-else" analysis (your business and technical analysts and project managers have their hands full doing "real things"—mplementing new customer-requested functions and working on the next release of the product).
By setting up a few extra steps in your planning processes and applying a specialized tool against your project plan, I believe that you can find the right solution and be able to define the correct approach to this problem.
Your process of planning and assessment consists of five main steps:
- Define the project: During this phase, you put together the regular project plan in a MS Project with the main tasks, timeframes, and deliverables. A Project Plan can be created either manually or be generated as the result of other efforts (doing some preliminary use cases or object design or working with the cost estimate tools).
- Identify uncertainly: In this phase, you need to determine which tasks or their inputs in the plan are uncertain, what you know about these uncertainties, and what would be the best way to describe these uncertainties, their behavior, and known range of values they might cover. During this phase, you also need to identify the most critical results or outputs of the plan you want to look at, analyze, and optimize.
- Apply the tool and implement the model: I've selected Palisade's @Risk for Project (http://www.palisade.com) as my project optimization tool from the wide variety of the other available packages. I believe that this tool is a good match to my requirements: It can be integrated with the MS project, has an easy-to-set-up model, sufficient depth of simulation and outputs, and ease of analysis. A particular tool selection affects only the way that the uncertainly model will be applied; the general approach to the problem handling will remain the same.
- Run simulation: As the result of the Monte Carlo simulation (parameters of this simulation can be configured as well), you obtain the ranges and probabilities of all outcomes for the outputs that you've identified during model setup. Normally, it's the ranges of the duration of the particular phases or the project with the most critical dependencies, overall project timeframe, and so forth.
- Make a decision: Based on the simulation results and looking at the outputs of the model, you obtain the information about the most critical characteristics of the project. This data serve as the basis for the following optimization and corrective actions.
Working with the Tool
@Risk is tightly integrated with MS Project and looks like an additional tool set on the MS project panel (see Figure 1). By selecting particular project parameters (for example, the duration of the task) and clicking on the "Define Distribution" icon on the @Risk tool bar, you can set up model characteristics.
Figure 1: Model Setup
The tool is quite flexible in definition of the distribution windows: Out of the box, it's easy to find anything from the uniform or normal distribution up to the more exotic ones; see Figure 2.
Figure 2: Different available distribution functions