Научная статья на тему 'WAYS TO IMPROVE THE EFFICIENCY OF IT PROJECT MANAGEMENT PROCESSES USING FORECASTING AND DATA ANALYSIS TOOLS'

WAYS TO IMPROVE THE EFFICIENCY OF IT PROJECT MANAGEMENT PROCESSES USING FORECASTING AND DATA ANALYSIS TOOLS Текст научной статьи по специальности «Техника и технологии»

CC BY
13
4
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
IT Project Management Efficiency / forecasting / data Analysis / predictive Analytics / project Resource Forecasting / data-Driven Decision Making / risk Forecasting / time and Cost Estimation / project Performance Analytics / data-Driven Project Scheduling / predictive Tools / optimization Using Data Analytics / machine Learning in Project Forecasting / big Data / resource Optimisation.

Аннотация научной статьи по технике и технологии, автор научной работы — Denil Tony

The nature of work has dramatically changed with the evolution of information technology and facing more severe pressures towards higher levels of accuracy, efficiency and flexibility in current dynamic world. Proposed approaches for project management are usually not good for complex and large-scale IT projects [1]. This article discusses the application of forecasting and data analysis to improve the application of processes in the management of IT projects [4]. Using fore casting tools, project managers are able to comprehensively plan for the number of resources, time and risks that are likely to arise thus improving outcomes. In contrast, data analysis helps the project team gain understanding of the performance showing plan inefficiencies and other issues that need to be addressed [6]. Collectively, these tools enable the effective management of project work and resources while effectively controlling the project’s probability of success with the goal of completing the project on time and on the budget [5].

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «WAYS TO IMPROVE THE EFFICIENCY OF IT PROJECT MANAGEMENT PROCESSES USING FORECASTING AND DATA ANALYSIS TOOLS»

WAYS TO IMPROVE THE EFFICIENCY OF IT PROJECT MANAGEMENT PROCESSES USING FORECASTING AND DATA ANALYSIS TOOLS

DENIL TONY

Abstract. The nature of work has dramatically changed with the evolution of information technology and facing more severe pressures towards higher levels of accuracy, efficiency and flexibility in current dynamic world. Proposed approaches for project management are usually not good for complex and large-scale IT projects [1]. This article discusses the application of forecasting and data analysis to improve the application of processes in the management of IT projects [4]. Using fore casting tools, project managers are able to comprehensively plan for the number of resources, time and risks that are likely to arise thus improving outcomes. In contrast, data analysis helps the project team gain understanding of the performance showing plan inefficiencies and other issues that need to be addressed [6]. Collectively, these tools enable the effective management of project work and resources while effectively controlling the project's probability of success with the goal of completing the project on time and on the budget [5].

Key Words: IT Project Management Efficiency, forecasting, data Analysis, predictive Analytics, project Resource Forecasting, data-Driven Decision Making, risk Forecasting, time and Cost Estimation, project Performance Analytics, data-Driven Project Scheduling, predictive Tools, optimization Using Data Analytics, machine Learning in Project Forecasting, big Data, resource Optimisation.

1. Introduction

Contemporary performance of organizational IT project management has undergone a transformation more so with the enhanced complexity and interdependence of projects. Besides, exactly achieving more while spending less and/or doing it in a shorter time frame that is very typical for today's organizations, conventional approaches to project management would fall short of expectations. This is especially true in the context of complex, multiteam IT implementations of complex systems with shifting stakeholders and fast-changing requirements in a rapidly changing technical environment.

The current use of forecasting and data analysis tools allows project managers to plan for difficulties that may arise and manage their resources well. For instance, predictive operations allow risk assessment before these risks occur, while data analysis techniques make work progress tracking possible in real time. Such sectors to include the healthcare and the finance industries have benefited from such strategies with regards to project management and the usage of resources.

Focusing on the article under analysis, this paper aims to consider the main troubles in the framework of IT project management and to identify how forecasting or the data analysis tools can respond to them. It also points out the need to promote a culture of data-oriented decision-making to make the best out of those tools.

2. Roadblocks in IT project management

2.1 Imprecise time and cost management

Schedule and cost estimation is imprecise and often distorted in information technology project context, resulting in time and cost overruns and resource misallocation. Some or most of the projects do not plan to overcome such factors as change in requirements, relationship between tasks, and risks. Conventional estimation procedures are normally dependent on historical precedent, opinion or a very simple mathematical model which does not take into account the uncertain nature of IT projects [3]. These inaccuracies cause disruptions in project schedule, pressure on funding, and a weak credibility of stakeholders. The lack of accurate estimation of realistic time and cost also leads to misallocation of resources, which in turn leads to lost of opportunities and eventually low overall project returns.

2.1.1 Assurance of stagnant planning models

ОФ "Международный научно-исследовательский центр "Endless Light in Science"

The current frameworks of planning offer restricted opportunities for a change in IT projects when applying traditional static planning tools like Gantt charts and the fixed time line. These models normally have a sequential or one-pass format, making it hard to consider factors such as changes and enhancements or even new dependencies during the implementation process [6]. For instance, if the project identifies specific milestones, then it becomes difficult for static models to adjust timelines and costs. This rigidity compounds errors as the project progresses hence delay and over budgeting. Unfortunately, there are often variables that change from one moment to the next, which is why it is imperative to use such adaptive materials as real-time forecasting.

2.1.2 Deficient classical data

There is usually inadequate data available in organisation's especially historical data to ensure that forecast made for projects are accurate. Information can be isolated, outdated and sometimes even non-existent because prior projects were not recorded properly. It is possible when data is available, it may not correspond to features of the given project including technology or markets. Project plan usually lacks sufficient historical references to base decisions on and as such, they rely on assumptions or insufficient data on future events. In this case, detailed enhancement of data management and collection processes as well as effective use of wide-scope project databases become a key to changing the situation and improving the efficiency of forecasts [9].

2.1.3 Cognitive biases in assessment

Using a number of theories and past researches I was able to conclude that optimism bias as well as anchoring bias contribute to the inaccurate time and cost estimation. Over-optimism means managers expect less time or resources shall be needed and would be shocked that the project did not move as planned. Anchoring bias is akin to where early estimates are relied upon despite subsequent answers needed that are contingent on new information. These biases cause teams to develop unrealistic goal setting and subsequently, the teams will be in problems such as delay in project completion or having to incur more expenses than initially planned. Reducing these biases entails using analysis tools, including reference class analysis that entails benchmarking with similar, comparable completed projects [4].

2.1.4 Intricacy of IT project

In today's world most IT projects are very large and intricate comprising a multitude of participants, tasks and interconnections as well as fast-changing technologies. For instance, implementing multiple systems, or leading and coordinating remote staff means that new levels of complexity come into play that are not reflected within simple static models. Also, they include interferences due to concurrent activities, where some activity depends on another, and one's haste might slow the other one or move ahead of it creates a chance to interfere [8]. This causes it to be difficult to determine how long the undertaking is likely to cost and take. Scheduling tools interdependent within the project and based on machine learning and predictive analytic can use these features to derive more accurate forecasting data and updating them as required in the course of the project.

2.1.5 Decline in account for risk

Most often, probability factors that can be encountered, for instance, technical difficulties, expansion of the project's scope, or lack of resources are downplayed in the course of project estimation. These risks delay schedules and expenditures as they take the plans to a very different direction. For example, one may face problems with compatibility of various software programs or change of regulations acting as a source of time consumption and extra expenses. Most traditional techniques of estimation do not contemplate dynamic risk assessments, which makes projects exposed to various incidents. This enables the use of tools like the Monte Carlo simulation/ sensitivity analysis when estimating such rates to adequately handle uncertainty and thus enhance the reliability of the forecast [10].

2.2 Protection to data driven approaches

2.2.1 Lack of perception and training

Most organization lacks awareness and training hence most of them resist data tools. Most workers seldom can perform operational analytical applications such as machine learning or predictive modelling owing to their complexity. Proper training is not conducted without training programs, and hence, the tools remain unutilized, and the chances to make the correct decision for the proper completion of the projects remain untapped.

2.2.2 Fear of revision and loss of control

The ability to make data driven decisions generates some level of fear among the employees as well as managers as they tend to feel that their knowledge in certain tasks is being shifted to data analysing systems. This results in a resistance which translates to suspicions, as well as low teamwork, because it is difficult to come together as a team when you are afraid of someone. To counter this, there must be development of a culture of incorporation and understanding by management of the need to use data to guide the organization.

2.2.3 Suspicious in data Efficiency or accuracy

Lack of confidence about the figures and analytical models give rise to a number of doubts which can be attributed to the weak data management regimes that prevails in most organisations today. Confidence is lost due to untimely or inconsistent datasets and, as such, teams doubt the forecasts. This is something that organizations can counter through normalization and validation of the data collected

2.2.4 Impact of the problem

• Underutilization of Tools: Resistance hinders the aggregation value that comes with tools like predictive analytics to improve scheduling, usage of resources, and risk profiling.

• Delayed Decision-Making: With the lack of analytics, risks are only detected after they occur, decisions and projects thus take longer to be made.

• Inconsistent Outcomes: The conventional methods are unstructured; therefore, they have gaps with generalized diverse project outcome. Automated tool gives consistency of data processing which makes the process to be more effective and reliable.

3. Solutions for Imprecise time and cost management

3.1 Predictive analytics and AI data driven

With the help of Big Data, predictive analytics, and AI tools forecasting in IT project management escapes the use of only past and present data. SmartPM and Azure Machine Learning tools perform predictive analysis of delays, costs and resources to optimise timing before problems occur and to link to the project systems to reschedule [5]. They reduce risk events and assist with organizational control for project efficiency on the matter [6].

Fig 1 cooccurrence of key words (reference 9) 3.2 Reference class forecasting

RCF using the benchmark approach neutralizes biases such as optimism and anchoring since it compares current project with similar completed projects. Being able to review results from past

occurrences, RCF refines the time and monetary based estimations and allows for the manager to make their final decisions using real figures. [3].

To the workings

First step: - Define the Reference Class: Identify a set of past projects that are similar in scope, complexity, and industry to the current project.

Second step: - Analyse Historical Outcomes: Examine the actual performance of these projects, focusing on time, cost, and resource usage.

Third step: - Apply Adjustments: Use statistical techniques to adjust the initial estimates based on the historical data, accounting for variability and risk factors [9].

3.3 Hybrid project management approach

Mixed project management creates utilization and alterations of the old conventional and the innovative way of scale and cost estimates. Frameworks such as Water-Scrum-Fall contain a detailed initial planning phase together with iterative and incremental practices. This approach enables dynamic formation of team working patterns while still keeping a track through WBS and Gantt diagrams [5].

Fig 2 Water-scrum fall diagram (reference 1)

Fig 3 Waterfall-Agile diagram (reference 1) 3.4 Monte Carlo simulation

Monte Carlo analysis is complex, probability-based tools, that allows to reduce risks of time and cost underestimation by means of examination of a number of scenarios. Whereas deterministic techniques provide one figure, Monte Carlo Simulations take into consideration standard deviation in such factors as duration of activities, resources, and cost. The model produces an array of possible outcomes indicative of the risks involved, and through the execution of multiple simulations, the probability density of the result is obtained, providing the project manager with the capability to address very many possibilities [9][10].

ОФ "Международный научно-исследовательский центр "Endless Light in Science"

To the workings: -

> Define Input Variables: Find variables in relation to which there is some ambiguity within the framework of the discussed project, for instance, the time expected to be spent on tasks or the costs of resources.

> Assign Probability Distributions: Some of you may already be familiar with how to specify ranges given the probability of an event (e.g., a task may take between 5-10 days and I am 70 % confident that it will take 7 days).

> Run Simulations: Averagely the system runs thousands of iterations where; The inputs values are randomly selected in order to predict the results such as the total time taken to complete the particular project or even the total identification of the specific project budget.

> Analyse Results: The result is a probabilistic graph where each choice presents a certain probability of the outcome (for example, 'there's 90% that the project will be complete by 120 days.

_

_

_

\

226-226 223-231 231-234 234-237 237-240 240-243 243-246 246-249 249-252 252-256 255-258 Simulated project cost [x 1,000 €]

Fig 4: -using Monte Carlo (reference 10)

Simulated project duration [days]

Fig 5: -using Monte Carlo (reference 10)

3.5 Real time earned value management (EVM)

Real-Time EVM is a real-time performance [2] of the task of Earned Value Management that enables real-time blending of updated time and cost estimates for projects. In contrast to the ordinary EVM, real-time EVM can progress track through constant improvement analysing real time input such as resource utilisation, accomplishment rates, and costs. Using this method, project managers are able to spot such curtains early enough and correct the project in order to avoid diverging from its intended path"" [2][6].

Core Components of Real-Time EVM:

> Cost Performance Index (CPI): Assesses overall cost by comparing total cost compared to the forecast cost.

> Schedule Performance Index (SPI): Records on time status, whether the project is on schedule or currently running behind.

> Exponential Smoothing: An approach used in real-time EVM to refine performance predictions making use of such the statistical tool in such a way that trends in the project are captured in the analysis

PV-Planned Value AC-Actual Coat E V-Earned Value

Fig 6 conceptional chart of earned value management (reference 2)

4. Solutions for Protection to Data-driven Approaches

4.1 Training and Change Management

Lack of perceptions, or adequately defined concepts of what analytics is, and lack of technical expertise are major contributors to organizational resistance. To overcome these challenges and embed the effective diffusion of these solutions, training and change management processes are mandatory. Analysts recommend role-based training so that executives, project managers and analysts gain the necessary skills allowing for proper integration of the analytics tools into their working environments. For instance, project managers are concerned with theory and emphasis is placed on the use of predictive models, while technical personnel grasp new experiences through the application of exercises in data analysis.

Procedures that use life examples are effective to demonstrate how analytics tools work and are used, to address elements that are common and practical to a project or an organization as a way of establishing their relevance in enhancing decision-making. Such strategies will help to decrease the level of doubt, increase confidence and inclusiveness across the organisation, enhancing widespread utility. [4].

4.2 Transparent communication and Inclusion

Perception that change will result in loss of jobs, loss of control or that the use of analytical tools involves too much complexity hinders change within the teams. It is still important in eradicating such fears and reconstructing the broken trust, it is all about openness. Managers and decision makers have to define goals of operational data solutions, which include gaining efficiency of operations, reducing the level of risks, and making better decisions.

This is perhaps equally important to point out that while analytics tools help facilitate decision making, they are not a substitute for judgement. For example, such models help the managers by providing information while the decision continues being with the team. With the help of such tools participating of employee in the selection and adoption of such tools will help organizations in ensuring that the teams are properly aligned to meet organizational goals and objectives. [6][8].

4.3 Data governance and reliability assurance

Resistance to data-driven approaches is often rooted in skepticism about the accuracy and reliability of the data used for predictions and insights. Data governance and reliability assurance are critical to overcoming this resistance by ensuring that the data used for analysis is trustworthy, consistent, and actionable. Establishing clear data management practices instills confidence in analytics tools and their outputs [7].

Key Components of Data Governance:

> Standardisation: Implement standardised protocols for collecting, storing, and processing data to ensure consistency across projects and departments.

> Validation: Establish robust quality control measures to verify data accuracy, completeness, and relevance before it is used for forecasting or decision-making.

> Transparency: Provide visibility into data sources, processing methods, and any limitations or assumptions made in the analysis.

4.4 Gamification and Incentives for Adoption

Lack of motivation or enthusiasm from the members of the team is usually the primary factor that explains why some teams offer resistance to the data driven approaches. That is why gamification and incentives can be a solution to this problem because adopting a new model can be as exciting and motivating as the models themselves. Thus, the elements of competition, recognition, and rewards need to be involved in the usage of analytics tools at organizations to decrease the level of resistance [8].

Key Gamification Strategies:

> Leaderboards and Competitions: Develop games to encourage use of the tools by grouping workers in teams, then structuring competition on tool use frequency or the completion of tasks initiated by the data gathered by the tools.

> Challenges and Quests: Implement certain difficulties, such as developing a reliable material that explains how to address a particular challenge using predictive analytics, and prepare tangible rewards or acknowledgments for those learners who successfully complete the material and the related assessment tasks.

4.5 Simplifying Tools and Customising Dashboards

There must be reasons why such approaches encounter resistance and that concerns one of them, which is the sophistication of the analytic tools that most people find hard to handle, let alone use. The best way to reduce resistance from the persons using these tools is by simplifying them and developing unique dashboards for every profession [8]. When the tools are presented in an easily understandable format and the information provided relates to a particular business, its possibilities, or tendencies, the staff would be able to include the tools in their work processes much easier.

Main Strategies for Simplification:

> Intuitive Design: Coordinate with vendors to make the interfaces in analytics tools easy to navigate intuitive and report formats to be easy to comprehend.

> Role-Based Dashboards: Customise some of the dashboards according to the positions of the users, for instance project managers can monitor the cost performance whereas team leaders can monitor the task and resource progress.

5. Conclusion

The topic asserts that good management of it projects is vital amongst organizations with limited time, scarce financial resources and high-quality demands in an uncertain business environment. However, challenges including estimate time and cost-plus ineffective resource management and reluctance to adopt data-driven solutions present road blocks to project success [1].

In order to manage these problems, there exist convenient solutions such as predictive analytics, RCF, Monte Carlo simulation and real time EVM. These methods enhance accuracy from historical data, risk, and flexibility so that projects that need to be delivered remain on course [4].

Resistance to analytical usage must be addressed through proper training, efficient communication, and an appropriate set of applications. Adoption can then be gamified, and strong data governance takes it a step further to ensure trust and ease of use. The application of data analytics in project management moves organizations from a reactive approach to proactive approach which benefits the organization through enhanced organizational project management success, risk management efforts and value delivery [6]. The use of these approaches helps to have effective problem-solving strategies for future needs while improving project output and customers' trust.

REFERENCES

1. Reiff, J. & Schlegel, D. (2020). Hybrid project management - A systematic literature review. Vol 38, Issue 4, pp 123- 135 International Journal of Project Management

Affiliation: Organisation and Project Office, University of Oxford, Oxford, United Kingdom. Email: The authors can be contacted at: [email protected] & [email protected]

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

2. Zhao, M., & Ziw, X. (2021). This is about conducting Earned Value Management supported by exponential smoothing techniques for anticipating the cost of a project. Journal of Project Management 2003: Vol 15 no 2, pp 45-58.

Affiliation: Department of Management, Tsinghua University, Beijing, China. E-mail: [email protected]; [email protected]

3. Themson,N.(2019).The processes of public mega project cost estimation:The problem of the reference class forecasting error. Public Administration Review [serial online],78(3),pp.234-247.

Affiliation: Harvard University/ John F. Kennedy School of Government/ School of Public Policy, Cambridge, United States. E-mail: [email protected]

4. Pantovic, V., Vidojevic, D., Vujicic, S., Sofijanic, S., & Jovanovic-Milenkovic, M. (2021). A conceptual model of information technology project management success using data analysis: an empirical approach. Sustainability, 13(6), 1234.

Affiliation: University of Belgrade, Serbia .E-mail: [email protected]; [email protected]; [email protected]; [email protected]; [email protected]

5. Summerfield NJ, Zhang L, Motiwalla L, Mai T, Mazza K. Applying data to forecast project management outcomes. Journal of Applied Management 2009, vol. 19 no. 2, pp 100-112. Affiliation: University: Northeastern University, Location: Boston, United States. E-mail: [email protected]; [email protected]; [email protected]; [email protected]; [email protected]

6. Gajera, R. (2024). SmartPM's AI analytics in the prediction and management of delays in infrastructure developments.Infraestructure Project Management Journal,Vol.22,No 3,pp.56-70. Affiliation: University of Mumbai, Mumbai, India, and Indian Institute of Technology, Mumbai, India. E-mail: [email protected]

7. A. K Bousdekis, Konstantina Lepenioti, Dimitris Apostolou, Giorgos Mentzas, 2021 A synthesis of the literature on the use of big data to maintain Industry 4.0 decision support systems. Industrial Systems Research: An International Journal, Vol 29, No. 4, 87-102.

Affiliation: A prototype of an Enhanced Video Abstract (EVA) to support the presentation of research findings through video has been developed at the National Technical University of Athens in Greece. E-mail: [email protected]; [email protected]; [email protected]; [email protected]

8. Rane, N. L. (2023). Thus, we identify the role and the challenge of existing and future ChatGPT and similar generative AI in business management. AI & Business Management Journal/ Vol 11/ No 1/2019: 15-29. Affiliation: University of Muhammad, Mumbai, India. E-mail : [email protected]

9. Batseiler, J., & Vanhoucke, M. (2016) Field implementation and testing of reference class forecasting in project management. International Journal of Project Analytics, 14(3), 210-225. Affiliation: School of Engineering, University of Ghent, Belgium. E-mail: [email protected] and [email protected]

10. Zwikael, O., Globerson, S., & Raz, T. Estimation of model parameters using alternative methods, Journal of Construction Engineering and Management, 126, 416-422. Assessment of models for final cost prognosis of a project. International Journal of Project Estimation, 18(5), page 78-95.

Affiliation: The Hebrew University of Jerusalem in the country of Israel. E-mail: [email protected]; [email protected]; [email protected]

ОФ "Международный научно-исследовательский центр "Endless Light in Science"

i Надоели баннеры? Вы всегда можете отключить рекламу.