1. Introduction
1.1 Background
Modern software contains hundred lines of code and can be developed by several developers at the same time. Based on that, the need of a version control system that keeps track of the work and controls it, becomes crucial (Ruparelia N., 2010). This vital role of version control system could contribute in improving the performance, on the other hand, it could be a source of frustration (Rosso and Jackson, 2013).
The history of version control systems emerged as a Local Version Control Systems at the beginning. These systems used mainly to copy files into another directory. Thereupon Centralized Version Control Systems appeared and made it feasible to collaborate with developers from other systems. These system implemented by having a single server that contained all the versioned files and a number of clients. In order to avoid the drawbacks of centralization, Distributed Version Control Systems, Git is one of them, developed. These systems allow the clients to have their own copies of the whole repository.(Chacon, 2009).
ClearCase and Git are two of the most known version control systems. The former launched in the 1990s by IBM which supports parallel and geographically-distributed software development (Allen L. et al., 1995). While the later launched 2005 and adopted by Linux developer as a free and open-source distributed version control system (Ruparelia N., 2010).
According to O’sullivan (2009), the choice of a version control system is a significant process for each company that needs it. It should suit the type of work which done by the company. Moreover, as a software developer replacing one version control system with another might affect the development environment. For instance, replacing ClearCase with Git might involve challenges and benefits. One study which conducted by De Rosso and Jackson (2013) tried to shed the light on the root causes of Git’s complexity by analyzing its conceptual model and identifying some undesirable properties. Yet, there is no research has been done to address the differences between these two systems or the difficulties and the benefits that associated with each one of them. This lack of knowledge arise the question about how to tell if this replacement is a step toward a better development environment or not. Yet if these benefits and challenges addressed clearly, then the questions could be answered easily.
1.2 Research objectives
This study described the potential practical benefits and drawbacks that resulting of using Git instead of ClearCase as a version control system from software developers point of view.
The Software Development Life Cycle is seldom used at my place of work. Unfortunately, recent developments in its use are deemed confidential. Due to this fact, this paper will examine in general terms one of the projects we are undertaking right now while at the same time attempting to maintain our confidentiality.
File management is used to keep, send and receive files efficiently so the files can be accessed easily. Using naming conventions and good file structuring are some ways for people to organize files and make it easier for themselves to access their work. Without good file management, the user will use up a lot of their time and effort trying to find the right files. A good example of good file management is putting all the python files in one folder and not have them mixed up with other
Standardize procedures and project management. E.g. use the same language or coding and decoding of software.
Share the information across all but do not trust each other. All tools which are not open-source are treated suspiciously.
Github is a way for people to share open source code. It is a powerful and sophisticated repository web-system for developing software projects. It uses “Git revision control” system. It offers both paid plans for private repositories and free accounts for open source projects [1]. GitHub was the most popular and famous code repository site for the open source projects. GIT is developed by Linus Torvald. Before going in the core explanation of GitHub it’s better to describe the term “version control” system. It can be a designer or a person who works with code or the developer. They all have few multiple common tasks as:
Proprietary software is defined as computer software in which the producer has set restrictions on use, private modification, copying, or republishing. Open source and free software are pretty much the opposite, the source codes are made available which permits the user to use, change, improve, and redistribute it in an unmodified or modified form. These definitions first led me to believe that proprietary software was more secure when compared to others because of its code not being available. Recent observations have shown though that even proprietary software developers are starting to realize that open source software development has been so successful that proprietary companies have been paying attention to incorporating open source strategies into their business model. [3] These observations have led to the development of hybrid software that has elements of both proprietary and open source software. As a matter of fact, we are alread...
b.Identify appropriate project management tools and describe how they can be accessed. Provide an example of the tool and outline how it relates to your project.
Consequentially, the Waterfall consists of seven procedural steps followed in linear order, but possess small gates where information, specifications, and designs are reviewed. The seven procedural steps performed by software companies, according to Lotz (2013): “1. Gather and document requirements, 2. Design, 3. Code and unit test, 4. Perform system testing, perform user acceptance testing (UAT), 6. Fix any issues, and 7. Deliver the finished product.” However, the Waterfall methodology clear and defined linear plan provides development teams distinct guidelines for each phase of development, but the methodology still possesses pros and cons for usage. The advantages of the methodology are discipline provided by the procedural phase structure, current phase of the development team easily identifiable by vendor and client, and provides efficient knowledge transfer between team members. (Melonfire, 2008) Furthermore, the associated disadvantages of the methodology are the phases are not flexible to change, developers cannot return to a previous phase, and originally develop designs are not feasible. Finally, the trait of not being flexible deems Waterfall appropriate for well-defined projects, and projects with a fixed-price, a fixed-timeline, and a none adjustable scope. (Base36,
1.0 IntroductionIn this report I will be concentrating on the failure of software systems. To understand why software systems fail we need to understand what are software systems. Software systems are a type of information system. This is because a software system is basically a means for hardware to process information. Flynn’s definition of an information system is:"An information system provides procedures to record and make available information, concerning part of an organization, to assist organization-related activities."Humans have been processing information manually for thousands of years, but with the vast increase of demand for knowledge this century has meant that a new method of information processing has been needed. Software systems have provided a new means that is much faster and efficient.
Cooperative IS. According to Massimo Mecella et al. “is a large scale information system that interconnects various systems of different and autonomous organizations, sharing common objectives” [30]. The main problems with these information systems are the many copies of the same objects (duplicate copies) and the possibility of poor data quality from one source to spread through the cooperative systems. Thus it is very important for the individual information systems to be trusted.
In Michiel van Genuchten’s case study “Why is Software Late? An Empirical Study of Reasons For Delav in Software Development” he explores the reasons why this might be the case. The study took place in a software development department that ran for approximately two years. The department was concerned with the “development and integration of the system software in the operating system and data communicating fields”. One hundred and seventy-five engineers were in control of roughly three hundred products. It is pointed out that the time spent on each activity was difficult to measure. As a result, there was a lack of responsibility and documentation of recorded hours. The difference between the data held by the project leaders was too large to be credible. The most significant finding of this case study is that only thirty percent of activities were finished according to plan. It was discovered that there was an ever-growing difference between subsequent phases of the project. It is pointed out that forty five percent of the cause for these overages was due to organizational related issues. To conclude the study various surveys were distributed to try and figure out why the estimation of phases were off by such a high amount. It turns out that many described an over optimistic plan and an underestimation of complexity and
Detailed requirements that a small organization (less than 50 employees) can use to select project management software.
To improve the quality of work done by the developers at individual and team level, Humphrey developed the Personal Software Process (PSP) and Team Software Process (TSP). PSP shows software engineers how to plan and track their work, and how good and consistence practice of time management, good software engineering practices and data tracking can lead to high-quality software. In an interview Humphrey states that “Developers using PSP improve the quality of their work at...
A crucial component of the software development process, software documentation serves to describe the various operations or uses for computer software or source code. Commonly referred to as user guides or technical manuals, software documentation revolves around the explanation of software-related features and information, based on material published by Marie Kennan, contributor to the Salem Press Encyclopedia (Keenan, 2016). Evolving from printed manuals to vast electronic databases, software documentation encompasses nearly every component necessary to successfully use a given program or source code, and is available in an array of different formats and languages. From large-scale, company-based software development, to small-scale, personal-based
Goknil [6] the requirements of a systems cannot be static they are prone to change and new requirements emerge frequently. New and/or modified requirements are integrated with the existing ones, and adaptations to the architecture and source code of the system are made. The process of integration of the new/modified requirements and adaptations to the software system is called change management. The size and complexity of software systems make change management costly and time consuming. To reduce the cost of changes, it is important to apply change management as early as possible in the software development cycle. Requirements traceability is considered crucial in change management for establishing and maintaining consistency between software development artifacts. It is the ability to link requirements back to stakeholders’ rationales and forward to corresponding design artifacts, code, and test cases. When changes for the requirements of the software system are proposed, the impact of these changes on other requirements, design elements and source code should be traced in order to determine parts of the software system to be