Information And Communication Technology And Its Impact On Global Business
Advantages and Disadvantages of Outsourcing IT
Information and Communication Technology is changing how the enterprises are operating in the global economy and resulting in creating competitive advantages for international enterprises. According to Laudon and Laudon (2016), the information and communication technology is the biggest driver of the present day economy innovation. Along with innovation, the ICT is the driver of economic growth and productivity. The ICT has a positive impact on the global element of society and business as well as ICT transforms those entities. The value in the global economy has been greatly increased by the innovations made in information and communication technology in the recent years. The economic structure, jobs and business model viability is associated with the pace of the technological innovations as well as seamless spreading of the technology across nations. Some jobs are created and some are modified by the innovation in the technology. The global value chains and productions has been reshaped by the ICT in the global economy (Brettel et al., 2014). ICT industry pioneered the internationalization of dispersed global production systems. The main impact of ICT in creating the competitive advantage in the global economy is the communication. The ICT has an enormous advancement in the communication and information sharing has enabled the organizations to share the data worldwide. The organizations have been able to track the activities and stock of an office in another country easily and accurately. Another big competitive advantage in global economy that has been enabled by the ICT is the decision making. With accurate data, the enterprises are making critical decisions accurately.
The advantages of the Outsourcing IT are as following.
- The management will be able to defer the descriptions to a specialized organization.
- Through outsourcing the IT the client organizations can focus on the primary and core concerns of the organization while the expertise organization will be handing the IT solutions (Toffolutti et al., 2017).
- As the outsourced organizations are experienced and have IT experts, it will be able to provide a better and effective IT solution to the organization.
- Accessing the world class capabilities is a major advantage of outsourcing. As the organizations are very specialized in their work, these third party outsourcing service providers brings the world class quality resources to meet the requirements of the clients.
- Various organizations invest in the information and communication technology in various ways, as per their business goals and objectives.
The disadvantages of the Outsourcing IT are as following.
- If the outsourcing provider organizations are selected properly then the overall quality of the work can be of poor standards
- The communication between the clients and service providers may face difficulty and often it has been seen that the communication is completely absent
- The organizations have to share their business related crucial data to the external entity. This creates or increases the possibility of information leakage (Lacity, Yan & Khan, 2017)
- The lack of communication between the client and outsourcing organization can be resulted into improper timelines
The benefits/advantages associated with this virtual storage platform are as following.
- The cloud prevents the organizations from concentrating on the cost of the hardware and software that are essential for system implementation. The cloud service providers are responsible for implementing the required hardware and software.
- The virtual storage platforms are easy to access from any location. Thus the end users of the system can access the system from any location with the use of smartphone and proper internet connection.
- The virtual storage platforms are easy to maintain as the provider will be maintaining all the aspects of the technology (Hashem et al., 2015). The service provider will be installing the packages and updating the system as soon as a new version of installed software releases.
- As cloud is able to allow the organizations to collaborate easily, the virtual storage platforms are best solutions for the organizations share the data among themselves. This practice is most visible in the medical industry.
The challenges/risks associated with this virtual storage platform are as following.
- The loss of control over the sensitive data is the most common risk of the virtual storage platforms. The biggest virtual storage platforms are facing this security issue regularly. Having complete control over the organizational data is crucial for the enterprises.
- The theft of sensitive data is another major issue. The service provider itself can sell the crucial business data to the competitors (Rittinghouse & Ransome, 2016).
- The data stored in the virtual storage platforms travels from cloud to user and vice versa. This creates a serious security concern while the data is transmitting through open internet.
There are our attributes in the ERD such as client, antiques, sell and buy. These four attributes has been created for storing all the data regarding the business process in a database or any other central location. The client attribute will be storing all the data of a client that will be required by the organization in conducting business process. There is a unique identification attribute in each of the tables. These attributes will be serving as the primary key in the database. These primary keys will be used in storing each row of record. The database will search each row in the tables based on these primary keys. Taken as an example, if two persons have same name and live in same address then the system will not be able to identify a row uniquely. The primary key separates these two similar rows.
Benefits and Challenges of Virtual Storage Platforms
In case of sell and buy tables it has been assumed that the client and antiques ID will be references (foreign keys). The foreign keys will connect the client and antiques tables with buy and sell tables. This entity will record which client has sold/bought which antique. It is essential for tracking the business related processes. Stillwater Antiques wants different types of information for buy and sell so few of the attributes are dissimilar otherwise basic layout of the entity is same.
The un-normalized tables has various anomalies like insertion, deletion and update. The user cannot insert data as intended or delete a data from the table as per business profit. The biggest problem is in update as the redundancy and data inaccuracy issue is a huge issues in database management.
Taken as an example, an organization has an employee table as following.
Employee (EmpID, EmpName, DeptName, DeptType)
Now if the organization wants record a department then the database will not accept the insertion (Alotaibi & Ramadan, 2017). If the primary key is not present in the to-be inserted values then the data will be not be stored. This results in the issue that an organization cannot have a department without any employee. Many organizations have few department or roles that are to be filled up for a short period of time and then the employees are transferred to their designated roles. In this cases, every time the department has to be created.
If the organization is trying to remove all the employees form a single department to various another department and intends to fill the department later will recruited employees thent it will not be possible to have the name of the department in the database until a single employee is registered in that department (Coronel & Morris, 2016). Suppose the training is complete and the staff will be send to their respected positions and work. Now until new trainees are not registered the training department will not be existing in the database.
In terms of updating an employee data who is filling a role in a exiting department and also taking training, if any update is made in one single entry then all the other entries of that employee has to be updated (Alotaibi & Ramadan, 2017). In case this is not followed then the data inaccuracy problem will occur. Of the type of the department is changed then all the entries against that department has to be changed.
Problems of Un-normalized Tables and Concurrent Access
The concurrent access problems and the ways of managing them are described as below.
Update Lost:Suppose two transactions are updating the same data element at the same time and one of the updates are lost (Leis, Kemper & Neumann, 2014). The transaction T1 is updating the value of an attribute by 10 and another transaction is updating the same attribute by reducing the value by 20. Now if T1 transition execute write command before or vice versa and the commit is not done then the first transaction to execute will be lost. This issue can be prevented by restricting a transaction to write a attribute until a transaction commits the changes.
Uncommitted Data: This can be occurring during the T1 is rolled back after T2 has already accessed the committed data. Suppose the T2 is getting the latest value of an attribute but the T1 transaction has been rolled back after it. Then the T2 transaction will be processing the inaccurate data. This issue can be avoided by allowing a transaction to access a value only after a previous transaction is completely executed.
Pessimistic Locking:The notion of pessimistic locking refers preventing two or more process accessing a record within the database at the same time. This locking illustrates that while a process intends to load an object for updating it, it puts a lock on the equivalent record.
Inconsistent Retrievals:The occurrence of the inconsistent retrievals can be seen while a transaction access a specific data before or after another transaction is writing the data.
The systems development life cycle or SDLC is the process of creating a system through various organized procedures. These procedures are completed by one or more project staff. A SDLC can be done through a sequence of phases followed by previous one or an iterative phases which are carried out continuously until the final product is generated.
The basic phases of the SDLC phase are as following.
- Requirement gathering:Gather business and user requirement.
- Design:The UML and other designs to support implementation.
- Coding or implementing:The actual implementation of the system where the prototypes are developed (Sharma & Misra, 2017).
- Testing:Testing the functionalities of the developed application.
- Deployment:Deploying the developed application in the organization environment.
- Maintenance:Maintaining the developed system to support business objectives.
The advantages of the SDLC model are as following.
- The model allows the project manager to define the activities properly
- The sequence flow of the activities are determined early in the project
- The timeline of the project is developed in the planning phase
- Gathering the requirements of the project becomes easy (Bartolini & Lenzini, 2017)
- Sharing the resources among the activities is more effective
- Each of the project staff has a clear understanding of their responsibilities and work
The disadvantages of the SDLC model are as following.
- The primary disadvantage of the SDLC model is excessive cost. In terms of small projects, additional costs are spent in planning and other partially project related activities (Karim, Saba & Albuolayan, 2017)
- As each of the staff has different role, if any staff leaves project before completion, the project can be harmed badly
The Data Flow Diagram is the visual representation of flow of data or information within a system or application (Wei et al., 2014). The data flow occurs from process to process with or without the intervention of the user.
The Data Flow Diagram has four components. The components are as following.
- External Entity:A system resides outside of the system that sends or receives the information. These components can trigger a process within a system. Taken as an example, if the user logs in, the system will input login id and password and validate those data.
- Process:The processes are done to execute the present data. The process can be a validation, information generation, verification or any other type of process that change the system status. There may be various sub processes under a particular process.
- Data Store:It is essential for a system to store data for further use. The files or other data storing technology used by the system is represented in the DFD models as data store (Dick, Hull & Jackson, 2017).
- Data Flow:This is the most significant component of the DFD model. This is the way of showing the data flow among the data stores, external entities and processes. The data flow can occur in both direction.
The biggest advantage of DFD is that it is capable of defining the boundaries of the system. The users can get better and concise idea of the system through the existing knowledge of it. It is easy to identify and it is also a straightforward graphical representation of data movement in the system.
Systems Development Life Cycle (SDLC) Phases
A system analyst is considered to be an IT expert who has the responsibility of ensuring effective functionality of the systems and IT infrastructure of the organization. In a project, a system analyst has the role of identifying the system requirements of the system from the business requirements. The system analyst derives the system requirements based on the organization goals so that the system can be used for better profit making. A system analyst suggests a course of actions coordinating with the stakeholders of the project to meet the particular requirements of the project.
The following are the characteristics or qualifications that a system analyst must have.
- A system analyst must have the technical knowledge and skills regarding both the techniques and tools. The system analyst should be able to analyze requirements of any operating system based application (Earle & Ericson, 2014).
- The system analysts take part in the project in the early stages. The system analyst should have proper knowledge about the planning and requirement analysis stages.
- The system analyst should be capable of understanding the business issues. This understanding allows the system analyst to identify the system requirements properly.
- A system analyst must be thinking analytically and logically to solve an issue
- A system analyst should be able to work in a group or alone
- A system analyst needs to have good imagination power and great reasoning power
- A system analyst requires to update the personal skills and IT knowledge continuously
There are four kinds of quantities designs such as correlational, descriptive, experimental and quasi-experimental. The qualitative designs are of three types such as ethnography, phenomenology and grounded theory. On the other hand, the mixed methods refers to the methodology of mixing both the qualitative and quantitative data into one examination. The researchers first identify the complexity of the intervention. It is because the complexity of the intervention assist in determining what methodology is to be used. In case the natural setting is available, the researchers are tend to do the qualitative method (Creswell & Creswell, 2017). The researchers must be looking for the nature of the settings. If the numerical analysis and statistics can be used for generating a solution or understanding the theory then quantitative research method is most suitable. Researchers can utilize a convergent design for comparing findings from quantitative and qualitative data sources. For this the researchers must consider the factor of colleting the qualitative and quantitative data simultaneously. The researchers have to make sure that qualitative data collection process has been used along with quantitative instrument phase. However, the proper quantitative instrument development can be supported by mixed methods study design so that accurate measures in a PCHM context can be supported (Mertens, 2014). This investigative chronological design comprises first assembling qualitative investigative data, examining the evidence, and utilizing the results to improve a psychometric tool well adapted to the model under study. This tool is then, in turn, managed to a sample of a populace.
The application of theory base are used in the study only the works that are essential will be referenced. The analysis that is referenced will be rigorous. The theoretical analysis should not be copied form the referenced paper instead the researcher should be pouting additional fresh ideas to the theory. The readers must be able to understand the theory that has been referenced as well as the newly established theory (Bernard, 2017). The theories should be pointing to a single goal that is to be achieved through the study. The researchers can be using various theory base researches as references.
Understanding Data Flow Diagrams (DFDs)
A “literature review” should cover all of the scientific literature in a field that is defined by the author. Much of this is usually achieved by reference to previous reviews. By doing this, it is not necessary to review all of the past literature, but all studies that have been published since the most recent thorough review should be included (Creswell & Creswell, 2017). The amount and quality of analysis in a review may vary from little (that is a mostly descriptive review) to a lot such as an analytical review. The researchers must avoid any kind of plagiarism by avoiding the exact words written in the literature review. The researchers must be able to justify whether the reviewed literature has presented the right information in the study. Adding additional researched information is a mesentery task as it helps the researcher to make his/her paper unique (Mertens, 2014).
The common research strategies of quantitative research are as following.
- correlational,
- descriptive,
- experimental
- and quasi-experimental
The common research strategies of qualitative research are as following.
- ethnography,
- phenomenology
- and grounded theory
The common research strategies of mixed methods research are as following.
- Sequential Transformative
- Concurrent Triangulation
- Concurrent Nested (Bernard, 2017)
- Concurrent Transformative.
The key factors that are to be considered while choosing a proper strategy for research study are as following.
- Philosophical Worldviews:It is suggested that the individuals who are doing a plan or a research do clear the philosophical ideas.
- The Postpositivist Worldview:The conventional forms of researches are represented by the The Postpositivist Worldview. Regarding the quantities researches the assumptions made in the strategy are hold true.
- The Constructivist Worldview:It is also referred to as the social constructivism is like a perspective and it is also considered to be an approach suited for qualitative research.
- The Transformative Worldview:Another set of researchers clenches to the philosophical expectations of the transformative method (Creswell & Creswell, 2017). This worldview states that research review requires to be entangled with the policies and policy alterations program for antagonizing social domination irrespective of the occurring level.
- The Pragmatic Worldview:This worldview has been come from the pragmatists. Philosophy has been represented in our lives through various forms. However for various researchers, the pragmatic worldview is generated based on the action happened as well as consequences and circumstances instead of predecessor settings.
The main criteria of having a proper scope management is clear and concise communication. It is assisting in making sure that all the stakeholders and the staff of the project management are able to understand the scope of the project. The scopes are essential for determining how the goals and objectives of the project will be met. There are various risks that might affect the project but with the use of proper scope management plan these risks can be avoided. The scope allows the team members to look for the loop holes in the project that might invite those risks. If the scopes are properly identified then it will be easy for the project manager and project sponsor to recognize what are required to complete the project. A major issue of improper scope identification is that many requirements are collected at the beginning of the project (Goetsch & Davis, 2014). In order to add new scopes in the project, the project manager has to face iterations that are tremendously time consuming. The proper scope management plan allows the project manager to have all the essential requirements at hand at the beginning. The example of good scope management practice is that the project manager allocates adequate amount of time for collecting data and generate scopes of the project so that all the project related scopes are identified.
Key Characteristics of a System Analyst
Communication is done for two basic level purposes such as providing information to someone or getting information from someone. The staffs are not having all the information while start their work. The project manager have to make them understand all their responsibilities and roles. The communication is also essential in sharing the project progress and risk related data to the stakeholder. If the project manager makes a proper communication management process then all the stakeholders will be able to have all the project related data. This will assist the project manager to have support of the stakeholders all the time (Pelto et al., 2016). An example of good communication management plan is that the project manager attends meeting periodically with the stakeholders and project owner.
The entire internal and external landscape of operation have to be considered while understanding the view of the project in terms of system viewpoint. This illustrates that the project must be considered as the whole process of the having proper understanding of the system operations that will be executed. The system interfaces, hardware or other technical aspects are not the only element of system. The system is vast than most of the system owners think it is, it is the collaboration of ICT with all the business processes and business goals.
The project manager must have complete view of the enterprise, the approach of the system, in order to make the project success. A project is undertaken by an organization with the purpose of having additional profit from the business and gaining competitive advantage. The system view will allow the project manager to have a complete view of the system objectives and the environment it will be deployed in (Brettel et al., 2014). If the system view is not completely understood then the project will be harming the business of the organization resulting in loss instead of profit form the project. Taken as an example, the project manager did not understand the levels of the customers and how they will be getting benefits based on the products they buy. This may result in providing more benefits to unintended customers while the intended customers get less benefits on purchase. This will dissatisfy the loyal customers and the benefitted customers may not be buying from the organization after the issues is revised.
The project manager will be using WBS as the core of good project management planning for the following purposes.
- Enable higher productivity: The work breakdown structure is essential for understanding the set of skills required to complete distinct activities of the project. This allows the project manager to recruit only those team members who are expertise in the identified skill set. Through the WBS, the project can identify how many people are needed to complete an activity. This prevents the wastage of human resource.
- Provide Steps that are More Detailed:The team members will be able to identify how many steps are required to complete a particular activity. The activity can be formed of two or more number of steps. The team members will be understanding the connection between the steps and the deliverables more concisely.
- Encourage Transparency and Accountability:The project manager has to have proper control over the team members until the project is completed. The work breakdown structure allows the project manager to have proper guide line so that all the team member binds by a fine line of responsibility.
Types of Research Designs and Methodologies
A way of using the work breakdown structure dynamically is estimating projects accurately. The project requirements are not always static. It has been seen may times that as the project progresses, the project requirements change due to the need of meeting the project objective.
Alotaibi, Y., & Ramadan, B. (2017). A Novel Normalization Forms for Relational Database Design throughout Matching Related Data Attribute.
Bartolini, C., & Lenzini, G. (2017). Law and the software development life cycle.
Bernard, H. R. (2017). Research methods in anthropology: Qualitative and quantitative approaches. Rowman & Littlefield.
Brettel, M., Friederichsen, N., Keller, M., & Rosenberg, M. (2014). How virtualization, decentralization and network building change the manufacturing landscape: An Industry 4.0 Perspective. International Journal of Mechanical, Industrial Science and Engineering, 8(1), 37-44.
Coronel, C., & Morris, S. (2016). Database systems: design, implementation, & management. Cengage Learning.
Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and mixed methods approaches. Sage publications.
Dick, J., Hull, E., & Jackson, K. (2017). Requirements engineering. Springer.
Earle, T., & Ericson, J. (2014). Exchange systems in prehistory. Elsevier.
Goetsch, D. L., & Davis, S. B. (2014). Quality management for organizational excellence. Upper Saddle River, NJ: pearson.
Hashem, I. A. T., Yaqoob, I., Anuar, N. B., Mokhtar, S., Gani, A., & Khan, S. U. (2015). The rise of “big data” on cloud computing: Review and open research issues. Information Systems, 47, 98-115.
Karim, N. S. A., Saba, T., & Albuolayan, A. (2017). Analysis of software security model in scenario of Software Development Life Cycle (SDLC). Journal of Engineering Technology (ISSN: 0747-9964), 6(2), 304-316.
Lacity, M., Yan, A., & Khan, S. (2017). Review of 23 Years of Empirical Research on Information Technology Outsourcing Decisions and Outcomes. In Proceedings of the 50th Hawaii International Conference on System Sciences.
Laudon, K. C., & Laudon, J. P. (2016). Management information system. Pearson Education India.
Leis, V., Kemper, A., & Neumann, T. (2014). Exploiting hardware transactional memory in main-memory databases. In Data Engineering (ICDE), 2014 IEEE 30th International Conference on (pp. 580-591). IEEE.
Mertens, D. M. (2014). Research and evaluation in education and psychology: Integrating diversity with quantitative, qualitative, and mixed methods. Sage publications.
Pelto, G. H., Martin, S. L., Van Liere, M., & Fabrizio, C. S. (2016). The scope and practice of behaviour change communication to improve infant and young child feeding in low?and middle?income countries: results of a practitioner study in international development organizations. Maternal & child nutrition, 12(2), 229-244.
Rittinghouse, J. W., & Ransome, J. F. (2016). Cloud computing: implementation, management, and security. CRC press.
Sharma, A., & Misra, P. K. (2017). Aspects of Enhancing Security in Software Development Life Cycle. Advances in Computational Sciences and Technology, 10(2), 203-210.
Toffolutti, V., Reeves, A., McKee, M., & Stuckler, D. (2017). Outsourcing cleaning services increases MRSA incidence: Evidence from 126 english acute trusts. Social Science & Medicine, 174, 64-69.
Wei, F., Roy, S., & Ou, X. (2014). Amandroid: A precise and general inter-component data flow analysis framework for security vetting of android apps. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (pp. 1329-1341). ACM.