Information Technology IT And Architecture Essay

¶ … IT architecture? The architecture of IT has to be created through the development of models, guidelines, and specifications (Allen & Morton, 1994). The kinds of processes that are generally used have been created in recent decades in order to meet the needs of those who are focused on the quality of IT. With that in mind, IT architecture is the structure that is designed to actually operate and use IT properly. Without it, the IT department of any company would not be nearly as successful, and that could cause the entire company to struggle. Any good IT system has to be built around the specifications that are needed in order to allow it to work the way it is intended and provide what is needed for the company (Allen & Morton, 1994). An IT department has to be ready for nearly everything, because companies rely so much on technology that they would be lost without it.

The operation of the IT department is essential to the operation of the company, and if the architecture behind the IT department is not designed to handle the company's needs, that can be reflected in the quality of service given to customers (Allen & Morton, 1994). At that point, the company may lose those customers because it will not be able to provide them with what they need. Employees at the company can have a similar feeling about the business if there are issues with technology, because it does not inspire confidence or encourage a person to want to continue to work for that company (Allen & Morton, 1994). When IT architecture is set up properly, though, employees and customers both get great benefits from that, as it provides a great deal of value to everyone involved.

Allen, T., Morton, M.S. (eds.). (1994). Information technology and the corporation of the 1990s. New York: Oxford University Press.

2. Explain the three-tier software architecture design.

Three-tier software architecture design is a type of client-server architecture. There are separate platforms and independent modules that are developed and maintained, including data access, functional process logic, user interface, and computer data storage (Fowler, 2002). By having these in independent modules and on separate platforms, it can protect one module or platform if something is to happen to any or all of the others. The tiers can also be replaced independently, and they can be upgraded, as well (Fowler, 2002). A desktop PC can handle the standard user interface, and various modules can easily run on the application server (Fowler, 2002). The middle tiers are multitiered, and the database server contains the computer data storage logic (Fowler, 2002). The three separate tiers are the presentation tier, application tier, and the data tier (Fowler, 2002). Each one has a very specific function, and they all work together to provide success from start to finish.

The presentation tier is on the top level, and displays all the information that is related to the website's available services. It is able to communicate with the other tiers by sending results to the other network tiers and the browser (Fowler, 2002). The second tier is the application tier, which is also called the logic tier or business logic. It is attached to the presentation tier, and performs highly detailed processing, which allows it to control the overall functionality of the application itself (Fowler, 2002). The third tier is the data tier. It houses the servers where all of the information is located and retrieved from (Fowler, 2002). The information that is stored on this tier is independent of business logic or the application servers, so it is directly protected from problems that could occur on one of the other tiers (Fowler, 2002). Having a multitiered approach is excellent protection for any architecture design.

Fowler, M. (2002). Patterns of enterprise application architecture. NY: Addison Wesley.

3. Define triple constraint. What is the critical path?

...

When something is done to one of the constraints, it affects what happens to the other two constraints (Lewis, 2005). When a person has to change a project, he or she needs to be aware that any change can become a big change depending on how that change affects each one of the legs of the triangle. It is very important that a project is managed with each of the constraints in mind, all the time (Lewis, 2005). By doing that, there is less of a chance that the project will become something with which the client is not happy, or that it will take too long to complete or be over budget. These kinds of things can still happen, but it is easier to avoid them if a person is very clear on what he or she needs to address regarding the constraints themselves. The quality of the service or product will also be affected when one or more constraints are adjusted, and that is something about which a company must definitely be aware (Lewis, 2005).
The critical path is an important part of addressing a triple constraint. It defines the total length of the overall project (Lewis, 2005). Any task along the critical path that is changed or that slips, even by a day, can drastically affect the entire length of the project itself (Lewis, 2005). If that takes place, a company could be in serious trouble based on promised deadlines, which could lead the company to take shortcuts that would cause problems with the quality of the good or service it had agreed to provide (Lewis, 2005). That can generally be avoided by providing some leeway in the critical path, because that allows for small difficulties and minor changes that will not completely derail the entire timeline for the project. The more careful the company is with the triple constraint triangle and the critical path, the more likely it is that the company will be successful in its project endeavor.

Lewis, J.P. (2005). Project planning, scheduling & control. (4th ed.). NY: McGraw Hill.

4. Define the eight stages of the SDLC.

The eight stages of the Systems Development Life Cycle (SDLC) are system investigation, system analysis, design, environments, testing, training and transition, operations and maintenance, and evaluation (Blanchard & Fabrycky, 2006). The system investigation stage is focused on determining an opportunity or a need within the current system. The system analysis narrows the focus even more, to determine exactly where the opportunity or problem lies (Blanchard & Fabrycky, 2006). That is needed before an attempt to actually fix the system can be made. By breaking the system down into components, the area where the problem is being seen can be more easily discovered. Then design takes place, in order to clearly see how the problem is going to be fixed within the system, and what kinds of benefits will be achieved (Blanchard & Fabrycky, 2006).

Environments are controlled areas, so developers can build, install, and test things before they "go live" in the system itself. The creation of these environments is a valuable and protective step in the process (Blanchard & Fabrycky, 2006). Testing comes next, and it needs to be thorough, clear, and extensive in order to make sure every possible scenario has been considered. Overlooking things can cause serious problems down the line. Adequate testing of a system logically moves into training and transition, so everyone who needs to know how to work the system is able to do so during and after the transition (Blanchard & Fabrycky, 2006). Then the system will continue to operate, and it will be maintained and monitored in order to make sure it does not develop further problems (Blanchard & Fabrycky, 2006). Evaluation is the final step, used to measure how effective the changes actually were (Blanchard & Fabrycky, 2006).

Blanchard, B.S., & Fabrycky, W.J. (2006) Systems engineering…

Sources Used in Documents:

Green computing is the study and creation of any type of computing that is environmentally sustainable (Kurp, 2008). This can include designing and building computers, but also using and disposing of them properly, in a way that has minimal impact on the environment (Kurp, 2008). Naturally, this is an important issue to consider. The environment is fragile, and there are a number of activities that are working against it. Greenhouses gases, climate change, vanishing ecosystems, and other problems are all issues the environment has to face. Many people who work with computers want to reduce their footprint and take better care of the environment, which they can do in a couple of ways. One of those ways is to buy, build, and use computers with longevity (Kurp, 2008). In other words, the longer a computer is made to last the longer it will be before it ends up in a landfill and a new one has to be purchased. When a person does get rid of old electronics, there are recycling places that will take them specifically and dispose of their components safely, further protecting the environment (Kurp, 2008).

The second way a person can practice green computing is in the parts and components that he or she purchases, because some of them are much more environmentally friendly than others (Kurp, 2008). When parts that are easy on the environment are used to build computers, those computers are a better choice for ensuring that the environment is not damaged by people wanting and needing computers for both business and personal use (Kurp, 2008). It is not always possible to protect the environment one hundred percent of the time, but there are many ways green computing can be practiced and used in everyday life (Kurp, 2008). Conscientious people who want to preserve their planet know the value of green computing, and will continue to abide by it as much as possible.

Kurp, P. (2008).Green computing. Communications of the ACM, 51(10): 11-23.


Cite this Document:

"Information Technology IT And Architecture" (2014, April 30) Retrieved April 19, 2024, from
https://www.paperdue.com/essay/information-technology-it-and-architecture-188694

"Information Technology IT And Architecture" 30 April 2014. Web.19 April. 2024. <
https://www.paperdue.com/essay/information-technology-it-and-architecture-188694>

"Information Technology IT And Architecture", 30 April 2014, Accessed.19 April. 2024,
https://www.paperdue.com/essay/information-technology-it-and-architecture-188694

Related Documents

Migrate off of any individualized content management systems and processes not integrated to a single portal platform for greater cost and time savings in administration. Olson (32) provides an excellent case study on how universities are making use of open source portal applications to alleviate redundant and often conflicting data in multiple portals on an IBM WebSphere platform Define and build out a portal development plan that encompasses all shared processes

Information Technology -- Annotated Bibliography Information Technology Baker, N. (2011). The Borderless Enterprise. Internal Auditor, August, 28 -- 33. This article endeavors to explain the various trends in digital media. The author contends the use of digital technology is evidence of a deeper trend and shift in global culture. The article is as philosophical as it is technical. This article could be considered a technical article or a philosophy of technology article. Durkee, D.

It is possible to avoid becoming a victim of such crimes through some basic precautionary methods. Firstly, it is essential to store or discard personal information in a careful manner. This means personal documents should not be left lying around or shared through internet. All personal information should be stored in a secure place. All sorts of old receipts, expired cards, bank statements and checks should be disposed of

Information Technology The reference company is the oil giant Chevron. Because there is a growing demand for the development of Internet, intranet, and extranet business-to-business (B2B) transaction capabilities, the Stamford, Connecticut -- based Gartner Group predicts that by the year 2004 more than 50% of all enterprises will use the Internet for more than 80% of their external procurement activities. Meanwhile, almost every company offering products and/or services has developed or

Linux Kernel Analysis Much has been written in praise of the Linux (Crandall, Wu, Chong, 359), (Parnas, 112), (Baliga, Iftode, Chen, 323), and its use of preemptive multitasking memory architectures to manage process control, file management, device management, information maintenance and communications subsystems securely and effectively. The Linux modular design, lack of reliance on Remote Procedure Calls (RPC), and use of UNIX-based system administration all are often cited as factors in how

Implementing a private Cloud-based ILS will give each department real-time availability of data from both the ECM and EKM systems, tailored to the specific roles through the use of customizable ILS-based interfaces. One of the most critical success factors of implementing any new enterprise system is to plan for making everything form the workflows to individual screens as permeable and customizable as possible (Lampert, Vaughan, 2009). This is essential