XML Latest Changes Are in essay

Download this essay in word format (.doc)

Note: Sample below may appear distorted but all corresponding word document files contain proper formatting

Excerpt from essay:

The implications of security payloads and overheads on the performance of optimized XML networks (Choi, Wong, 2009) are inherent in the continual design of XML standards and protocols attempting to compress these elements and optimize their performance. The integration of security into Business Reporting Language (XBRL) is having a minimal impact on overall performance of XML networks overall, as the features in this standard are compressed (Piechocki, Felden, Graning, Debreceny, 2009). Compression is also specifically used with the XML key management specification to increase performance (Ekelhart, Fenz, Goluch, Steinkellner, Weippl, 2008). Compression algorithms are used for shrinking the contents of data containers, packets and messages so they are more compact, which increases transmission speed and accuracy. The development and continual support for XML within Web Services has transformed transactional workflows from being simplistic to be multifaceted, supporting the development of trading networks (Kangasharju, Lindholm, Tarkoma, 2008). As a result of Web Services getting used so much more because companies are choosing to use them for handling transactions with suppliers and customers, software developers are looking at how XML can be used to make Web Services more efficiently. A Web Service managing millions of transactions a day with suppliers and customers slows down to the point of sometimes not working. Software developers are looking at how to use XML as a means to spread the workload across several different instances or installations of the same Web Service so all the transactions can be completed quickly. Spreading out the workload across different Web Services installations is often called scalability (Warkentin, Johnston, 2006). Programming developing Web Services concentrate on how to make transaction workflows highly scalable so both the XML network and Web Service will be able to continue working even when millions of transactions a day are occurring.

The design objective of creating distributed order management systems that are scalable and safe enough to manage complex transactions is achievable with the current advances in AJAX and XML technologies. The concurrent design of XML-Based Intelligent Agent Protocol Design Frameworks that support role-based access and transactions can today have the potential to scale into trusted networks (Warkentin, Johnston, 2006). Many companies today are using Virtual Private Networks (VPN) to connect with their remote employees and also to secure their supply chain networks. A network that relies on VPN is a form of a trusted network ((Warkentin, Johnston, 2006). Trusted networks have a lot of potential because they provide a secure connection from one computer to another, often carrying confidential cost, price and customer data. VPN networks can run on top of TCP/IP and XML networks. Because VPN networks are compatible with XML and TCP/IP, companies are looking at how they can grow their distributed order management systems without losing performance or security.

XML is progressing rapidly to the point of being able to support secured, multi-based roles through secured private and public connections, as well (Warkentin, Johnston, 2006).

The use of HTML optimization routines and techniques that have shown initial performance gains over XML were tested only at the page level, given the fact that HTML is page-based as a development technology (Yang, Liao, Fang, 2007). Up to date there has been no research completed on the optimization of XML networks to support higher performance AJAX-based applications. This is one of the key objectives of this study, to determine how to optimize the performance of XML networks and AJAX applications to attain the highest levels of transaction efficiency and performance possible.

Studies and tests have shown however that HTML-based applications, when used in conjunction with XML, lack the inherent ability to be optimized due to the inherent limitations in the HTML technology. Attempts to optimize the performance of HTML have continued to be mixed in their results due to the page-based approach taken to defining content, navigation, and page structure (Choi, Wong, 2009). HTML's performance as a programming language is further reduced by the many scripting languages that lack the necessary and critical security upgrades needed to make them suitable for use in transaction-intensive networks. When all of these factors are taken into account it is clear that AJAX and optimized XML networks have significant upside potential for performance and improvement. The intent of this research is to measure the performance of AJAX applications on XML and TCP/IP networks. Once measures of AJAX performance are completed on each network, it will be possible to determine how larger, more complex networks will perform. These larger, more complex networks are called exchanges, and often they have many different suppliers, buyers, and customers included in them. The performance of AJAX applications over TCP/IP and XML networks will provide insights into how these larger networks, called exchanges, will perform.

To attain the research objectives, it is necessary to concentrate on those parameters that best quantitatively measure the performance gains of XML networks and AJAX application performance. The XmlHttpRequest command is used to measure the relative speed and performance of the network. The XmlHttpRequest command is specifically used for requesting and delivering content of all types throughout an XML network. As this is a JavaScript-based command, it can also be used as part of a container-based metafile testing methodology which is used in the series of research efforts completed. The XmlHttpRequest command can also deliver or retrieve content, and therefore forms the foundation for an effective construct for measuring the performance of the network over time. This command is also used for transporting metafiles throughout the networks in a four-square test frame to normalize the specific interferences of the network transport, as well. Every attempt has been made to remove any item or factor that will detract from the accuracy of the research. This is why the network, operating systems and servers are all consistent. This is also why the decision was made to use the XmlHttpRequest command. This command not only manages the sending and receiving of content, it tracks network performance. As all networks have highly randomized resource loads, meaning they are very busy or slow depending on user's needs, it is important to also have this factor introduced. It had to be introduced randomly into the analysis however. That is why the XmlHttpRequest was also used, because it can have randomized packet sizes, which will make the network either very busy or slow.

How companies are using XML networks and the performance challenges they encounter form the foundation of this analysis. Specifically focusing on the use cases of enterprise-wide adoption of databases to support transaction systems, this dissertation has specifically defined randomized traffic flows both in duration and payload to determine how optimized XML networks would perform over an extended period of time. XmlHttpRequest is one of the commands that is a foundation of the XML command set. It carries transaction data within its containers and uses an index value of other networks to navigate to the system where the transaction data needs to go. To fully replicate a distributed order management environment, the use of a four-square-based test-bed has been devised in a closed-loop testing region. The use of AJAX-based applets for measuring the performance over the network regardless of payload and specifics of data transfer components in frames has also been taken into account in the methodology. This test was completed in a lab that did not have Internet access to any of the servers used in the testing. This was done to eliminate any Internet traffic that could potentially influence the results. It is a common practice in larger companies who have software engineering teams to do all development in labs where no Internet access is on the servers. This is done for security purposes, and to make sure the applications are not slowed down due to other services running on the servers that may access the Internet automatically. This was critically important, so the tests contained here would effectively replicate the performance of a private trading exchange (PTX). The PTX has become a standard framework for many companies who have extensive supply chains and sales channels. Proctor & Gamble (P&G) for example has one of the most diverse supply chains in the consumer packaged goods industry and has standardized on the PTX framework. P&G also has a diverse distribution channel that includes grocery stores and chains, mass merchandisers including Wal-Mart, Tesco and others, and packaged good wholesalers. The PTX framework is useful to P&G because it brings together[continue]

Cite This Essay:

"XML Latest Changes Are In" (2010, May 15) Retrieved October 21, 2016, from http://www.paperdue.com/essay/xml-latest-changes-are-in-2425

"XML Latest Changes Are In" 15 May 2010. Web.21 October. 2016. <http://www.paperdue.com/essay/xml-latest-changes-are-in-2425>

"XML Latest Changes Are In", 15 May 2010, Accessed.21 October. 2016, http://www.paperdue.com/essay/xml-latest-changes-are-in-2425

Other Documents Pertaining To This Topic

  • Change Management a Case Study of

    Change Management -- a Case Study of British Telecom About CRM Theoretical Perspectives, Concepts and Practices Involved in Implementing a CRM Change Management About British Telecom British Telecom -- Implementing CRM CRM Systems -- Data Quality and systems Integration British Telecom -- A Case Study BT's Solution Analyzing BT's CRM from an Academic Perspective An Example of Systems Integration British Telecom -- Building Customer Relationships Problems with Implementing a CRM System Change Management -- A Case Study of British Telecom Today, when one thinks

  • Designing XML Databases

    Designing XML Databases What exactly is a 'web-enabled database? The World Wide Web, as everyone knows, provides the user with a host of tools with which he cane gain access to information and knowledge on the Internet and browse for information using web browsing technologies. Numerous people also use web browsing in order to deliver marketing messages, advertising information, promotional material for any products, and so on. However, what is most

  • Wide Web Is Available Around

    The reward for the effort of learning is access to a vocabulary that is shared by a very large population across all industries globally" (p. 214). Moreover, according to Bell, because UML is a language rather than a methodology, practitioners who are familiar with UML can join a project at any point from anywhere in the world and become productive right away. Therefore, Web applications that are built using

  • Amazon com a Strategic Assessment of Amazons E Strategies

    Amazon.com A Strategic Assessment of Amazons' e-Strategies Amazon's remarkable ascent as one of the top online global retailers can be attributed to the foresight they had in creating a comprehensive distributed order management, Enterprise Resource Planning (ERP), Supply Chain Management (SCM) and e-commerce series of systems. The many other e-commerce sites that rose quickly with massive infusions of venture capital just as quick exited the market, flaming out due to a lack

  • Sustainability of Forest Logging in

    In this regard, Green and her colleagues emphasize that, "The corporate wealth of logging giant Gunns, Ltd. (which controls over 85% of the state's logging, is the world's largest hardwood woodchip exporter, and is worth over one billion dollars) has not trickled down into the state's economy" (2007, p. 95). Despite the enormous range of wood products, particularly its valuable hardwoods, that could be produced from Tasmania's forests, more than

  • Cloud Computing Information Security in

    This approach to defining a performance-based taxonomy will also allow for a more effective comparison within industries as well. All of these factors taken together will provide enterprise computing buyers with more effective foundations of arguing for more thorough measures of application performance. The net result will be much greater visibility into how cloud computing is actually changing the global economics of the enterprise computing industry. III. Final Report: Introduction The foundational

  • Social Engineering Information Security

    Social Engineering and Information Security We are in an age of information explosion and one of the most critical problems facing us is the security and proper management of information. Advanced hardware and software solutions are being constantly developed and refined to patch up any technical loopholes that might allow a hacker attack and prevent consequent breach of information security. While this technical warfare continues, hackers are now pursuing other vectors

Read Full Essay
Copyright 2016 . All Rights Reserved