Server Architectures Using A Fat Assessment

PAGES
3
WORDS
812
Cite

Thick client technology can support more efficient asynchronous transfers and the independence of data models at the individual worker level can significantly improve the overall performance of a given project. Thick client/server networks are especially well suited for concurrent engineering tasks given their data management and overhead requirement specifications (Lee, 2002). In the field of engineering consulting and design, it is also critically important for companies to have secured networks capable of collaborative design sessions and concurrent design sessions. The thin client technology alone can't scale to this requirement, and when companies have pushed this technology to this level of performance, security compliances have become commonplace (Vlissidis, Hickey, 2010). In other words, even if an engineering and design company strove to create a thin client/server network to support its collaborative engineering and concurrent workflows, it would fail on the security aspects of performance alone. Empirical studies have also shown how effective thick client/sever architectures are for managing server resources more effectively than their thin client counterparts. For the engineering consulting and design firm, the licensing costs for the highly complex CAD software is among their highest overall costs of operating their business. CAD software is much less expensive on thick client configurations as software companies charge a premium for thin...

...

A thick client/server architecture will have on licensing costs by assigning software licenses only to those systems that need them.
Conclusion

Thick client/server systems are well-suited for the data intensive tasks that an engineering consulting company will have running its CAD applications, their need for concurrent engineering support, and the data-intensive aspects of visualizations for design walk-throughs. All of these customer requirements point to the need for a thick client architecture for this specific need.

Sources Used in Documents:

References

Guynes, C.S., & Windsor, J. (2011). Revisiting Client/Server Computing. Journal of Business & Economics Research, 9(1), 17-22.

Lai, A.M., & Nieh, J. (2006). On the performance of wide-area thin-client computing. ACM Transactions on Computer Systems, 24(2), 175-209.

Lee, S.K. (2002). Client server-based distributed architecture for concurrent design of DCS networks: A case study. Integrated Manufacturing Systems, 13(1), 47-47.

Royster, K., & Reed, J. (2008). Security audits: Don't ignore thick clients. Network World, 25(30), 21-21.


Cite this Document:

"Server Architectures Using A Fat" (2013, February 26) Retrieved April 27, 2024, from
https://www.paperdue.com/essay/server-architectures-using-a-fat-86244

"Server Architectures Using A Fat" 26 February 2013. Web.27 April. 2024. <
https://www.paperdue.com/essay/server-architectures-using-a-fat-86244>

"Server Architectures Using A Fat", 26 February 2013, Accessed.27 April. 2024,
https://www.paperdue.com/essay/server-architectures-using-a-fat-86244

Related Documents

The traditional Unix Common Desktop Environment should satisfy long-time Unix users, though most people are likely to find it to be crude and dated. The GNU Network Object Model Environment is a modern desktop environment that strives for simplicity, similar to Mac OS. Gnome has all the features modern users expect in a desktop, though experts may be disappointed by the fact that many decisions are left out of

This is however, not considered foolproof. It is possible to break the security by a person having adequate technical expertise and access to the network at hardware level. In view of this the SSL method with right configuration is considered perfectly sufficient for all commercial purposes.5In order to safeguard the data while in transit it is customary to adopt a practical SSL protocol covering all network services that use

Microsoft XP
PAGES 9 WORDS 2303

Windows XP is the latest and most robust of the family of all windows operating systems that are popular with PC users worldwide. Windows XP comes in two editions namely the home edition and the professional edition to cater to the varying needs of the respective class of users. One of the main problems with the previous versions (9x) of windows operating systems was that they were prone to frequent

The modeling environments was so accurate it could deliver results that aligned at a 95% accuracy rate with the actual results achieved. Another advantage was the use of knowledge management to orchestrate multichannel selling, marketing and service scenarios across the client's specific requirements and needs. The analytics and knowledge management systems were also combined successfully to create a constraint-based engine as well. All of these factors were critical to

Windows XP was introduced as a major upgrade to Windows 98 and Widows Millennium. After Microsoft released MS-DOS in the 1980's it developed its revolutionary operating system -- Windows. Newer versions of this operating system were developed over the years, including the release of Windows NT (New Technology) which was aimed at business users and computer servers. After various incarnations of the Windows OS, including Windows 95, 98 and the

cybercrime forensics lab work received approval purchase a software suite aid investigations. Your supervisor (Mr. Turtle) asks create a proposal comparing computer forensic software utilities recommend purchase-based research. Security forensic software tools Security information and event management has experienced much progress in recent years and there are currently a great deal of software providers that direct their attention toward the field. Through analyzing security alerts, SIEM creators make it possible for