Deployment and Administration - Windows Server 2012
Deployment and Server Additions
The number of servers required the roles to be combined.
Server Requirements:
Careful evaluation of present and projected activity considerations helps decide the server configuration. The number of servers required will correspond directly to the amount of functional data handling for the next three to five years. If a growth of 33% is projected, then it would be prudent to use a RAM of 16 GB in the physical server. Generally, it is good enough practice to start with a 12 GB RAM in a virtual operation computer, and monitor for the need to upgrade as the project and operations expand (Serhad MAKBULOGLU, 2012).
Component
Estimate Memory (example)
Base Operating System Recommended RAM (Windows Server 2008)
2 GB
LSASS internal tasks
Monitoring Agent
Antivirus
Database (Global Catalog)
GB
Cushion for backup to run, administrators to log on without impact
1 GB
Total
12 GB
Table 1: Calculation Summary Example
If the server is based on the premises, this user would require approximately 7.5 MB of external bandwidth every day. There should be a little more added to cater for protocol overheads. This means that the overall figure would round up to about 8MB. In an 8-hour working day, this figure adds up to an average of slightly over a quarter of a kilobyte per second for 8 hours. The rate, thus, would support thousands of users; even on a slow connection. However, such calculation is superficial and misleading (as regards data requirements), because it ignores many practical factors that affect performance; for example, if there is an on-premise mail system, there is a good chance that it will handle lots of spam mail. Spam generally outnumbers actual mail by up to 9 times. In other words, if the firm receives 8 MB of actual mail, it should also expect up to 72 MB of spam. The consequence is that the bandwidth requirement pushes up to 2-7 Kbps. Thus the connection estimated to serve thousands can only now serve a couple of hundreds (Peter Bright, 2012).
ii. The edition of windows to be used for each server
The standard edition of windows; the x64 bit will be on hand. This assures that, even if your Active Directory operates on a 2003x86 architecture, and has DIT less than 1.5 GB, the template provided in this paper can be still applied and will work satisfactorily. Capacity planning must be a continuous activity. You should constantly check the level of efficiency achieved by your system and upgrade as the situation demands. Optimization level will be achieved after several hardware deployments and lifecycles as the cost of such hardware changes. For instance, when the memory goes down in, cost per core also reduces. The cost of optional storage options also changes (Serhad MAKBULOGLU, 2012).
You should also plan for peak periods of the day. You could break it into a 30 minute or one hour spans. Avoid making it higher because the actual peaks will be concealed. Using lesser spans will also be misleading. It is prudent to plan for growth in line with the passing of the hardware lifecycle over time. One option is to add hardware and upgrade in a staggered step. You may also replace everything after three or five years. Whatever option you choose, you will need to estimate the amount of load that will grow on Active Directory. If you collect historical data, it could be a handy resource in such assessment (Serhad MAKBULOGLU, 2012).
iii. Should servers be virtualized using Hyper-V?
The important element here is to make sure that the shared infrastructure can support the DC load and the other dependent resources such as the shared media and the pathways that link to it. This is a necessary precaution regardless of whether the physical domain controller is co-sharing the same media on NAS iSCSI or SAN infrastructures (as alternative servers or apps); or a guest that makes use of pass via access to a NAS,...
operating system can be simply defined as a "program that manages a computer's resources, especially the allocation of those resources among other programs" ("operating system (OS) (computing) -- Encyclopedia Britannica"). There is no definitive answer to the question that which is the best operating system that has been produced till date. Windows and MAC OS X are two of the renowned operating systems that are used all over the
The system will also assist the company to incorporate a fleet management system that has a tracking device to assist the Jinx Transport & Logistics Company to track all its fleet of vehicles located at any location. With the new systems, customers will have platform to interact with the systems, and the system will assist them to make inquiry and track their parcels. 2. OS Processor and Core Jinx Transport &
7 billion by 2008 establishing the fact that Linux is no more a fringe player but rather a mainstream. IDC admitted that Linux is not being used just on new hardware only. As an alternative customers frequently reinstall existing servers to run Linux. While considering for such use as also the use of Linux for secondary OS, IDC forecasts for servers running Linux to remain 26% larger in 2008. Evidently,
In addition the cost makes it expensive to own the hardware required to support these systems as personal computers. The university has also indicated its preference for the PC and the Windows operating system. All training and skill acquisition is also generally offered for the windows system in the college. Standardization and uniformity of the design platform of Microsoft has greatly enhanced the applicability in the workplace. Standard packages
Microsoft DuPont DuPont Analysis of Microsoft DuPont Analysis Overview The DuPont Analysis is a type of analysis that provides a more detailed look at a company's Return on Equity (ROE) by breaking it into three main components. The three components are profit margin, asset turnover and a leverage factor. By separating the ROE into these smaller categories, investors can quickly identify how effectively or efficiently a company is using their resources. If any
Microsoft Anti-Trust Issues Issues in anti-trust cases tend to be very complex and technical, but in the case of the government vs. Microsoft, they are quite understandable. The government alleged that Microsoft used predatory pricing tactics to destroy competitors and eliminate competition in the marketplace. They were also accused of erecting technical barriers within their operating systems to make it difficult or impossible for non-Microsoft software to run on Windows. In
Our semester plans gives you unlimited, unrestricted access to our entire library of resources —writing tools, guides, example essays, tutorials, class notes, and more.
Get Started Now