Why Threat Management Is Different from Vulnerability Management
Studies have attempted to examine on the possibility of implementing an all rounded technology that seeks to manage several layers of OSI networking levels. However, this implementation has considerably lost influence since this approach is defeated by the nature of attacks. Currently, 2600 hacking publication presents to a user several methodologies of attacks. In fact, hacking as become complex for the single - headed approach. This research will be integral in differentiating threat management from vulnerability management. Their importance in implementing a hybrid network management on the operating system and vulnerability Management approach on the layer side is also addressed. The research will further clarify that the security approach designated by hybridism factor is responsive to all nature of attacks in the OSI networking models.
The research is based on the following studies. Firstly, Nikolaidis (2003) inspired this study in his analysis of the TCP/IP networking model and its relevance in establishing a security perimeter. Seconding this, Wu (2013) and Wenqiang (2010) who assesses the authentication approach and how some of these systems are weak in a given network. Thirdly, Gandotra (2013) seconded by Andre (2008) and Hsu et al. (2012) will seek to clarify on the various software roles and their impact on network defenses. Fourthly, Sterling (2006) will provide the technical approach in relation to DMZ server places and the effects on the first to third layers of OSI in the general debate of Threat management of Vulnerability management.
In modern day I.T security, the desire to provide comprehensive services is paramount. The chief reason behind this is based on the knowledge that potential clients of a given technology do not know the basic approaches of network security. Wenqiang (2010, pp. 104) argues that; traditional signature scanning was found impossible to keep pace with virus attacks that reaching epidemic levels at the time. Therefore, the TM technology was applied to ensure that a plethora of different gateway and desktop solutions could be applied hand in hand. The evolution of TM technology was due the fact that hacker were presented using blended attack. As such, basic antivirus software could not mitigate attacks from happening and in this case, it was necessary to enroll user mitigation techniques from the file transfer system, web and email.
Threat management is a networked-based security approach focusing on the primary network gateway used as a defense mechanism in any organization. Threat management is capable of performing multiple network defense protocols, which include gateway antivirus, network intrusion prevention, gateway ant-spam, VPN content filtering, data-leak prevention and load balancing. OSI (2006, p. 13) argues that the technology uses multilayered approach to incorporate several security technologies. TM system can be incorporated and configured to ensure that security feature can be quickly updated to meet evolving threat. Wu et al. (2013) joins this argument in what he considered as a triple 'A' approach in the network and system management. These include authentication, authorization, and accounting.
Authentication seeks to prove that the identity claim is authentic and valid. In most of the network attacks, proving the identity of the attacker is the most serious handle. However, when applying this approach, one will notice the technology is capable of identifying the source attacks and response will be directed towards the source attacks. Secondly, Gandotra (2012, pp. 290) argues that technology can be configured on Graphical User Interface system so that the administration among junior level managers is easy. As part of user management, the system ensures that there it is simple and reduced to ensure that troubleshooting, reduced and TCO advantages. Thirdly, the technology has reduced technical training requirements, that is, one comprehensive product approach. Fourthly, the technology is simplified and in any case, it is administered on a single software orientation. However, TM does not address the single point compromise if the network is vulnerable. It is also prudent to mention that the technology has impacts on a single point compromise. Lastly, the technology has single bandwidth orientation and cannot satisfy demands of a given network.
Based on this approach, (Gandotra, 2012, pp. 491) seconds that the TM is capable of mitigating frauds, pharming attacks, and phishing attacks. In this case, TM provides detects, analyzes, and provide remedies from attacks that have lost productivity and system downtime on aggregate. As part of further merits, (Eaton, 2001, pp. 184) argues that; TM is capable of providing standalone solution, which is complex and difficult to manage. In fact, each of the system has separate maintenance processes and requires patch management strategies, which can be upgraded on every software release. Consequently, as part of the operation, TM has provided its personalized operating system, which is mandated to ensure that there is centralized management, which can be monitored on reporting extra advantages provided on TM solution. Therefore, in contemplating whether to apply TM or other technologies, substantial concern is directed to ensure that a different solution checks on the behavior of connecting to unusual ports.
On the other hand, vulnerability management focuses on a cycle of identifying, remediating, classifying, and mitigating potential vulnerable attacks. These could be driven by software or firmware mounted on a given system. In this regard, a vulnerability device utilizes a series of hardware and software, which are previously mounted on the identified open ports, insecure software configuration, or even susceptibility to malware. A vulnerable scanner is one that can be identified on the zero-day attack (Gandotra, 2012).
In I.T industry, pundits have ensured that vulnerabilities attacks are simple and can patch to those annoying software issues, which are done. In fact, the scope of the platform and relevant application provides a plethora of weaknesses and prevention is the only key remedy. In this regard, most organizations will attempt to comply with networks that have a large number of different products. Also in this relation, it is prudent to note that researchers; for instance, Hsu et al., (2012, pp. 1439) have acknowledged that most organizations do not attempt to implement the I.T defense to the last mile. As this is the case, patches will often appear irrelevant to implement since support is organization is either not aware of releases or at most not even have the precise knowledge on how they are implemented.
In this sad notion, vulnerability concept will often provide a file patching technology that uses vulnerability scanners that can detect vulnerability from the network side with reasonable accuracy as well as, detecting vulnerability issues from the host side with optimal accuracy. However, Andre (2008, p. 4) when it comes to this system, it should be noted that volumes of data presents the key problem. In fact, many organizations have large databases: conducting a system scan-then-fix approach presents the potential problem. It is prudent to mention that most approaches in relation to vulnerability attacks do fail adequately to provide network visibility issues on the critical system.
To develop a given vulnerability process, Ariba et al. (2006, pp. 256) suggest the following steps that will act in line with overall mitigation plans. Firstly, it is good for the I.T department to develop a fastening policy that will initiate the first step that includes a definition of state device configuration, resource access and user identity. Secondly, it is appropriate to ensure that the I.T expert is away of the vulnerable attack. Thirdly, it is positive to ensure that the I.T expert is a better position to prioritize mitigation of activities and essentially extend external threat information; for instance, assets classification and security posture. Fourthly, it is appropriate to spread the shield of the environment, which is in this case, prior to eliminating the vulnerability by using network or desktop driven tools. Fifthly, the administrator should seek eliminate the vulnerability problem from the root and this approach is considered a legitimate concern. Sixthly, it is appropriate to maintain continually monitoring of the environment for deviation from policy and as a result, identify new vulnerability, which can affect the DMZ zone.
Vulnerability attacks do have three phases of problem approach; these are discovery, reconnaissance, and white box testing. Technically, the process identifies all host on given networks based on the process, version, type, and passiveness. The system applies a series of creative end use protocols that ensure that there is a close analysis of application behavior. When a host is identified the white box pings it and gather more information, and all information is backed on a security server firm. The complexity with this approach relates with its ability to conduct ICMP echo request applied when identify a given host and its origin. Through the mitigation process, most scanner will technically the TCP / IP process so that it can be the system can be able to override the networking protocols (Ariba, 2006, pp. 259). Ideally, the technology seeks to override the entire system by issuing a package sniffing process that identifies common ports in a network design. To achieve this, the…