Privacy Protection Commenting on the Research Proposal

Excerpt from Research Proposal :



Confab, however, is an architecture that is able to bypass these limitations and combine both approaches. It is limited, though, and a true pervasive environment calls for complex preferences that can be easily manipulated by the end user.

Moreover, all these approaches are not completely sufficient in meeting the challenges mentioned in section 3.2. For instance, PETs and privacy models do not explicitly contribute in a reduction of data collection, nor is that their intent or purpose. Although anonymous data collection is based on the assumption that if data is collected anonymously then it cannot be linked with any individual, and if data cannot be related to an individual then it poses no threats in terms of privacy. Thus, detailed privacy policies and safeguards for data are not seen as critical in this model. By collecting anonymous data, one may argue that a true minimum amount of personal data is being collected. However, ensuring complete anonymity remains both technically and practically difficult.

For example, mix zones and changing pseudonyms are used to maintain anonymity but it is also possible to break the anonymity and track a user in a mix zone. Pervasive computing, then, needs other, more robust means to minimize the amount of data collection. Moreover, there are usability and efficiency issues that arise with any of these approaches. Testing, for example, is typically done in a controlled environment under limited conditions. The effectiveness of many of these solutions, then, has not been adequately tested under typical, real-world, conditions. In a true pervasive computing environment, users will move extensively between different computing environments and will interact with various devices (e.g. starting from small portable hand held device to large wall sized displays), and applications. It is difficult to predict how privacy solutions will perform in a true user-environment under more typical conditions.

Thus, it will be necessary to find and incorporate a unique privacy model that accentuates both social and legal norms, while ensuring the technical ability to protect privacy.

Newman, a. 2008, Protectors of Privacy: Regulating Personal Data in the Global Economy, Cornell University Press.

Miller, S. And J. Weckert 2000, "Privacy, the Workplace and the Internet," Journal of Business Ethics, Vol. 28, no. 3, pp. 255-65.

OECD -- Recommendation Concerning Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data, (September 1980), cited in: http://www.oecd.org/document/18/0,2340,en_2649_34255_1815186_1_1_1_1,00.html

See also: Caloyannides, M 2004, Privacy Protection and Computer Forensics, 2nd ed., Artech House; Bennett, C.J. And C. Raab 2006, the Governance of Privacy: Policy Instruments in Global Perspective, MIT Press.

For additional discussions on the efficacy of sensitive monitoring devices, see: Glasziou, P., et.al., eds., 2008, Evidence-Based Medical Monitoring: From Principles to Practice, BMJ Press; Jovanov, M. And D. Raskovic, 2000, "Issues in Wearable Computing for Medical Monitoring Applications," Wearable Computers: The Fourth International Symposium, pgs. 43-49, Cited in: http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=888463

See: Blanchard, J. 2004, "Ethical Considerations of Home Monitoring Technology," Telemedicine Information Exchange, vol. 1, no. 4, pp. 63-64, Cited in: http://tie.telemed.org/articles/article.asp?path=homehealth&article=ethicsAndHomeTech_jb_hhct04.xml

See: Yamada & Kamioka 2005.

See: International Telecommunications Union 2005, "Privacy and Ubiquitous Network Societies," ITU Workshop on Ubiquitous Network Societies, New Initiatives Program, April 5-8, 2005; Cited in: http://www.itu.int/osg/spu/ni/ubiquitous/Papers/Privacy%20background%20paper.pdf

Do you really need this here? If you do, a complete scholarly reference will be necessary. I was not able to find one that fit your paragraph's intent.

See the technical discussion in: Golle, P. et.al.,n.d. "Data Collection With Self-Enforcing Privacy," Cited in: http://crypto.stanford.edu/~pgolle/papers/selfprivacy.pdf

See: Aceituno, V. 2005, "On Information Security Paradigms," ISSA Journal,

September; Casey, E 2004, Digital Evidence and Computer Crime, 2nd ed., Academic Press.

Sources Used in Document:

references that can be easily manipulated by the end user.

Moreover, all these approaches are not completely sufficient in meeting the challenges mentioned in section 3.2. For instance, PETs and privacy models do not explicitly contribute in a reduction of data collection, nor is that their intent or purpose. Although anonymous data collection is based on the assumption that if data is collected anonymously then it cannot be linked with any individual, and if data cannot be related to an individual then it poses no threats in terms of privacy. Thus, detailed privacy policies and safeguards for data are not seen as critical in this model. By collecting anonymous data, one may argue that a true minimum amount of personal data is being collected. However, ensuring complete anonymity remains both technically and practically difficult.

For example, mix zones and changing pseudonyms are used to maintain anonymity but it is also possible to break the anonymity and track a user in a mix zone. Pervasive computing, then, needs other, more robust means to minimize the amount of data collection. Moreover, there are usability and efficiency issues that arise with any of these approaches. Testing, for example, is typically done in a controlled environment under limited conditions. The effectiveness of many of these solutions, then, has not been adequately tested under typical, real-world, conditions. In a true pervasive computing environment, users will move extensively between different computing environments and will interact with various devices (e.g. starting from small portable hand held device to large wall sized displays), and applications. It is difficult to predict how privacy solutions will perform in a true user-environment under more typical conditions.

Thus, it will be necessary to find and incorporate a unique privacy model that accentuates both social and legal norms, while ensuring the technical ability to protect privacy.

Newman, a. 2008, Protectors of Privacy: Regulating Personal Data in the Global Economy, Cornell University Press.

Cite This Research Proposal:

"Privacy Protection Commenting On The" (2009, August 04) Retrieved December 10, 2019, from
https://www.paperdue.com/essay/privacy-protection-commenting-on-the-20147

"Privacy Protection Commenting On The" 04 August 2009. Web.10 December. 2019. <
https://www.paperdue.com/essay/privacy-protection-commenting-on-the-20147>

"Privacy Protection Commenting On The", 04 August 2009, Accessed.10 December. 2019,
https://www.paperdue.com/essay/privacy-protection-commenting-on-the-20147