Note: Sample below may appear distorted but all corresponding word document files contain proper formattingExcerpt from Term Paper:
As Jacko and Sears emphasize, "As the scope and sophistication of digital systems become ubiquitous, the pressure for improved human-computer interaction methodologies will continue to increase" (2003, p. 15). As noted above, the enabling technologies for a ubiquitous computing environment already exist to a large degree, but there are three things still missing from the picture that will provide the level of seamless interaction demanded by such an environment: (a) multi-industry cooperation, (b) the systems engineering required to make them all work together seamlessly, and, most importantly, (c) the human factors knowledge and experience to understand just what it would mean for them to be transparent to the human user (Jacko & Sears, 2003, p. 14). These three missing elements also form the purpose of the study proposed herein which is described further below.
Purpose of the Study
The purpose of the proposed study is to provide a working definition of ubiquitous computing and, based on current and future trends in human-computer interaction, extrapolate likely paths that interaction will take in the future that will provide humans with the ability to interact with a ubiquitous computing environment in the ways that are envisioned today.
Jacko, J.A. & Sears, a. (2003). Human-computer interaction handbook: Fundamentals, evolving technologies, and emerging applications. Mahwah, NJ: Lawrence Erlbaum
Dragon Speech Recognition Software. (2010). Nuance Software. [Online]. Available:
The Evolution of Software Engineering: Where is the Industry Headed?
Software engineering has evolved in major ways over the past 6 decades or so. During the mid-20th century, software engineers were limited by both computer processing power as well as a lack of experience in the field. By sharp contrast, software engineering today draws on a growing body of knowledge and software engineers truly "stand on the shoulders of giants." This process has moved software engineering from primitive assembly languages to programming languages such as Fortran, BASIC and COBOL to enormously intuitive software design functions today in ways that were unforeseeable just a few years ago, and it is likely that the next generations of software engineering will take the industry further still. Because software is typically designed based on estimations of computer processing speeds that are not yet available, it is important to determine how software engineers use their tools today in order to project where the industry is headed in the future, a need that forms the basis for the study proposed herein and which is discussed further below.
Statement of the Problem
In the Age of Information, computers have assumed a critical and global role in entertainment, business, government, education and commerce. Software engineering has been responsible for this explosion in computer usage and current indicators suggest that these trends will continue in the future as well. For example, according to Lohr (2001), "The rise of 'software engineering' was driven by the same force that led to COBOL - the recognition that computing was moving into the mainstream of business, commerce, and government, and that software was crucial to that happening, but also a growing problem" (p. 53). The "growing problem" referred to by Lohr concerned the need for better collaboration among software engineering teams and improved ways to allocate resources where they would do the most good (Karn & Cowling, 2008). According to these authors, "Software engineering team members must be able to work together effectively in order to maximize their potential" (Karn & Cowling, 2008, p. 583). There is also the issue of how to best use what has already been developed in formulating new applications. Throughout history, engineers have tended to reuse what has been shown to work best over and over until something better was identified. In this regard, Sutcliffe (2002) reports that, "Designs have been reused since the Egyptians invented the pyramid and Romans discovered the arch. In civil engineering, buildings are constructed from reusable components. Many houses come as prefabricated items, for example, window frames, roof segments, floors, and walls" (p. 1).
Likewise, software engineers have traditionally used code that has been shown to function well but tendency has resulted in some stagnation in the industry. According to Sutcliffe, "This message has not been lost on the software industry, but the development of component-based software engineering has been less rosy. This is not for want of trying. Reuse started with component-based reuse in programming languages; for instance, the Include statement in COBOL's file definition enabled programmers to reuse data structures" (p. 1). Today, the majority of computer programming languages include the ability for software engineers to reuse source and executable code; this process, though, ultimately leads to the question: "Just where does reuse start? Opportunistic reuse of code might be better described as salvaging or carrying code over from one application to another, whereas true reuse implies some conscious design of the module-design-code for reuse in the first place" (Sutcliffe, 2002, p. 1).
In order for software engineers to take computing to the next -- and subsequent -- levels, there must be innovation and new ideas introduced into software engineering that mere reuse precludes. There is also a need for a more robust understanding of how humans actually use software in order for the Age of Information to realize its true potential. In this regard, Narayan, and Nerurkar (2006) point out that, "Many good systems have failed because they did not capture the psychological aspects of the implementation. Competency of the information and communication technology, requirement engineering processes and ground level knowledge is a must for success" (p. 33). Improved competency among software engineers will also require better ways for them to collaborate in identifying and resolving developmental constraints and obstacles. For instance, according to Defranco-Tommarello and Deek (2005), "Problem solving is fundamental to software development. A comprehensive model that takes into consideration cognitive issues involved in a group collaborating during problem solving and program development is missing" (p. 5). While it is clear that software engineering has evolved in substantive ways over the past 60 years, it is equally clear that there are a number of problems that must be overcome in order for the industry to proceed in the future, and these issues form the purposes of the study proposed herein which is described further below.
Purpose of the Study
The purpose of the proposed study is to provide a comprehensive analysis of the origins of the software engineering industry and the various stages it has progressed through up to the present day in order to project these trends into the future and to determine what problems must be overcome in order for software engineering to live up to its expectations.
Defranco-Tommarello, J. & Deek, F.P. (2005). A study of collaborative software development using groupware tools. Journal of Interactive Learning Research, 16(1), 5-6.
Karn, J.S. & Cowling, a.J. (2008). Measuring the effect of conflict on software engineering teams. Behavior Research Methods, 40(2), 582-583.
Lohr, S. (2001). Go to: The story of the math majors, bridge players, engineers, chess wizards, maverick scientists, and iconoclasts, the programmers who created the software revolution. New York: Basic Books.
Narayan, G. & Nerurkar, a.N. (2006). Value-proposition of…[continue]
"New Applications For Artificial Intelligence" (2010, January 06) Retrieved December 10, 2016, from http://www.paperdue.com/essay/new-applications-for-artificial-intelligence-74497
"New Applications For Artificial Intelligence" 06 January 2010. Web.10 December. 2016. <http://www.paperdue.com/essay/new-applications-for-artificial-intelligence-74497>
"New Applications For Artificial Intelligence", 06 January 2010, Accessed.10 December. 2016, http://www.paperdue.com/essay/new-applications-for-artificial-intelligence-74497
Looking at other possibilities, however, the idea of creating a part organic, part mechanical computer holds much promise in the way for developing a human-like AI technology. Human brain processes ad functions that are unique to humans are many, and while every person has different life experiences, perceptions, genetics, and understandings, it is very difficult to understand exactly how they all incorporate themselves into everyday human life and decision-making. Organic
Bibliography Daniel Dennett (1998) Brainchildren: Essays on Designing Minds. MIT Press, 1998. Arthur R. Jensen (1998) Does IQ matter? Commentary, pages 20-21, November 1998. John McCarthy (1959) Programs with Common Sense in Mechanisation of Thought Processes, Proceedings of the Symposium of the National Physics Laboratory, pages 77-84, London, U.K., 1959. Her Majesty's Stationery Office. John McCarthy (1989) Artificial Intelligence, Logic and Formalizing Common Sense. In Richmond Thomason, editor, Philosophical Logic and Artificial Intelligence. Kluver Ac John
Artificial Intelligence Intelligence is the ability to learn about, to learn from and understand and interact with one's environment. Artificial intelligence is the intelligence of machines and is a multidisciplinary field which involves psychology, cognitive science, and neuroscience and computer science. It enables machines to become capable of doing those things which the human mind can do. Though the folklore of artificial intelligence dates back to a long time ago, it
Artificial Intelligence What if these theories are really true, and we were magically shrunk and put into someone's brain while he was thinking. We would see all the pumps, pistons, gears and levers working away, and we would be able to describe their workings completely, in mechanical terms, thereby completely describing the thought processes of the brain. But that description would nowhere contain any mention of thought! It would contain nothing
information systems risk, threats and related methods of risk mitigation. Specifically, we will examine systems based upon artificial intelligence (AI), including those for managing component content as well as document management. We will also consider newer systems designed to provide protection in emerging technology areas such as the internet, mobile communications and social networks, each of which presents unique challenges and threats requiring novel responses and methods of intervention. In
Applications Decision Models Supply Chain Management How will emerging web-based technologies and decision models change supply chains in the future? The response to this question has incredible stakes. It is projected that the Internet has the prospect to capture more than $1 trillion from the $7 trillion spent every year on mechanisms, supplies, and services globally (USA Today, 2008, p. BI). How this will occur is a high-priority subject in many
Pedagogic Model to the Teaching of Technology to Special Education Students Almost thirty years ago, the American federal government passed an act mandating the availability of a free and appropriate public education for all handicapped children. In 1990, this act was updated and reformed as the Individuals with Disabilities Education Act, which itself was reformed in 1997. At each step, the goal was to make education more equitable and more