Computers have brought a great change in our lives. A software program called spreadsheet turns the computer screen into a paper sheet one is working on. This program saves time by reducing the errors and repetitions of calculations. It is commonly used in physics labs in order to save time by getting accurate results and accumulating proper data in labs. Spreadsheets have traditionally been used by accountants for the purpose of bookkeeping and budgeting, when actually they can prove to be amazing tools for engineers and scientists as well. Using a spreadsheet, entered raw data could be manipulated and plotted through few easy commands. Furthermore, due to their built-in capability to easily plot data, spreadsheets prove to be especially useful (Karmakar et al., 2007).
Initially introduced in the corporate world in late 1970s and the early 1980s, it proved to be an immediate success. The usefulness, power of spreadsheets, and adaptability helped ignite broad-based approval and adoption of desktop computers in businesses. In today's world, electronic spreadsheets have become ubiquitous, and there are plenty of reasons why (Karmakar et al., 2007).
Spreadsheets have become indispensable tools for everyone involved in doing financial calculations and modelling. Around 92% of public companies utilize spreadsheets for all critical financial accounting activities, according to a recent article on accountingWEB.com. Usage of these spreadsheets range from revenue accounting to revenue scheduling, redistribution and allocation (Karmakar et al., 2007). The paper highlights that spreadsheets have been widely used due to the fact that plenty of the key revenue identification and reporting tasks are not completely automated in today's business systems.
The capabilities of spreadsheets which include conditional statements, programming through macros and calculations make them experts in creating all sorts of ad hoc applications. Moreover, these can be utilized in many areas including budgeting, inventory control, financial modelling, reporting and data entry of all the financial information (Karmakar et al., 2007).
Paradoxically, what actually makes a spreadsheet highly attractive proves to be one of its shortcomings. Companies need to balance ease-of-access, flexibility and procurement cost of using spreadsheets vs. their shortcomings in being used at the enterprise-level. These shortcomings can be classified in the following three areas (Karmakar et al., 2007):
having a lack of data integrity, as values might be changed on purpose or accidentally being error prone, resulting from errors occurring during input, logic, or data interface and use not being proportional with standard IT regimes as a tool for running highly critical applications such as documentation, version control and testing
Spreadsheets are important tool of data analysis that is used in decision making. However, it is important to mention that spreadsheets cannot perform the complete analysis required for decision making hence reliance on them only leads to high risk that decision making will be wrong and may lead to loss in business (Karmakar et al., 2007).
Despite all the shortcomings, spreadsheets are used at vast level. All the organisations cannot afford or manage to develop a customised application that can replace spreadsheets in business operations. There is an important need to focus on utility of spreadsheet in the organisation and its likely shortcomings so that organisations can devise better plan for those functions which are not performed by spreadsheets. Furthermore, there are many functions which are perfectly performed by spreadsheets and are prone to high degree of error if performed manually. This is one of the reasons; organisations do not quit using spreadsheets (Karmakar et al., 2007).
Risks inherent in usage of spreadsheets
The errors contained in spreadsheets are noted at broad industry level. In July 2002, The Journal of Property Management published a report stating that 30 to 90% spreadsheets are erroneous and errors are likely to be significant from decision making point-of-view. In May 2004, similar information was presented by a Computer World article stating that 91% of spreadsheets used in decision making were erroneous (as cited in Ragsdale, 2008).
The errors are significant and industry cannot ignore them. The errors are leading to huge misleading information. Mentioned below are few examples (Ragsdale, 2008).
• Ad hoc process was mistakenly shown with the figure of $644M;
• The total of spreadsheet errors results in the figure of $30M;
• A faulty spreadsheet has projected the error of $11 million;
• $118,387 was the outcome of erroneous data entry (Ragsdale, 2008).
Several factors determine the range and magnitude of errors. Simple errors might occur in coding of macros and spreadsheet formulas. They are also referred at times as simple user error, which are immensely basic, such as the function of copy and paste. The probability of a material error increases with the complexity of the spreadsheet. Improperly referenced cells, cross linking and inappropriately defined cell-ranges are also core reasons leading towards the occurring of errors. Most of the times, end users are unable to completely visualize such errors. Macros present in spreadsheets may not be tested, placed or documented by any sort of version control. Data which is copied or imported into spreadsheets may be altered, changed or converted unintentionally. Such type of risks occurs accidentally, however there exist many cases where spreadsheets are manipulated cleverly and utilized for fraud purposes. Thus, errors in spreadsheet need to be identified amicably, as this casts drastic consequences on the decision making process (Ragsdale, 2008).
Spreadsheets are extensively used in the decision-making process despite the occurrence of consistent errors. The tabular format of spreadsheets cater enhanced level of accessibility and flexibility in data-analysis which makes the decision-making process more persuasive and tends to be a core reason for the wide usage of spreadsheets. Unique responsibilities and sensibilities are associated with the decision-making function (Ragsdale, 2008).
Decision-making and data analysis
Several essential attributes need to be considered for the evaluation of data analysis for business decision-making. Some of them are mentioned below:
Decision makers may find it daunting to access the data simply which is attributed to the data request procedure implied at most organizations. A typical circumstance is described below (Kong et al., 2009):
• A request for data extract or report is placed by an individual for analytical purposes.
• The IT department is placed with a request for data which specifies data type, information needed, time period required for covering and format of the output.
• The request is placed on the schedule by IT.
• Required data is received after two or three weeks.
• An initial analysis is conducted
Any sort of data is missing such as vendor ID, postal code, account number or Any additional information is required which was not identified at the initial placement of the request (Kong et al., 2009).
• The IT department is placed with an additional request and the waiting period starts again (Kong et al., 2009).
Both the internal as well as external decision-making process encounters the above mentioned scenario. Firstly the decision-making cycle does not have enough room for long waiting periods and decision makers do not have the liberty to wait for several weeks for extracting the required data. Secondly, data extraction requires report writing or programming by IT which results in some kind of filtering or alteration. This ultimately results in omitting some critical information unintentionally. This necessitates the need for considering the integrity of data which results in compromising the on-going and ad-hoc objectives (Kong et al., 2009).
Decision makers are pressurized in attaining more results in a short span of time which requires the elimination of obstacles and streamlining of the decision-making process. Decision-makers require direct access to data and thus an effective decision-making technology is the one which caters this directivity. This not only streamlines the entire decision-making process but it also relieves busy IT workforce from catering the frequent data requests required by the decision-making function. The degree of suitability of data analysis technology required for decision-making process is highly governed by variety, volume and veracity. Spreadsheets cater these requirements amicably which makes the decision-making process more quick and efficient (Kong et al., 2009).
In order to analyse the data population available for decision-making, an effective and appropriate analysis methodology must be utilized. In the decision making process statistical sampling is considered as a most important tool for analysing the data accurately by identifying every error, anomaly and exceptions. Advance analysis of data not only give the entire business operations records but also identify the errors in transaction on the early basis to take immediate corrective actions before the problem get worst and unavoidable. That is why spreadsheet modelling tool is being used for such analytical purpose (?eref et al., 2008).
In a research report presented by Gartner Group, it is mentioned that very huge volumes of data has been gathered by the firms and the rate is increasing day by day which is around 30% per annum for the accuracy of decision making process. The data is present in such large bulks that it is downloaded or imported to PC for the analysis. In order to coordinate…