StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Evaluation of Science and Technology Policies - Essay Example

Cite this document
Summary
The paper "Evaluation of Science and Technology Policies" asserts that the trend to simplify the evaluation process, driven by investor parties striving to receive a transparent and simple answer on the projects’ usefulness can be easily understood…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER92% of users find it useful
Evaluation of Science and Technology Policies
Read Text Preview

Extract of sample "Evaluation of Science and Technology Policies"

Introduction The investor-based research funding system implemented in European countries increases the importance of research evaluation. Investorsrequire not misty promise, but a reliable data proving the value of a proposed research. Therefore the development of research evaluation policies and techniques is an integral part of the research management. Evaluation techniques can be divided into ex ante and ex post. While the ex post evaluation has always been present and is now a standard, ex ante assessment presents greater interest as it allows to assess the future returns of investments. Each evaluation of research, regardless of type, requires experts, funds, and time to be carried out in proper way. A number of measures, called performance indicators are usually invented in order to simplify the evaluation process and make it more easily readable by non-expert groups (e.g. investors). "Simply put, performance indicators are measures that describe how well a programme is achieving its objectives Indicators are usually quantitative measures but may also be qualitative observations. They define how performance will be measured along a scale or dimension" (USAID Center for Development Information and Evaluation, 1996). The question that is raised in this essay can be formulated as follows: is it possible to rely on performance indicators without evaluation itself, and what will be the consequences To answer that question, the essay clarifies at first a concept of evaluation, its development in research policy, its relations with performance indicators (PI), limitations of PI, and finally demonstrates with the help of two examples that the substitution of evaluation with merely PI will lead to the decline of investor-funded science itself. Definition of evaluation Let us at first get acquainted with the concept of evaluation answering a simple question: what is the evaluation and why do we need it in research Generally the evaluation can be defined as follows: Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object" (Trochim, 2002). So, in other words, evaluation provides the interested parties with the feedback, which will be useful, i.e. will help in the decision-making process. This leads us to the answer on the second part of the expressed question: evaluation is needed in research to make the funding policy more effective. If the evaluation processes provide the correct feedback about the usefulness of candidate scientific projects then the most 'useful' projects will receive funding, which will lead to the development of 'useful' science. The word 'useful' is placed in quotation marks advisedly, as it is also an important question: what science can be called useful However, this question leaves out of the scope of this essay. Initially, evaluation can be divided into two types: formative and summative. Whereas formative evaluation examine the delivery of the project or technology, the quality of its implementation, and the assessment of the organizational context, personnel, procedures, inputs, and so on, the summative evaluation analyses the effects of the project, determining its overall impact (Trochim, 2002). Each of these types benefits from the use of performance indicators, because to determine both the implementation and the impact a number o measures have to be devised. Development of evaluation It is evident that the evaluation process itself constantly endures changes. To put it differently, the accent of evaluation changes in accordance with the current research evaluation policy. "In most European countries an "evaluation culture" in science, technology and innovation policies has evolved since the 1980s, including the ex post evaluation of research programmes and other policy initiatives, the evaluation of R&D centres and universities, and the evaluation of R&D funding agencies. (Kuhlmann, 2000)" Rip characterises the changes of R&D evaluation through the use of triangular metric with accountability, strategic change, and decision support put in the corners: "The triangle is not just a metric. It also presents the tensions and pressures at work around R&D evaluations, and can thus also be used as a diagnostic tool, and stimulate reflection." (Rip, 2000) Accountability reflects the interests of investors and leads to audit-based type of evaluation, where profit making is the main objective of research. Simply it neglects all the scientific meaning and simplifies projects to investment opportunities. The decision support evaluation uses a combination of qualitative and quantitative measures in an attempt to assess the efficiency of a research. A peer review can serve as an example of such evaluation. Finally, strategic change reflects the ability of policy makers to affect research evaluation in order to evaluate the appropriateness of policy goals and of promises of actual and possible directions of research. Thus the triangle system reflects the view of three main groups affecting evaluation: investors trying to make profit, experts or peers interested in increasing scientific knowledge, and policy makers striving to develop research policy. The use of performance indicators Performance indicators serve to measure a research project in certain ways. However to determine that ways and to increase the understanding od what is a good performance indicator, several requirements should be outlined. In general, a golden rule for performance indicators can formulated in the following way: "Whatever Key Performance Indicators are selected, they must reflect the goals, they must be key to its success, and they must be quantifiable (measurable)" (Reh, 2005). For instance if the accountability evaluation of the research is undertaken, than its goal is to maximise investors profit. Therefore the measures should relate to financial measures: for example, profitability ratios. In order to make the best use of performance indicators, several steps should be carried out. At first, a clear research goal should be stated. Second, the quantifiable performance indicators reflecting the progress in achieving the selected goal should be identified. Third, each of the candidate indicators must be analysed from the perspective of its usefulness. Finally, the fourth step is to pick out the best indicators reflecting the selected goal. As can be seen, the use of performance indicators varies greatly, depending on the interested party that uses them. Relations between performance indicators and evaluation Indeed, the performance indicators help to conduct the research evaluation. Usually evaluation is a complex process of examining a research project from different perspective. Without the performance indicators the results provided by the expert group after a significant amount of time usually are too difficult to comprehend for non-expert person. In the meantime, investors do not want to hear blind promises; they want to know how exactly the proposed research will help them. Placed in this context, the performance indicators are a perfect tool for making the results of a complicate analysis easily understood for every interested person. However, performance indicators are not all-powerful. In fact they have a number of serious drawbacks making them useless without the research evaluation. "Indicators in themselves poorly reflect the complexity of research and development The initial experience suggests that indicators, if appropriately defined, collated and benchmarked, can offer useful insights into qualitative and qualitative research science and technology performance." (Bartle, 2001) Thus it is wise to use the performance indicators in research evaluation; it makes easy to see if the goals set were achieved on the post assessment stage, and it helps to provide more accurate forecasts on the initial stages. Nevertheless one should not forget that the use of performance indicators is strongly limited. Their limitations are observed in the next part of the essay. Limitations of performance indicators Each of the four steps of performance indicators' determination contains limitations that complicate their use. On the initial stage figuring out the research goal is not always easy. Sometimes the goal seen by the expert group is less important to them than the one they failed to identify. In that case, the indicators developed by them will reflect the progress in reaching the less important goal, while the other one will stay without control. The next step is about determining the performance indicators. Not everything can be traced with performance indicators. "Not everything that counts can be counted. Some matters can only be judged, that is to say they can only be assessed in a qualitative way Unless matters can be reduced to measurable standards and indicators, the managers will not be able to exert significant influence." (Spiegelman, 2001) The third stage unfolds the risk of improper assessment of the indicators. Of all the risk this is the least significant, because the criteria of good indicators are often discussed and are well-developed. The final step of choosing the right indicators depends greatly on the criteria of 'good' indicators developed in stage 3 and a number of indicators made in step 2. Still, it is possible that the specifics of the research do not allow developing any performance indicator matching such criteria. All the limitations given make the sole use of performance indicators without the research evaluation too risky. The following examples illustrating the combined use of performance indicators and evaluation support that point. Example 1: Use of performance indicators with peer review evaluation in the UK HEFCE Higher Education Funding Council for England (HEFCE) is an organisation promoting teaching and research with the students. Research Assessment Exercise (RAE) traditionally relies on a peer review evaluation method, when a group of anonymous experts work independently from authors of the proposal and from each other to assess the subject research. The main criteria for approval of a candidate research are as follows: validity, significance, and originality. Unlike RAE, HEFCE makes the great use of performance indicators. This example illustrates the research performance indicators selected in 2001-02. They were as follows: proportion of PhDs awarded per proportion of academic staff costs; proportion of PhDs awarded per proportion of funding council QR funding allocation; proportion of research grants and contracts obtained per proportion of academic staff; proportion of research grants and contracts obtained per proportion of funding council. "The indicators here look at numbers of PhDs awarded and amount of research grants and contracts obtained, relative to the academic staff costs of an institution, and relative to the funding council allocation of quality related (QR) research funds to that institution." (HEFCE, 2003). In spite of the importance of this indicators, HEFCE stresses with the point 1 of the provided tables, that: "[Performance indicators] are not intended to replace the Research Assessment Exercise, which remains the most reliable indicator relating to quality of research" (HEFCE, 2003). Therefore, while the performance indicators provided with the HEFCE allow to trace the performance of the research, they are calculated alongside with the evaluation carried out by RAE. Moreover, this evaluation was put on the first place of importance. Example 2: Use of performance indicators in bibliometrics Bibliometrics, or the use of statistical methods in science evaluation can serve as one of the best example of performance indicators. This method is known for its quantitative nature, providing performance and activity indicators, such as citations and publications index. Nevertheless, bibliometrics "is not designed to evaluate research results bibliometrics is not designed to override or even to substitute peer reviews or evaluation by experts but qualitative and quantitative methods in science studies should complement each other." (Glanzel, 2002) Its tools can be and should be used to enhance the peer review method, but it is almost impossible to represent the value of every research with a single indicator. There are many academic works that put the power of citation index under questioning. For example, Langford finds the citation index limited with the time constraints: "the time sequence of development of citation information must be respected. Citation analysis has little to say about this year's work" (1999, p. 10). In turn, Gingras (1995) stresses that the number of papers is of course a direct measure of output, but is certainly not a direct measure of impact or quality. A good indicator ought to have a consistent meaning, while sociologists continue to debate the meaning of citations. Thus, while the citation index is a widely used indicator, its usefulness without the evaluation of research is little. Conclusion The trend to simplify the evaluation process, driven by investor parties striving to receive a transparent and simple answer on the projects' usefulness can be easily understood. "Research and innovation policy is more an exception than a rule in being evaluated by specialists Reduction of research and innovation activities to a set of easily understood performance indicators has appeal for politicians and other senior administrators." (Georghiou, 2000) However, performance indicators can serve only as a compliment. It is hard to perform qualitative evaluation of research with performance indicators, and it is even harder to express the true meaning of research with only a few numbers. Therefore, the use of qualitative research evaluation with qualified experts and performance indicators serving only as secondary tools seems to be the easiest choice. References Bartle, D. (2001). Performance Indicators and Research Evaluation. Proceeding paper of the Conference Innovation and Links: Research Management and Development & Postgraduate Education, New Zealand, Auckland. Georghiou, L. (2000). "Evaluation of Research and Innovation Policy in Europe - New Policies, New Frameworks". Proceeding paper of the US-European Workshop on Learning from Science and Policy Evaluation. UK: University of Manchester. Retrieved February 22, 2006 from http://www.isi.fhg.de/ti/bh-proceed.pdf Gingras, Y. (1995). Performance Indicators: Keeping the Black Box Open. Proceeding paper of the Second International Symposium on Research Funding, Ottawa, Canada. Retrieved February 22, 2006 from http://www.ost.uqam.ca/OSTE/pdf/articles/1995/proceedings_research_funding.pdf Glanzel, W. (2002). A Concise Introduction to Bibliometrics & its History. Netherlands: Steunpunt O&O Statistieken. Retrieved February 22, 2006 from http://www.steunpuntoos.be/bibliometrics.html HEFCE. (2003). Research performance indicators 2001-02. Retrieved February 22, 2006 from http://www.hefce.ac.uk/learning/perfind/2003/reports/default.asptab=9 Kuhlmann, S. (2000). Evaluation as a Source of Strategic Intelligence. Proceeding paper of the US-European Workshop on Learning from Science and Policy Evaluation. Germany: Fraunhofer Institute for Systems and Innovation Research. Retrieved February 22, 2006 from http://www.isi.fhg.de/ti/bh-proceed.pdf Langford, C.H. (1999). The Evaluation of Research Done in Post Secondary Institutions. Canada: Council of Ministers of Education. Retrieved February 22, 2006 from http://www.cmec.ca/postsec/evaluation.e.pdf Rip, A. (2000). Societal Challenges for R&D Evaluation. Proceeding paper of the US-European Workshop on Learning from Science and Policy Evaluation. Netherlands: University of Twente. Retrieved February 22, 2006 from http://www.isi.fhg.de/ti/bh-proceed.pdf Reh, F.J. (2005). "Key Performance Indicators." From Your Guide to Management. About, Inc. Retrieved February 22, 2006 from http://management.about.com/cs/generalmanagement/a/keyperfindic.htm Spiegelman, J.J. (2001). Quality in an Age of Measurement: The Limitations of Performance Indicators. Retrieved February 22, 2006 from http://www.lawlink.nsw.gov.au/lawlink/supreme_court/ll_sc.nsf/pages/SCO_speech_spigelman_281101 Trochim, W. (2002). "Introduction to Evaluation." From Research Methods Knowledge Base. Retrieved February 22, 2006 from http://www.socialresearchmethods.net/kb/intreval.htm USAID Center for Development Information and Evaluation. (1996). Performance Monitoring and Evaluation Tips. Number 6. Retrieved February 22, 2006 from http://www.dec.org/pdf_docs/pnaby214.pdf Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(“Evaluation of Science and Technology Policies Essay”, n.d.)
Evaluation of Science and Technology Policies Essay. Retrieved from https://studentshare.org/science/1516588-evaluation-of-science-and-technology-policies
(Evaluation of Science and Technology Policies Essay)
Evaluation of Science and Technology Policies Essay. https://studentshare.org/science/1516588-evaluation-of-science-and-technology-policies.
“Evaluation of Science and Technology Policies Essay”, n.d. https://studentshare.org/science/1516588-evaluation-of-science-and-technology-policies.
  • Cited: 0 times

CHECK THESE SAMPLES OF Evaluation of Science and Technology Policies

The Use of Evidence Based Practice in Nursing

In nursing, evidence-based practice calls for the re-evaluation of research for the purpose of improving inadequate practice.... Three information technology applications in quality management Three information technology applications that are very important in quality management processes in healthcare sectors include electronic health records, bar coding, and the use of the clinical decision support system....
6 Pages (1500 words) Essay

Program Evaluation

This report covers the evaluation of management's commitment & leadership, employee participation and accountability at Procter & Gamble keeping in view their management of environments, health & safety.... Keywords: Procter & Gamble – P&G, Occupational Safety and Health Association – OSHA, The National Institute for Occupational Safety and Health – NIOSH, Health, Safety & Environmental – HS&E, Fast moving consumer goods – FMCG, policies, employee safety, product safety....
4 Pages (1000 words) Research Paper

Forensic Science in Policing: A New Frontier

hellip; The author states that forensic science is not new, but the modern era has witnessed dramatic advances in science and its tool technology and this has its strong implications on forensic science as it applies these advances to law.... The paper “Forensic Science in Policing: A New Frontier” evaluates forensic science as the application of science to law.... This broad perspective of forensic science makes any and every branch of science integral to forensic science, should any part of it be found useful to law....
4 Pages (1000 words) Research Paper

Technology and Health Transformation

"technology and Health Transformation" paper compares and contrasts two approaches to service design or transformation in healthcare settings.... nbsp; The utilization of technology to improve medical services is amazingly significant to advance diagnoses and responses to illnesses.... Nowadays, everybody could observe how technology has assisted doctors in brain and heart surgeries, liver operation, and even in curing pneumonia, tuberculosis, in determining the standard figures of oxygen needed by the human body, and even in the documentation of patients' treatment....
7 Pages (1750 words) Coursework

The Politics of Information Systems and Technologies

To support the policies in the IT and IS processes, the paper also focuses on the internet by showing its history of development.... For example, Knowledge Management (KM) is one of the approaches used in understanding politics and in the control of political resources by the use of information technology/systems.... nowledge Management (KM) proves to be an essential approach for understanding the politics of information technology....
8 Pages (2000 words) Coursework

Intangible Assets Management and Evaluation

It also looks into reporting entities most affected by the standard, academic or non-academic research relevant to the standard and a critical evaluation of the standard.... This paper discusses the IAS 38 intangible asset accounting standard and theories applicable to the standard....
13 Pages (3250 words) Essay

Use and Misuse of Science by Governments

… The paper  " Use and Misuse of science by Governments " is a delightful example of an assignment on politics.... The paper  " Use and Misuse of science by Governments " is a delightful example of an assignment on politics.... nbsp; science is an important field in the human day to day life and it is because of this that technological advancements have significantly been made.... science has continued to make human life easier and more comfortable through inventions and creations it has developed....
6 Pages (1500 words) Assignment

Evaluation of Public Policy Effectiveness

… The paper “evaluation of Public Policy Effectiveness» is a  meaningful variant of essays on politics.... The paper “evaluation of Public Policy Effectiveness» is a  meaningful variant of essays on politics.... In determining the effectiveness of policies, outcome evaluation may not produce useful results as its only aim of assessing the intended results is often unclear.... In the designing and implementation of public policies, the state is charged with the responsibility of prioritizing the human rights of the disadvantaged groups (Carmona, 2012)....
6 Pages (1500 words) Essay
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us