1. Introduction The government of South Korea has advanced a scheme to transfer government-supported research institute into guidance-style R&D organizations for innovation, and, accordingly, phased in the mission-oriented evaluation system for the R&D institutes. The dominant characteristics of mission-oriented evaluation for the government-supported research institute is the shift of performance indicators from quantity including number of papers and patents to quality such as citation index. At first, the quality indicators for papers, considered as the very basic output of R&D, were “Number of SCI papers ranked at the 20% on the top of the standardized IF(Impact Factor)” and “Number of SCI papers at the 10% on the top of the standardized CI(Citation Index)”, and they were applied to all the government-supported research institute that need to be evaluated by the government. However, the problems of IF estimation method were revealed and the CI which evaluates the influences by papers is being perceived as an useful standard. However, the CI tends to stand different depending on the characteristics of the R&D fields and to be measurable after accumulation over certain amount of time. That means the papers to be evaluated at the aspect of their quality should be analysed with their availability of proper timing and range first of all. Consequently, this study demonstrated that the proper timing, a component of performance indicators introduced to evaluate the quality of papers produced in the mission oriented government-supported research institute. As a tool for this study, citation half-life and immediacy index were used, and improvement ways to shift the previous evaluation system to quality evaluation for mission oriented government-supported research institute were proposed. 2. Analysis on the Citation Characteristics of Papers by Mission-Oriented Government-Supported Research Institute Types To analyse the citation characteristics, all the citation data of the SCI papers published in English from 1994 to 2004 in the R&D fields carried out by the government-funded research institutes in South Korea was collected and analysed in its frequency and variance with SAS statistics program. The analysis result showed that the average citation period was 4.67 years, and 1.38 as the immediacy index. The R&D fields of above-average long term citation period were found to be dealt in Korea Research Institute of Bioscience & Biotechnology(KRIBB), Korea Institute of Civil Engineering and Building Technology(KICT), Korea Food Research Institute(KFRI), and Korea Institute of Oriental Medicine(KIOM). The citation period of outstanding papers in quality which belong to 10% on the top of annual average CI was 5.81 years and 8.41 as the immediacy index. In other words, outstanding papers in quality are cited for a long time and in shorter time that less outstanding ones. The period of knowledge transfer of outstanding papers in quality which belong to 20% on the top of annual average CI was 5.70 years and 5.75 as the immediacy index. The papers on the top 20% showed a bit shorter period of knowledge transfer than those on the top 10% in all the R&D fields, and the speed of knowledge transfer showed relatively slower. The results of citation period and immediacy index in all the fields were compared and analysed to the proportion of investment based on the mission type of all the government-supported research institutes. As a result, the speed of knowledge transfer in the R&D fields of more investment in the industrialization seemed faster that those of less investment. However, unlike the common expectation that the citation period in fundamental R&D fields may be longer than that of applied R&D, the citation period did not show direct correlation with the excellency of the R&D fields. It can be said that the analysis may be of critical situation arisen from the vagueness which does not definitely divide the papers into technology and theory in each field. However, even the pure science field such as astronomy showed little difference as the dependability of large instruments on high technology became higher. The analysis above showed that the speed of change in R&D fields are becoming faster as the development and dependability on technology get serious. However, the citation speed in the field of public technology is slower than that in the other fields, and the speed in the fields related to industry is faster. In addition, the differences among the R&D fields are irrelevant to their characteristics, and the minimum period for citation index measurement with validity should be over 6 years. 3. Proposals to Improve the Evaluation System for Mission-oriented Government-Supported Research Institute Based on the technical and statistical analysis result, this study proposes improvement ways in the aspects of evaluation period and standard which deal with short-term ways of policy alternatives with object and reasonable quality evaluation, and in the aspect of management which handles long-term ways of stable quality outcome production. From the perspective of evaluation period, CI measurement on the R&D output produced for 30 months is not appropriate to secure objectivity. In order to improve this matter, first of all, the president’s term of office and the evaluation period should be extended to evaluate accurately whether the goals were achieved or not. Second, the evaluation period should be set according to the period of production and accumulation of main accomplishments depending on the mission and characteristics of each institute. Third, tracking type evaluation instead of one-time evaluation should be intensified on the characteristic that the quality accomplishment is accumulated for a long time. Fourth, the range or standard of research accomplishment for quality evaluation should be extended from 3 years, which is present standard, to 6 years at least. From the perspective of evaluation standard, the performance indictors for quality evaluation are not definite in their methods and standards, and the DB management for performance measurement did not exist. To solve these problems, firstly, new performance indicators for object quality evaluation should be studied continuously. Secondly, certain performance management system and official real-time systems linked to attain reasonable performance data, CI and etc. should be installed and used to the actual evaluation. Thirdly, new indicators suitable to evaluate challenging and adventurous research projects instead of CI which may distort the spill-over-effect of R&D output should be induced and reflect qualitative evaluation using peer review. Fourthly, the internal project and individual evaluation carried out in each institute should be linked to the external evaluation in consistent way. From the perspective of evaluation management, the managing institutes and committee members added to the confusion made by the lack of understanding on the characteristics of government-funded research institutes. The first solution to this situation is to install an evaluation research department in the managing institute and have the department monitor continuously and feedback input-process-output-outcome indicators in consideration of the government-supported research institutes for evaluation credibility and consistency, instead of temporary task project or TFT activity report. Second, the evaluation should be marked on absolute scale not on relative one based on the nature and characteristics of each R&D institute, and the autonomy of each institute should be expanded to the level in which each institute can choose its own indicators from the common ones that do not correspond to it. Third, preliminary education for evaluation committee members should be intensified to specialize them with better understanding on the evaluation system, evaluation standard, and the R&D institute’s characteristics. 4. Limitations of this Study and Further works This study focused on the inducement of improvement directions for better paper quality evaluation using citation characteristics including its half-life and immediacy index, however, was limited in a few aspects of analysis methods and their application . The data drawn from Web of Science DB of Thomson Reuters do not cover the whole paper output of all the R&D fields, and there exists logical limitations in the way that the analysis did not reflect detailed characteristics of each R&D field for generalization. To improve present quality evaluation on R&D accomplishments in the future, additional analysis in detail or in classified type of science and technology is required. Also, the quality difference and its factor between the output by government-funded research institutes and world-leading R&D organizations should be analysed to apply to policy-making for global level of R&D creation. Besides, further studies on citation measurement methods and new measurement indicators are needed to evaluate scientific quality of R&D papers.
목차
제1장 서 론 1제1절 연구의 배경 및 목적 1제2절 연구의 범위 및 방법 5제2장 이론적 논의와 선행연구 검토 8제1절 논문 평가에 대한 이론적 고찰 8제2절 인용분석 문헌연구 검토 131. 과학기술분야에 대한 인용분석에 관한 기존 연구 132. 인용정보의 R&D성과평가 활용에 관한 연구 16제3절 선행연구 요약 및 시사점 19제4절 연구 설계 201. 연구질문 및 용어정의 202. 분석의 틀 223. 데이터 선정 및 분석 24제3장 임무중심형 기관유형별 논문의 피인용 특성 분석 26제1절 국가연구개발 성과평가와 기관평가의 정책변화 261. 제1차 국가연구개발 성과평가 기본계획 262. 제2차 국가연구개발 성과평가 기본계획 283. 출연(연) 기관평가 제도 개관 304. 임무중심형 기관평가 체계 33제2절 임무중심형 기관유형별 논문의 반감기 및 즉시성 분석 351. 분석의 개요 352. 기초미래선도형 연구기관 논문의 피인용 특성 분석 383. 공공인프라형 연구기관 논문의 피인용 특성 분석 414. 산업화형 연구기관 논문의 피인용 특성 분석 62제3절 임무중심형 기관유형별 논문의 피인용 특성 분석 종합 및 시사점 77제4장 인용주기분석을 통한 임무중심형 기관평가 수행의 개선방안 86제1절 평가주기적 관점 86제2절 평가기준적 관점 89제3절 평가관리적 관점 91제5장 결론 96제1절 연구결과의 요약 및 함의 96제2절 연구결과의 한계 및 향후 연구과제 99참고문헌 101ABSTRACT 108