A classification and summarization method for analysis of research activities in an academic faculty
Loures, Eduardo Rocha; Liao, Yongxin; Canciglieri Junior, Osiris
Abstract
Nowadays, more and more scientific research activities are carried out in different laboratories and universities, which not only play an important role in the development of science and technology, but also show a significant inference on education. The improvement of the research capability of an academic faculty can directly impact the quality of education, bring innovations to Industrial Engineering curriculum proposals, and guarantee the subjects are up to date. The investigation of the existing issues in the current research activities is usually considered as the primary and challenging step. As the output of research activities, academic articles are often considered as a kind of evidence-based resources for the investigation. Despite some methodological efforts have been made by existing article review methods, less attention has been paid to discover the implicit academic relationships among the academic staffs and to investigate their research expertise. The objective of this study is to address this existing drawback through the proposition of an Academic Information Classification and Summarization method. A case study is carried out in the Industrial and System Engineering Graduate Program (PPGEPS), PUCPR, Brazil. The result not only highlights the advantages that can be obtained from this proposition from the education perspective related to Industrial Engineering, but also can be used as evidence to balance and compare an academic staff’s research expertise and his/her teaching disciplines.
Keywords
References
Adler, N., Elmquist, M., & Norrgren, F. (2009). The challenge of managing boundary-spanning research activities: experiences from the Swedish context. Research Policy, 38(7), 1136-1149. http://dx.doi.org/10.1016/j.respol.2009.05.001.
Arksey, H., & O’Malley, L. (2005). Scoping studies: towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19-32. http://dx.doi.org/10.1080/1364557032000119616.
Arundel, A., & Constantelou, A. (2006). Conventional and experimental indicators of knowledge flows. In Y. Caloghirou, A. Constantelou & N. Vonortas (Eds.), Knowledge flows in European industry (pp. 45-66). London: Routledge. Retrieved in 21 July 2016, from http://ecite.utas.edu.au/75683
Bercovitz, J., & Feldman, M. (2011). The mechanisms of collaboration in inventive teams: composition, social networks, and geography. Research Policy, 40(1), 81-93. http://dx.doi.org/10.1016/j.respol.2010.09.008.
Chen, N.-S., Wei, C.-W., & Chen, H.-J. (2008). Mining e-Learning domain concept map from academic articles. Comput. Educ., 50(3), 1009-1021. http://dx.doi.org/10.1016/j.compedu.2006.10.001.
Feldman, M. P., Lanahan, L., & Lendel, I. V. (2014). Experiments in the laboratories of democracy: state scientific capacity building. Economic Development Quarterly, 28(2), 107-131. http://dx.doi.org/10.1177/0891242413490018.
Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: an international comparison. Minerva, 41(4), 277-304. http://dx.doi.org/10.1023/B:MINE.0000005155.70870.bd.
He, Z.-L., Geng, X.-S., & Campbell-Hunt, C. (2009). Research collaboration and research output: a longitudinal study of 65 biomedical scientists in a New Zealand university. Research Policy, 38(2), 306-317. http://dx.doi.org/10.1016/j.respol.2008.11.011.
Hoekman, J., Frenken, K., & Tijssen, R. J. W. (2010). Research collaboration at a distance: changing spatial patterns of scientific collaboration within Europe. Research Policy, 39(5), 662-673. http://dx.doi.org/10.1016/j.respol.2010.01.012.
Hulth, A. (2003). Improved automatic keyword extraction given more linguistic knowledge. In Proceedings of the 2003 Conference on Empirical Methods in Natural Language Processing (Vol. 10, pp. 216-223). Stroudsburg, USA. http://dx.doi.org/10.3115/1119355.1119383
Khangura, S., Konnyu, K., Cushman, R., Grimshaw, J., & Moher, D. (2012). Evidence summaries: the evolution of a rapid review approach. Systematic Reviews, 1(1), 10. PMid:22587960. http://dx.doi.org/10.1186/2046-4053-1-10.
King, J. (1987). A review of bibliometric and other science indicators and their role in research evaluation. Journal of Information Science, 13(5), 261-276. http://dx.doi.org/10.1177/016555158701300501.
Li, F., Miao, Y., & Yang, C. (2014). How do alumni faculty behave in research collaboration? An analysis of Chang Jiang Scholars in China. Research Policy, 44(2), 438-450. http://dx.doi.org/10.1016/j.respol.2014.09.002.
Liao, Y., Lezoche, M., Loures, E. R., Panetto, H., & Boudjlida, N. (2014). Formal semantic annotations for models interoperability in a PLM environment. In Proceedings of the 19th World Congress of the International Federation of Automatic Control (IFAC 2014) (pp. 2382-2393), Cape Town, South Africa. http://dx.doi.org/10.3182/20140824-6-ZA-1003.02551
Matsuo, Y., & Ishizuka, M. (2004). Keyword extraction from a single document using word co-occurrence statistical information. International Journal of Artificial Intelligence Tools, 13(1), 157-169. http://dx.doi.org/10.1142/S0218213004001466.
Mullen, B. (1986). Basic meta-analysis: description of a statistical package. Behavior Research Methods, Instruments, & Computers, 18(2), 165-167. http://dx.doi.org/10.3758/BF03201018.
Smeby, J. C., & Try, S. (2005). Departmental contexts and faculty research activity in Norway. Research in Higher Education, 46(6), 593-619. http://dx.doi.org/10.1007/s11162-004-4136-2.
Smith, S., Ward, V., & House, A. (2011). Impact’ in the proposals for the UK’s research excellence framework: shifting the boundaries of academic autonomy. Research Policy, 40(10), 1369-1379. http://dx.doi.org/10.1016/j.respol.2011.05.026.
Yih, W., Goodman, J., & Carvalho, V. R. (2006). Finding advertising keywords on web pages. In Proceedings of the 15th International Conference on World Wide Web (WWW ’06) (pp. 213-222), Edinburgh, Scotland. http://dx.doi.org/10.1145/1135777.1135813.