DocumentCode :
1890728
Title :
Applying source code analysis techniques: A case study for a large mission-critical software system
Author :
Haralambiev, Haralambi ; Boychev, Stanimir ; Lilov, Delyan ; Kraichev, Kraicho
Author_Institution :
Appl. R&D Center, Musala Soft, Sofia, Bulgaria
fYear :
2011
fDate :
27-29 April 2011
Firstpage :
1
Lastpage :
2
Abstract :
Source code analysis has been and still is extensively researched topic with various applications to the modern software industry. In this paper we share our experience in applying various source code analysis techniques for assessing the quality of and detecting potential defects in a large mission-critical software system. The case study is about the maintenance of a software system of a Bulgarian government agency. The system has been developed by a third-party software vendor over a period of four years. The development produced over 4 million LOC using more than 20 technologies. Musala Soft won a tender for maintaining this system in 2008. Although the system was operational, there were various issues that were known to its users. So, a decision was made to assess the system´s quality with various source code analysis tools. The expectation was that the findings will reveal some of the problems´ cause, allowing us to correct the issues and thus improve the quality and focus on functional enhancements. Musala Soft had already established a special unit - Applied Research and Development Center - dealing with research and advancements in the area of software system analysis. Thus, a natural next step was for this unit to use the know-how and in-house developed tools to do the assessment. The team used various techniques that had been subject to intense research, more precisely: software metrics, code clone detection, defect and “code smells” detection through flow-sensitive and points-to analysis, software visualization and graph drawing. In addition to the open-source and free commercial tools, the team used internally developed ones that complement or improve what was available. The internally developed Smart Source Analyzer platform that was used is focused on several analysis areas: source code modeling, allowing easy navigation through the code elements and relations for different programming languages; quality audit through software metri- s by aggregating various metrics into a more meaningful quality characteristic (e.g. “maintainability”); source code pattern recognition - to detect various security issues and “code smells”. The produced results presented information about both the structure of the system and its quality. As the analysis was executed in the beginning of the maintenance tenure, it was vital for the team members to quickly grasp the architecture and the business logic. On the other hand, it was important to review the detected quality problems as this guided the team to quick solutions for the existing issues and also highlighted areas that would impede future improvements. The tool IPlasma and its System Complexity View (Fig. 1) revealed where the business logic is concentrated, which are the most important and which are the most complex elements of the system. The analysis with our internal metrics framework (Fig. 2) pointed out places that need refactoring because the code is hard to modify on request or testing is practically impossible. The code clone detection tools showed places where copy and paste programming has been applied. PMD, Find Bugs and Klockwork Solo tools were used to detect various “code smells” (Fig. 3). There were a number of occurrences that were indeed bugs in the system. Although these results were productive for the successful execution of the project, there were some challenges that should be addressed in the future through more extensive research. The two aspects we consider the most important are usability and integration. As most of the tools require very deep understanding of the underlying analysis, the whole process requires tight cooperation between the analysis team and the maintenance team. For example, most of the metrics tools available provide specific values for a given metric without any indication what the value means and what is the threshold. Our internal metrics framework aggregates the met
Keywords :
DP industry; pattern recognition; public domain software; security of data; software maintenance; software metrics; software quality; software tools; source coding; Bulgarian government agency; IPlasma; PMD; applied research and development center; business logic; code clone detection; code smells detection; findbug tool; graph drawing; implementation quality audit; klockwork solo tool; mission critical software system; open source tool; programming language; quality assurance; quality characteristic; security issue; smart source analyzer; software development; software industry; software metrics; software system maintenance; software visualization; source code analysis; source code pattern recognition; system complexity; third party software vendor; Cloning; Maintenance engineering; Research and development; Software metrics; Software systems; case study; implementation quality; software maintenance; software metrics; source code analysis;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
EUROCON - International Conference on Computer as a Tool (EUROCON), 2011 IEEE
Conference_Location :
Lisbon
Print_ISBN :
978-1-4244-7486-8
Type :
conf
DOI :
10.1109/EUROCON.2011.5929241
Filename :
5929241
Link To Document :
بازگشت