Software metrics data analysis - exploring the relative performance of some commonly used modeling techniques
aut.researcher | MacDonell, Stephen Gerard | |
dc.contributor.author | Gray, AR | |
dc.contributor.author | MacDonell, SG | |
dc.date.accessioned | 2011-10-03T00:57:04Z | |
dc.date.available | 2011-10-03T00:57:04Z | |
dc.date.copyright | 1999 | |
dc.date.issued | 1999 | |
dc.description.abstract | Whilst some software measurement research has been unquestionably successful, other research has struggled to enable expected advances in project and process management. Contributing to this lack of advancement has been the incidence of inappropriate or non-optimal application of various model-building procedures. This obviously raises questions over the validity and reliability of any results obtained as well as the conclusions that may have been drawn regarding the appropriateness of the techniques in question. In this paper we investigate the influence of various data set characteristics and the purpose of analysis on the effectiveness of four model-building techniques—three statistical methods and one neural network method. In order to illustrate the impact of data set characteristics, three separate data sets, drawn from the literature, are used in this analysis. In terms of predictive accuracy, it is shown that no one modeling method is best in every case. Some consideration of the characteristics of data sets should therefore occur before analysis begins, so that the most appropriate modeling method is then used. Moreover, issues other than predictive accuracy may have a significant influence on the selection of model-building methods. These issues are also addressed here and a series of guidelines for selecting among and implementing these and other modeling techniques is discussed. | |
dc.identifier.citation | Empirical Software Engineering, vol.4(4), pp.297-316 | |
dc.identifier.doi | 10.1023/A:1009849100780 | |
dc.identifier.uri | https://hdl.handle.net/10292/2202 | |
dc.publisher | Springer Netherlands | |
dc.relation.uri | http://dx.doi.org/10.1023/A:1009849100780 | |
dc.rights | An author may self-archive an author-created version of his/her article on his/her own website and or in his/her institutional repository. He/she may also deposit this version on his/her funder’s or funder’s designated repository at the funder’s request or as a result of a legal obligation, provided it is not made publicly available until 12 months after official publication. He/ she may not use the publisher's PDF version, which is posted on www.springerlink.com, for the purpose of self-archiving or deposit. Furthermore, the author may only post his/her version provided acknowledgement is given to the original source of publication and a link is inserted to the published article on Springer's website. The link must be accompanied by the following text: "The final publication is available at www.springerlink.com”. (Please also see Publisher’s Version and Citation) | |
dc.rights.accessrights | OpenAccess | |
dc.subject | Software metrics | |
dc.subject | Analysis | |
dc.subject | Statistical methods | |
dc.subject | Connectionist methods | |
dc.title | Software metrics data analysis - exploring the relative performance of some commonly used modeling techniques | |
dc.type | Journal Article | |
pubs.organisational-data | /AUT | |
pubs.organisational-data | /AUT/Design & Creative Technologies | |
pubs.organisational-data | /AUT/PBRF Researchers | |
pubs.organisational-data | /AUT/PBRF Researchers/Design & Creative Technologies PBRF Researchers | |
pubs.organisational-data | /AUT/PBRF Researchers/Design & Creative Technologies PBRF Researchers/DCT C & M Computing |