dc.contributor.author |
Exarhos, M |
en |
dc.contributor.author |
Papaodysseus, C |
en |
dc.contributor.author |
Fragoulis, D |
en |
dc.contributor.author |
Panagopoulos, Th |
en |
dc.contributor.author |
Alexiou, C |
en |
dc.contributor.author |
Roussopoulos, G |
en |
dc.date.accessioned |
2014-03-01T01:51:52Z |
|
dc.date.available |
2014-03-01T01:51:52Z |
|
dc.date.issued |
2002 |
en |
dc.identifier.uri |
https://dspace.lib.ntua.gr/xmlui/handle/123456789/26485 |
|
dc.relation.uri |
http://www.scopus.com/inward/record.url?eid=2-s2.0-4944229172&partnerID=40&md5=efb9efde163473ccf52819ed96106456 |
en |
dc.subject |
Automated timbre classification |
en |
dc.subject |
Instrument identification |
en |
dc.subject |
Timbre perception |
en |
dc.subject |
Timbre recognition |
en |
dc.subject.other |
Data acquisition |
en |
dc.subject.other |
Frequency domain analysis |
en |
dc.subject.other |
Neural networks |
en |
dc.subject.other |
Pattern recognition |
en |
dc.subject.other |
Statistical methods |
en |
dc.subject.other |
Timber |
en |
dc.subject.other |
Automated timbre classification |
en |
dc.subject.other |
Instrument identification |
en |
dc.subject.other |
Timbre perception |
en |
dc.subject.other |
Timbre recognition |
en |
dc.subject.other |
Musical instruments |
en |
dc.subject.other |
Data Processing |
en |
dc.subject.other |
Lumber |
en |
dc.subject.other |
Neural Networks |
en |
dc.subject.other |
Pattern Recognition |
en |
dc.subject.other |
Statistical Methods |
en |
dc.title |
Efficient automated piano - Guitar timbre classification |
en |
heal.type |
journalArticle |
en |
heal.publicationDate |
2002 |
en |
heal.abstract |
In this paper it is pointed out that there is a new decisively important factor in both the perceptual as well as the automated instrument identification process. This factor is determined by the inharmonic spectral content of a note, while it is, in practice, totally independent of the note spectrum harmonic part. This conclusion is based on a number of extended acoustical experiments, performed on six hundred twelve (612) isolated guitar notes and nine hundred twenty six (926) isolated piano notes, over the full pitch range of each instrument. The notes have been recorded from six different performers where each one played a different instrument. Next, a number of powerful criteria for the classification between guitar and piano are proposed. Using these criteria, automated classification between 754 piano and guitar test notes has been achieved with a hundred percent (100%) success rate. |
en |
heal.publisher |
World Scientific and Engineering Academy and Society |
en |
heal.journalName |
Recent Advances in Computers, Computing and Communications |
en |
dc.identifier.spage |
235 |
en |
dc.identifier.epage |
240 |
en |