dc.contributor.author |
Fragoulis, D |
en |
dc.contributor.author |
Papaodysseus, C |
en |
dc.contributor.author |
Exarhos, M |
en |
dc.contributor.author |
Roussopoulos, G |
en |
dc.contributor.author |
Panagopoulos, T |
en |
dc.contributor.author |
Kamarotos, D |
en |
dc.date.accessioned |
2014-03-01T01:23:39Z |
|
dc.date.available |
2014-03-01T01:23:39Z |
|
dc.date.issued |
2006 |
en |
dc.identifier.issn |
1558-7916 |
en |
dc.identifier.uri |
https://dspace.lib.ntua.gr/xmlui/handle/123456789/17069 |
|
dc.subject |
Musical instrument classification |
en |
dc.subject |
Noontonal spectrum |
en |
dc.subject |
Timbre identification |
en |
dc.subject |
Timbre recognition |
en |
dc.subject.classification |
Acoustics |
en |
dc.subject.classification |
Engineering, Electrical & Electronic |
en |
dc.subject.other |
Musical instrument classification |
en |
dc.subject.other |
Noontonal spectrum |
en |
dc.subject.other |
Timbre identification |
en |
dc.subject.other |
Timbre recognition |
en |
dc.subject.other |
Acoustics |
en |
dc.subject.other |
Classification (of information) |
en |
dc.subject.other |
Spectrum analysis |
en |
dc.subject.other |
Speech analysis |
en |
dc.subject.other |
Musical instruments |
en |
dc.title |
Automated classification of piano-guitar notes |
en |
heal.type |
journalArticle |
en |
heal.identifier.primary |
10.1109/TSA.2005.857571 |
en |
heal.identifier.secondary |
http://dx.doi.org/10.1109/TSA.2005.857571 |
en |
heal.language |
English |
en |
heal.publicationDate |
2006 |
en |
heal.abstract |
In this paper, a new decisively important factor in both the perceptual and the automated piano-guitar identification process is introduced. This factor is determined by the nontonal spectral content of a note, while it is, in practice, totally independent of the note spectrum tonal part This conclusion and all related results are based on a number of extended acoustical experiments, performed over the full pitch range of each instrument. The notes have been recorded from six different performers each of whom played a different instrument. Next, a number of powerful criteria for the classification between guitar and piano is proposed. Using these criteria, automated classification between 754 piano and guitar test notes has been achieved with a 100% success rate. © 2006 IEEE. |
en |
heal.publisher |
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
en |
heal.journalName |
IEEE Transactions on Audio, Speech and Language Processing |
en |
dc.identifier.doi |
10.1109/TSA.2005.857571 |
en |
dc.identifier.isi |
ISI:000237140500028 |
en |
dc.identifier.volume |
14 |
en |
dc.identifier.issue |
3 |
en |
dc.identifier.spage |
1040 |
en |
dc.identifier.epage |
1050 |
en |