Towards classifying human phonemes without encodings via spatiotemporal liquid state machines

Alex Frid, Hananel Hazan, Larry Manevitz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Classifying human production of phonemes without additional encoding is accomplished at the level of about 77% using a version of reservoir computing. So far this has been accomplished with: (1) artificial data (2) artificial noise (designed to mimic natural noise) (3) natural human data with artificial noise (4) natural human data with its natural noise and variance albeit for certain phonemes. This mechanism, unlike most other methods is done without any encoding of the signal, and without changing time into space, but instead uses the Liquid State Machine paradigm which is an abstraction of natural cortical arrangements. The data is entered as an analogue signal without any modifications. This means that the methodology is close to natural biological mechanisms.

Original languageEnglish
Title of host publicationProceedings - 2014 IEEE International Conference on Software Science, Technology and Engineering, SWSTE 2014
PublisherIEEE Computer Society
Pages63-64
Number of pages2
ISBN (Print)9780769551883
DOIs
StatePublished - 2014
Externally publishedYes
Event2014 IEEE International Conference on Software Science, Technology and Engineering, SWSTE 2014 - Ramat Gan, Israel
Duration: 11 Jun 201412 Jun 2014

Publication series

NameProceedings - 2014 IEEE International Conference on Software Science, Technology and Engineering, SWSTE 2014

Conference

Conference2014 IEEE International Conference on Software Science, Technology and Engineering, SWSTE 2014
Country/TerritoryIsrael
CityRamat Gan
Period11/06/1412/06/14

Keywords

  • Liquid State Machine
  • Machine Learning
  • classification
  • speech synthesis

Fingerprint

Dive into the research topics of 'Towards classifying human phonemes without encodings via spatiotemporal liquid state machines'. Together they form a unique fingerprint.

Cite this