Forum Replies Created
-
AuthorPosts
-
sunnysuhappyParticipant
besides my app is an App about audio playing and recording. so the session lasts active when user is in my app is Reasonable。
sunnysuhappyParticipantGot it.
I have changed some code in openears proj.
to stop release audiosession when in my app. this will resolve my problem currently. and I think this will not take me other problems.Thanks a lot
sunnysuhappyParticipanti’sorry. here is the right one.
from applaunch to a ‘Stop Listening’tap2015-07-03 17:51:36.269 OpenEarsSampleApp[5205:1529263] Starting OpenEars logging for OpenEars version 2.04 on 64-bit device (or build): iPhone running iOS version: 8.100000
2015-07-03 17:51:36.270 OpenEarsSampleApp[5205:1529263] Creating shared instance of OEPocketsphinxController
2015-07-03 17:51:36.303 OpenEarsSampleApp[5205:1529263] Starting dynamic language model generation2015-07-03 17:51:36.349 OpenEarsSampleApp[5205:1529263] Done creating language model with CMUCLMTK in 0.045764 seconds.
2015-07-03 17:51:36.383 OpenEarsSampleApp[5205:1529263] I’m done running performDictionaryLookup and it took 0.029058 seconds
2015-07-03 17:51:36.390 OpenEarsSampleApp[5205:1529263] I’m done running dynamic language model generation and it took 0.116840 seconds
2015-07-03 17:51:36.398 OpenEarsSampleApp[5205:1529263] Starting dynamic language model generation2015-07-03 17:51:36.468 OpenEarsSampleApp[5205:1529263] Done creating language model with CMUCLMTK in 0.069776 seconds.
2015-07-03 17:51:36.503 OpenEarsSampleApp[5205:1529263] The word QUIDNUNC was not found in the dictionary /private/var/mobile/Containers/Bundle/Application/524DAD99-8925-46E0-A0E4-2D4EBAFD463A/OpenEarsSampleApp.app/AcousticModelEnglish.bundle/LanguageModelGeneratorLookupList.text/LanguageModelGeneratorLookupList.text.
2015-07-03 17:51:36.504 OpenEarsSampleApp[5205:1529263] Now using the fallback method to look up the word QUIDNUNC
2015-07-03 17:51:36.504 OpenEarsSampleApp[5205:1529263] If this is happening more frequently than you would expect, the most likely cause for it is since you are using the English phonetic lookup dictionary is that your words are not in English or aren’t dictionary words.
2015-07-03 17:51:36.504 OpenEarsSampleApp[5205:1529263] Using convertGraphemes for the word or phrase QUIDNUNC which doesn’t appear in the dictionary
2015-07-03 17:51:36.518 OpenEarsSampleApp[5205:1529263] I’m done running performDictionaryLookup and it took 0.044882 seconds
2015-07-03 17:51:36.525 OpenEarsSampleApp[5205:1529263] I’m done running dynamic language model generation and it took 0.134767 seconds
2015-07-03 17:51:36.526 OpenEarsSampleApp[5205:1529263]Welcome to the OpenEars sample project. This project understands the words:
BACKWARD,
CHANGE,
FORWARD,
GO,
LEFT,
MODEL,
RIGHT,
TURN,
and if you say “CHANGE MODEL” it will switch to its dynamically-generated model which understands the words:
CHANGE,
MODEL,
MONDAY,
TUESDAY,
WEDNESDAY,
THURSDAY,
FRIDAY,
SATURDAY,
SUNDAY,
QUIDNUNC
2015-07-03 17:51:36.526 OpenEarsSampleApp[5205:1529263] Attempting to start listening session from startListeningWithLanguageModelAtPath:
2015-07-03 17:51:36.531 OpenEarsSampleApp[5205:1529263] User gave mic permission for this app.
2015-07-03 17:51:36.532 OpenEarsSampleApp[5205:1529263] setSecondsOfSilence wasn’t set, using default of 0.700000.
2015-07-03 17:51:36.532 OpenEarsSampleApp[5205:1529263] Successfully started listening session from startListeningWithLanguageModelAtPath:
2015-07-03 17:51:36.532 OpenEarsSampleApp[5205:1529282] Starting listening.
2015-07-03 17:51:36.533 OpenEarsSampleApp[5205:1529282] about to set up audio session
2015-07-03 17:51:36.533 OpenEarsSampleApp[5205:1529282] Creating audio session with default settings.
2015-07-03 17:51:36.809 OpenEarsSampleApp[5205:1529287] Audio route has changed for the following reason:
2015-07-03 17:51:36.812 OpenEarsSampleApp[5205:1529287] There was a category change. The new category is AVAudioSessionCategoryPlayAndRecord
2015-07-03 17:51:36.815 OpenEarsSampleApp[5205:1529287] This is not a case in which OpenEars notifies of a route change. At the close of this function, the new audio route is —SpeakerMicrophoneBuiltIn—. The previous route before changing to this route was <AVAudioSessionRouteDescription: 0x174010040,
inputs = (null);
outputs = (
“<AVAudioSessionPortDescription: 0x174010010, type = Speaker; name = \U626c\U58f0\U5668; UID = Speaker; selectedDataSource = (null)>”
)>.
2015-07-03 17:51:36.822 OpenEarsSampleApp[5205:1529282] done starting audio unit
2015-07-03 17:51:36.851 OpenEarsSampleApp[5205:1529282] Restoring SmartCMN value of 41.135986
2015-07-03 17:51:36.852 OpenEarsSampleApp[5205:1529282] Listening.
2015-07-03 17:51:36.852 OpenEarsSampleApp[5205:1529282] Project has these words or phrases in its dictionary:
BACKWARD
CHANGE
FORWARD
GO
LEFT
MODEL
RIGHT
TURN
2015-07-03 17:51:36.852 OpenEarsSampleApp[5205:1529282] Recognition loop has started
2015-07-03 17:51:36.866 OpenEarsSampleApp[5205:1529263] Local callback: Pocketsphinx is now listening.
2015-07-03 17:51:36.867 OpenEarsSampleApp[5205:1529263] Local callback: Pocketsphinx started.
2015-07-03 17:51:38.453 OpenEarsSampleApp[5205:1529263] Stopping listening.
2015-07-03 17:51:38.791 OpenEarsSampleApp[5205:1529263] Unable to stop listening because because an utterance is still in progress; trying again.
2015-07-03 17:51:38.829 OpenEarsSampleApp[5205:1529287] Audio route has changed for the following reason:
2015-07-03 17:51:38.833 OpenEarsSampleApp[5205:1529287] There was a category change. The new category is AVAudioSessionCategoryPlayAndRecord
2015-07-03 17:51:38.839 OpenEarsSampleApp[5205:1529287] This is not a case in which OpenEars notifies of a route change. At the close of this function, the new audio route is —Speaker—. The previous route before changing to this route was <AVAudioSessionRouteDescription: 0x170205130,
inputs = (
“<AVAudioSessionPortDescription: 0x170204f50, type = MicrophoneBuiltIn; name = iPhone \U9ea6\U514b\U98ce; UID = Built-In Microphone; selectedDataSource = \U4e0b>”
);
outputs = (
“<AVAudioSessionPortDescription: 0x170012660, type = Speaker; name = \U626c\U58f0\U5668; UID = Speaker; selectedDataSource = (null)>”
)>.
2015-07-03 17:51:38.843 OpenEarsSampleApp[5205:1529263] Attempting to stop an unstopped utterance so listening can stop.
2015-07-03 17:51:38.844 OpenEarsSampleApp[5205:1529263] No longer listening.
2015-07-03 17:51:38.848 OpenEarsSampleApp[5205:1529263] Local callback: Pocketsphinx has stopped listening.sunnysuhappyParticipantsorry,I devided it into two sections.this is the whole.this log is OpenEarsSampleApp’s Log,from a ‘Start Listening’ to ‘Stop Listening’.all I added is that opend the log switch.
2015-07-03 17:30:56.244 OpenEarsSampleApp[5196:1526295] Attempting to start listening session from startListeningWithLanguageModelAtPath:
2015-07-03 17:30:56.247 OpenEarsSampleApp[5196:1526295] User gave mic permission for this app.
2015-07-03 17:30:56.247 OpenEarsSampleApp[5196:1526295] setSecondsOfSilence wasn’t set, using default of 0.700000.
2015-07-03 17:30:56.248 OpenEarsSampleApp[5196:1526295] Successfully started listening session from startListeningWithLanguageModelAtPath:
2015-07-03 17:30:56.249 OpenEarsSampleApp[5196:1526309] Starting listening.
2015-07-03 17:30:56.249 OpenEarsSampleApp[5196:1526309] about to set up audio session
2015-07-03 17:30:56.251 OpenEarsSampleApp[5196:1526309] Creating audio session with default settings.
2015-07-03 17:30:56.292 OpenEarsSampleApp[5196:1526313] Audio route has changed for the following reason:
2015-07-03 17:30:56.294 OpenEarsSampleApp[5196:1526313] There was a category change. The new category is AVAudioSessionCategoryPlayAndRecord
2015-07-03 17:30:56.300 OpenEarsSampleApp[5196:1526313] This is not a case in which OpenEars notifies of a route change. At the close of this function, the new audio route is —SpeakerMicrophoneBuiltIn—. The previous route before changing to this route was <AVAudioSessionRouteDescription: 0x17000c850,
inputs = (null);
outputs = (
“<AVAudioSessionPortDescription: 0x17000c750, type = Speaker; name = \U626c\U58f0\U5668; UID = Speaker; selectedDataSource = (null)>”
)>.
2015-07-03 17:30:56.523 OpenEarsSampleApp[5196:1526309] done starting audio unit
2015-07-03 17:30:56.558 OpenEarsSampleApp[5196:1526309] Restoring SmartCMN value of 41.135986
2015-07-03 17:30:56.558 OpenEarsSampleApp[5196:1526309] Listening.
2015-07-03 17:30:56.558 OpenEarsSampleApp[5196:1526309] Project has these words or phrases in its dictionary:
BACKWARD
CHANGE
FORWARD
GO
LEFT
MODEL
RIGHT
TURN
2015-07-03 17:30:56.559 OpenEarsSampleApp[5196:1526309] Recognition loop has started
2015-07-03 17:30:56.561 OpenEarsSampleApp[5196:1526295] Local callback: Pocketsphinx is now listening.
2015-07-03 17:30:56.562 OpenEarsSampleApp[5196:1526295] Local callback: Pocketsphinx started.
2015-07-03 17:30:57.475 OpenEarsSampleApp[5196:1526309] Speech detected…
2015-07-03 17:30:57.476 OpenEarsSampleApp[5196:1526295] Local callback: Pocketsphinx has detected speech.
2015-07-03 17:30:58.245 OpenEarsSampleApp[5196:1526295] Stopping listening.
2015-07-03 17:30:58.349 OpenEarsSampleApp[5196:1526295] Unable to stop listening because because an utterance is still in progress; trying again.
2015-07-03 17:30:58.368 OpenEarsSampleApp[5196:1526313] Audio route has changed for the following reason:
2015-07-03 17:30:58.372 OpenEarsSampleApp[5196:1526313] There was a category change. The new category is AVAudioSessionCategoryPlayAndRecord
2015-07-03 17:30:58.375 OpenEarsSampleApp[5196:1526313] This is not a case in which OpenEars notifies of a route change. At the close of this function, the new audio route is —Speaker—. The previous route before changing to this route was <AVAudioSessionRouteDescription: 0x170201540,
inputs = (
“<AVAudioSessionPortDescription: 0x170201090, type = MicrophoneBuiltIn; name = iPhone \U9ea6\U514b\U98ce; UID = Built-In Microphone; selectedDataSource = \U4e0b>”
);
outputs = (
“<AVAudioSessionPortDescription: 0x17000c9b0, type = Speaker; name = \U626c\U58f0\U5668; UID = Speaker; selectedDataSource = (null)>”
)>.
2015-07-03 17:30:58.401 OpenEarsSampleApp[5196:1526295] Attempting to stop an unstopped utterance so listening can stop.
2015-07-03 17:30:58.418 OpenEarsSampleApp[5196:1526295] No longer listening.
2015-07-03 17:30:58.420 OpenEarsSampleApp[5196:1526295] Local callback: Pocketsphinx has stopped listening.sunnysuhappyParticipantThis is start Log
2015-07-03 15:25:12.188 OpenEarsSampleApp[5125:1504786] Attempting to start listening session from startListeningWithLanguageModelAtPath:
2015-07-03 15:25:12.189 OpenEarsSampleApp[5125:1504786] User gave mic permission for this app.
2015-07-03 15:25:12.190 OpenEarsSampleApp[5125:1504786] setSecondsOfSilence wasn’t set, using default of 0.700000.
2015-07-03 15:25:12.191 OpenEarsSampleApp[5125:1504786] Successfully started listening session from startListeningWithLanguageModelAtPath:
2015-07-03 15:25:12.191 OpenEarsSampleApp[5125:1504806] Starting listening.
2015-07-03 15:25:12.192 OpenEarsSampleApp[5125:1504806] about to set up audio session
2015-07-03 15:25:12.194 OpenEarsSampleApp[5125:1504806] Creating audio session with default settings.
2015-07-03 15:25:12.238 OpenEarsSampleApp[5125:1504807] Audio route has changed for the following reason:
2015-07-03 15:25:12.242 OpenEarsSampleApp[5125:1504807] There was a category change. The new category is AVAudioSessionCategoryPlayAndRecord
2015-07-03 15:25:12.467 OpenEarsSampleApp[5125:1504806] done starting audio unit
2015-07-03 15:25:12.477 OpenEarsSampleApp[5125:1504807] This is not a case in which OpenEars notifies of a route change. At the close of this function, the new audio route is —SpeakerMicrophoneBuiltIn—. The previous route before changing to this route was <AVAudioSessionRouteDescription: 0x170214830,
inputs = (null);
outputs = (
“<AVAudioSessionPortDescription: 0x170214920, type = Speaker; name = \U626c\U58f0\U5668; UID = Speaker; selectedDataSource = (null)>”
)>.
2015-07-03 15:25:12.496 OpenEarsSampleApp[5125:1504806] Restoring SmartCMN value of 45.180664
2015-07-03 15:25:12.496 OpenEarsSampleApp[5125:1504806] Listening.
2015-07-03 15:25:12.497 OpenEarsSampleApp[5125:1504806] Project has these words or phrases in its dictionary:
BACKWARD
CHANGE
FORWARD
GO
LEFT
MODEL
RIGHT
TURN
2015-07-03 15:25:12.497 OpenEarsSampleApp[5125:1504806] Recognition loop has started
2015-07-03 15:25:12.498 OpenEarsSampleApp[5125:1504786] Local callback: Pocketsphinx is now listening.
2015-07-03 15:25:12.499 OpenEarsSampleApp[5125:1504786] Local callback: Pocketsphinx started.
2015-07-03 15:25:13.671 OpenEarsSampleApp[5125:1504806] Speech detected…
2015-07-03 15:25:13.672 OpenEarsSampleApp[5125:1504786] Local callback: Pocketsphinx has detected speech.This is Stop Log:
2015-07-03 15:25:13.871 OpenEarsSampleApp[5125:1504786] Stopping listening.
2015-07-03 15:25:14.020 OpenEarsSampleApp[5125:1504786] Unable to stop listening because because an utterance is still in progress; trying again.
2015-07-03 15:25:14.041 OpenEarsSampleApp[5125:1504807] Audio route has changed for the following reason:
2015-07-03 15:25:14.044 OpenEarsSampleApp[5125:1504807] There was a category change. The new category is AVAudioSessionCategoryPlayAndRecord
2015-07-03 15:25:14.046 OpenEarsSampleApp[5125:1504807] This is not a case in which OpenEars notifies of a route change. At the close of this function, the new audio route is —Speaker—. The previous route before changing to this route was <AVAudioSessionRouteDescription: 0x174018040,
inputs = (
“<AVAudioSessionPortDescription: 0x1740165b0, type = MicrophoneBuiltIn; name = iPhone \U9ea6\U514b\U98ce; UID = Built-In Microphone; selectedDataSource = \U4e0b>”
);
outputs = (
“<AVAudioSessionPortDescription: 0x174018d20, type = Speaker; name = \U626c\U58f0\U5668; UID = Speaker; selectedDataSource = (null)>”
)>.
2015-07-03 15:25:14.071 OpenEarsSampleApp[5125:1504786] Attempting to stop an unstopped utterance so listening can stop.
2015-07-03 15:25:14.079 OpenEarsSampleApp[5125:1504786] No longer listening.
2015-07-03 15:25:14.081 OpenEarsSampleApp[5125:1504786] Local callback: Pocketsphinx has stopped listening.sunnysuhappyParticipanthi Halle.
I have tested your openears example,and fint it also causes audioSessionRouteChange.if there are some bugs?
sunnysuhappyParticipantcan i have some way to avoid route change while stopListen
sunnysuhappyParticipantMy I Have a Way to Control OpenEars to give me the result.now I Need to wait until it judged the silence.
sorry, i meant that i must wait several seconds for the result.because openears need silence time and to retuen results.
may I have a way to get the result rapidly with JSFGsunnysuhappyParticipantok thanks very much
sunnysuhappyParticipantOk thanks.
If i want to recognize lots of sentences. so i think I should clean lauageModel when recognize new words to save memory. but it seems no apis.sunnysuhappyParticipantThis situation happens when i use ReturnNBest mode.
sunnysuhappyParticipantsoory, I mean if I have a way that i can input custom audio buffer in it.instead of the api startListen.
Now i think i find the way to achieve my target.use rapidEars to have realtime recognize.but i find that rapidEars has crash when use it for server seconds, and it seems having memory leak, it always lead to a large memory useage increased every time to 100M and moresunnysuhappyParticipantok
thanks for your reply,and i want to know if I can input audio buffer to recognize instead of a wav path. -
AuthorPosts