Reply To: followed tutorial, not working for me

Home Forums OpenEars followed tutorial, not working for me Reply To: followed tutorial, not working for me

#1022323
JonLocate
Participant

hey Halle, thanks for your replies. I’ve updated my code a little, to closer match the tutorial. Also I am not sure what you mean by ivars?

the delegate methods are still in my code but i didn’t post them just to make the code shorter. also i turned on verbosePocketSphinx (i think) and the logging output is the same as when only using [OpenEarLogging startOpenEarsLogging].

header file

#import <UIKit/UIKit.h>
#import <OpenEars/PocketsphinxController.h>
#import <OpenEars/AcousticModel.h>
#import <OpenEars/OpenEarsEventsObserver.h>

@interface SpeechViewController : UIViewController <OpenEarsEventsObserverDelegate>
{
PocketsphinxController *pocketSphinxController;
OpenEarsEventsObserver *openEarsEventObserver;
}

@property (strong, nonatomic) PocketsphinxController *pocketSphinxController;
@property (strong, nonatomic) OpenEarsEventsObserver *openEarsEventObserver;

@property (strong, nonatomic) IBOutlet UILabel *resultsLabel;

– (IBAction)talkButton:(id)sender;

@end

implementation

#import “SpeechViewController.h”
#import <OpenEars/LanguageModelGenerator.h>
#import <OpenEars/OpenEarsLogging.h>

@interface SpeechViewController () <OpenEarsEventsObserverDelegate>

@end

@implementation SpeechViewController

@synthesize pocketSphinxController;
@synthesize openEarsEventObserver;

– (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

[openEarsEventObserver setDelegate:self];

pocketSphinxController.verbosePocketSphinx = true;

[OpenEarsLogging startOpenEarsLogging];
}

– (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}

– (PocketsphinxController *)pocketSphinxController
{
if (pocketSphinxController == nil)
{
pocketSphinxController = [[PocketsphinxController alloc] init];
}

return pocketSphinxController;
}

– (OpenEarsEventsObserver *)openEarsEventObserver
{
if (openEarsEventObserver == nil)
{
openEarsEventObserver = [[OpenEarsEventsObserver alloc] init];
}

return openEarsEventObserver;
}

– (IBAction)talkButton:(id)sender
{

LanguageModelGenerator *LangGen = [[LanguageModelGenerator alloc] init];

NSArray *words = [NSArray arrayWithObjects:@”HELLO WORLD”, @”HELLO”, @”WORLD”, @”TEST”, @”SPEECH”, @”LOCATION”, nil];
NSString *name = @”LangModelName”;
NSError *err = [LangGen generateLanguageModelFromArray:words withFilesNamed:name forAcousticModelAtPath:[AcousticModel pathToModel:@”AcousticModelEnglish”]];

NSDictionary *languageGeneratorResults = nil;

NSString *lmPath = nil;
NSString *dicPath = nil;

if ([err code] == noErr)
{
languageGeneratorResults = [err userInfo];

lmPath = [languageGeneratorResults objectForKey:@”LMPath”];
dicPath = [languageGeneratorResults objectForKey:@”DictionaryPath”];
}
else
{
NSLog(@”Error: %@”, [err localizedDescription]);
}

//[openEarsEventObserver setDelegate:self];

[pocketSphinxController startListeningWithLanguageModelAtPath:lmPath dictionaryAtPath:dicPath acousticModelAtPath:[AcousticModel pathToModel:@”AcousticModelEnglish”] languageModelIsJSGF:NO];

}

– (void)pocketsphinxDidReceiveHypothesis:(NSString *)hypothesis recognitionScore:(NSString *)recognitionScore utteranceID:(NSString *)utteranceID
{
NSLog(@”The Recieved Hypothesis is %@ with a score of %@ and an ID of %@”, hypothesis, recognitionScore, utteranceID);

[self.resultsLabel setText:hypothesis];
}

logging

2014-08-21 10:03:46.900 OpenEarsTest[192:60b] Starting OpenEars logging for OpenEars version 1.7 on 32-bit device: iPad running iOS version: 7.100000
2014-08-21 10:03:49.715 OpenEarsTest[192:60b] acousticModelPath is /var/mobile/Applications/7E1E39CB-4194-4F73-B3FC-8997C8C161A0/OpenEarsTest.app/AcousticModelEnglish.bundle
2014-08-21 10:03:49.800 OpenEarsTest[192:60b] Starting dynamic language model generation
2014-08-21 10:03:49.806 OpenEarsTest[192:60b] Able to open /var/mobile/Applications/7E1E39CB-4194-4F73-B3FC-8997C8C161A0/Library/Caches/LangModelName.corpus for reading
2014-08-21 10:03:49.809 OpenEarsTest[192:60b] Able to open /var/mobile/Applications/7E1E39CB-4194-4F73-B3FC-8997C8C161A0/Library/Caches/LangModelName_pipe.txt for writing
2014-08-21 10:03:49.811 OpenEarsTest[192:60b] Starting text2wfreq_impl
2014-08-21 10:03:49.834 OpenEarsTest[192:60b] Done with text2wfreq_impl
2014-08-21 10:03:49.836 OpenEarsTest[192:60b] Able to open /var/mobile/Applications/7E1E39CB-4194-4F73-B3FC-8997C8C161A0/Library/Caches/LangModelName_pipe.txt for reading.
2014-08-21 10:03:49.838 OpenEarsTest[192:60b] Able to open /var/mobile/Applications/7E1E39CB-4194-4F73-B3FC-8997C8C161A0/Library/Caches/LangModelName.vocab for reading.
2014-08-21 10:03:49.840 OpenEarsTest[192:60b] Starting wfreq2vocab
2014-08-21 10:03:49.842 OpenEarsTest[192:60b] Done with wfreq2vocab
2014-08-21 10:03:49.844 OpenEarsTest[192:60b] Starting text2idngram
2014-08-21 10:03:49.867 OpenEarsTest[192:60b] Done with text2idngram
2014-08-21 10:03:49.878 OpenEarsTest[192:60b] Starting idngram2lm

2014-08-21 10:03:49.892 OpenEarsTest[192:60b] Done with idngram2lm
2014-08-21 10:03:49.894 OpenEarsTest[192:60b] Starting sphinx_lm_convert
2014-08-21 10:03:49.908 OpenEarsTest[192:60b] Finishing sphinx_lm_convert
2014-08-21 10:03:49.915 OpenEarsTest[192:60b] Done creating language model with CMUCLMTK in 0.114059 seconds.
2014-08-21 10:03:50.093 OpenEarsTest[192:60b] I’m done running performDictionaryLookup and it took 0.147953 seconds
2014-08-21 10:03:50.101 OpenEarsTest[192:60b] I’m done running dynamic language model generation and it took 0.384307 seconds

i doubt it has anything to do with how i imported OpenEars into my project since all the classes and methods are fully visible and showing no errors, but i could be wrong, maybe there is a missing reference to something?

thanks very much!