Problem when using NSDictionary

Home Forums OpenEars Problem when using NSDictionary

Viewing 4 posts - 1 through 4 (of 4 total)

  • Author
    Posts
  • #1024877
    Wiraju
    Participant

    Fist I should let you know that I’m a novice programmer.

    I have created a very simple App for testing and learning.

    When I run the App on mu iPad, I get the following message.

    2015-02-15 12:34:21.421 OE[10405:4642110] Error: you have invoked the method:

    startListeningWithLanguageModelAtPath:(NSString *)languageModelPath dictionaryAtPath:(NSString *)dictionaryPath acousticModelAtPath:(NSString *)acousticModelPath languageModelIsJSGF:(BOOL)languageModelIsJSGF

    with a languageModelPath which is nil. If your call to OELanguageModelGenerator did not return an error when you generated this grammar, that means the correct path to your grammar that you should pass to this method’s languageModelPath argument is as follows:

    NSString *correctPathToMyLanguageModelFile = [NSString stringWithFormat:@”%@/TheNameIChoseForMyLanguageModelAndDictionaryFile.%@”,[NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0],@”gram”];

    Feel free to copy and paste this code for your path to your grammar, but remember to replace the part that says “TheNameIChoseForMyLanguageModelAndDictionaryFile” with the name you actually chose for your grammar and dictionary file or you will get this error again.

    With my limited knowledge, I’m having trouble figuring out where add the NSString as suggested in the message.

    Here are my ‘h’ and ‘m’ files.

    h file:

    #import <UIKit/UIKit.h>

    #import <OpenEars/OEEventsObserver.h>
    #import <Slt/Slt.h>
    #import <OpenEars/OEFliteController.h>

    @interface ViewController : UIViewController <OEEventsObserverDelegate>

    @property (weak, nonatomic) IBOutlet UILabel *wordSpoken;
    @property (strong, nonatomic) OEEventsObserver *openEarsEventsObserver;

    @property (strong, nonatomic) OEFliteController *fliteController;
    @property (strong, nonatomic) Slt *slt;

    @end

    m file:

    #import <UIKit/UIKit.h>

    #import <OpenEars/OEEventsObserver.h>
    #import <Slt/Slt.h>
    #import <OpenEars/OEFliteController.h>

    @interface ViewController : UIViewController <OEEventsObserverDelegate>

    @property (weak, nonatomic) IBOutlet UILabel *wordSpoken;
    @property (strong, nonatomic) OEEventsObserver *openEarsEventsObserver;

    @property (strong, nonatomic) OEFliteController *fliteController;
    @property (strong, nonatomic) Slt *slt;

    @end

    Thanks in advance to anyone that can help me with this.

    #1024878
    Wiraju
    Participant

    oops!

    Heres my .m file again

    #import “ViewController.h”

    #import <OpenEars/OELanguageModelGenerator.h>

    #import <OpenEars/OEPocketsphinxController.h>
    #import <OpenEars/OEAcousticModel.h>

    @interface ViewController ()

    @end

    @implementation ViewController

    – (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.

    OELanguageModelGenerator *lmGenerator = [[OELanguageModelGenerator alloc] init];

    //NSArray *words = [NSArray arrayWithObjects: @”1″, @”2″, @”3″, @”4″, @”5″, @”6″, nil];

    NSDictionary *grammar = @{
    ThisWillBeSaidOnce : @[
    @{ OneOfTheseCanBeSaidOnce : @[@”HELLO COMPUTER”, @”GREETINGS ROBOT”]},
    @{ OneOfTheseWillBeSaidOnce : @[@”DO THE FOLLOWING”, @”INSTRUCTION”]},
    @{ OneOfTheseWillBeSaidOnce : @[@”GO”, @”MOVE”]},
    @{ThisWillBeSaidOnce : @[
    @{ OneOfTheseWillBeSaidOnce : @[@”10″, @”20″,@”30″]},
    @{ OneOfTheseWillBeSaidOnce : @[@”LEFT”, @”RIGHT”, @”FORWARD”]}
    ]},
    @{ ThisCanBeSaidOnce : @[@”THANK YOU”]}
    ]
    };

    NSString *name = @”MyLanguageModelFiles”;

    NSError *err = [lmGenerator generateGrammarFromDictionary:grammar withFilesNamed:name forAcousticModelAtPath:[OEAcousticModel pathToModel:@”AcousticModelEnglish”]];

    NSString *lmPath = nil;
    NSString *dicPath = nil;

    if(err == nil) {

    lmPath = [lmGenerator pathToSuccessfullyGeneratedLanguageModelWithRequestedName:@”MyLanguageModelFiles”];
    dicPath = [lmGenerator pathToSuccessfullyGeneratedDictionaryWithRequestedName:@”MyLanguageModelFiles”];

    } else {
    NSLog(@”Error: %@”,[err localizedDescription]);
    }

    [[OEPocketsphinxController sharedInstance] setActive:TRUE error:nil];

    [[OEPocketsphinxController sharedInstance] startListeningWithLanguageModelAtPath:lmPath dictionaryAtPath:dicPath acousticModelAtPath:[OEAcousticModel pathToModel:@”AcousticModelEnglish”] languageModelIsJSGF:NO];
    self.openEarsEventsObserver = [[OEEventsObserver alloc] init];
    [self.openEarsEventsObserver setDelegate:self];

    self.fliteController = [[OEFliteController alloc] init];
    self.slt = [[Slt alloc] init];

    self.openEarsEventsObserver = [[OEEventsObserver alloc] init];
    [self.openEarsEventsObserver setDelegate:self];

    }

    – (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
    }

    – (void) pocketsphinxDidReceiveHypothesis:(NSString *)hypothesis recognitionScore:(NSString *)recognitionScore utteranceID:(NSString *)utteranceID {
    NSLog(@”The received hypothesis is %@ with a score of %@ and an ID of %@”, hypothesis, recognitionScore, utteranceID);
    self.wordSpoken.text = hypothesis;
    [self.fliteController say:hypothesis withVoice:self.slt];
    }

    – (void) pocketsphinxDidStartListening {
    NSLog(@”Pocketsphinx is now listening.”);
    }

    – (void) pocketsphinxDidDetectSpeech {
    NSLog(@”Pocketsphinx has detected speech.”);
    }

    – (void) pocketsphinxDidDetectFinishedSpeech {
    NSLog(@”Pocketsphinx has detected a period of silence, concluding an utterance.”);
    }

    – (void) pocketsphinxDidStopListening {
    NSLog(@”Pocketsphinx has stopped listening.”);
    }

    – (void) pocketsphinxDidSuspendRecognition {
    NSLog(@”Pocketsphinx has suspended recognition.”);
    }

    – (void) pocketsphinxDidResumeRecognition {
    NSLog(@”Pocketsphinx has resumed recognition.”);
    }

    – (void) pocketsphinxDidChangeLanguageModelToFile:(NSString *)newLanguageModelPathAsString andDictionary:(NSString *)newDictionaryPathAsString {
    NSLog(@”Pocketsphinx is now using the following language model: \n%@ and the following dictionary: %@”,newLanguageModelPathAsString,newDictionaryPathAsString);
    }

    – (void) pocketSphinxContinuousSetupDidFailWithReason:(NSString *)reasonForFailure {
    NSLog(@”Listening setup wasn’t successful and returned the failure reason: %@”, reasonForFailure);
    }

    – (void) pocketSphinxContinuousTeardownDidFailWithReason:(NSString *)reasonForFailure {
    NSLog(@”Listening teardown wasn’t successful and returned the failure reason: %@”, reasonForFailure);
    }

    – (void) testRecognitionCompleted {
    NSLog(@”A test file that was submitted for recognition is now complete.”);
    }

    #1024880
    Halle Winkler
    Politepix

    Welcome,

    Your using pathToSuccessfullyGeneratedLanguageModelWithRequestedName but since you generated a grammar you need to use pathToSuccessfullyGeneratedGrammarWithRequestedName instead.

    #1024881
    Wiraju
    Participant

    Halle,

    Thanks very much.

Viewing 4 posts - 1 through 4 (of 4 total)
  • You must be logged in to reply to this topic.