October 2, 2012 at 7:09 am #11453
First I want to say that I am thrilled by this project. It opens completely new possibilities for my app (which is not yet released). I only use the TTS functionality and I am under IOS 6 with iPhone 5
My question: Can I avoid that the flitecontroller stops playing the “normal” music on my iPhone? Maybe like an overlay?
Thanks for any advice.October 2, 2012 at 8:39 am #11455
This is currently undocumented and in normal use you should never need to interact with the AudioSessionManager directly, but see if it helps you to send this message to the audio session singleton from your app before using FliteController:
[AudioSessionManager sharedAudioSessionManager].soundMixing = TRUE;October 2, 2012 at 10:13 am #11458
thanks your for your fast feedback!
Which Framework(s) do I need to use the AudioSessionManager? AVFoundation does not seem to be enough.
I am also curious how this would raise the risk of a rejection from Apple. Do you have experience with Apps on the App Store using the AudioSessionManager?
UliOctober 2, 2012 at 10:21 am #11459
Sorry, I wasn’t clear enough — AudioSessionManager is just a class of OpenEars. It is a class which usually is only interacted with internally by other OpenEars classes so I don’t like to encourage developers to send messages to it directly, since under normal circumstances it should “just work” and not need any attention from the developer (which also means that interacting with it a lot is likely to lead to causing issues rather than fixing them). However, in this case there is a part of the OpenEars API that you actually can use to tell the AudioSessionManager directly to mix in other system sounds or music with the OpenEars audio session, so I’m just suggesting that you use it and see if it gives you the functionality you want. Since it’s part of OpenEars you don’t have to worry that it accesses any private frameworks or anything like that, it’s completely OK to use.
If it isn’t enough to just add this line:
[AudioSessionManager sharedAudioSessionManager].soundMixing = TRUE;
You may also have to import the class into your implementation like so:
#import <OpenEars/AudioSessionManager.h>October 2, 2012 at 10:43 am #11460
#import was the missing link.
soundMixing is read only, so [[AudioSessionManager sharedAudioSessionManager]setSoundMixing:YES]; did the trick.
I just can say…wow. This place is fantastic.
I have another question about the “system mute switch” when using flitecontroller. flitecontroller ignores the mute switch and plays audio when the mute switch is turned on. I understand that this is part of the AudioSession Category. Can I influence this portion when using the flitecontroller, because the sound is only peripheral and should be muted if the user wants so.
Should I open another threat for this?
UliOctober 2, 2012 at 11:19 am #11461
This is probably something where you’d need to start making your own alterations to the framework to get the exact audio session behavior you want, which shouldn’t be too difficult. You can check out what the audio session settings you need for that behavior are and then make your changes to the implementation of AudioSessionManager, and then re-compile the framework project to create a new framework for your app.
I think you probably are looking for a change of audio session category from the current playandrecord setting to the ambient setting — google or stack overflow should give more detail on this.
Please be aware that changing the audio session directly will usually prevent speech recognition from working (won’t be an issue for you if you are just using FliteController, but at some future point if you have unexpected results just keep in mind that you’ve altered the settings in AudioSessionManager).October 2, 2012 at 11:22 am #11462
this is exactly what I had in mind. Use the ambient category. I will try it out and will let you know the result.
UliOctober 2, 2012 at 11:23 am #11463
You also might be able to get away with making a call to the audio session setting right before the use of FliteController like so:
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryAmbient error: nil];
Without your having to make any changes or recompilation to the OpenEars framework. Just keep in mind that it’s your responsibility at that point to keep track of what calls you are making to the audio session versus OpenEars because it is adding complexity to your code when you have two possible sources of audio session settings. I receive a lot of requests for support for functionality which isn’t working because of a second point of interaction with the audio session is being overlooked.October 2, 2012 at 11:33 am #11464
Unfortunately [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryAmbient error: nil]; did not change the behaviour. Thanks anyhow. This would have been nice for the future, not to change your code.
Good news on the other front.
Meanwhile I compiled the framework with the ambient category and it works like a charm.
Very happy! Thx a ton.October 2, 2012 at 11:35 am #11465
Super, very glad it’s working for you. It’s probably best in the long run to make the change to the framework anyway in order to observe DRY.
- You must be logged in to reply to this topic.