What I meant was, when the hypothesis is calculated, will it allow for given phrases to be recombined — I figured out that the answer is “only if i want them to be”
It still isn’t clear to me what your goal is here.
Out of curiosity, how large is your language model/grammar? 8-10x processing sounds really unusual to me, as does reduced accuracy for JSGF. A memory warning at calibration is also very unusual in my experience, especially on a device that recent.
Have you changed anything in your setup to make it not be stock? I.e. are you using Pocketsphinx .7 or have you made changes to the library? Question 2 is if you get the same results when using your grammar with the sample app, without making any other changes.