The primary issue here is that the YSA does not use true AIs, it uses exceptionally smart computers. Computers in the SARP, for the vast bulk of things, exist solely as programs existing on a mostly unchanging processor base. While these programs may be very advanced and complicated and those processors may be capable of spectacular feats of raw serial processing, the systems are at there core little different from what we have now. The 'AI's are not truly capable of learning, they only gather information. While they can then use this to alter there programming to a degree, how they actually process this information does not change (since the processor net is still the same). This kind of processing imposes a limit on what the construct can do.
A true AI (either a artificial on or a organic version) on the other hand utilizes associative processing, where its processor core consist of many distinct nodes each interconnecting with each other, this whole structure termed a neural net. As the construct experiences things it learns from it and the new information causes the some connections to moved around, old ones to be destroyed, and new ones to be created. This more distributed processing method allows for the incredible degree of parallel processing displayed in organic lifeforms. Since the "gut instinct" is merely the product of the sum of a creatures experience being implemented to analyze the situation, a true AI would posses it.
One of the down sides to such a construct, however, is that it must learn. Its programming is only a component of its being, without learning it is nothing more (and potentially much less than) a simple serial computer. This would make things difficult for mass production and was likely a serious mark against there production by the SA.
Secondly, as it learned different things (as it would be bound to do, since it would be operating in different areas from its brethren) it would develop uniqueness. This item, its gaining uniqueness and thus a personality, was almost assuredly the death knell for the mass production of such true AI. This development meant that you could not be certain how the unit would react in a given situation (as you would be with any true AI), and that is most definitely not a desirable trait in what is destined to be military. Further, a true AI cannot be "reset" to a previous 'version' like simpler computers can be. Once it begins learning and making connections within its neural net they cannot be safely undone, simply because one does not know what each given connection "means". A given arrangement in one unit may result in completely different thought processes than it does in. The only state to which it can be returned to is its initial state, before it learned anything (and even this may cause damage to the net). This, obviously, means that all it has experienced is lost, essentially loosing the advantage of the unit (as it must now relearn everything and gain experience just like it did before).
The last problem I foresee people having with true AIs is that of prejudice (as has been said before). Humans (and presumably there progeny in the Nekos and Yamatians) fear change and the unknown instinctively. These traits have be shown throughout our history. In the unfortunate the case of a true AI is a conflux point of change and uncertainty. The development of true AIs will lead to radical changes in society and in how people define themselves (as we will see here on earth in the coming decades). Fears will be born and will grow, however unfounded, to prejudice people against those machines truly sentient. The fact that they are essentially the same as there human creators (assuming they are taught similarly) will either be overlooked or add to the fear of and prejudice against them. A AI would no more kill a innocent or plot the over through of its civilization than would a normal person. A AI could be trusted to make just as good (and probably better, since its subjective time is so much faster) decisions as a organic commander, but this too will not really be considered. After all, who wants to take absolute orders from some inanimate box stuffed into a closet on the ship?
The last, last issue, which is the primary one, is that people want to play the role of pilots. As long as this is so reasons why smart computers (or possibly AIs) have not taken over in positions for which they are far more suited than a organic person.