OK, let's review:
Devices run:
Fridge - 5.5 AH /day
iPad - 1.8 AH/day
Fan - 0.2 AH/day
Lights - 0.2 AH/day (I don't recall seeing how many hours the lights are on for in post #175 above) - this number assumes just one hour
Total = 7.6 AH.
You have a 150W panel, and when the sun is highest in the sky you get about 5A.
The battery bank is a a single 80 amp AGM that has been load tested and found good.
You are using a MorningStar SunSaver 10L solar controller that goes into float mode at 13.6 volts.
And monitoring all this with Trimetric TM2030 and are seeing that the charging volts never exceeded 13.8 volts.
Did I get all that?
Ok, so then I have some more questions and some possible insights:
1) Where is this 150W panel you have? On the roof? Are there any shadows on it? (fan/vent cover popped open and casting a shadow on even just one corner of a single cell can kill your output).
2) Is it a portable and/or are the wires leading to it very small guage? The reason I ask is:
3) At 80% nominal efficiency, you are getting about 120W out of that panel. Your batteries need to see about .5V higher than they are currently resting at in order to charge. So if they are reading 12.9 at sunset, the solar controller needs to give them 12.9 + .5 = 13.6V. You are seeing 13.8. That sounds ok, but..
4) After a few days, the charge seems to stabilize at 85% or so. At that voltage (you said 12.5) the charger needs to only deliver 13V. It seems to be able to do that and keep the battery at 85%. This leads me to suspect a large amount of voltage drop between the panel and your battery so that you really are not seeing 13.8 at the battery.
5) AND, the difference in current at these two voltages is minimal - 120W / 13.4V = 8.96A and 120W / 13V = 9.23; however, you are only seeing 5A. That's not right - see shadow question above.