Sorry for the delay in responding, I have been traveling this past week. I am just now taking a look and looks like I am having the same problem. I noticed that other folks on LA forums are saying that there hasn't been any new notes listed recently?.?. This would cause the issue of PLS reaching the 300 attempts and not detecting new notes list because it has a minimum new note count of 15 threshold (hard coded) which indicates that new notes are listed. In other words, PLS will continually poll LC until it finds a note count of 15+ new notes (from current note count) which indicates that the new batch is available. The reason this is in place is that LC continually adds/removes notes and PLS needs a way to know when new notes have been added (can't rely on timing and there is no field via their API that indicates when new notes have been added). Therefore, an increase of 15 or more notes indicates new list and PLS will start. Since no notes or a few have been added, new loan list never gets detected.
As for the API invalid response, PLS does 3 verifications on the payload of the API response:
* Length of JSON string must be greater than 0 (loan data must be present)
* JSON string must contain a valid field. PLS looks for "pubRec" in header (arbitrarily chosen) to make sure that it contains information relating to loan data (full verification of all fields would take too long)
* Conversion from JSON to table records must complete successfully
In the past, I have seen payload be empty/missing. Likely that these errors were due to some technical issue on LC side since no code has changed on PLS. I don't see the invalid API error in most recent run of PLS but rather the issue of no note list detection.
I am also curious as to why other 3rd party software sites have not reported this API issue (on forum). Anyone reading this use a 3rd party site and seen this behavior? Do they log/post the API responses to their users or hide technical issues? I would assume they are experiencing the same problem as we all connect via the same API.
I will keep an eye on this issue tomorrow and post any new findings.