Problem: The customer runs every night a synchronizer (a dos script) in order to update sources from the mainframe. They deal with 30k program sources + 47k copybook sources, spread among 20 pds directories .They have average of 20 new modules per day (with a peak of 100). After the initial loading, the daily run takes 40 minutes for 4/5 days, then after this the consumed time for synchronizing starts to increase. This is not the first time the customer has this problem: When the mainframe task is purged, after that it starts to work within a reasonable elapsed time, but after few days the problem re-occurs. Resolution: We have verified the MFA logs. There was nothing in the logs except an evidence of a lot of traffic. Then we asked for the memory allocated, the CPU usage and whatever else was running on the machine at the same time that was consuming resources. However everything looked normal. Then we discovered that the .dat (mfmon.dat) file passed to mfmonmx process was refreshed by a new copy every day. Syncmon must analyse the PDS directory first to determine which members need to be synchronized based on the .dat file passed. In this way any time the process runs it must "hash" everything and this slows down the performance. By retaining the old .DAT file and reusing it, this helped to keep the elaboration time of the synchronization constant.
↧