Restrict Simultaneous execution of Accounting Processor
Description
Environment
Activity

Deepak Pansheriya February 26, 2019 at 2:54 PM
After giving careful consideration looks semaphore approach will not work in either case. Making posting parallel can introduce issue with average costing.
For Example if we implement record level semaphore same as in document completion, and we have 2 accounting processor executing and one is posting MR and same time another pickup shipment (completed after MR) which has product received from MR which has posting in progress, Shipment carry wrong cost information.
Only we can make parallel execution based on document categorized as using or updating cost vs which are not using cost at all. For example credit memo, GL Journals or any custom documents created in case average costing in use. Or implementation using only standard costings.
So one approach is to control how much parallelism can be achieved is based on table and document type grouping. Default is always accounting processor process all the tables record as we are doing currently. but admin can configure accounting processor to run for multiple group of document type or tables.
Now accounting processor can be configured to run on single node using current IP based configuration on scheduler, so no extra check required. When any implementation want to make it not run simultaneously, then we can advise to hide menu for accounting processor process.
Now only known issue remain with immediate posting post before dependent document for which we may need to give further thought process.
Carlos Ruiz February 25, 2019 at 1:35 PM
Hi Deepak,
> as we have complex logic for sequencing document posting based on time it is completed
This is because average costing is affected by the order the documents are posted. Posting by completed time provided to get the same average cost when reposting.
> what we think is single accounting processor, processing documents is good enough
Current approach allows to define several accounting processors for different tables. I guess that can be a required option in high-volume implementations, so preferably better not to destroy such feature.
> to add Accounting processor Running on detail on client info
Implementing a semaphore sounds tricky, as there are three ways to start posting:
manual posting (pushing the button in the document potentially many users doing it in parallel)
immediate posting
running client accounting processor in menu (potentially multiple users in parallel)
server accounting processors (potentially multiple)
Taking all of that into account I think the easiest and safer is maybe what I suggested above:
implement the setting of the Processing flag on the record being processed at the beginning, out of the transaction
We implemented something similar when completing documents to avoid double prepare or double complete - that used to happen in old compiere.
Regards,
Carlos Ruiz

Deepak Pansheriya February 25, 2019 at 12:59 PM
we thought on creating batch and let documents posted parallel on different server or in different thread. But as we have complex logic for sequencing document posting based on time it is completed, and also Shipment has normally dependency on MR and Production due to avoid costings, it not sound safe.
So what we think is single accounting processor, processing documents is good enough. updating all records also contribute significant performance hit. Also we need to undo if our scheduler failed or server crashed.
So what we have decided is to add Accounting processor Running on detail on client info and when this column is null means accounting processor can continue for that client. if IP presents on this column then no need to execute. whenever server restart it check if this column has it's IP then it clear this column so next execution can continue.
To me this looks simplest and safe design, Please advise if any one can see any risk here.
Carlos Ruiz February 22, 2019 at 1:23 PM
Hi , raising priority of this suggested improvement as the actual behavior can cause dup postings which is data corruption.
One way as you suggest would be to implement some restriction of acct processor.
Another way could be to implement the setting of the Processing flag on the record being processed at the beginning, out of the transaction.
I mean, at this moment in Doc.post (line 537 in 6.2 version) there is an update to Processing=Y for the record being processed.
I think maybe the problem must be there. Or that update is not behaving as expected (the first processor must do the update, and the second must wait because the record is locked) - or we need to implement an extra check there - if the record being processed has the Processing=Y then skip it because it means another processor is working on it.
Regards,
Carlos Ruiz
Details
Details
Assignee

Reporter

If two instance of Accounting Processor is running for same client, it cause some time duplicate posting.
Also we have faced.
to resolve this we can restrict simultaneous execution of Accounting Processor for same client and also make sure we always run it on single server.