You are browsing a read-only backup copy of Wikitech. The primary site can be found at wikitech.wikimedia.org
Fundraising/Data and flow/Audits
This documents the workflow to process audit files from payment processors and import missing messages into CiviCRM.
For all payment processors, except PayPal, the parse-audit drush command processes the audit files. The code in the drupal module handles reading the list of files from the directory, searching for existing transactions in the database, and finding missing information in the payments-wiki logs (mounted at /srv/archive/frlog1001/logs) for each transaction that isn't in the database. The code to parse the individual files to an array of normalized transactions lives under the SmashPig codebase, in classes that implement the AuditParser interface.
Audit files are located on civi1001 in
/var/spool/audit/[payment-processor] and divided into two directories: incoming and completed.
Payment Processor Specific Information
Ingenico (new integration)
Transactions from the new API come in the same file as transactions from the old API (see below), but these transactions can be identified because they have the tag EmailTypeIndicator.
The files are located in /var/spool/audit/globalcollect/incoming. While there are multiple files in this directory, the ones we parse begin with wx1 and are in a xml.gz format.
We pay attention to records classified with these codes (Recordcategory + recordtype):
- XON - Credit card item that has been processed, but not settled.
- +IP - Settled "Invoice Payment". Could be invoice, bt, rtbt, check, prepaid card, ew, cash
- -CB - Credit card chargeback
- -CR - Refund on collected credit card payment
- XCR - Any old refund
- +AP - Direct Debit collected
PayPal uses a python script deployed in the fundraising tools repository: https://phabricator.wikimedia.org/diffusion/WFTO/browse/master/audit/paypal/
We currently use PayPal's Express Checkout integration, and classify those donations using gateway value 'paypal_ec'. However, we still have some legacy recurring donations from the old integration, and those are classified with gateway value 'paypal'.
Files are not nightly, rather they are produced when a 'batch' has been completed. We get an IPN message with the report path and download it using a DownloadReportJob under the jobs-adyen job queue runner.
Instead of an SFTP download, we have to call methods on the Amazon Pay SDK to get our reports. This is kicked off with the DownloadReports php script in SmashPig.
Sometimes the audit processor can't resolve all the transactions in a file, even after trying for several days. This can lead to a build-up of files in the incoming directory and to subsequent processor runs getting longer and longer till finally they start timing out. The solution is to 'manually' move the older files from the incoming to the completed directory. Since our personal accounts don't have permissions to move the files, we do this with a one-off process-control job such as ingenico_move_audit_files. Since process control runs each command as a separate process under python, we need to wrap any file globs that we want expanded with 'sh -c', for example:
sh -c "mv /srv/archive/civi1001/audit/globalcollect/incoming//wx1*202010[0-5]*xml* /var/spool/audit/globalcollect/"
The directories are set up in the drupal module here:
Run the audit parser:
drush parse-audit ingenico
If a refund is successful it will put a message on the refund queue:
drush --user=1 -v -r /srv/civi-sites/wmff/drupal rfdqc