Cron is a software utility in Unix-like computer operating systems that automatically schedules time-based jobs.
BriteCore’s DayCron script is executed on every live site’s Leader node around midnight every day.
Note: The frequency is based on the type of job. Advanced settings control the frequency for some of the utilities.
As the script executes, it logs to log/daycron.log and records the status of individual steps in the cron_jobs database table.
DayCron/Nightly processing dependencies
All sites (live, demo, and test) have a nightly processing file that executes a number of other processes, depending on if it was called to run daycron or minutecron jobs.
Part of this script ensures that any site that isn’t configured to be truly live will skip the processing (this is why most demo and all test sites do nothing overnight).
When running DaycCon jobs, the following tasks are called and send jobs to the cluster:
- BriteCore’s DayCron script kicks off the subprocesses and functions listed below:
- Process premium records
- Generate reports
- Process commissions
- Uploads to vendors
Together, these processes send the jobs to the instances they are running on to be processed asynchronously, and then the result is returned to the original instance upon completion.
Details on Nightly processing dependencies
View additional details on dependencies
DayCron script
Before any jobs are queued at all, the DayCron script sets two “master” Redis locks:
- <client-name>:locks:df_cache
- <client-name>:locks:df_prep_cache
These locks are meant to represent the processes for building data frames (DF) and prepared DF cache that haven’t yet completed. These locks are also not meant to be released until all locks that will be nested under them have been released.
Premium processing
This script kicks off the majority of the DF cache-related tasks each morning.
- Chunks the work to process premium records into an arbitrary number of jobs. Each job sets a new Redis lock with the format :locks:premium_records_X, where X is the number of this particular premium records job. These locks are released as each premium records job completes.
- Locks the :locks:df_prep_cache master lock ensures that it is locked, and the first sub-key is set beneath it: :locks:df_prep_cache:item_trans. Similar to the DF cache master locks, this lock represents that item transactions haven’t yet been built this morning.
- Sends the “parent” job for building DF cache to its cluster. Each DF cache build job that executes will group together a number of caches that need building, setting a new Redis lock under the df_cache master lock, and releasing each one as each individual DF is built. Once all DF caches have been built, the master df_cache lock will be released.
- Sends item transaction build job to its cluster. This script executes only when all DF cache has finished building. Once finished building item transactions, the df_prep_cache:item_trans lock is released, and if no other locks have been set under df_prep_cache, the df_prep_cache master lock itself is released. This may be a cause of problems!
- Chunks and builds prepared DF cache once all DF cache has been built and after item, transactions have been built. In the event the df_prep_cache master is released lock too early, this process will be sure to re-lock the master lock. Afterward, individual Redis locks are set for each prepared DF that needs to build and are released as each build completes.
Report runner
Depending on the time of month/year DayCron is executing, different reports will run for each client. These reports are executed to run that script on cluster nodes is queued by the reports utility. This job won’t execute until all DF cache and prepared DF cache locks are released (meant to wait for all DF cache to be built, but we currently don’t always wait for that, as building item transactions can also release the prepared DF cache lock).
Process commissions
This is another script that will execute differently based on the time of month. It, however, executes without any dependencies.
Upload to vendors
NxTech processing is sent to the supercluster for clients who integrate with that vendor. DF cache must finish building for this job to execute.
Frequency
The utilities run on a daily, monthly, and annual frequency. The tables below summarize the jobs processed.
Daily Processing jobs
View daily processing jobs
Table 1 summarizes the daily processing jobs.
Tool | Description |
apply_pnc_lockbox_payments | Process PNC Payments. PNC is a third-party integration that provides a lockbox financial service to BriteCore customers. |
create_mailings | This utility will merge, by recipient and policy, any documents that would print on a given day (date_cursor) into a single PDF ('Individual Mailing') and assign that PDF a batch based upon settings.printing.batches and the individual documents that comprise the merged PDF. |
dropped_non_ratables_report | Custom report which reports non-ratable items that have dropped from a policy's builder across terms. |
failed_check_payments | Send an email alert for failed outgoing, QuickBooks specific payments. |
generate_missing_deliverables | Finds any records of type “Invoice,” ”Non Pay,” or “Return Premium” for the date we are running daycron and generates a snapshot (a snapshot is billing info, such as premium_overview - written premium, fees, etc.; and details, such as if it is paid in full). Once snapshot is generated, it creates a deliverable connected to the record if there isn’t already an existing deliverable. |
generate_non_renewal_notices | If a policy is past its due date for a payment, we will create a non renewal notice deliverable. |
generate_pre_auth_schedules | If an insured has elected to make payments automatically, then we will send them a schedule of dates where we will be pulling payments from their credit/debit card |
generate_return_premium | Send out return premiums over client set threshold of $ and days |
index_edeliverable_documents | Create EDeliverableDocument records in an index for further E-Delivery functionality. |
kill_abandoned_reports | Finds and kills any reports which were never run (basically they are obsolete jobs that were never started, they should be removed). |
move_credits_on_expired_terms | Move any credits on expired policy terms forward to the next term to prevent them from getting lost to the past. |
mysql_backup | Backs up MySQL database and push to S3. |
non_pay_cancel_pending_amount_due_report | Custom report for viewing policies in non-pay cancellation pending and amounts past due. |
process_auto_pays | Processes auto-pays for all policies that are eligible. |
process_active | Retrieves all policies with the status 'Cancellation Pending, Non-Payment of Premium' and checks to see if they need to be marked as active. This is a double-check for what should be happening automatically when a payment is made. This should notnormally be needed. It catches things that are done on a policy manually outside of the context of a payment being made. |
process_cancellation_pending_or_non_renewals | Retrieves any non-renewal policy revision whose expiration date is today. Cancel the policy and generate a final billing statement or create return premium for insured. |
process_combined_billing | If an insured has multiple policies with the client, we will create a combined billing statement for them instead of having separate bills all being sent to the same address. |
process_non_pays_and_cancellations | Places policies into non-pay Cancellation pending where appropriate. |
process_renewals | Find all policies within a date range built around todays date and creates a renewal for them if it is required. |
reports.launch_periodic | Run all brite_data reports tagged to run daily, weekly, monthly and annually. |
save_notes | Loops through the contents of the instance's notes list and savesthem in the db using the class config file. |
send_deliv_emails | Send all emails to agents/billwhoms if the nightly-processing option in the settings table is set to automatic. |
send_emails | Loops through the contents of the instance's email list and sends out all the emails using the class config file. |
submit_commission_payments | Finalizes the commission process. |
submit_sweep_payments | Submit payments in the sweep queue that were made x days ago (determined in settings). This only applies to clients who have opted to hide the 'Submit Sweep' button. The days setting exists to allow agents time to deposit money in the bank before sweeping their accounts. |
submitted_applications_report | Custom submitted applications report. |
upload_print_backup | Run the `upload_print_backup` for a given date range to upload documents that have already been printed for the client to their DropBox account. |
validate_holidays | Checks for holidays as holidays affect nightly processing of deliverables. |
write_premium_records | Write premium records for the day. |
Monthly Processing
View monthly processing jobs
Table 2 summarizes the monthly processing jobs.
Tool/ Title | Description |
send_aplus_claim_rpt | Retrieve all named insured claim/losses for the current month,enter data into fixed width flat file and send to ISO A-PLUS via FTP |
send_lexisnexis_claim_rpt | Retrieve all claim/loss data for the current month, create a fixed width flat file and send to LexisNexis via VPN. |
write_commission_payments | Write commission payments to the commission_payments table. First on the first of each month. |
run_stone_river_commissions | Processes Stone River commissions. |
Run_month_end_reports | Runs a set of reports for the previous month. |
Annual Processing
View annual processing jobs
Table3 summarizes the annual processing jobs.
Tool/ Title | Description |
run_year_end_reports | Runs a set of reports using report runner for the previous year. |