CircadifyCircadify
Insurance Technology12 min read

How to Calibrate Mortality Tables With Contactless Vitals Data

How contactless vitals data from rPPG can calibrate mortality tables for life insurance underwriting, with research on resting heart rate, HRV, and actuarial modeling.

ayhealthbenefits.com Research Team·
How to Calibrate Mortality Tables With Contactless Vitals Data

Mortality tables sit at the foundation of every life insurance product ever written. They are, in the most literal sense, the math that makes life insurance work. And for over a century, actuaries have calibrated those tables using a relatively stable set of inputs: age, sex, smoking status, medical history from attending physician statements, and whatever a paramedical exam turns up. The question facing the industry right now is whether contactless vitals data — captured through a smartphone camera in under a minute — can meaningfully improve how those tables reflect actual mortality risk.

"The relative risk of all-cause mortality with every 10 beats/min increment of resting heart rate was 1.09 (95% CI 1.07–1.12)." — Zhang et al., Canadian Medical Association Journal, 2016

What calibrate mortality tables contactless vitals data actually means

Calibration, in actuarial terms, is the process of adjusting expected mortality rates so they better match observed experience. The standard industry mortality table in the U.S. is the 2017 CSO (Commissioners Standard Ordinary), maintained by the Society of Actuaries. Carriers take this base table and apply credibility-weighted adjustments based on their own book of business. If a carrier's actual claims experience shows lower mortality than the table predicts for a given risk class, they adjust downward. Higher than expected, they adjust up.

The problem is that traditional calibration inputs are coarse. A 45-year-old nonsmoking male with no flagged medical history gets slotted into the same risk band whether his resting heart rate is 58 bpm or 82 bpm. Those two people have meaningfully different cardiovascular risk profiles, but the underwriting process has historically had no practical way to capture that difference at scale without ordering a full paramedical exam.

Contactless vitals data changes that equation. Remote photoplethysmography (rPPG) extracts heart rate, heart rate variability, respiratory rate, and blood pressure estimates from facial video captured by a standard smartphone camera. The measurement takes 30 to 60 seconds. There is no equipment to ship, no nurse to schedule, no lab to process. The data is available immediately.

The mortality signal in resting heart rate

The connection between resting heart rate and mortality is one of the most replicated findings in cardiovascular epidemiology. A 2016 meta-analysis by Zhang et al. published in the Canadian Medical Association Journal pooled 46 studies covering over 1.2 million patients. They found that each 10 bpm increase in resting heart rate corresponded to a 9 percent increase in all-cause mortality risk and an 8 percent increase in cardiovascular mortality risk.

Munich Re's actuarial research team published an analysis examining how biometric data segments mortality risk. Their model showed that resting heart rate alone can produce meaningful mortality segmentation. A cohort with resting heart rates in the low range showed mortality rates roughly 40 percent lower than a cohort with elevated resting heart rates, even after controlling for age and other standard underwriting factors.

That is a large signal. For context, the mortality difference between preferred and standard risk classes in most carriers' rate structures is 30 to 50 percent. Resting heart rate data, collected in under a minute through a phone camera, can produce segmentation of comparable magnitude.

Biometric Signal Mortality Segmentation Power Data Source Collection Method Time to Capture
Resting heart rate 1.4x between low and high cohorts Munich Re analysis rPPG via smartphone camera 30-60 seconds
Heart rate variability (SDNN) 1.6x between normal and reduced HRV Framingham Heart Study data rPPG via smartphone camera 30-60 seconds
Physical activity (steps/day) 1.8x between active and sedentary Wearable device studies Wearable required Continuous wear
Sleep duration 1.3x between 7h and 4h sleepers Wearable device studies Wearable required Continuous wear
Blood pressure 2.1x between normal and Stage 2 HTN Clinical measurement Traditional cuff or rPPG 30 seconds to 5 minutes
Smoking status 2.0x smoker vs nonsmoker Self-report + cotinine test Questionnaire + lab Days for lab results

Heart rate variability adds another layer

Heart rate variability (HRV) measures the variation in time intervals between consecutive heartbeats. Higher HRV generally indicates better autonomic nervous system function and greater cardiovascular resilience. Low HRV has been associated with increased mortality risk independent of mean heart rate.

The Framingham Heart Study, one of the longest-running cardiovascular research cohorts in the world, has contributed data on HRV and mortality outcomes spanning decades. Research from the study has shown that reduced HRV in middle-aged adults predicts cardiovascular events and all-cause mortality even after adjusting for traditional risk factors. A 2024 analysis published in the Journal of Medical Internet Research by researchers at multiple institutions examined resting heart rate associations across clinical measures and confirmed the independent predictive value of cardiac rhythm metrics.

For mortality table calibration, HRV provides information that resting heart rate alone misses. Two people with identical resting heart rates of 72 bpm may have very different HRV profiles. The person with higher HRV has a more adaptable cardiovascular system. The person with reduced HRV may be showing early signs of autonomic dysfunction that will not appear in traditional underwriting data for years.

rPPG can extract HRV metrics from the same facial video scan used to measure heart rate. No additional measurement step is needed. The interbeat interval data comes from the same photoplethysmographic signal, just analyzed differently.

How actuaries can integrate contactless data into table construction

The Society of Actuaries published a 2024 essay series on AI-driven longevity that discussed how new data sources, including continuous biometric monitoring, are changing mortality modeling approaches. The SOA's Retirement Plans Experience Committee (RPEC) continues to update mortality improvement scales annually, and their 2025 update noted that emerging data through mid-2025 showed small residual excess mortality in the 65+ population from pandemic aftereffects.

Building contactless vitals into mortality table calibration does not require throwing out existing methodology. The graduation techniques actuaries have used for decades — Whittaker-Henderson smoothing, cubic spline fitting, parametric models like Makeham-Gompertz — still apply. What changes is the input data.

Here is how the integration works in practice:

Step 1: Collect baseline vitals at point of application. When an applicant applies for coverage, they complete a 60-second rPPG scan through their smartphone. This captures resting heart rate, HRV (RMSSD and SDNN), respiratory rate, and estimated blood pressure.

Step 2: Build a vitals-augmented experience study. Over time, the carrier accumulates mortality experience data that includes contactless vitals measurements alongside traditional underwriting variables. This creates a dataset where actual mortality outcomes can be analyzed against vitals data.

Step 3: Run multivariate credibility analysis. Using Bayesian credibility or Bühlmann-Straub methods, the actuary determines how much weight to assign the vitals data relative to traditional factors. Early in the program, when the vitals dataset is small, credibility will be low and the adjustment modest. As the book grows, credibility increases and the vitals signal carries more weight.

Step 4: Construct sub-tables by vitals risk band. Rather than a single mortality table modified by traditional underwriting class, the carrier can produce sub-tables that reflect vitals-based segmentation. A preferred nonsmoker with resting HR under 65 bpm and normal HRV gets one sub-table. A preferred nonsmoker with resting HR over 80 bpm and reduced HRV gets another.

Step 5: Validate against emerging experience. Compare predicted mortality from the vitals-augmented tables against actual claims experience. Adjust calibration factors as more data accumulates.

Practical considerations for carriers

Adopting contactless vitals data for mortality table calibration raises questions that go beyond the actuarial math.

Regulatory acceptance

State insurance regulators have not yet issued specific guidance on using rPPG-derived vitals data in rate-making. The NAIC's Accelerated Underwriting Working Group has been examining the use of non-traditional data sources in underwriting since 2020, and their focus has been on ensuring that new data inputs do not introduce unfair discrimination. Contactless vitals data has an advantage here: it measures objective physiological signals, not behavioral proxies or consumer data that might correlate with protected classes.

Data volume requirements

Mortality table calibration requires large datasets and long observation periods. A carrier writing 50,000 new policies per year with contactless vitals data collection would need three to five years of experience before the vitals-augmented sub-tables reach full credibility. In the interim, a blended approach using partial credibility weighting works fine. The actuarial profession has solved this problem before with every new underwriting factor that has been introduced over the past century.

Measurement consistency

For vitals data to be useful in mortality table construction, it needs to be collected under reasonably consistent conditions. rPPG measurements are sensitive to lighting, motion, and camera quality. Standardizing the measurement protocol — adequate lighting, face centered in frame, 60 seconds of stillness — reduces noise. Quality scoring algorithms can flag measurements taken under suboptimal conditions for exclusion or down-weighting.

Longitudinal tracking

The real power of contactless vitals for mortality calibration comes from repeated measurement. A single point-of-application reading gives one snapshot. If policyholders can be incentivized to take periodic scans — quarterly or annually — the carrier gets a time series of health trajectory data. Deteriorating HRV or increasing resting heart rate over time would trigger recalibration of individual mortality assumptions, enabling more accurate reserving and potentially more responsive policy management.

Current research and evidence

The research base connecting rPPG-measurable vitals to mortality outcomes draws from two bodies of work: the epidemiological literature on cardiovascular biomarkers and mortality, and the computer vision literature on rPPG measurement accuracy.

On the epidemiology side, the evidence is strong. The Zhang et al. meta-analysis (2016) in CMAJ covered over 1.2 million subjects. A separate analysis by the Copenhagen Male Study followed 2,798 men for 16 years and found that resting heart rate above 80 bpm was associated with a 1.6-fold increase in all-cause mortality compared to rates below 50 bpm. The HUNT study in Norway, following over 37,000 adults, reported similar dose-response relationships between resting heart rate and cardiovascular death.

On the rPPG accuracy side, validation studies have shown that camera-based heart rate measurement achieves mean absolute errors of 1 to 3 bpm against ECG reference under controlled conditions. A 2025 study in Bioengineering reported a mean absolute error of 1.061 bpm. For HRV, the accuracy is somewhat lower but still sufficient for risk stratification purposes — the goal is not clinical-grade precision for individual diagnosis, but population-level segmentation for actuarial modeling.

The SOA's 2025 essay series on AI and longevity specifically discussed how machine learning models trained on biometric time-series data can improve mortality prediction beyond what traditional actuarial factors achieve alone. Dr. Kojo Decardi-Nelson's essay for the SOA argued that cross-disciplinary collaboration between actuaries, data scientists, and healthcare professionals will be necessary to align AI-driven mortality models with practical applications in insurance.

The future of mortality table calibration

The life insurance industry has always adapted its mortality tables as new data becomes available. The shift from the 1980 CSO to the 2001 CSO to the 2017 CSO reflected changing population health, better data collection, and improved statistical methods. The next iteration will likely incorporate digital health data as a standard calibration input.

Contactless vitals data through rPPG is a practical starting point because the collection barrier is so low. No hardware. No clinical visit. No ongoing compliance requirement. Just a phone camera and 60 seconds. That makes it feasible to collect vitals data on a large enough population, with enough frequency, to build actuarially credible experience studies within a reasonable timeframe.

The carriers that start collecting this data now, even before the actuarial standards formally incorporate it, will have a multi-year head start on building the experience studies needed for credible calibration. Those that wait for regulatory clarity or industry-wide adoption will find themselves years behind on data accumulation.

Frequently asked questions

Can contactless vitals data replace traditional mortality tables?

No. Contactless vitals add a supplementary layer to existing mortality table frameworks. The base table structure, graduation methodology, and regulatory filing requirements remain the same. Vitals data provides additional calibration factors that improve granularity within existing risk classes.

How accurate does rPPG need to be for actuarial use?

For population-level mortality segmentation, the accuracy bar is lower than for individual clinical diagnosis. Mean absolute errors of 2-3 bpm for heart rate and reasonable HRV correlation are sufficient to identify risk bands. The actuarial application uses aggregate statistics across thousands of policies, which smooths out individual measurement noise.

What about adverse selection if only some applicants provide vitals data?

This is a real concern. If vitals collection is optional, healthier applicants may be more willing to scan, creating selection bias. Most carriers implementing contactless vitals are making the scan a standard part of the application process for all applicants to avoid this problem.

How long before vitals-augmented mortality tables reach full credibility?

It depends on the carrier's volume. A carrier writing 50,000 policies per year with vitals data would typically need three to five years of claims experience before the vitals-specific sub-tables carry full actuarial credibility. Partial credibility approaches allow earlier adoption with blended weighting against the standard table.

Companies like Circadify are building the rPPG infrastructure that makes large-scale contactless vitals collection practical for insurance carriers. The technology to capture the data exists today. The actuarial frameworks to use it are well understood. What remains is for carriers to begin the multi-year process of accumulating the experience data that will make vitals-augmented mortality tables a reality.

mortality tablescontactless vitalsactuarial sciencerPPG underwriting
Request a Demo