HealthDay News — For patients receiving an implantable cardioverter-defibrillator (ICD), using 30 of 40 intervals to detect ventricular arrhythmias (long detection) during spontaneous fast ventricular arrhythmia episodes is associated with a reduced rate of antitachycardia pacing (ATP), shocks and inappropriate shocks compared with the standard 18 of 24 intervals, results from the ADVANCE III trial indicate.

Patients in the prolonged interval group had a 37% lower rate of delivered therapies than those in the standard-interval detection group at 12-month median follow-up, Maurizio Gasparini, MD, from the Humanitas Clinical and Research Center in Rozzano, Italy, and colleagues reported in Journal of the American Medical Association.

The number of inappropriate shocks also dropped 45% in the long-interval detection group, whereas the number of appropriate shocks remained similar in both groups, as did the incidence of syncope.

Continue Reading

“This programming strategy may be a useful approach for ICD recipients,” Gasparini and colleagues wrote.

The ADVANCE III findings may have implications for mortality, as data from another study — the MADIT-RIT randomized trial — suggest that any ICD therapy, whether appropriate or not, is associated with an increased risk for death and worsening of heart failure. However, follow-up in ADVANCE III was too short to determine whether the longer detection intervals and subsequent fewer therapies translated into mortality benefit.

The goal of both the ADVANCE III and MADIT-RIT trials was to reduce both appropriate and inappropriate ICD therapies by identifying the best device programming strategies — either through delaying the time before ATP interrupts ventricular tachyarrhythmias, or through prolonging the time before shocks are delivered for ventricular fibrillation, according to the researchers.

The single-blind ADVANCE III trial involved 1,902 primary and secondary prevention patients undergoing first ICD implant, 948 of whom were assigned to long-detection intervals and 954 of whom were assigned to standard-detection intervals. Average patient age was 65 years and 84% were men.

During the median 12 month follow-up, patients in the long-detection group had 346 delivered therapies versus 557 in the standard-detection group. This translates into a rate of 42 vs. 67 therapies per 100 person-years (incidence rate ratio [IRR] 0.63).

The total number of ATPs was 23 per 100 person-years in the long-detection group compared with 37 ATPs in the standard-detection group (IRR 0.58), and the number of shocks was 19 per 100 person-years in the long-detection group versus 30 in the standard-detection group (IRR 0.77; P= 0.06). There was also a significant reduction in the incidence of inappropriate shocks in the long- vs. standard-detection group (5.1 vs. 11.6 per 100 person-years; IRR, 0.55).

In an accompanying editorial, Merritt H. Raitt, MD, of the Portland Veterans Administration Medical Center, Ore., praised the study findings.

“Regardless of whether these programming interventions lead to reduced mortality, the unequivocal reduction in ICD shocks and the reduction in hospitalization without an increase in adverse events such as syncope suggests that this programming approach should be considered for adoption in the care of patients with ICDs and clinical characteristics similar to those enrolled in these studies,” Raitt wrote.


  1. Gasparini M et al. JAMA. 2013;309(18):1903-1911. 
  2. Raitt MH. JAMA. 2013;309(18):1937-1938.