Circulation: Arrhythmia and Electrophysiology On the Beat October 2017


Manage episode 189680539 series 1452724
By Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio streamed directly from their servers.

Dr. Paul Wang: Welcome to the monthly podcast "On The Beat" for Circulation, Arrhythmia, and Electrophysiology. I'm Dr. Paul Wang, editor-in-chief, with some of the key highlights from this month's issue. We'll also hear from Dr. Suraj Kapa reporting on new research from the latest journal articles in the field.

In our first manuscript this month, Cho and Associates investigate the need for readmission for Dofetilide reloading. The FDA labeling for Dofetilide loading states that Dofetilide must be initiated or reinitiated in hospital with continuous electrocardiographic monitoring.

In this article, the authors retrospectively examine the hospital records for 138 patients admitted for Dofetilide reloading for atrial arrhythmias. Of these 138 patients, 102 were reloaded at a previously-tolerated dose, 30 with a dose higher than a previously tolerated dose, and 2 at a lower dose, with the prior dosage unknown in 4 patients.

In 44 patients, or 31.9%, dose adjustment or discontinuation of Dofetilide was performed, although, torsades de pointes occurred in two patients admitted to increased Dofetilide dosage, no torsades de pointes was observed in patients loaded with the same dose of Dofetilide.

This is 0 versus 6.7% or P = 0.05. In 30 out of 102 patients, 29.4% reloaded at a previously tolerated dose. Dofetilide dose adjustment was required. In 11 out of 30 patients or 36.7% admitted for an increase in dose, a dose adjustment or discontinuation was required.

The authors therefore concluded that dosage adjustments or discontinuation were frequent, and that their observations support the need for hospitalization for Dofetilide reloading.

In the next manuscript Tilman Maurer and Associates report a novel superolateral approach to creating a mitral isthmus ablation line.

Because the creation of an endocardial mitral isthmus line with the end point of bidirectional block maybe challenging, the authors examine 114 patients with perimitral annular flutter without a prior mitral isthmus ablation line.

The authors compared the initial group of 57 patients, group A, who underwent catheter ablation using a novel superolateral mitral isthmus ablation line connecting the left sided pulmonary veins with the mitral annulus along the base of the left atrial appendage visualized by selective angiography to another group of patients, 57 patients in groups B undergoing ablation using a conventional mitral isthmus ablation line connecting the left inferior pulmonary vein to the mitral annulus.

The authors found that bidirectional block was achieved in 56 out 57 patients in group A, or 98.2%, and 50 patients in group B, or 87.7%, P=0.06. Ablation from within the coronary sinus was required significantly less for creation of a superolateral mitral isthmus ablation line compared to a conventional mitral isthmus ablation line, 7.0% versus 71.9%, P is less than 0.01.

The need for epicardial ablation from within the coronary sinus in the total length of the mitral isthmus line, 29.3 versus 40.8 millimeters were predictors for unsuccessful bidirectional mitral isthmus blockade. Pericardial tamponade was observed in group A, but not in group B, 5.2% versus 0%, P=0.24.

The authors, therefore, concluded that superolateral mitral isthmus ablation line has a higher acute success rate compared with conventional mitral isthmus ablation line with a low likelihood of needing ablation from within the coronary sinus.

In our next paper, Cronin and Associates examine the relationship between right ventricular pacing frequency, and the incidence of ventricular arrhythmias leading to ICD shock.

Using the altitude database, the authors examined 389 appropriate shocks, and 425,625 transmissions received from 8,435 patients over a mean follow-up of 15.0 months.

Transmissions with 80 to 98% right ventricular pacing were associated with a hazard ratio of 1.56 for an appropriate shock in the subsequent week compared to less than 1% right ventricular pacing, P=0.04 using a time dependent Cox proportional hazard model, however, the authors found that greater than or equal to 98% right ventricular pacing trended towards a lower risk of appropriate shock. Hazard ratio 0.61.

Lifetime cumulative percentage right ventricular pacing was similarly associated with an increased risk of appropriate shocks at 80 to 98% right ventricular pacing, but not greater than or equal to 98% right ventricular pacing.

The authors, therefore, concluded that an increased frequency of right ventricular pacing is associated with an increased risk of appropriate ICD shocks until the right ventricular pacing is greater than or equal to 98%.

In the next manuscript, Wesley O'Neal and Associates examined 12,241 patients from The Atherosclerosis Risk in Communities Study, ARIC study, the association of individual QT components, that is R-wave onset to R-wave peak, R-peak to R-wave end, ST-segment, T-wave onset to T-wave peak, and T-peak to T-wave end with the occurrence of sudden cardiac death.

The authors identified a total of 346 cases of sudden cardiac death identified over a median followup of 23.6 years. The prolongation of the QT interval was associated with a 49% risk of sudden cardiac death. Of the components of the QT interval only the T-wave onset to T-peak component was associated with sudden cardiac death with each standard deviation increase, hazard ratio of 1.19.

The authors found similar results when the QT interval components were included in the same model, thus the authors conclude that the risk of a sudden cardiac death is driven by prolongation of the T-wave onset to T-peak component.

In the next article by Kalliopi Pilichou and Associates, the authors examined copy number variations or CNVs in arrhythmogenic cardiomyopathy patients. The author studied 160 arrhythmogenic cardiomyopathy proband genotype negative for 5 arrhythmogenic cardiomyopathy desmosome genes using conventional mutation screening.

Using multiplex ligation dependent probe amplification, MLPA, 9 heterozygous copy number variations were identified in 11 or 6.9% of the 160 probands. Of these, the authors found that 5 had the least of the entire plakophilin-2 gene to a deletion of only the PRP2 [exon 00:08:45], 1 a deletion of the PRP2 exon 6211, and 1 a PRP2 duplication of the 5 UTR to exon 1. One the desmocollin 2 duplication of exon 7 to 9, and one large lesion of chromosome 18 comprising both DSC2 and desmoglein 2 genes.

All probands were affected by moderate severe forms of disease and 10 or 32% of the 31 family members carrying one of these deletions met the diagnostic criteria for arrhythmogenic cardiomyopathy.

The authors concluded that identifying the copy number variations may increase the yield of genetic testing. In family members carrying the copy number variations, but not displaying the phenotype other factors are likely involved.

In the article by Rahul Samanta and Associates, the authors examined in 7 sheep a mean of 84 weeks post MI, the influence of intramyocardial adipose tissue on scar tissue identification during endocardial contact mapping, the authors found that endocardial electrogram amplitude correlated significantly with intramyocardial adipose tissue.

Unipolar, Right = negative 0.48, bipolar R = negative 0.45, but not correlated with collagen. Unipolar, R = negative 0.36, bipolar, R = negative 0.43. Intramyocardial adipose tissue, dense regions of myocardium were reliably identified using endocardial mapping with thresholds of less than 3.7 millivolts and less than 0.6 millivolts respectively for unipolar, bipolar, and combined modalities.

Unipolar mapping using optimal thresholding remained significantly reliable, an AUC of 0.76. During mapping of intramyocardial adipose tissue confined to punitive scar border zone regions. Bipolar amplitude range of 0.5 to 1.5 millivolts.

The authors concluded that combined bipolar and unipolar voltage mapping with optimal thresholds may permit delineation of intramyocardial adipose dense regions of myocardium following infarction.

In the next article by Kevin Leong and Associates, the authors examined the substraight in electrophysiologic mechanisms that contribute to the characteristic ECG of Brugada syndrome. The authors studied 11 patients with concealed type 1 Brugada syndrome and 2 healthy controls by performing noninvasive electrocardiographic imaging, or ECGI, and ECG recordings during an Ajmaline infusion.

Following Ajmaline infusion the right ventricular outflow tract had the greatest increase in conduction delay and activation recovery interval prolongation compared to the right ventricle or the left ventricle. In controls there was minimal change in the JST point elevation, the conduction delay, or activation recovery intervals at all sites with Ajmaline.

In Brugada syndrome patients, conduction delay in right ventricular outflow tract, but right ventricle or left ventricle correlated with a degree of JST point elevation. Pearson R 0.81.

No correlation was found between the JST point elevation and activation recovery interval prolongation in the right ventricular outflow tract the right ventricle or the left ventricle.

The authors, therefore, concluded that the degree of conduction delay in the right ventricular outflow tract and not prolongation or re-polarization time accounts for the ST or J-point elevation seen in type 1 Brugada syndrome pattern.

In the next article by Jonas Diness and Associates, the authors investigate the role of inhibition with small conductance calcium activated potassium channels in atrial fibrillation termination.

Since these channels are predominately expressed in the atria compared to ventricles, they are a particularly attractive drug target. With a total of 43 pigs atrial tachy pacing was performed until they developed sustained atrial fibrillation that could not be reverted by vernakalant administration.

After the SK channel inhibitor AP14145 was administered, vernakalant resistant AF reverted to sinus rhythm and could not be re-induced by burst pacing. In open chest pigs both vernakalant and AP14145 significantly prolonged atrial refractory of this and reduced AF duration without affecting the ventricular refractory in this or blood pressure.

The authors concluded that SK currents played a role in porcine atrial repolarization and their inhibition by AP14145 demonstrates an arrhythmic affects in a vernakalant resistant porcine model of atrial fibrillation.

In our final article by Padmini Sirish and Associates, the authors examined the role of several ion transporters in action potential duration in cardiac function. The solute carrier SIC26A6, which is highly expressed in cardiomyocytes plays an important role in cardiac intracellular pH regulation.

Using the SIC26A6 knockout mice, the authors found that ablation of SIC26A6 results in action potential shortening, reduced calcium transients, reduced sarcoplasmic reticulum calcium load, and decreased sarcomere shortening in the SIC26A6 knockout cardiomyocytes.

Ablation of the SIC26A6 reduced fractual shortening and cardiac contractility in vivo. Intracelluar pH regulation is elevated in the SIC26A6 knockout cardiomyocytes consistent with the chloride bicarbonate exchange activities of SIC26A6.

The SIC26A6 knockout mice exhibited bradycardia and fragmented QRS complexes supporting the role of SIC26A6 in the cardiac conduction system, therefore, the authors provided evidence that the role of SIC26A6 cardiac electrogenic chloride bicarbonate transporter in ventricular myocytes as well as intracellular pH regulation, excitability, and contractility.

That's it for this month, but keep listening. Suraj Kapa will be surveying all journals for the latest topics of interest in our field. Remember to download the podcast "On The Beat." Take it away Suraj.

Suraj Kapa: Thank you very much Paul and welcome everybody back to "On The Beat," where we'll review hard hitting articles across the electrophysiologic literature. It is my pleasure to introduce you to 15 different articles published in the past month of September across all the journals in cardiovascular medicine.

The first area that we will be focusing on is atrial fibrillation with a specific focus within the realm of anticoagulation, and we refer you to a paper published by [Kurshida Doll 00:16:55], entitled "Factors Associated With Anticoagulation Delay Following New-Onset Atrial Fibrillation," published in The American Journal Of Cardiology on October 15, 2017.

In this publication Kurshida Doll, reviewed the frequency with which there is a delay in introduction of oral anticoagulation after a new diagnosis of atrial fibrillation, and the impact on overall outcomes. In a large electronic medical record they identify incident episodes of atrial fibrillation between 2006 and 2014.

They used the CHADS2 score rather than the CHADS-VASc score to estimate overall risk, and then after this they reviewekud the outcomes of the patients. They found for those patients in whom oral anticoagulation would have been recommended, the median time to initiation was around five days, with an interquartile range of 1 to 43, with by far most patients receiving Warfarin with about 86%.

Interestingly, about 98 strokes occurred between the time of new atrial fibrillation diagnosis, and the actual initiation of oral anticoagulation. Several factors led to this delay in oral anticoagulation including female gender, absence of hypertension, prior falls, and the presence of chronic kidney disease.

However, ultimately, by 6 months over 90% of patients were on oral anticoagulants appropriately, though still a slightly higher proportion appropriately in men than woman.

They noted that most patients with new diagnosis of atrial fibrillation and noted to have an elevated stroke risk started on oral anticoagulation within 1 week. Given these findings it is important to consider how we wait to introduce oral anticoagulation into patients after initial diagnosis given many initial diagnoses may be made by internists, or even in some cases by the patient themselves on a remote monitor or an ambulatory monitor it is important to consider how they are tied into the individual, who would feel most comfortable and who's most apt to prescribe oral anticoagulation.

Changing gears within atrial fibrillation we next move on to cardiac mapping and ablation, and specifically focus on a paper published by Black-Maier et al, in the September edition of "Heart Rhythm" entitled "Risk Of Atrioesophageal Fistula Formation With Contact Force-Sensing Catheters."

While atrioesophageal fistula formation is a relatively rare complication of atrial fibrillation ablation it can be life threatening, contact force catheters for ablation of atrial fibrillation have come into vogue as they are felt to improve procedural effectiveness and potentially reduce complications by improving individual understanding of contact with the myocardium and when contact is excessive.

However, there's been little exploration of the actual risk of atrioesophageal fistula. An [inaudible 00:19:50] from the association they refused the mod database or the manufacturer and user facility device experience database for adverse event reports.

Amongst almost 27,000 device reports they identified a total of 78 atrioesophageal fistula cases. About 1,200 of the reports were related to contact force-sensing catheters and about almost 1,500 were related to non contact force sensing catheters.

Of the 78 atrioesophageal fistula cases reported the vast majority were the contact force-sensing catheters with a total number of 65, or about 5 times more than with non contact force-sensing catheters.

Unfortunately, esophageal temperature increases were only mentioned in about 2.5% of cases in contact force and power settings were not consistently reported in order to come to any conclusions. They noted the overall mortality with atrioesophageal fistula in this population was around 56%, with really the vast majority surviving as a result of surgical repair as apposed to stenting or no intervention.

While this data is somewhat skewed because it's based on self reported data by proceduralists, who are reporting back to the mod database, it is important to consider whether or not there is actually an increase complication rate associated with contact force-sensing catheters as these catheters do reflect a fundamentally different catheter than the non contact force-sensing catheters routinely used due to changes in the stiffness, and the mechanics of the catheter itself.

It is important to consider when using any new catheter with any new options for monitoring, or that might alter the stiffness, or other mechanical properties of the catheter, whether or not application of similar power settings are relevant.

While the data is potentially skewed in the status set it will be important to consider it going forward as to whether or not there are implications of some increased risk of complications, and how to mitigate these by altering our contact force and power setting decision making.

Further study will be required in order to better understand these data and the implications. I would refer the readers also to an article published by [inaudible 00:22:02] in circulation where they reviewed the mechanism of atrioesophageal injury and also to another publication published in The Journal Of Cardiovascular Electrophysiology this past month by [inaudible 00:22:11], where they did a meta analysis of the overall benefit of contact force related catheters over non contact force related catheters.

In that paper they demonstrated that based on this meta analysis there seems to be an overall benefit in terms of outcomes in contact force-sensing catheters without a difference in procedural complications. However, I would refer the reader to the fact that there are very limited randomized studies comparing contact force versus non contact force catheters.

Next, also within the realm of cardiac mapping and ablation we reviewed a publication by Haldar, et al., entitled Resolving Bipolar Electrogram Voltages During Atrial Fibrillation Using Omnipolar Mapping, published in the last edition of Circulation Arrhythmia Electrophysiology. Also, reviewed by Dr. Wang in last months podcast.

The importance of this article lays in an improved understanding of what we mean when we talk about voltage or substraight mapping. In his paper, Haldar, et al., tried to understand better what the bipolar electrogram might actually refer to when comparing traditional bipolar mapping versus omnipolar mapping.

This becomes important as we consider a low voltage guided substraight modification for not just atrial fibrillation ablation, but also potentially for ventricular arrhythmia ablation. They sought to compare the use of peak-to-peak voltage for assessment of bipolar voltage with omnipolar peak-to-peak voltages in both sinus rhythm and atrial fibrillation.

They demonstrated that in canines vertical orientation of a catheter relative to the underlying tissue consistently resulted in a higher bipolar voltage in both sinus rhythm and atrial fibrillation. Furthermore, they show that the max obtained ominipolar voltage were consistently larger than multi-horizontal and vertical voltages in both rhythms.

Vector field analysis of these wave fronts during atrial fibrillation in particular, demonstrated the omnipolar electrograms can account for a collision in fractionation, and required an electrogram of voltages independent of these effects. Thus, they suggested that the omnipolar electrograms can use maximum voltages, and can separate the influence from directional factors, collision, or fractionation especially when compared with contemporary bipolar techniques.

The implications of the study are several. First off, when performing substraight mapping we traditionally use what we can in terms of trying to get appropriate bipolar signal analysis. However, catheters have significantly evolved since the early studies of bipolar voltage mapping in terms of establishing voltage cutoffs.

There are many different multipolar catheters with varying interelectrode spacing, but sometimes prefer parallel orientation to the underlying myocardium as opposed to vertical orientation. The fact that bipolar voltage can significantly vary based on both orientation of the catheter as well as the rhythm is important when considering whether a substraight actually exists in a specific location or not, and what "Normal voltage cutoffs," where specific patients should be."

When we consider novel catheters with increasing complex design including introduction of mini electrodes as well as omnipolar electrodes, it is important to consider whether an assessment of "Normal voltage," should be the same. Further study will be required to better understand how to best analyze these results.

Moving to a different form of management in atrial fibrillation we will next refer you to a paper by Borris [Madal 00:25:44] published in this last month's edition of Heart Rhythm, entitled Efficacy and safety of left atrial appendage closure with WATCHMAN in patients with or without contraindication to oral anticoagulation, 1-Year follow-up outcome data of the EWOLUTION trial.

The EWOLUTION trial was a prospective multi center registry looking at the outcomes of WATCHMAN patients, who had indication for closure based on European society of cardiology guidelines. They sought to evaluate a 1 year followup of these patients.

The baseline CHADS-VASc score was on average about 4-1/2 with a mean age of over 73 years. Almost a third of the patients had prior transient ischemic attach or ischemic stroke. They noted that the vast majority of the patients had a successful WATCHMAN implantation with a 1,005 out of 1,025 patients having successful implantation, with only 3 of these 1,005 patients having any leak greater than 5 millimeters.

The majority up to 87% had T-followup at least once after initial implantation. Interestingly, the vast majority only used antiplatelet therapy with only 8% having vitamin K antagonist used in the post WATCHMAN implantation period.

There was a reasonably high mortality of 10% in the first year after implantation, though this was felt to typically reflect advanced age and other comorbidities. Also, interestingly almost 4% of patients had thrombus on their device, which was independent of the drug regimen used. In other words whether antiplatelet therapy or vitamin K antagonists.

Overall, the ischemic stroke rate was relatively low at 1.1%, with a relative risk of 84% versus estimated historical data, and also with a relatively low major bleeding rate of only 2.6% and this predominately being non-procedure of device related.

Thus, they concluded that LA closure with the WATCHMAN device had a high implant and sealing success, and it appeared to be safe and affective in reducing ischemic stroke risk given that the relative incidence was only 1.1%, despite the fact that the vast majority were not actually even using oral anticoagulation.

There are trial ongoing in the United States to evaluate whether or not patients can be safely kept off of oral anticoagulation in the peri-implant period as in some countries standard of care is to place them on anticoagulants in the immediate post implantation period.

However, two other things need to be noted in this real world analysis of outcomes with WATCHMAN. Almost 10% or 1 out of 10 patients died within 1 year of followup, thus whether or not better patient selection is required to understand those patients will receive maximal benefit from this invasive procedure might be considered.

Further, more almost 4% had device related thrombus. What this means in terms of stroke risk especially over longterm followup needs to also be considered. I think overtime we'll get better understanding of what those risks might be for an endocardial system for a left atrial appendage occlusion.

But, staying within the realm of stroke risk in atrial fibrillation, we next review the article by King, et al., published in The Journal Of American College Of Cardiology, in the September 2017 edition entitled, Left Atrial Fibrosis and Risk of Cerebrovascular and Cardiovascular Events in Patients With Atrial Fibrillation.

Cardiac MRI to evaluate late gadolinium enhancements suggesting regional cardiac fibrosis and atrial fibrillation is slowly taking steam, but primarily as a method of assessing potential efficacy of atrial fibrillation ablation with greater amounts of delayed enhancement potentially suggesting an overall lower risk, or a lower likelihood of success of atrial fibrillation ablation.

King, et al., sought to evaluate in a retrospective cohort study regarding the risk of cerebrovascular and cardiovascular major events associated with a degree of delayed enhancement in MRI. They reviewed 1,228 patients undergoing cardiac MRI to assess left atrial fibrosis between 2007 and 2015.

They then staged these patients and stratified them according their [Utah 00:29:45] stage, which had been previously recorded for the degree of fibrosis seen. They demonstrated on followup that there was a significantly higher incidence of major cardiovascular and cerebrovascular events associated with higher degrees of late gadolinium enhancement with a relative risk ratio of about 1.67.

However, the only individual component of these outcomes that remains significantly associated with advanced gadolinium enhancement was actually stroke or TIA, with a hazard ratio of 3.94, thus they concluded that severe LA late enhancement is associated with increased cerebrovascular events principally.

This study is important in that it highlights another potential risk factor that may need to be considered when risk stratifying patients for their risk of stroke. We recognize that even some paroxysmal patients can have extensive left atrial fibrosis, and some persistent patients might not have a ton of atrial fibrosis.

Whether this can further help risk stratified patients in terms of overall stroke risk, and might identify and help characterize low risk patients further needs to be considered.

One of the key features of this evaluation needs to be also the mechanism. In theory patients with greater endocardial injury of the atrium might be more prone to clot formation, and thus it may seem reasonable to expect indeed when we have more left atrial fibrosis as suggested by delayed enhancement on MRI. There may in fact be a higher greater cerebrovascular event rate.

Finally, changing gears a little bit within the realm of risk stratification and management for atrial fibrillation we focused on autonomics and specifically a publication by Stavrakis et al., in the last month edition of Jack Clinical Electrophysiology, entitled Low Level Vagus Nerve Stimulation Suppresses Postoperative Atrial fibrillation And Inflammation In A Randomized Study.

The group, headed up by Sonny [Poe 00:31:42] have previously published on both tragus stimulation as well as low level of vagus nerve stimulation in patients undergoing atrial fibrillation ablation. In this particular study they sought to evaluate whether or not implantation of a low level of vagus nerve stimulator during cardiac surgery could reduce the risk of postoperative atrial fibrillation.

They sutured a bipolar wire to the vagus nerve preganglionic fibers along the lateral aspect of the superior vena cava at the time of surgery. They then performed high frequency stimulation of 50% below the threshold for slowing the heart rate for 72 hours, and those randomized to the vagus nerve stimulation group.

The secondary group was a sham cohort. They demonstrated amongst the 54 patients randomized to either group that the frequency of postoperative atrial fibrillation was almost a third in the low level of vagus stimulation group when compared with the control group.

Interestingly, their frequency of atrial fibrillation was not only lower, but the level of inflammatory markers also decreased with both serum tumor necrose factor alpha and interleukin 6 levels being significantly lower in the low level vagus nerve stimulation cohorts.

In line with prior data from atrial fibrillation ablation these data were suggesting that low level of vagus nerve stimulation can suppress postoperative atrial fibrillation and attenuate the inflammatory response.

Also, in this past month there was a paper by [Yoo 00:33:09] et al., in The Journal Of The American Heart Association, specifically looking at the use of vagus nerve stimulation at the level of the tragus in patients with obstructive sleep apnea associated atrial fibrillation.

Similar to prior work form the Oklahoma group, they demonstrated that in fact there is a beneficial effect on reduction of atrial fibrillation, and this is primarily mediated through attenuation of autonomic factors that mediate obstructive sleep apnea related atrial fibrillation.

Moving away from atrial fibrillation, we next delve in cellular physiology first starting with an article published in Nature Scientific Report this past month, on very low density lipoprotein in metabolic syndrome, and how it modulates gap junctions and slows cardiac conduction.

In the past year there have been multiple studies regarding specific cell types and how they might interplay with cardiac fibrosis, and risk of conduction slowing. In this publication we had all reviewed the effect of very low density lipoproteins, and their effect on cardiac conduction in, in vitro models.

They demonstrated that primarily through down regulation of [conexion 00:34:21] 40 and conexion 43, very low density lipoproteins have significant impact on cardiac conduction with increased prolongation of the P-wave, PR-intervals, QR restoration, and QTC intervals.

Thus, they concluded that very low density lipoproteins may contribute to the path of physiology of both atrial fibrillation and ventricular arrhythmias that can be seen in metabolic syndrome. This report is important because it highlights the fact that we can actually see other cell types including LDL causing a significant reduction in cardiac conduction and thus mediating arrhythmogenesis.

In fact there was one other paper published just a couple weeks prior also in The Nature Of Scientific Reports by [Lee 00:35:04] et al., entitled Human Electronegative Low-Density Lipoprotein Modulates Cardiac Repolarization Via LOX-1-Mediated Alteration Of Sarcolemmal Ion Channels.

They showed that LDL can actually result in QTC prolongation in patients with ischemic heart disease by specific mechanisms involving LOX-1. Recognition of the mechanisms behind which less traditional factors such as VLDL or LDL may mediate alterations in cardiac conduction are important when we consider our potential novel targets for treatment of arrhythmias in patients whether for prevention or for treatments.

In light of this attempt to identify novel targets we next move on to another paper in the realm of cellular electrophysiology published by [Toib 00:35:52] et al., in The American Journal of Physiology, Heart and Circulatory Physiology, entitled Remodeling Of Repolarization And Arrhythmia Susceptibility In A Myosin-Binding Protein C Knockout Mouse Model.

In hypertrophic cardiomyopathy there might be multiple mechanisms that might lead to increased risk of ventricular arrhythmias. These might be scar related due to the fact that patients can burn out from the hypertrophic cardiomyopathy overtime and get both endocardial, epicardial, and mid myocardial fibrosis, but what are the mechanisms that might mediate the development of ventricular arrhythmias and hypertrophic cardiomyopathy remain to be elucidated, and there's been very limited evaluation of the effect of repolarizing potassium currents on this risk.

Thus, Toib, et al., studied myosin-binding protein C knockout mice to look at what happens with repolarizing potassium currents in his cohorts. They demonstrated that in these knockout mice there was a prolongation in the corrected QT interval when compared to the wild type mice with overt ventricular arrhythmias.

They also demonstrated that there is action potential prolongation associated with a decrease for polarizing potassium currents, and a decreased MRNA levels of several key potassium channels subunits, thus, they concluded that in this specific subtype of hypertrophic cardiomyopathy needed by myocin combining protein C mutations that part of the ventricular arrhythmia risk might be due to a decrease in polarizing potassium currents in turn leading to increase in action potential and QT interval.

The reason that this particular finding is important is in my highlight drug selection in specific types of hypertrophic cardiomyopathy. In my postulate for example the class 3 antiarrythmics drugs might actually increase risk in some subtypes of hypertrophic cardiomyopathy due to down regulation of potassium channel subunits.

Consideration of this is critical when best evaluating how to mange and treat these patients. Changing gears to another method of channelopathy we focus within the realm of genetic channelopathies and specifically on Brugada syndrome.

In this last month's edition of Heart Rhythm, Sierra, et al., published their series of longterm prognosis of drug induced Brugada syndrome. They reviewed a consecutive cord of 343 patients with drug induced Brugada syndrome, and compared their outcomes with 78 patients with a spontaneous type 1 pattern.

The mean age of patients was around 41 years. Interestingly, about 4% of the patients had a clinical presentation of 7 cardiac deaths, and 25% had a clinical presentation of syncope. However, the majority of the patients were asymptomatic, around 71%.

Most of the patients were female amongst the drug induced Brugada syndrome cohort. They demonstrated that there were less ventricular arrhythmias both induced string and electrophysiology study, and seen over followup of up to 62 months in the drug induced Brugada syndrome cohort as compared with the spontaneous type 1 cohort.

Overall, the event rate in drug induced Brugada syndrome was 1.1% of [person year 00:38:54] versus 2.3% of person year in patients with spontaneous type 1 pattern.

They suggested that presentation of sudden cardiac death or inducable ventricular arrhythmias at the time of VP study were independent risk factors associated with arrhythmic events in drug induced Brugada syndrome. However, if a patient was asymptomatic and had no inducible ventricular arrhythmias they had a significantly better prognosis with drug induced Brugada syndrome over a spontaneous type 1 pattern.

Thus, they concluded that even in drug induced Brugada syndrome sudden cardiac death is possible. However, in asymptomatic patients without a prior clinical presentation of sudden cardiac death or inducible ventricular arrhythmias during electrophysiology study, they may be relatively safer than their spontaneous type 1 counterparts.

This study highlights the importance of stratification of patients into the mechanism of how their genetic channelopathy presents whether as a spontaneous finding or as a finding in the setting of other events. Further prospective analysis, however, is needed to best guide how to manage these patients and in whom to put a defibrillator as I would note that almost 37% of these patients actually had an ICD placed with the vast majority without incident events.

Speaking of implantable devices we next move to the realm of ICD pacemaker and CRT, and specifically we review the publication by Samar, et al., published in Jack Clinical Electrophysiology this past month on the diagnostic value of MRI in patients with implanted pacemakers and implanted cardiover defibrillators across the population.

Does the benefit justify the risk of proof of concept study? Increasingly, MRIs are being done in patients with even Legacy defibrillators and permanent pacemakers. However, when assessing the benefit versus the risk it's important to understand did the MRI actually change outcomes, and this was a specific question that the authors tried to answer.

They took patients with conventional or Legacy pacemakers or ICDs, and tried to evaluate what the actual benefit was on those patients in whom an MRI was done. They specifically asked four questions, one, did the primary diagnosis change, two, did the MRI provide additional information to the existing diagnosis, three, was the pre-MRI or tentative diagnosis confirmed, and four, did the patient management change?

They noted there were no safety issues encountered in any of the 136 patients an MRI was performed. In 97% it was felt that MR added value to the patient diagnosis and managements, with 49% of investigators feeling that MR added additional valuable information to the primary diagnosis, and in nearly a third the MR actually changing the principle diagnosis and subsequent management of the patient.

Increasing evidence suggesting that MRI can be safely performed even in Legacy pacemakers and ICDs, and the fact the MRI can wield important evidence related to diagnostics needs to be taken into consideration as investigators and other centers try to identify methodologies for safely performing MRIs in these patient cohorts.

It seems thus far like MRI might justify risk of these procedures under controlled settings. Next, we move also within the realm of implantable cardioverted defibrillators, but to a different assessment published by Kawada et al., in this past months issue of Heart Rhythm where they sought to evaluate the comparison of longevity in clinical outcomes of implantable cardioverted defibrillator leads among manufacturers.

They specifically sought to assess the longevity of [Lynox 00:42:35] SSD by [Atronic 00:42:36] leads compared with Sprint Fidelis by Matronic, Sprint Quattro by Matronic, and Endotac Reliance by Boston Scientific Leads. The reasoning for this was early failure of some of the biotronic Lynox leads has been reported. Thus, they retrospectively reviewed patients undergoing implantation with these different lead approaches between 2000 and 2013.

They noted failure rates of the Lynox versus Spring Fidelis versus Endotac leads where 3.2% for a year, versus 3.4% for a year, versus 0.61% for a year respectively. No lead failure was notable with a followup [inaudible 00:43:13] in Sprint Quattro leads, thus, they felt that the survival probability of Lynox leads was comparable to Sprint Fidelis leads, and lower than that of Endotac or Sprint Quattro leads.

They found that age was the primary predictor of Lynox lead failures with the patients less than 58 years old had significantly increased risk of lead failure compared with those greater than 58 years old, thus, they concluded that this was a first description of a lower survival rate for Lynox leads in an aging population.

Early identification of leads that might be at risk of failure is critical in patient risk stratification. The finding that there might be other leads that might be at risk of failure highlights the importance of close monitoring of these leads in contribution to register data.

I would note that within this study that it was primarily done at one center and the vast majority of patients actually received Lynox leads. Thus, further evidence was clinically required for more centers to understand what the mechanism of this risk is, and also whether the risk is born out consistently across multiple centers particularly because the vast minority got the one lead, but didn't have any lead failure encountered for.

Further, speaking about defibrillators we focus on the different mechanism of failure, and specifically the publication by [Thogersen 00:44:38] et al., published in last months' edition of Circulation And Arrhythmia Electrophysiology entitled Failure To Treat Life Threatening Ventricular Tachy Arrhythmias In Temporary Implantable Cardioverted Defibrillators Implications For Strategic Programming.

In this publication they did not so much focus on lead failure, but the failure the ICD due to potential strategic programming decision making on appropriately treating ventricular tachy arrhythmias. Their current consensus recommendations as far as using a generic rate threshold between 185 and 200 beats per minutes in primary prevention ICD patients, thus, they sought to determine in the case series what the relationship between program parameters and failure of modernizing ICDs to treat for VF actually worked.

Between 2015 and 2017 at four institutions they reviewed cases where normally functioning ICDs failed to deliver timely therapy for VF. There were a small number of patients noted fitting this criteria with only 10 ambulatory patients. Five actually died from their untreated VF, whereas four had cardiac arrests through a witness requiring external shocks, and one was ultimately rescued by a delayed ICD shock.

The main reason that they were not appropriately treated were that the ventricular fibrillation event did not satisfy the programmed detection criteria in nine out of ten patients. Seven of the patients had the slowest detection rates consistent with generic recommendations, but were never tested in the peer review trial for the manufacturers ICDs.

Namely, the decision making on the appropriated generic rate threshold was tested on specific manufacturers ICDs, but didn't apply the decision making on programming on other manufactures ICDs. In some cases manufacturers specific factors were interacting with fast detection rates to withhold therapy such as enhancement in MIC wave oversensing.

Thus, they demonstrated that in this population untreated VF despite recommendation programming, accounted overall for 56% of sudden deaths and 11% of all deaths in the overall cord of patients during the study period.

Thus, over half of the cases where sudden death occurs in patients with ICDs appears to be due to untreated VF despite recommended programming. Thus, they concluded that these unanticipated interactions or complex decision making regrading generic program of parameters might in part lead to withholding of therapy inappropriately in ventricular fibrillation.

This publication highlights the importance of thoughtful decision making when translating evidence based detection parameters both between manufacturers and applying them across individual patients.

While the overall number of patients is quite low, mainly only ten patients who were affected by this event, the number of patients dying as a result of it is fairly high in terms of a percentage with 56% of sudden deaths occurring as a result of untreated VF from variation from recommended programming.

Closer attention needs to be paid to understanding how to better assess which patients would benefit from the current generic rate thresholds as opposed to who will be harmed by it. It is possible that one size fits all approach will always result in some harm to some, while benefit to others as potentially cutting down the lower rate cutoff in some patients might lead to inappropriate therapies, which might be as life altering as untreated VF in many patients.

Finally, keeping within the realm of defibrillation we review an article by [Layva 00:48:24] et al., published in last month's edition of The Journal of American College of Cardiology entitled Outcomes of Cardiac Resynchronization Therapy With or Without Defibrillation in Patients With Nonischemic Cardiomyopathy.

There are several recent studies that have started to cast doubt on what the incremental benefit of defibrillation adage cardiac resynchronization therapy actually is in nonischemic cardiomyopathy.

However, we also know that in patients with scar noted on MRI that there can be an increased risk of ICD therapy, thus, part of the difficulty that some individuals have is how we define the nonischemic cardiomyopathy cohorts. Namely, is all nonischemic cardiomyopathy crated equal and we can better risk stratify this population to subtypes some of whom might benefit from primary correction defibrillators and some of whom might not?

Thus, in this study they aimed to determine whether CRTD is superior to CRTP in patients with nonischemic cardiomyopathy based on the presence or absence of left ventricular midwall fibrosis detected by cardiac magnetic resonance.

There were a total of 68 patients who had midwall fibrosis, and 184 patients who had not, and all of them underwent the evaluation prior to CRT implantation. They noted that the presence of midwall fibrosis was an independent predictor of total mortality with a hazard ratio of 2.31 as well as total mortality or heart failure hospitalization.

This sudden cardiac hazard ratio was about 3.75 with an increased risk attributable to the presence of midwall fibrosis. They also noted that total mortality or heart failure hospitalization, and total mortality or hospitalization for major adverse cardiac events was significantly lower in patients with CRT defibrillator than with CRT pacemaker in those with midwall fibrosis, but not in those without midwall fibrosis.

These findings highlight that in some patients with nonischemic cardiomyopathy CRTD may be superior to CRTP, though these might be guided by the presence of abnormal substraights. The evaluation of what nonischemic cardiomyopathy means in an individual patient needs to be closely considered.

Nonischemic cardiomyopathy is a blanket term for all those patients who do not have an ischemic cardiomyopathy and who may or may not have been fully evaluated for discrimination of another type of myopathy such as infiltrated myopathies for example sarcoidosis.

The value of cardiac magnetic resonance imaging is being increasingly understood as it applies to both risk stratification, nonischemic cardiomyopathy, as well as the value in decision making as far as treatment of these patients. In a recent publication published this past month as well in Jack Electrophysiology, by [inaudible 00:51:13], et al., they reviewed the efficacy of implantable cardioverted defibrillator therapy in patients with nonischemic cardiomyopathy based on a meta analysis of existing trials.

They demonstrated in a meta analysis of randomized controlled trials that compared to medical therapy ICD has significantly improved survival among patients with nonischemic cardiomyopathy with an injection fraction of less and equal to 35%. However, CRT defibrillator overall was not associated with statistically significant mortality death when compared to CRT pacemaker.

These findings are actually complimentary to each other, but need to be considered in context. One of the indications for the recently published Danish study was the fact that not only is CRT being increasingly utilized appropriately in patients with nonischemic cardiomyopathies, but also guideline directed medical therapy has improved over the course of the last several years since the initial trials of defibrillator therapy as primary prevention.

Furthermore, the trial was actually powered based on a 25% reduction in overall events. Thus, even if there's a smaller benefit it would not necessarily be powered to identify if this is statistically significant. One issue as stated is the fact that nonischemic cardiomyopathy might be a milieu of different causes in individual patients. Some of whom might be at high risk for sudden cardiac death and some of whom might not.

The publication by Levya, et al., highlights that better attempts at risk stratification on the basis of either MRI or other modalities might be important in helping us further assess who actually benefits from ICD, however, when mixing in prior trials with more recent trials that existed at different areas of medical therapy, and different areas of appropriate use of devices such as CRT it is critical to consider whether or not the same cutoffs, the same power calculations still apply.

It is doubtless that defibrillator therapy is needed in many patients with both ischemic and nonischemic cardiomyopathy even with improved therapies for these patients otherwise. However, this multitude of publications coming out to improve our assessment of the utility of ICDs should not necessarily call into question of whether or not ICDs are merited at all, but should call into question whether we understand and have come to the best form of risk stratification for those patients who would most benefit, and thus this is an opportunity for us to identify those patients better.

Next, we will move to the realm of supraventricular tachycardia's and specifically an article published by [Yang 00:53:41], et al., in the last month's edition of Heart Rhythm, entitled Focal Atrial Tachycardia's From The Parahisian Region, Strategies From Mapping And Cather ablation.

With focal atrial tachycardia's from the parahisian region can potentially be targeted from multiple different regions, the right atrial septum, the noncoronary cusp, and the right middle septum. However, the optical mapping and ablation strategy for these arrhythmias remains unclear, and thus they sought to investigate electrophysiology characteristics in optimal ablation sites for parahisian [inaudible 00:54:10] from these different areas.

They reviewed 362 patients with atrial tachycardia's undergoing catheter ablation. They did DCG analysis and electrophysiology studies extensively on these patients. Overall, 91 patients had a parahistian origin. An ablation was successful in a majority of these up to 94.5%.

The majority of these patients had their AT successfully eliminated from the noncoronary cusp with about 44 of the 91 having it targeted from this region, with the remaining 23 from the right atrial septum, and 19 from the right middle septum. They noted those who had an earliest potential at the distal HIS catheter tended to have their site of origin more successfully ablated from the noncoronary cusp.

18 episodes available. A new episode about every 31 days averaging 37 mins duration .