Minimal residual disease assessment in CLL began two decades ago in studies utilizing poorly specific and insensitive two-color FLC methods, referring to all CD19+CD5+ light-chain restricted cells as CLL. Back in 1992, Robertson et al.7 demonstrated that even when MRD negativity is defined by basic two-color FLC, CR patients can be divided into two groups, with differences in progression-free survival (PFS) between MRD-positive versus negative patients (19 months versus over 30 months, respectively). O’Brien et al.8 also showed a small but significant prolongation of PFS in MRD-negative CR patients after fludarabine and cyclophosphamide treatment in a small group of 36 patients analyzed by the same insensitive FLC technique.
Later studies began using more complex multi-color multiparameter FLC as well as PCR to assess MRD as a secondary end-point, and examined a variety of treatment regimens, coming up with results in line with those presented above. These findings added solid evidence for the role of MRD surveillance in CLL during first-line as well as advanced-line treatments.
Minimal Residual Disease Negativity in First-Line Therapies
In a study by Bosch et al.,
9 69 patients were treated with the fludarabine/cyclophosphamide/mitoxantrone (FCM) protocol as front-line therapy. Minimal residual disease status was assessed by multiparameter (four-color) FLC and ASO-PCR. Twenty-six percent of patients achieved MRD-negative CR and had a lower probability of progression at 2 years (9% for MRD-negative patients versus 20% for MRD-positive patients). Moreover, MRD-negative CR also led to improved overall survival (OS) compared to any inferior response.
Hillmen et al.10 reported similar results by analyzing 297 previously untreated patients who received alemtuzumab versus chlorambucil. In this study, four-color FLC demonstrated an MRD-negative status in 26% of CR patients, and, in consensus with the study by Bosch et al., MRD-negative CR patients demonstrated a significantly improved PFS compared to those with MRD-positive CR.
The most substantial evidence for the benefit of achieving MRD negativity in first-line therapies comes from the German CLL Study Group (GCLLSG) trial designed, among other things, to compare MRD status in two different arms using four-color FLC 2 months after treatment.11 In this study, 817 untreated patients were randomized to either fludarabine and cyclophosphamide (FC) or FCR treatment protocols. Complete response rates were significantly higher with FCR (52%) than with FC (27%), as was the MRD negativity status in the bone marrow (47.6% and 27.3%, respectively). Ten-year follow-up of patients in the FCR arm displays what seems to be a plateau in survival curves, arising mostly from the low MRD patient group, suggesting that some of these patients may have been cured. Of interest, MRD levels were found to be of higher prognostic value than the treatment regimen itself, cytogenetics (excluding the 17p deletion), pre-therapeutic white blood cell count, β2-microglobulin, and IgHV mutational status.
A retrospective analysis of over 200 patients treated with a variety of first-line therapies conducted by Santacruz et al.12 supports the above findings, demonstrating clearly that MRD negativity was a consistent predictor of both treatment-free survival and OS, with an almost doubled treatment-free survival interval for patients achieving MRD-negative CRs (76 versus 40 months). This analysis also highlights that the advantages of MRD negativity achievement are not confined to one specific treatment protocol.
Minimal Residual Disease Negativity in Advanced-Line Therapies
The issue of MRD negativity has also been addressed in advanced-line treatments. In the study by Moreton et al.,
13 91 patients refractory to purine analogues were treated for a total of 9 weeks with alemtuzumab, and assessed for bone marrow MRD by four-color FLC. In this study treatment-free survival as well as median survival were significantly longer in MRD-negative patients compared with those achieving an MRD-positive CR, partial response, or no response. Median treatment-free survival for MRD-negative patients had not been reached at 60 months, versus 20 months for those with MRD-positive CRs.
In a similar vein, among 37 patients with resistant or relapsed CLL treated with FCM in the study by the Spanish GELCAB (Grup per l’Estudi dels Limfomes a Catalunya i Balears),14 median duration of PFS and overall response was longer for patients achieving MRD-negative versus MRD-positive CR.
Minimal Residual Disease Negativity in the Post-allogeneic Stem Cell Transplant Setting
Minimal residual disease has also been evaluated in the post-allogeneic stem cell transplant (alloSCT) setting. Post-alloSCT follow-up presents challenges in decision-making due to the fact that while physicians have a potential therapeutic tool that can be constantly manipulated for disease control, studies have not determined the exact scenarios requiring intervention, or the best intervention method. A study by the German CLL Group examined the MRD status at several time points after alloSCT and demonstrated enhanced event-free survival in MRD-negative patients at 12 months.
15 This finding emphasizes not only the importance of MRD negativity per se, but also the significance of performing MRD analysis at the 12-month time point in alloSCT patients. Another small retrospective analysis
16 also addressed this issue demonstrating similar data regarding the correlation between MRD negativity and disease-free survival, and suggesting a potential role for sequential MRD monitoring, since the dynamics of MRD level proved to be a relevant prognostic factor.
Importantly, these two studies report an association between MRD surveillance per se and improved event-free survival, suggesting that once the treating physician was equipped with the knowledge of MRD status, treatment modifications were made leading to improved survival. Optional treatment modifications applied in MRD-positive patients may include the use of donor lymphocyte infusions or reduction of immunosuppressive drug dose. This observation emphasizes the need for trials specifically addressing the question of treatment tailoring according to MRD dynamics after transplantation.
Can MRD Status Guide Risk Adapted Therapy in CLL?
The available data imply that MRD negativity is a potent landmark on the way to improved survival, raising questions about its potential applicability for treatment de-escalation and toxicity reduction in patients having reached MRD negativity early in the course of therapy. Two prospective studies addressed this issue suggesting that MRD negativity can guide decisions regarding therapy duration and intensity. In the study by Strati et al.,
17 MRD negativity in bone marrow samples after three cycles of FCR resulted in the same PFS and OS as those found in patients with no detectable MRD after six courses of FCR. Data analysis of the GCLLSG CLL8 study reveals a similar picture, showing that PFS was similar in patients who had already achieved low MRD levels after three treatment cycles compared with those who required the full treatment (six cycles) to attain this status.
11
These two studies show that MRD negativity is an independent prognostic factor at any time point, suggesting that once reached it may provide sufficient relapse risk “protection” which is not enhanced with treatment continuation.
If so, strict adherence to the full-length treatment protocols might prove to be unwarranted, and shortening or de-escalation of therapy could reduce toxicity in patients achieving early and deep responses. This hypothesis needs to be proven in a randomized trial, but currently available data are nevertheless thought-provoking.