Extended Wear: At Last Or Déjà Vu?
BY H. DWIGHT CAVANAUGH, MD, PHD
We stand at the threshold of a new era in contact lens development with high oxygen-transmissible materials which offer the promise of true extended wear for up to 30 nights. Do we have any convincing biological rationale for why these materials might be safer and more effective? The hopeful part of us wants to believe; the survivor of previous EW attempts in the 1980s remains unconvinced. The paradox is that unless the lenses are approved for lengthy wearing periods, we cannot perform the epidemiological studies after the fact to vindicate their performance.
Along the way, clinicians need to offer patients an acceptable alternative to permanent laser surgery. Many surveys have shown that EW is not a viable option unless wearing periods of about a month can be safely achieved. To make things more difficult, clinical studies of 500 to 1,000 patients will not serve as adequate measurements of incidence of or risk factors for the infectious-ulcerative keratitis. To date, we have no surrogate, prospective, objective, potentially maskable outcome measure in clinical studies that could predict future risk for this devastating problem.
Up until now, measurements of corneal thickness have been gospel, and the Holden-Mertz curve has been the text. Unfortunately, corneal swelling does not predict risk for infectious keratitis. Recent studies of binding of Pseudomonas aeruginosa to corneal cells shed from human contact lens wearers has shown promise as a prospective measure of risk. Indeed, results of such binding studies correlate exactly with the existing hierarchy of risk profiles for infectious-ulcerative keratitis, both by lens type and wearing schedule.
The good news is that the hyper-O2 transmissible new materials, both rigid and soft, do not appear to increase bacterial binding as compared to conventional materials. Thus, there is now a biological rationale why the future behavior of these materials may be different from past materials. The final question will be answered by an epidemiological study: have the incidence and associated risk factors changed favorably or not? We have excellent studies with which to compare: the seminal papers in the New England Journal of Medicine by Poggio and Stein et al in 1989 and the recent study of Cheng et al (2000), looking at almost the entire contact lens-wearing population of the Netherlands.
What should the thoughtful clinician now do? Jump on board with EW or wait? Should these materials be approved for up to 30-night wear if the classical clinical studies in progress are also successful? This observer, as one clinician, approves 30-night wear with a follow-up epidemiological study that should demonstrate lessened risks for infectious-ulcerative keratitis in both daily and EW modalities. Then bacterial binding data will be vindicated as a prospective objective outcome measure that will serve as the backbone for testing of all lenses. This can be done in groups of less than 50 individuals over three months, and would free the entire field for an explosive development of creativity in new lens materials and designs. I say never have things looked brighter, but the proof will be when the final science is in.