TOPIC: Glaucoma is the leading cause of irreversible blindness, despite having good prognosis with early treatment. We evaluated the global extent of undetected glaucoma and the factors associated with it in this systematic review and meta-analysis. CLINICAL RELEVANCE: Undetected glaucoma increases the risk of vision impairment, which leads to detrimental effects on the quality-of-life and socioeconomic well-being of those affected. Detailed information on the extent and factors associated with undetected glaucoma aid in the development of public health interventions. METHODS: We conducted a systematic review and meta-analysis of population-based studies published between January 1, 1990, and June 1, 2020. Article search was conducted in online databases (PubMED, Web-of-Science), grey literatures (OpenGrey), and nongovernment organization reports. Our outcome measure was the proportion of glaucoma cases that were undetected previously. Manifest glaucoma included any form of glaucoma reported in the original studies and may include primary open-angle glaucoma (POAG), primary angle-closure-glaucoma, secondary glaucoma, or a combination thereof. Undetected glaucoma was defined as glaucoma cases that were undetected prior to diagnosis in the respective study. Random-effect meta-analysis was used to estimate the pooled proportion of undetected glaucoma. We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses and the Meta-analysis of Observational Studies in Epidemiology guidelines in our study. RESULTS: We identified 61 articles from 55 population-based studies (n = 189 359 participants; n = 6949 manifest glaucoma). Globally, more than half of all glaucoma cases were undetected previously on average in each geographical region. Africa (odds ratio [OR], 12.70; 95% confidence interval [CI], 4.91-32.86) and Asia (OR, 3.41; 95% CI, 1.63-7.16) showed higher odds of undetected glaucoma as compared with Europe. Countries with low Human Development Index (HDI; <0.55) showed a higher proportion of undetected manifest glaucoma as compared with countries of medium to very high HDI (≥0.55; all P < 0.001). In 2020, 43.78 million POAG cases were projected to be undetected, of which 76.7% were in Africa and Asia. DISCUSSION: Undetected glaucoma is highly prevalent across diverse communities worldwide and more common in Africa and Asia. Strategies to improve detection are needed to prevent excess visual disability and blindness resulting from glaucoma.
Sokol JT, Schechet SA, Komati R, Eliott D, Vavvas DG, Kaplan RI, Ittiara ST, Farooq AV, Sheth VS, MacCumber MW, Ke R, Gentile RC, Skondra D. Macular Hole Closure with Medical Treatment. Ophthalmol Retina 2021;5(7):711-713.
Solaguren-Beascoa M, Bujakowska KM, Méjécase C, Emmenegger L, Orhan E, Neuillé M, Mohand-Saïd S, Condroyer C, Lancelot M-E, Michiels C, Demontant V, Antonio A, Letexier M, Saraiva J-P, Lonjou C, Carpentier W, Léveillard T, Pierce EA, Dollfus H, Sahel J-A, Bhattacharya SS, Audo I, Zeitz C. WDR34, a candidate gene for non-syndromic rod-cone dystrophy. Clin Genet 2021;99(2):298-302.Abstract
Rod-cone dystrophy (RCD), also called retinitis pigmentosa, is characterized by rod followed by cone photoreceptor degeneration, leading to gradual visual loss. Mutations in over 65 genes have been associated with non-syndromic RCD explaining 60% to 70% of cases, with novel gene defects possibly accounting for the unsolved cases. Homozygosity mapping and whole-exome sequencing applied to a case of autosomal recessive non-syndromic RCD from a consanguineous union identified a homozygous variant in WDR34. Mutations in WDR34 have been previously associated with severe ciliopathy syndromes possibly associated with a retinal dystrophy. This is the first report of a homozygous mutation in WDR34 associated with non-syndromic RCD.
Post-keratoplasty infectious keratitis (PKIK) represents a unique clinical entity that often poses significant diagnostic and therapeutic challenges. It carries a high risk of serious complications such as graft rejection and failure, and less commonly endophthalmitis. Topical corticosteroids are often required to reduce the risk of graft rejection but their use in PKIK may act as a double-edged sword, particularly in fungal infection. The increased uptake in lamellar keratoplasty in the recent years has also led to complications such as graft-host interface infectious keratitis (IIK), which is particularly difficult to manage. The reported incidence of PKIK differs considerably across different countries, with a higher incidence observed in developing countries (9.2-11.9%) than developed countries (0.02-7.9%). Common risk factors for PKIK include the use of topical corticosteroids, suture-related problems, ocular surface diseases and previous corneal infection. PKIK after penetrating keratoplasty or (deep) anterior lamellar keratoplasty is most commonly caused by ocular surface commensals, particularly Gramme-positive bacteria, whereas PKIK after endothelial keratoplasty is usually caused by Candida spp. Empirical broad-spectrum antimicrobial treatment is the mainstay of treatment for both PKIK, though surgical interventions are required in medically refractory cases (during the acute phase) and those affected by visually significant scarring (during the late phase). In this paper, we aim to provide a comprehensive overview on PKIK, encompassing the epidemiology, risk factors, causes, management and outcomes, and to propose a treatment algorithm for systematically managing this challenging condition.
PURPOSE: Characteristics of periodic flares of dry eye disease (DED) are not well understood. We conducted a rapid evidence assessment to identify evidence for and characteristics of DED flares. METHODS: Literature searches were performed in Embase® via Ovid®, MEDLINE®, and PubMed®. Clinical trials and observational studies published 2009-2019 were included if they investigated patients aged ≥18 years with clinically diagnosed DED who experienced a flare, defined as a temporary or transient episode of increased ocular discomfort, typically lasting days to a few weeks. Triggers of flares, patient-reported outcomes (symptoms), clinician-measured outcomes (signs), and changes in tear molecules were captured. RESULTS: Twenty-one publications that included 22 studies met inclusion criteria. Five observational studies described evidence of DED flares in daily life, 5 studies reported changes following cataract/refractive surgery in patients with preoperative DED, and 12 studies employed controlled environment (CE) models. Real-world triggers of DED flares included air conditioning, wind, reading, low humidity, watching television, and pollution. CE chambers (dry, moving air) and surgery also triggered DED flares. Exacerbations of symptoms and signs of DED, assessed through varied measures, were reported during flares. Across studies, matrix metalloproteinase-9 and interleukin-6 increased and epidermal growth factor decreased during DED flares. CONCLUSIONS: Evidence from 22 studies identified triggers and characteristics of DED flares. Further research is needed to assist clinicians in early diagnosis and treatment of patients experiencing flares.
OBJECTIVE: Severe preeclampsia complicates roughly 1% of all pregnancies. One defining feature of severe preeclampsia is new onset visual disturbance. The accessibility of the choroid to high-resolution, noninvasive imaging makes it a reasonable target of investigation for disease prediction, stratification, or monitoring in preeclampsia. This study aimed to compare subfoveal choroidal thickness between women with severe preeclampsia and those with normotensive pregnancies, and to investigate associations between such findings and other indicators of disease severity, including gestational age and serum angiogenic factors. STUDY DESIGN: We designed a case-control study comprised of 36 women diagnosed with severe preeclampsia (cases) matched to 37 normotensive women (controls) by race/ethnicity and parity, all diagnosed in the postpartum period. All patients underwent enhanced depth imaging spectral-domain optical coherence tomography and serum analysis. RESULTS: Cases showed no difference in subfoveal choroidal thickness compared with controls ( = 0.65). Amongst cases, subfoveal choroidal thickness and gestational age at delivery were inversely related ( = 0.86, < .001). There was a positive association of placental growth factor with subfoveal choroidal thickness amongst cases ( = 0.54, = 0.002). CONCLUSION: This study suggests a relationship between the degree of disease severity and the magnitude of choroidal thickening. We also show an association between this index and placental growth factor level in the postpartum period.
Autism spectrum disorder (ASD) is characterized by social deficits and atypical facial processing of emotional expressions. The underlying neuropathology of these abnormalities is still unclear. Recent studies implicate cerebellum in emotional processing; other studies show cerebellar abnormalities in ASD. Here, we elucidate the spatiotemporal activation of cerebellar lobules in ASD during emotional processing of happy and angry faces in adolescents with ASD and typically developing (TD) controls. Using magnetoencephalography, we calculated dynamic statistical parametric maps across a period of 500 ms after emotional stimuli onset and determined differences between group activity to happy and angry emotions. Following happy face presentation, adolescents with ASD exhibited only left-hemispheric cerebellar activation in a cluster extending from lobule VI to lobule V (compared to TD controls). Following angry face presentation, adolescents with ASD exhibited only midline cerebellar activation (posterior IX vermis). Our findings indicate an early (125-175 ms) overactivation in cerebellar activity only for happy faces and a later overactivation for both happy (250-450 ms) and angry (250-350 ms) faces in adolescents with ASD. The prioritized hemispheric activity (happy faces) could reflect the promotion of a more flexible and adaptive social behavior, while the latter midline activity (angry faces) may guide conforming behavior.
Purpose: Develop equations to convert Cirrus central subfield thickness (CST) to Spectralis CST equivalents and vice versa in eyes with diabetic macular edema (DME). Methods: The DRCR Retina Network Protocol O data were split randomly to train (70% sample) and validate (30% sample) conversion equations. Data from an independent study (CADME) also validated the equations. Bland-Altman 95% limits of agreement between predicted and observed values evaluated the equations. Results: Protocol O included 374 CST scan pairs from 187 eyes (107 participants). The CADME study included 150 scan pairs of 37 eyes (37 participants). Proposed conversion equations are Spectralis = 40.78 + 0.95 × Cirrus and Cirrus = 1.82 + 0.94 × Spectralis regardless of age, sex, or CST. Predicted values were within 10% of observed values in 101 (90%) of Spectralis and 99 (88%) of Cirrus scans in the validation data; and in 136 (91%) of the Spectralis and 148 (99%) of the Cirrus scans in the CADME data. Adjusting for within-eye correlations, 95% of conversions are estimated to be within 17% (95% confidence interval, 14%-21%) of CST on Spectralis and within 22% (95% confidence interval, 18%-28%) of CST on Cirrus. Conclusions: Conversion equations developed in this study allow the harmonization of CST measurements for eyes with DME using a mix of current Cirrus and Spectralis device images. Translational Relevance: The CSTs measured on Cirrus and Spectralis devices are not directly comparable owing to outer boundary segmentation differences. Converting CST values across spectral domain optical coherence tomography instruments should benefit both clinical research and standard care efforts.
PURPOSE: To compare the variability and ability to detect visual field (VF) progression of 24-2, central 12 locations of the 24-2 and 10-2 VF tests in eyes with abnormal VFs. DESIGN: Retrospective, multisite cohort. PARTICIPANTS: A total of 52 806 24-2 and 11 966 10-2 VF tests from 7307 eyes from the Glaucoma Research Network database were analyzed. Only eyes with ≥ 5 visits and ≥ 2 years of follow-up were included. METHODS: Linear regression models were used to calculate the rates of mean deviation (MD) change (slopes), whereas their residuals were used to assess variability across the entire MD range. Computer simulations (n = 10 000) based on real MD residuals of our sample were performed to estimate power to detect significant progression (P < 5%) at various rates of MD change. MAIN OUTCOME MEASURES: Time required to detect progression. RESULTS: For all 3 patterns, the MD variability was highest within the -5 to -20 decibel (dB) range and consistently lower with the 10-2 compared with 24-2 or central 24-2. Overall, time to detect confirmed significant progression at 80% power was the lowest with 10-2 VF, with a decrease of 14.6% to 18.5% when compared with 24-2 and a decrease of 22.9% to 26.5% when compared with central 24-2. CONCLUSIONS: Time to detect central VF progression was reduced with 10-2 MD compared with 24-2 and C24-2 MD in glaucoma eyes in this large dataset, in part because 10-2 tests had lower variability. These findings contribute to current evidence of the potential value of 10-2 testing in the clinical management of patients with glaucoma and in clinical trial design.
Purpose: One rehabilitation strategy taught to individuals with hemianopic field loss (HFL) is to make a large blind side scan to quickly identify hazards. However, it is not clear what the minimum threshold is for how large the scan should be. Using driving simulation, we evaluated thresholds (criteria) for gaze and head scan magnitudes that best predict detection safety. Methods: Seventeen participants with complete HFL and 15 with normal vision (NV) drove through 4 routes in a virtual city while their eyes and head were tracked. Participants pressed the horn as soon as they detected a motorcycle (10 per drive) that appeared 54 degrees eccentricity on cross-streets and approached toward the driver. Results: Those with HFL detected fewer motorcycles than those with NV and had worse detection on the blind side than the seeing side. On the blind side, both safe detections and early detections (detections before the hazard entered the intersection) could be predicted with both gaze (safe 18.5 degrees and early 33.8 degrees) and head (safe 19.3 degrees and early 27 degrees) scans. However, on the seeing side, only early detections could be classified with gaze (25.3 degrees) and head (9.0 degrees). Conclusions: Both head and gaze scan magnitude were significant predictors of detection on the blind side, but less predictive on the seeing side, which was likely driven by the ability to use peripheral vision. Interestingly, head scans were as predictive as gaze scans. Translational Relevance: The minimum scan magnitude could be a useful criterion for scanning training or for developing assistive technologies to improve scanning.
Eye and head movements are used to scan the environment when driving. In particular, when approaching an intersection, large gaze scans to the left and right, comprising head and multiple eye movements, are made. We detail an algorithm called the gaze scan algorithm that automatically quantifies the magnitude, duration, and composition of such large lateral gaze scans. The algorithm works by first detecting lateral saccades, then merging these lateral saccades into gaze scans, with the start and end points of each gaze scan marked in time and eccentricity. We evaluated the algorithm by comparing gaze scans generated by the algorithm to manually marked "consensus ground truth" gaze scans taken from gaze data collected in a high-fidelity driving simulator. We found that the gaze scan algorithm successfully marked 96% of gaze scans and produced magnitudes and durations close to ground truth. Furthermore, the differences between the algorithm and ground truth were similar to the differences found between expert coders. Therefore, the algorithm may be used in lieu of manual marking of gaze data, significantly accelerating the time-consuming marking of gaze movement data in driving simulator studies. The algorithm also complements existing eye tracking and mobility research by quantifying the number, direction, magnitude, and timing of gaze scans and can be used to better understand how individuals scan their environment.
SIGNIFICANCE: Think Tank 2019 affirmed that the rate of infection associated with contact lenses has not changed in several decades. Also, there is a trend toward more serious infections associated with Acanthamoeba and fungi. The growing use of contact lenses in children demands our attention with surveillance and case-control studies. PURPOSE: The American Academy of Optometry (AAO) gathered researchers and key opinion leaders from around the world to discuss contact lens-associated microbial keratitis at the 2019 AAO Annual Meeting. METHODS: Experts presented within four sessions. Session 1 covered the epidemiology of microbial keratitis, pathogenesis of Pseudomonas aeruginosa, and the role of lens care systems and storage cases in corneal disease. Session 2 covered nonbacterial forms of keratitis in contact lens wearers. Session 3 covered future needs, challenges, and research questions in relation to microbial keratitis in youth and myopia control, microbiome, antimicrobial surfaces, and genetic susceptibility. Session 4 covered compliance and communication imperatives. RESULTS: The absolute rate of microbial keratitis has remained very consistent for three decades despite new technologies, and extended wear significantly increases the risk. Improved oxygen delivery afforded by silicone hydrogel lenses has not impacted the rates, and although the introduction of daily disposable lenses has minimized the risk of severe disease, there is no consistent evidence that they have altered the overall rate of microbial keratitis. Overnight orthokeratology lenses may increase the risk of microbial keratitis, especially secondary to Acanthamoeba, in children. Compliance remains a concern and a significant risk factor for disease. New insights into host microbiome and genetic susceptibility may uncover new theories. More studies such as case-control designs suited for rare diseases and registries are needed. CONCLUSIONS: The first annual AAO Think Tank acknowledged that the risk of microbial keratitis has not decreased over decades, despite innovation. Important questions and research directions remain.
Diagnosis and treatment planning in ophthalmology heavily depend on clinical examination and advanced imaging modalities, which can be time-consuming and carry the risk of human error. Artificial intelligence (AI) and deep learning (DL) are being used in different fields of ophthalmology and in particular, when running diagnostics and predicting outcomes of anterior segment surgeries. This review will evaluate the recent developments in AI for diagnostics, surgical interventions, and prognosis of corneal diseases. It also provides a brief overview of the newer AI dependent modalities in corneal diseases.
SIGNIFICANCE: We assessed the prevalence of refractive error in a sample of children of Northern Mexico using the Refractive Error Study in Children protocol of the World Health Organization, which allows for the comparison with other global studies. PURPOSE: Uncorrected refractive error is the main cause of visual impairment in children. The purpose of this study was to assess the refractive error and visual dysfunctions of students (15 to 18 years old) in the upper-middle school system of Sinaloa, Mexico. METHODS: A total of 3468 students in Sinaloa's high school system participated in the study from 2017 to 2019. Optometrists and student clinicians from the Optometry Program of the Autonomous University of Sinaloa conducted the testing. Tests included visual acuities and static retinoscopy. We did not use a cycloplegic agent. RESULTS: The results showed a high prevalence of uncorrected refractive errors. Myopia, defined as a refractive error ≤-0.50 D, had a prevalence of 36.11% (95% confidence interval, 33.47 to 38.83%); hyperopia, defined as a refractive error ≥+2.00 D, had a prevalence of 1.49% (95% confidence interval, 0.09 to 2.33%); and astigmatism, defined as a refractive error with a cylinder ≥0.75 D, had a prevalence of 29.17% (95% confidence interval, 26.60 to 31.76%). We found a significant effect of sex on visual acuity. CONCLUSIONS: Our results are consistent with a high prevalence of myopia reported in adolescents worldwide and in Mexico's northern regions. The results suggest that students attending high school and entering universities should be required to have an optometric eye examination. Additional studies are needed to investigate the prevalence of refractive errors in children in Mexico.
PURPOSE: To evaluate the safety and efficacy of a single, in-office administration of 5% povidone-iodine (PVP-I) compared to artificial tears (AT) for adenoviral conjunctivitis (Ad-Cs). DESIGN: Double-masked pilot randomized trial METHODS: Patients presenting with presumed adenoviral conjunctivitis were screened at 9 U.S. clinics. INCLUSION CRITERIA: ≥ 18 years of age, symptoms ≤ 4 days and a positive AdenoPlus® test. EXCLUSION CRITERIA: thyroid disease, iodine allergy, recent ocular surgery, and ocular findings inconsistent with early-stage Ad-Cs. Randomization was to a single administration of 5% PVP-I or AT in one eye and examinations on days 1-2, 4, 7, 14 and 21 with conjunctival swabs taken each visit for quantitative polymerase chain reaction. Primary outcome was percent reduction from peak viral load. Secondary outcomes were improvement in clinical signs and symptoms. RESULTS: Of 56 patients randomized, 28 had detectable viral titers at baseline. Day 4 post-treatment, viral titers in the 5% PVP-I and AT groups were 2.5% ± 2.7% and 14.4% ± 10.5% of peak respectively (p=0.020). Severity of participant-reported tearing, lid swelling and redness as well as clinician-graded mucoid discharge, bulbar redness and bulbar edema were lower in the 5% PVP-I group than AT group on Day 4 (p< 0.05). After Day 4, viral titers, severity of signs and symptoms decreased markedly in both groups and no differences between groups were detected. CONCLUSIONS: Pilot data suggest a single, in-office administration of 5% PVP-I could reduce viral load and hasten improvement of clinical signs and symptoms in patients with Ad-Cs.
The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) emerged in December 2019 in Wuhan city, Hubei province, China. This is the third and largest coronavirus outbreak since the new millennium after SARS in 2002 and Middle East respiratory syndrome (MERS) in 2012. Over 3 million people have been infected and the COVID-19 has caused more than 217 000 deaths. A concern exists regarding the vulnerability of patients who have been treated with immunosuppressive drugs prior or during this pandemic. Would they be more susceptible to infection by the SARS-CoV-2 and how would their clinical course be altered by their immunosuppressed state? This is a question the wider medical fraternity-including ophthalmologists, rheumatologists, gastroenterologist and transplant physicians among others-must answer. The evidence from the SARS and MERS outbreak offer some degree of confidence that immunosuppression is largely safe in the current COVID-19 pandemic. Preliminary clinical experiences based on case reports, small series and observational studies show the morbidity and mortality rates in immunosuppressed patients may not differ largely from the general population. Overwhelmingly, current best practice guidelines worldwide recommended the continuation of immunosuppression treatment in patients who require them except for perhaps high-dose corticosteroid therapy and in patients with associated risk factors for severe COVID-19 disease.