PURPOSE: To estimate point prevalence of uveal melanoma in the patients with germline BAP1 pathogenic variant. DESIGN: Cohort study with risk assessment using Bayesian analysis. METHODS: The point prevalence estimate was obtained by Bayes's rule of reverse conditional probabilities. The probability of uveal melanoma given that BAP1 mutation exists was derived from the prevalence of uveal melanoma, prevalence of germline BAP1 pathogenic variants, and the probability of germline BAP1 pathogenic variant given that uveal melanoma is present. Confidence intervals (CIs) for each variable were calculated as the mean of Bernoulli random variables and for the risk estimate, by the delta method. The age at diagnosis and the gender of the uveal melanoma patients with BAP1 germline pathogenic variants obtained from previous publications or from authors' unpublished cohort was compared with those in the Surveillance, Epidemiology, and End Results (SEER) database. RESULTS: The point prevalence of uveal melanoma in patients with the germline BAP1 pathogenic variants in the US population was estimated to be 2.8% (95% CI, 0.88%-4.81%). In the SEER database, the median age at diagnosis of uveal melanomas was 63 (range 3-99 years) with a male-to-female ratio of 1.01:1. In comparison, uveal melanoma cases with BAP1 germline pathogenic variants from the US population (n = 27) had a median age at diagnosis of 50.5 years (range 16-71). CONCLUSIONS: Quantification of the risk of developing uveal melanoma can enhance counseling regarding surveillance in patients with germline BAP1 pathogenic variant.
The conjunctival microvasculature consists of extensive branching of superficial and deep arterial systems and corresponding drainage pathways, and the translucent appearance of the conjunctiva allows for immediate visualization of changes in the circulation. Conjunctival hyperemia is caused by a pathological vasodilatory response of the microvasculature in response to inflammation due to a myriad of infectious and non-infectious etiologies. It is one of the most common contributors of ocular complaints that prompts visits to medical centers. Our understanding of these neurogenic and immune-mediated pathways has progressed over time and played a critical role in developing targeted novel therapies. Due to a multitude of underlying etiologies, the patients must be accurately diagnosed for the efficacious management of conjunctival hyperemia. The diagnostic techniques used for the grading of conjunctival hyperemia have also evolved from descriptive and subjective grading scales to more reliable computer-based objective grading scales.
Conjunctival hyperemia is one of the most common causes for visits to primary care physicians, optometrists, ophthalmologists, and emergency rooms. Despite its high incidence, the treatment options for patients with conjunctival hyperemia are restricted to over-the-counter drugs that provide symptomatic relief due to short duration of action, tachyphylaxis and rebound redness. As our understanding of the immunopathological pathways causing conjunctival hyperemia expands, newer therapeutic targets are being discovered. These insights have also contributed to the development of animal models for mimicking the pathogenic changes in microvasculature causing hyperemia. Furthermore, this progress has catalyzed the development of novel therapeutics that provide efficacious, long-term relief from conjunctival hyperemia with minimal adverse effects.
: To determine if angiotensin converting enzyme-inhibitors (ACE-I) alter the incidence of non-infectious uveitis (NIU). Patients in a large healthcare claims database who initiated ACE-I (n = 695,557) were compared to patients who initiated angiotensin receptor blockers (ARB, n = 354,295). A second comparison was also made between patients who initiated ACE-I (n = 505,958) and those who initiated beta-blockers (BB, n = 538,109). The primary outcome was incident NIU defined as a first diagnosis code for NIU followed by a second instance of a NIU code within 120 days. For the secondary outcome, a corticosteroid prescription or code for an ocular corticosteroid injection within 120 days of the NIU diagnosis code was used instead of the second NIU diagnosis code. Data were analyzed using Cox regression modeling with inverse probability of treatment weighting (IPTW). Sub-analyses were performed by anatomic subtype. When comparing ACE-I to ARB initiators, the hazard ratio (HR) for incident NIU was not significantly different for the primary outcome [HR = 0.95, 95% Confidence Interval (CI): 0.85-1.07, = .41] or secondary outcome [HR = 0.96, 95% CI: 0.86-1.07, = .44]. Similarly, in the ACE-I and BB initiators comparison, the HR for incident NIU was not significantly different comparing ACE-I and BB initiators for either outcome definition or any of the NIU anatomical subtypes. Our results suggest there is no evidence that ACE-I have a protective effect on NIU.
PURPOSE: The purpose of this study is to determine if statin therapy decreases the incidence of non-infectious uveitis (NIU) using a retrospective cohort study. METHODS: Patients enrolled in a national insurance plan who initiated statin (n = 711,734, statin cohort) or other lipid-lowering therapy (n = 148,044, non-statin cohort) were observed for NIU development. Incident NIU in the primary analysis was defined as a new diagnosis code for NIU followed by a second instance of a NIU code within 120 days. For the secondary outcome definition, a corticosteroid prescription or code for an ocular corticosteroid injection within 120 days of the NIU diagnosis code was used instead of the second NIU diagnosis code. Estimation of NIU incidence used multivariable Cox proportional hazards regression. The proportional hazards assumption was satisfied by creating two time periods of analysis, ≤ 150 and > 150 days. Subanalyses were performed by anatomic subtype. RESULTS: Overall, the primary outcome occurred 541 times over 690,465 person-years in the statin cohort and 103 times over 104,301 person-years in the non-statin cohort. No associations were seen in the ≤ 150-day analyses (p > 0.20 for all comparisons). However, after 150 days, the statin cohort was less likely to develop any uveitis [hazard ratio (HR) = 0.70, 95% confidence interval (CI): 0.51-0.97, P = 0.03] in the primary outcome analysis, but did not meet significance for the secondary outcome (HR = 0.85, 95% CI: 0.63-1.15, P = 0.30). Similarly, in the anatomic subtype analysis, after 150 days, the statin cohort was less likely to develop anterior uveitis (HR = 0.67, 95% CI: 0.47-0.97, P = 0.03) in the primary analysis, but the association did not reach significance for the secondary outcome (HR = 0.82, 95% CI: 0.56-1.20, P = 0.31). CONCLUSION: Our results suggest that statin therapy for > 150 days decreases the incidence of NIU.
PURPOSE: To determine if metformin is associated with noninfectious uveitis (NIU). METHODS: Patients in an insurance claims database who initiated metformin (n = 359,139) or other oral anti-diabetic medications (n = 162,847) were followed for NIU development. Both cohort and case-control analyses were performed to assess differing exposure lengths using Cox and conditional logistic regression, respectively. RESULTS: The hazard ratio (HR) for incident NIU was not significantly different between the metformin and non-metformin cohorts [HR = 1.19, 95% Confidence Interval (CI): 0.92-1.54, = .19]. The case control analysis similarly showed no association between any metformin use 2 years before the outcome date and NIU [odds ratio (OR) = 0.64, 95% CI: 0.39-1.04, = .07]. However, there was a protective 20 association between cumulative metformin duration [(445-729 days) adjusted OR (aOR) = 0.49, 95% CI: 0.27-0.90, = .02] and dosage (>390,000 mg aOR = 0.44, 95% CI: 0.25-0.78, = .001) compared with no metformin use. CONCLUSIONS: Our results suggest metformin use for longer durations may be protective of NIU onset.
PURPOSE: To identify functionally related genes associated with diabetic retinopathy (DR) risk using gene set enrichment analyses (GSEA) applied to genome-wide association study (GWAS) meta-analyses. METHODS: We analyzed DR GWAS meta-analyses performed on 3,246 Europeans and 2,611 African Americans with type 2 diabetes. Gene sets relevant to five key DR pathophysiology processes were investigated: tissue injury, vascular events, metabolic events and glial dysregulation, neuronal dysfunction, and inflammation. Keywords relevant to these processes were queried in four pathway and ontology databases. Two GSEA methods, Meta-Analysis Gene set Enrichment of variaNT Associations (MAGENTA) and Multi-marker Analysis of GenoMic Annotation (MAGMA) were used. Gene sets were defined to be enriched for gene associations with DR if the P value corrected for multiple testing (Pcorr) was <.05. RESULTS: Five gene sets were significantly enriched for multiple modest genetic associations with DR in one method (MAGENTA or MAGMA) and also at least nominally significant (uncorrected P <.05) in the other method. These pathways were regulation of the lipid catabolic process (2-fold enrichment, Pcorr=.014); nitric oxide biosynthesis (1.92-fold enrichment, Pcorr=.022); lipid digestion, mobilization and transport (1.6-fold enrichment, P=.032); apoptosis (1.53-fold enrichment, P=.041); and retinal ganglion cell degeneration (2-fold enrichment, Pcorr=.049). The interferon gamma (IFNG) gene, previously implicated in DR by protein-protein interactions in our GWAS, was among the top ranked genes in the nitric oxide pathway (best variant P=.0001). CONCLUSIONS: These GSEA indicate that variants in genes involved in oxidative stress, lipid transport and catabolism and cell degeneration are enriched for genes associated with DR risk.
TOPIC: Glaucoma is the leading cause of irreversible blindness despite having good prognosis with early treatment. We evaluated the global extent of undetected glaucoma and the factors associated with it in this systematic review and meta-analysis of population-based epidemiological studies. CLINICAL RELEVANCE: Undetected glaucoma increases the risk of vision impairment, which leads to detrimental effects on the quality-of-life and socio-economic well-being of those affected. Detailed information on the extent and factors associated with undetected glaucoma aid in the development of public health interventions. METHODS: We conducted a systematic review and meta-analysis of population-based studies published between January 1, 1990 to June 1, 2020. Article search was conducted in online databases (PubMED, Web-of-Science), grey literatures (opengrey) and non-government organization (NGOs) reports. Our outcome measure was the proportion of glaucoma cases that were undetected previously. Manifest glaucoma included any form of glaucoma reported in the respective study and may include primary-open-angle-glaucoma (POAG), primary-angle-closure-glaucoma (PACG), and/or secondary glaucoma. Undetected glaucoma was defined as glaucoma cases that were undetected prior to diagnosis in the respective study. Random-effect meta-analysis was used to estimate the pooled proportion and factors associated with undetected glaucoma. We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and Meta-analysis of Observational Studies in Epidemiology (MOOSE) guidelines in our study. RESULTS: We identified 61 articles from 55 population-based studies (N= 189,359 participants; N= 6,949 manifest glaucoma; N= 5,558 undetected glaucoma). Globally, more than half of all glaucoma cases were previously undetected in each geographical region. Regionally, Africa (OR 12.70, 95% CI 4.91, 32.86) and Asia (OR 3.41, 95% CI 1.63, 7.16) had higher odds of undetected glaucoma as compared to Europe. Countries with low human development index (HDI, <0.55) had higher proportion of undetected manifest glaucoma as compared to countries of medium to very high HDI (≥0.55, all P <0.001). In 2020, 43.78 million POAG cases were undetected, of which 76.7% reside in Africa and Asia. CONCLUSION: Undetected glaucoma is highly prevalent across diverse communities worldwide, and more common in Africa and Asia. Strategies to improve detection are needed to prevent excess visual disability and blindness due to glaucoma.
Solaguren-Beascoa M, Bujakowska KM, Méjécase C, Emmenegger L, Orhan E, Neuillé M, Mohand-Saïd S, Condroyer C, Lancelot M-E, Michiels C, Demontant V, Antonio A, Letexier M, Saraiva J-P, Lonjou C, Carpentier W, Léveillard T, Pierce EA, Dollfus H, Sahel J-A, Bhattacharya SS, Audo I, Zeitz C. WDR34, a candidate gene for non-syndromic rod-cone dystrophy. Clin Genet 2021;99(2):298-302.Abstract
Rod-cone dystrophy (RCD), also called retinitis pigmentosa, is characterized by rod followed by cone photoreceptor degeneration, leading to gradual visual loss. Mutations in over 65 genes have been associated with non-syndromic RCD explaining 60% to 70% of cases, with novel gene defects possibly accounting for the unsolved cases. Homozygosity mapping and whole-exome sequencing applied to a case of autosomal recessive non-syndromic RCD from a consanguineous union identified a homozygous variant in WDR34. Mutations in WDR34 have been previously associated with severe ciliopathy syndromes possibly associated with a retinal dystrophy. This is the first report of a homozygous mutation in WDR34 associated with non-syndromic RCD.
OBJECTIVE: Severe preeclampsia complicates roughly 1% of all pregnancies. One defining feature of severe preeclampsia is new onset visual disturbance. The accessibility of the choroid to high-resolution, noninvasive imaging makes it a reasonable target of investigation for disease prediction, stratification, or monitoring in preeclampsia. This study aimed to compare subfoveal choroidal thickness between women with severe preeclampsia and those with normotensive pregnancies, and to investigate associations between such findings and other indicators of disease severity, including gestational age and serum angiogenic factors. STUDY DESIGN: We designed a case-control study comprised of 36 women diagnosed with severe preeclampsia (cases) matched to 37 normotensive women (controls) by race/ethnicity and parity, all diagnosed in the postpartum period. All patients underwent enhanced depth imaging spectral-domain optical coherence tomography and serum analysis. RESULTS: Cases showed no difference in subfoveal choroidal thickness compared with controls ( = 0.65). Amongst cases, subfoveal choroidal thickness and gestational age at delivery were inversely related ( = 0.86, < .001). There was a positive association of placental growth factor with subfoveal choroidal thickness amongst cases ( = 0.54, = 0.002). CONCLUSION: This study suggests a relationship between the degree of disease severity and the magnitude of choroidal thickening. We also show an association between this index and placental growth factor level in the postpartum period.
Autism spectrum disorder (ASD) is characterized by social deficits and atypical facial processing of emotional expressions. The underlying neuropathology of these abnormalities is still unclear. Recent studies implicate cerebellum in emotional processing; other studies show cerebellar abnormalities in ASD. Here, we elucidate the spatiotemporal activation of cerebellar lobules in ASD during emotional processing of happy and angry faces in adolescents with ASD and typically developing (TD) controls. Using magnetoencephalography, we calculated dynamic statistical parametric maps across a period of 500 ms after emotional stimuli onset and determined differences between group activity to happy and angry emotions. Following happy face presentation, adolescents with ASD exhibited only left-hemispheric cerebellar activation in a cluster extending from lobule VI to lobule V (compared to TD controls). Following angry face presentation, adolescents with ASD exhibited only midline cerebellar activation (posterior IX vermis). Our findings indicate an early (125-175 ms) overactivation in cerebellar activity only for happy faces and a later overactivation for both happy (250-450 ms) and angry (250-350 ms) faces in adolescents with ASD. The prioritized hemispheric activity (happy faces) could reflect the promotion of a more flexible and adaptive social behavior, while the latter midline activity (angry faces) may guide conforming behavior.
Graduate medical education (GME) in ophthalmology has faced and overcome many challenges over the past years and 2020 has been a game-changing year. Although the severe acute respiratory syndrome coronavirus (SARS-CoV2) pandemic disrupted medical education globally, ophthalmic educators rapidly transformed their curricula to novel and effective virtual learning formats. Thus, while the COVID-19 outbreak has been one of the most significant challenges faced in the history of medical education, it has also provided an impetus to develop innovative teaching practices, bringing with it unprecedented success in allowing medical students to continue their education in ophthalmology despite these challenges. We review and appraise novel educational interventions implemented by various institutions in response to the COVID-19 pandemic, highlighting their effectiveness, challenges and proposing future directions beyond the pandemic. Many of these innovations will persist even after the end of the pandemic because they have proven that face-to-face learning is not required for all aspects of the ophthalmic GME curriculum. As ophthalmic educators harness the power of educational technology it is critical that their novel educational initiatives are incorporated into competency-based curricula with assessments mapped to the competencies. Future research should focus on evaluating the impact of this transformation to virtual learning environments on student performances as well as implementing longitudinal assessment strategies for clinical competence in workplace-based practice.
PURPOSE: To compare the variability and ability to detect visual field progression of 24-2, central 12 locations of the 24-2 and 10-2 visual field (VF) tests in eyes with abnormal VFs. DESIGN: Retrospective, multisite cohort. PARTICIPANTS: A total of 52,806 24-2 and 11,966 10-2 VF tests from 7,307 eyes from the Glaucoma Research Network database were analyzed. Only eyes with ≥ 5 visits and ≥ 2 years of follow-up were included. METHODS: Linear regression models were used to calculate the rates of MD (Mean Deviation) change (slopes) while their residuals were used to assess variability across the entire MD range. Computer simulations (n=10,000) based upon real MD residuals of our sample were performed to estimate power to detect significant progression (P < 5%) at various rates of MD change. MAIN OUTCOME MEASURES: Time required to detect progression. RESULTS: For all 3 patterns, the MD variability was highest within the -5 to -20 dB range and consistently lower with the 10-2 compared to 24-2 or Central 24-2. Overall, time to detect confirmed significant progression at 80% power was the lowest with 10-2 VF, with a decrease of 14.6% to 18.5% when compared to 24-2 and a decrease of 22.9% to 26.5% when compared to Central 24-2. CONCLUSION: Time to detect central VF progression was reduced with 10-2 MD compared with 24-2 and C24-2 MD in glaucoma eyes in this large dataset, in part because 10-2 tests had lower variability. These findings contribute to current evidence of the potential value of 10-2 testing in the clinical management of glaucoma patients and in clinical trial design.
Eye and head movements are used to scan the environment when driving. In particular, when approaching an intersection, large gaze scans to the left and right, comprising head and multiple eye movements, are made. We detail an algorithm called the gaze scan algorithm that automatically quantifies the magnitude, duration, and composition of such large lateral gaze scans. The algorithm works by first detecting lateral saccades, then merging these lateral saccades into gaze scans, with the start and end points of each gaze scan marked in time and eccentricity. We evaluated the algorithm by comparing gaze scans generated by the algorithm to manually marked "consensus ground truth" gaze scans taken from gaze data collected in a high-fidelity driving simulator. We found that the gaze scan algorithm successfully marked 96% of gaze scans and produced magnitudes and durations close to ground truth. Furthermore, the differences between the algorithm and ground truth were similar to the differences found between expert coders. Therefore, the algorithm may be used in lieu of manual marking of gaze data, significantly accelerating the time-consuming marking of gaze movement data in driving simulator studies. The algorithm also complements existing eye tracking and mobility research by quantifying the number, direction, magnitude, and timing of gaze scans and can be used to better understand how individuals scan their environment.
Purpose: One rehabilitation strategy taught to individuals with hemianopic field loss (HFL) is to make a large blind side scan to quickly identify hazards. However, it is not clear what the minimum threshold is for how large the scan should be. Using driving simulation, we evaluated thresholds (criteria) for gaze and head scan magnitudes that best predict detection safety. Methods: Seventeen participants with complete HFL and 15 with normal vision (NV) drove through 4 routes in a virtual city while their eyes and head were tracked. Participants pressed the horn as soon as they detected a motorcycle (10 per drive) that appeared 54 degrees eccentricity on cross-streets and approached toward the driver. Results: Those with HFL detected fewer motorcycles than those with NV and had worse detection on the blind side than the seeing side. On the blind side, both safe detections and early detections (detections before the hazard entered the intersection) could be predicted with both gaze (safe 18.5 degrees and early 33.8 degrees) and head (safe 19.3 degrees and early 27 degrees) scans. However, on the seeing side, only early detections could be classified with gaze (25.3 degrees) and head (9.0 degrees). Conclusions: Both head and gaze scan magnitude were significant predictors of detection on the blind side, but less predictive on the seeing side, which was likely driven by the ability to use peripheral vision. Interestingly, head scans were as predictive as gaze scans. Translational Relevance: The minimum scan magnitude could be a useful criterion for scanning training or for developing assistive technologies to improve scanning.
SIGNIFICANCE: Think Tank 2019 affirmed that the rate of infection associated with contact lenses has not changed in several decades. Also, there is a trend toward more serious infections associated with Acanthamoeba and fungi. The growing use of contact lenses in children demands our attention with surveillance and case-control studies. PURPOSE: The American Academy of Optometry (AAO) gathered researchers and key opinion leaders from around the world to discuss contact lens-associated microbial keratitis at the 2019 AAO Annual Meeting. METHODS: Experts presented within four sessions. Session 1 covered the epidemiology of microbial keratitis, pathogenesis of Pseudomonas aeruginosa, and the role of lens care systems and storage cases in corneal disease. Session 2 covered nonbacterial forms of keratitis in contact lens wearers. Session 3 covered future needs, challenges, and research questions in relation to microbial keratitis in youth and myopia control, microbiome, antimicrobial surfaces, and genetic susceptibility. Session 4 covered compliance and communication imperatives. RESULTS: The absolute rate of microbial keratitis has remained very consistent for three decades despite new technologies, and extended wear significantly increases the risk. Improved oxygen delivery afforded by silicone hydrogel lenses has not impacted the rates, and although the introduction of daily disposable lenses has minimized the risk of severe disease, there is no consistent evidence that they have altered the overall rate of microbial keratitis. Overnight orthokeratology lenses may increase the risk of microbial keratitis, especially secondary to Acanthamoeba, in children. Compliance remains a concern and a significant risk factor for disease. New insights into host microbiome and genetic susceptibility may uncover new theories. More studies such as case-control designs suited for rare diseases and registries are needed. CONCLUSIONS: The first annual AAO Think Tank acknowledged that the risk of microbial keratitis has not decreased over decades, despite innovation. Important questions and research directions remain.
Diagnosis and treatment planning in ophthalmology heavily depend on clinical examination and advanced imaging modalities, which can be time-consuming and carry the risk of human error. Artificial intelligence (AI) and deep learning (DL) are being used in different fields of ophthalmology and in particular, when running diagnostics and predicting outcomes of anterior segment surgeries. This review will evaluate the recent developments in AI for diagnostics, surgical interventions, and prognosis of corneal diseases. It also provides a brief overview of the newer AI dependent modalities in corneal diseases.