Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2014 Aug;21(8):1169-77.
doi: 10.1128/CVI.00228-14. Epub 2014 Jun 25.

Application of immunosignatures for diagnosis of valley fever

Affiliations

Application of immunosignatures for diagnosis of valley fever

Krupa Arun Navalkar et al. Clin Vaccine Immunol. 2014 Aug.

Abstract

Valley fever (VF) is difficult to diagnose, partly because the symptoms of VF are confounded with those of other community-acquired pneumonias. Confirmatory diagnostics detect IgM and IgG antibodies against coccidioidal antigens via immunodiffusion (ID). The false-negative rate can be as high as 50% to 70%, with 5% of symptomatic patients never showing detectable antibody levels. In this study, we tested whether the immunosignature diagnostic can resolve VF false negatives. An immunosignature is the pattern of antibody binding to random-sequence peptides on a peptide microarray. A 10,000-peptide microarray was first used to determine whether valley fever patients can be distinguished from 3 other cohorts with similar infections. After determining the VF-specific peptides, a small 96-peptide diagnostic array was created and tested. The performances of the 10,000-peptide array and the 96-peptide diagnostic array were compared to that of the ID diagnostic standard. The 10,000-peptide microarray classified the VF samples from the other 3 infections with 98% accuracy. It also classified VF false-negative patients with 100% sensitivity in a blinded test set versus 28% sensitivity for ID. The immunosignature microarray has potential for simultaneously distinguishing valley fever patients from those with other fungal or bacterial infections. The same 10,000-peptide array can diagnose VF false-negative patients with 100% sensitivity. The smaller 96-peptide diagnostic array was less specific for diagnosing false negatives. We conclude that the performance of the immunosignature diagnostic exceeds that of the existing standard, and the immunosignature can distinguish related infections and might be used in lieu of existing diagnostics.

PubMed Disclaimer

Figures

FIG 1
FIG 1
Hierarchical clustering of informative peptides across five diseases. Peptides (y axis) are colored by intensity, with blue corresponding to low intensity and red to high intensity. Patients (x axis) are grouped by their corresponding peptide values with Aspergillus (black), Mycoplasma (red), Chlamydia (green), normal (blue), and valley fever (brown) grouping by cohort, as computed by GeneSpring 7.3.1 (Agilent, Santa Clara, CA). The peptides were selected by Fisher's exact test.
FIG 2
FIG 2
Hierarchical clustering of valley fever immunosignature. A total of 1,586 peptides from a 1-way ANOVA between VF-infected and uninfected individuals are plotted on the y axis. The coloring is based on the signal intensities obtained from relative binding on the 10,000-peptide array, with blue representing low relative intensity and red representing high signal intensity. Each column represents the immunosignature of one individual, with VF patients (red), uninfected individuals, and pre- and post-influenza vaccine sera (blue).
FIG 3
FIG 3
Signal intensity (y axis) for 96 peptides from the 10,000 peptide microarray that distinguish VF and influenza vaccine recipients. The x axis indicates signal response averaged across patients for each CF titer. On the far right are signals averaged for the influenza vaccine recipients and normal donors. These data originated from the full 10,000-peptide array. Forty-eight peptides that captured high antibody binding in VF patients and low signals in normal/influenza vaccine recipients are colored in red. Forty-eight peptides showing higher signals in normal/influenza vaccine recipients and low signals for VF patients are colored in blue. Consistency was seen across the valley fever patients, and a reversal in signal was seen for non-VF patients. Inf_Pre, influenza vaccine recipients; Inf_21, patients 21 days postvaccine.
FIG 4
FIG 4
Heat map showing normalized average signals from the 96 predictor peptides as in Fig. 2 but displaying the cohort separation. The data were averaged per CF titer and for 45 VF patients (red bars), 34 healthy controls (yellow bar), 7 pre-2006 influenza vaccine recipients (cyan bar) (flu pre), and 21-day postvaccine patients (dark blue bar) (flu post) (x axis). A t test identified 96 peptides (y axis) as being highly significant for distinguishing VF and healthy donors (ND).
FIG 5
FIG 5
Limits of detection graphed from a post hoc power calculation. The black curve in each figure represents the ±delta (minimum detectable fold change) calculated from the statistical precision of each peptide independently. The probes along the x axis are sorted by the calculated power, thus forming a smooth curve. Delta was calculated using α as 1/number of peptides/microarray, β of 0.20, and n of number of patients per group. The vertical bars (y axis) represent the log2 ratio between the healthy and VF-infected patients, with red bars indicating a peptide selected to predict VF and blue bars representing peptides selected for detection of non-VF conditions. The red circles on top of certain bars specify statistically significant fold changes at a P value of <0.01. The peptides used were 10,440 random peptides (training data set) using VF and healthy controls (A), 96 VF predictor peptides (training data set) within the 10,000 microarray (B), 96 resynthesized VF predictor peptides (training data set) for the VF-diagnostic assay (C), and 96 resynthesized VF predictor peptides (test data set) for the VF-diagnostic assay (D).

Similar articles

Cited by

References

    1. Galgiani JN, Ampel NM, Blair JE, Catanzaro A, Johnson RH, Stevens DA, Williams PL. 2005. Coccidioidomycosis. Clin. Infect. Dis. 41:1217. 10.1086/496991 - DOI - PubMed
    1. Kaitlin B, Tsang C, Tabnak F, Chiller T. 2013. Increase in reported coccidioidomycosis–United States, 1998–2011. MMWR Morb. Mortal. Wkly. Rep. 62:217–221 - PMC - PubMed
    1. Centers for Disease Control and Prevention. 2009. Increase in coccidioidomycosis–California, 2000–2007. MMWR Morb. Mortal. Wkly. Rep. 58:105–109 - PubMed
    1. DiCaudo DJ. 2006. Coccidioidomycosis: a review and update. J. Am. Acad. Dermatol. 55:929. 10.1016/j.jaad.2006.04.039 - DOI - PubMed
    1. Tamerius JD, Comrie AC. 2011. Coccidioidomycosis incidence in Arizona predicted by seasonal precipitation. PLoS One 6:e21009. 10.1371/journal.pone.0021009 - DOI - PMC - PubMed

Publication types