Built on Laura's code to port the strand bias filter from M1 and refactored code aroud SomaticGenotypingEngine. Added a new integration test.

This commit is contained in:
Takuto Sato 2016-01-13 16:06:57 +09:00
parent 4066bcd75c
commit 63e0865491
19 changed files with 935 additions and 509 deletions

View File

@ -241,7 +241,7 @@ public abstract class StrandBiasTest extends InfoFieldAnnotation implements Acti
/**
Allocate and fill a 2x2 strand contingency table. In the end, it'll look something like this:
* fw rc
* fw rv
* allele1 # #
* allele2 # #
* @return a 2x2 contingency table

View File

@ -153,6 +153,7 @@ public class StrandOddsRatio extends StrandBiasTest implements StandardAnnotatio
double ratio = 0;
ratio += (augmentedTable[0][0] / augmentedTable[0][1]) * (augmentedTable[1][1] / augmentedTable[1][0]);
// TODO: repeated computation: how about ratio += 1/ratio, or ratio = ratio + 1/ratio to be expicit
ratio += (augmentedTable[0][1] / augmentedTable[0][0]) * (augmentedTable[1][0] / augmentedTable[1][1]);
final double refRatio = (Math.min(augmentedTable[0][0], augmentedTable[0][1])/Math.max(augmentedTable[0][0], augmentedTable[0][1]));

View File

@ -0,0 +1,191 @@
/*
* By downloading the PROGRAM you agree to the following terms of use:
*
* BROAD INSTITUTE
* SOFTWARE LICENSE AGREEMENT
* FOR ACADEMIC NON-COMMERCIAL RESEARCH PURPOSES ONLY
*
* This Agreement is made between the Broad Institute, Inc. with a principal address at 415 Main Street, Cambridge, MA 02142 ("BROAD") and the LICENSEE and is effective at the date the downloading is completed ("EFFECTIVE DATE").
*
* WHEREAS, LICENSEE desires to license the PROGRAM, as defined hereinafter, and BROAD wishes to have this PROGRAM utilized in the public interest, subject only to the royalty-free, nonexclusive, nontransferable license rights of the United States Government pursuant to 48 CFR 52.227-14; and
* WHEREAS, LICENSEE desires to license the PROGRAM and BROAD desires to grant a license on the following terms and conditions.
* NOW, THEREFORE, in consideration of the promises and covenants made herein, the parties hereto agree as follows:
*
* 1. DEFINITIONS
* 1.1 PROGRAM shall mean copyright in the object code and source code known as GATK3 and related documentation, if any, as they exist on the EFFECTIVE DATE and can be downloaded from http://www.broadinstitute.org/gatk on the EFFECTIVE DATE.
*
* 2. LICENSE
* 2.1 Grant. Subject to the terms of this Agreement, BROAD hereby grants to LICENSEE, solely for academic non-commercial research purposes, a non-exclusive, non-transferable license to: (a) download, execute and display the PROGRAM and (b) create bug fixes and modify the PROGRAM. LICENSEE hereby automatically grants to BROAD a non-exclusive, royalty-free, irrevocable license to any LICENSEE bug fixes or modifications to the PROGRAM with unlimited rights to sublicense and/or distribute. LICENSEE agrees to provide any such modifications and bug fixes to BROAD promptly upon their creation.
* The LICENSEE may apply the PROGRAM in a pipeline to data owned by users other than the LICENSEE and provide these users the results of the PROGRAM provided LICENSEE does so for academic non-commercial purposes only. For clarification purposes, academic sponsored research is not a commercial use under the terms of this Agreement.
* 2.2 No Sublicensing or Additional Rights. LICENSEE shall not sublicense or distribute the PROGRAM, in whole or in part, without prior written permission from BROAD. LICENSEE shall ensure that all of its users agree to the terms of this Agreement. LICENSEE further agrees that it shall not put the PROGRAM on a network, server, or other similar technology that may be accessed by anyone other than the LICENSEE and its employees and users who have agreed to the terms of this agreement.
* 2.3 License Limitations. Nothing in this Agreement shall be construed to confer any rights upon LICENSEE by implication, estoppel, or otherwise to any computer software, trademark, intellectual property, or patent rights of BROAD, or of any other entity, except as expressly granted herein. LICENSEE agrees that the PROGRAM, in whole or part, shall not be used for any commercial purpose, including without limitation, as the basis of a commercial software or hardware product or to provide services. LICENSEE further agrees that the PROGRAM shall not be copied or otherwise adapted in order to circumvent the need for obtaining a license for use of the PROGRAM.
*
* 3. PHONE-HOME FEATURE
* LICENSEE expressly acknowledges that the PROGRAM contains an embedded automatic reporting system ("PHONE-HOME") which is enabled by default upon download. Unless LICENSEE requests disablement of PHONE-HOME, LICENSEE agrees that BROAD may collect limited information transmitted by PHONE-HOME regarding LICENSEE and its use of the PROGRAM. Such information shall include LICENSEE'S user identification, version number of the PROGRAM and tools being run, mode of analysis employed, and any error reports generated during run-time. Collection of such information is used by BROAD solely to monitor usage rates, fulfill reporting requirements to BROAD funding agencies, drive improvements to the PROGRAM, and facilitate adjustments to PROGRAM-related documentation.
*
* 4. OWNERSHIP OF INTELLECTUAL PROPERTY
* LICENSEE acknowledges that title to the PROGRAM shall remain with BROAD. The PROGRAM is marked with the following BROAD copyright notice and notice of attribution to contributors. LICENSEE shall retain such notice on all copies. LICENSEE agrees to include appropriate attribution if any results obtained from use of the PROGRAM are included in any publication.
* Copyright 2012-2016 Broad Institute, Inc.
* Notice of attribution: The GATK3 program was made available through the generosity of Medical and Population Genetics program at the Broad Institute, Inc.
* LICENSEE shall not use any trademark or trade name of BROAD, or any variation, adaptation, or abbreviation, of such marks or trade names, or any names of officers, faculty, students, employees, or agents of BROAD except as states above for attribution purposes.
*
* 5. INDEMNIFICATION
* LICENSEE shall indemnify, defend, and hold harmless BROAD, and their respective officers, faculty, students, employees, associated investigators and agents, and their respective successors, heirs and assigns, (Indemnitees), against any liability, damage, loss, or expense (including reasonable attorneys fees and expenses) incurred by or imposed upon any of the Indemnitees in connection with any claims, suits, actions, demands or judgments arising out of any theory of liability (including, without limitation, actions in the form of tort, warranty, or strict liability and regardless of whether such action has any factual basis) pursuant to any right or license granted under this Agreement.
*
* 6. NO REPRESENTATIONS OR WARRANTIES
* THE PROGRAM IS DELIVERED AS IS. BROAD MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE PROGRAM OR THE COPYRIGHT, EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF LATENT OR OTHER DEFECTS, WHETHER OR NOT DISCOVERABLE. BROAD EXTENDS NO WARRANTIES OF ANY KIND AS TO PROGRAM CONFORMITY WITH WHATEVER USER MANUALS OR OTHER LITERATURE MAY BE ISSUED FROM TIME TO TIME.
* IN NO EVENT SHALL BROAD OR ITS RESPECTIVE DIRECTORS, OFFICERS, EMPLOYEES, AFFILIATED INVESTIGATORS AND AFFILIATES BE LIABLE FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES OF ANY KIND, INCLUDING, WITHOUT LIMITATION, ECONOMIC DAMAGES OR INJURY TO PROPERTY AND LOST PROFITS, REGARDLESS OF WHETHER BROAD SHALL BE ADVISED, SHALL HAVE OTHER REASON TO KNOW, OR IN FACT SHALL KNOW OF THE POSSIBILITY OF THE FOREGOING.
*
* 7. ASSIGNMENT
* This Agreement is personal to LICENSEE and any rights or obligations assigned by LICENSEE without the prior written consent of BROAD shall be null and void.
*
* 8. MISCELLANEOUS
* 8.1 Export Control. LICENSEE gives assurance that it will comply with all United States export control laws and regulations controlling the export of the PROGRAM, including, without limitation, all Export Administration Regulations of the United States Department of Commerce. Among other things, these laws and regulations prohibit, or require a license for, the export of certain types of software to specified countries.
* 8.2 Termination. LICENSEE shall have the right to terminate this Agreement for any reason upon prior written notice to BROAD. If LICENSEE breaches any provision hereunder, and fails to cure such breach within thirty (30) days, BROAD may terminate this Agreement immediately. Upon termination, LICENSEE shall provide BROAD with written assurance that the original and all copies of the PROGRAM have been destroyed, except that, upon prior written authorization from BROAD, LICENSEE may retain a copy for archive purposes.
* 8.3 Survival. The following provisions shall survive the expiration or termination of this Agreement: Articles 1, 3, 4, 5 and Sections 2.2, 2.3, 7.3, and 7.4.
* 8.4 Notice. Any notices under this Agreement shall be in writing, shall specifically refer to this Agreement, and shall be sent by hand, recognized national overnight courier, confirmed facsimile transmission, confirmed electronic mail, or registered or certified mail, postage prepaid, return receipt requested. All notices under this Agreement shall be deemed effective upon receipt.
* 8.5 Amendment and Waiver; Entire Agreement. This Agreement may be amended, supplemented, or otherwise modified only by means of a written instrument signed by all parties. Any waiver of any rights or failure to act in a specific instance shall relate only to such instance and shall not be construed as an agreement to waive any rights or fail to act in any other instance, whether or not similar. This Agreement constitutes the entire agreement among the parties with respect to its subject matter and supersedes prior agreements or understandings between the parties relating to its subject matter.
* 8.6 Binding Effect; Headings. This Agreement shall be binding upon and inure to the benefit of the parties and their respective permitted successors and assigns. All headings are for convenience only and shall not affect the meaning of any provision of this Agreement.
* 8.7 Governing Law. This Agreement shall be construed, governed, interpreted and applied in accordance with the internal laws of the Commonwealth of Massachusetts, U.S.A., without regard to conflict of laws principles.
*/
package org.broadinstitute.gatk.tools.walkers.cancer.m2;
import htsjdk.variant.variantcontext.Allele;
import htsjdk.variant.variantcontext.VariantContext;
import htsjdk.variant.vcf.VCFHeaderLineType;
import htsjdk.variant.vcf.VCFInfoHeaderLine;
import org.broadinstitute.gatk.tools.walkers.annotator.interfaces.ActiveRegionBasedAnnotation;
import org.broadinstitute.gatk.tools.walkers.annotator.interfaces.AnnotatorCompatible;
import org.broadinstitute.gatk.tools.walkers.annotator.interfaces.InfoFieldAnnotation;
import org.broadinstitute.gatk.utils.QualityUtils;
import org.broadinstitute.gatk.utils.contexts.AlignmentContext;
import org.broadinstitute.gatk.utils.contexts.ReferenceContext;
import org.broadinstitute.gatk.utils.genotyper.MostLikelyAllele;
import org.broadinstitute.gatk.utils.genotyper.PerReadAlleleLikelihoodMap;
import org.broadinstitute.gatk.utils.refdata.RefMetaDataTracker;
import org.broadinstitute.gatk.utils.sam.AlignmentUtils;
import org.broadinstitute.gatk.utils.sam.GATKSAMRecord;
import org.broadinstitute.gatk.utils.sam.ReadUtils;
import java.util.*;
/**
* Created by gauthier on 7/27/15.
*/
public class ClusteredEventsAnnotator extends InfoFieldAnnotation implements ActiveRegionBasedAnnotation {
private String tumorSampleName = null;
@Override
public List<String> getKeyNames() { return Arrays.asList("tumorForwardOffsetMedian","tumorReverseOffsetMedian","tumorForwardOffsetMAD","tumorReverseOffsetMAD"); }
@Override
public List<VCFInfoHeaderLine> getDescriptions() {
//TODO: this needs a lot of re-phrasing
return Arrays.asList(new VCFInfoHeaderLine("TUMOR_FWD_POS_MEDIAN", 1, VCFHeaderLineType.Integer, "Median offset of tumor variant position from positive read end"),
new VCFInfoHeaderLine("TUMOR_FWD_POS_MAD", 1, VCFHeaderLineType.Integer, "Median absolute deviation from the median for tumor forward read positions"),
new VCFInfoHeaderLine("TUMOR_REV_POS_MEDIAN", 1, VCFHeaderLineType.Integer, "Median offset of tumor variant position from negative read end"),
new VCFInfoHeaderLine("TUMOR_REV_POS_MAD", 1, VCFHeaderLineType.Integer, "Median absolute deviation from the median for tumor reverse read positions"));
}
@Override
public Map<String, Object> annotate(final RefMetaDataTracker tracker,
final AnnotatorCompatible walker,
final ReferenceContext ref,
final Map<String, AlignmentContext> stratifiedContexts,
final VariantContext vc,
final Map<String, PerReadAlleleLikelihoodMap> stratifiedPerReadAlleleLikelihoodMap) {
if (tumorSampleName == null){
if (walker instanceof MuTect2 ) {
tumorSampleName = ((MuTect2) walker).tumorSampleName;
} else {
// ts: log error and exit
throw new IllegalStateException("ClusteredEventsAnnotator: walker is not MuTect2");
}
}
final Map<String, Object> map = new HashMap<>();
if ( stratifiedPerReadAlleleLikelihoodMap != null ) {
final PerReadAlleleLikelihoodMap likelihoodMap = stratifiedPerReadAlleleLikelihoodMap.get(tumorSampleName);
MuTect2.logReadInfo("HAVCYADXX150109:2:2209:19034:53394", likelihoodMap.getLikelihoodReadMap().keySet(), "Present inside ClusteredEventsAnnotator:annotate");
if ( likelihoodMap != null && !likelihoodMap.isEmpty() ) {
double[] list = fillQualsFromLikelihoodMap(vc.getStart(), likelihoodMap); // [fwdMedian, revMedian, fwdMAD, revMAD]
final int FWDMEDIAN = 0, REVMEDIAN = 1, FWDMAD = 2, REVMAD = 3; // ts: make a class to contain these values
map.put("TUMOR_FWD_POS_MEDIAN", list[FWDMEDIAN]);
map.put("TUMOR_REV_POS_MEDIAN", list[REVMEDIAN]);
map.put("TUMOR_FWD_POS_MAD", list[FWDMAD]);
map.put("TUMOR_REV_POS_MAD", list[REVMAD]);
}
}
return map;
}
private double[] fillQualsFromLikelihoodMap(final int refLoc,
final PerReadAlleleLikelihoodMap likelihoodMap) {
final ArrayList<Double> tumorFwdOffset = new ArrayList<>();
final ArrayList<Double> tumorRevOffset = new ArrayList<>();
for ( final Map.Entry<GATKSAMRecord, Map<Allele,Double>> el : likelihoodMap.getLikelihoodReadMap().entrySet() ) {
final MostLikelyAllele a = PerReadAlleleLikelihoodMap.getMostLikelyAllele(el.getValue());
if ( ! a.isInformative() )
continue; // read is non-informative
final GATKSAMRecord read = el.getKey();
if ( isUsableRead(read, refLoc) ) {
if ( a.getMostLikelyAllele().isReference() )
continue;
final Double valueRight = getElementForRead(read, refLoc, ReadUtils.ClippingTail.RIGHT_TAIL);
if ( valueRight == null )
continue;
tumorFwdOffset.add(valueRight);
final Double valueLeft = getElementForRead(read, refLoc, ReadUtils.ClippingTail.LEFT_TAIL);
if ( valueLeft == null )
continue;
tumorRevOffset.add(valueLeft);
}
}
double fwdMedian = 0.0;
double revMedian = 0.0;
double fwdMAD = 0.0;
double revMAD = 0.0;
if (!tumorFwdOffset.isEmpty() && !tumorRevOffset.isEmpty()) {
fwdMedian = MuTectStats.getMedian(tumorFwdOffset);
revMedian = MuTectStats.getMedian(tumorRevOffset);
fwdMAD = MuTectStats.calculateMAD(tumorFwdOffset, fwdMedian);
revMAD = MuTectStats.calculateMAD(tumorRevOffset, revMedian);
}
return( new double[] {fwdMedian, revMedian, fwdMAD, revMAD} ); // TODO: make an object container instead of array
}
protected Double getElementForRead(final GATKSAMRecord read, final int refLoc, final ReadUtils.ClippingTail tail) {
final int offset = ReadUtils.getReadCoordinateForReferenceCoordinate(read.getSoftStart(), read.getCigar(), refLoc, tail, true);
if ( offset == ReadUtils.CLIPPING_GOAL_NOT_REACHED ) // offset is the number of bases in the read, including inserted bases, from start of read to the variant
return null;
int readPos = AlignmentUtils.calcAlignmentByteArrayOffset(read.getCigar(), offset, false, 0, 0); // readpos is the number of REF bases from start to variant. I would name it as such...
final int numAlignedBases = AlignmentUtils.getNumAlignedBasesCountingSoftClips( read );
if (readPos > numAlignedBases / 2)
readPos = numAlignedBases - (readPos + 1);
return (double)readPos;
}
/**
* Can the read be used in comparative tests between ref / alt bases?
*
* @param read the read to consider
* @param refLoc the reference location
* @return true if this read is meaningful for comparison, false otherwise
*/
protected boolean isUsableRead(final GATKSAMRecord read, final int refLoc) {
return !( read.getMappingQuality() == 0 ||
read.getMappingQuality() == QualityUtils.MAPPING_QUALITY_UNAVAILABLE );
}
}

View File

@ -54,7 +54,7 @@ and then run the following Queue command
java \
-Djava.io.tmpdir=$TEMPDIR \
-jar $QUEUE_JAR \
-S $GSA_UNSTABLE_HOME/private/gatk-tools-private/src/main/java/org/broadinstitute/gatk/tools/walkers/cancer/m2/run_M2_dream.scala \
-S $GSA_UNSTABLE_HOME/private/gatk-queue-extensions-internal/src/main/qscripts/org/broadinstitute/gatk/queue/qscripts/m2/run_M2_dream.scala \
--job_queue gsa -qsub -jobResReq virtual_free=5G -startFromScratch \
-sc 200 \
-normal $NORMAL_BAM \

View File

@ -135,6 +135,9 @@ public class M2ArgumentCollection extends AssemblyBasedCallerArgumentCollection
@Argument(fullName = "strand_artifact_power_threshold", required = false, doc = "power threshold for calling strand bias")
public float STRAND_ARTIFACT_POWER_THRESHOLD = 0.9f;
@Argument(fullName = "enable_strand_artifact_filter", required = false, doc = "turn on strand artifact filter")
public boolean ENABLE_STRAND_ARTIFACT_FILTER = false;
/**
* This argument is used for the M1-style read position filter
*/

View File

@ -0,0 +1,68 @@
# CRSP HapMap Sensitivity Evaluation
###Current M2 Performance
(gsa-unstable 9/1/15, commit:a08903d)
| Mixture | type | sensitvity |
|------|----------------------|
| 5-plex |SNP|0.9691274|
| 5-plex |INDEL|0.87466127|
| 10-plex |SNP|0.97179496|
| 10-plex |INDEL|0.8888889|
| 20-plex |SNP|0.9537307|
| 20-plex |INDEL|0.83281654|
###Run Procedure
Run the script separately for each HapMap mixture bam:
inputDir=/dsde/working/mutect/laura/hapmapSensitivity/inputs/
Queue_Jar=<Queue jar of interest>
```
java -jar $Queue_Jar -S Qscript_HapMapPlex.scala \
-intervals $inputDir/agilent_5plex_intervalFiles.list \
-tumors $inputDir/agilent_5plex_bams.list \
-truthVCF $inputDir/agilent_5plex_truth_intervals.vcf \
-snpCounts $inputDir/agilent_5plex_truth_intervals.snpCounts.list \
-indelCounts $inputDir/agilent_5plex_truth_intervals.indelCounts.list \
-o <output.5plex.sensitivity.report> \
-qsub -jobQueue gsa -jobResReq virtual_free=5G -sc 50
```
The HapMap bams get run as tumors without normals because we're not interested in specificity here, so we don't need the normals to filter out noise
###Inputs
Bam lists:
5- and 10-plex have 3 replicates, 20-plex has 9
Interval files:
If we're only interested in sensitivity, then we only need to run the caller around known true positive sites, which we take from the truth VCFs
This workaround repeats the truth filename for the number of bams -- in theory each could have a separate truth VCF, but they are the same titration mixture
SNP/INDEL counts:
This is the number of events in the truth VCFs so we can find the sensitivity across all samples
TODO: this could be generalized
###Outputs
Each run outputs its own SNP and INDEL sensitivity combined across all samples:
```
Sensitivity across all samples:
SNPs: 0.95156
INDELs: 0.7328859
```
Note that these are not filtered for depth as described in the CRSP documentation
###Resources
Truth file preparation for 5-plex:
Start with /cga/tcga-gsc/benchmark/data/crsp-truth/1kg_5plex_wgs_hc_calls.codingIndelSnp.db135.recode.vcf
Select out allele fraction greater than 20% using "vc.isBiallelic() ? AF >= 0.2 : vc.hasGenotypes() && vc.getCalledChrCount(vc.getAltAlleleWithHighestAlleleCount())/(1.0*vc.getCalledChrCount()) >= 0.2"
Similar for 10-plex source:
/cga/tcga-gsc/benchmark/data/crsp-truth/1kg_10plex_wgs_hc_calls.codingIndelSnp.db135.recode.vcf
And 20-plex source:
/cga/tcga-gsc/benchmark/data/crsp-truth/1kg_20plex_wgs_hc_calls.codingIndelSnp.db135.recode.vcf
both also using AF filter of 0.2

View File

@ -399,6 +399,13 @@ public class MuTect2 extends ActiveRegionWalker<List<VariantContext>, Integer> i
headerInfo.add(GATKVCFHeaderLines.getInfoLine(GATKVCFConstants.EVENT_DISTANCE_MIN_KEY));
headerInfo.add(GATKVCFHeaderLines.getInfoLine(GATKVCFConstants.EVENT_DISTANCE_MAX_KEY));
if (MTAC.ENABLE_STRAND_ARTIFACT_FILTER){
headerInfo.add(GATKVCFHeaderLines.getInfoLine(GATKVCFConstants.TLOD_FWD_KEY));
headerInfo.add(GATKVCFHeaderLines.getInfoLine(GATKVCFConstants.TLOD_REV_KEY));
headerInfo.add(GATKVCFHeaderLines.getInfoLine(GATKVCFConstants.TUMOR_SB_POWER_FWD_KEY));
headerInfo.add(GATKVCFHeaderLines.getInfoLine(GATKVCFConstants.TUMOR_SB_POWER_REV_KEY));
}
headerInfo.add(GATKVCFHeaderLines.getFormatLine(GATKVCFConstants.ALLELE_FRACTION_KEY));
headerInfo.add(GATKVCFHeaderLines.getFilterLine(GATKVCFConstants.STR_CONTRACTION_FILTER_NAME));
@ -410,11 +417,8 @@ public class MuTect2 extends ActiveRegionWalker<List<VariantContext>, Integer> i
headerInfo.add(GATKVCFHeaderLines.getFilterLine(GATKVCFConstants.TUMOR_LOD_FILTER_NAME));
headerInfo.add(GATKVCFHeaderLines.getFilterLine(GATKVCFConstants.GERMLINE_RISK_FILTER_NAME));
headerInfo.add(GATKVCFHeaderLines.getFilterLine(GATKVCFConstants.TRIALLELIC_SITE_FILTER_NAME));
headerInfo.add(GATKVCFHeaderLines.getFilterLine(GATKVCFConstants.STRAND_ARTIFACT_FILTER_NAME));
headerInfo.add(new VCFFilterHeaderLine("M1_CLUSTERED_READ_POSITION", "Variant appears in similar read positions"));
headerInfo.add(new VCFFilterHeaderLine("M1_STRAND_BIAS", "Forward LOD vs. reverse LOD comparison indicates strand bias"));
headerInfo.add(new VCFInfoHeaderLine("TLOD_FWD",1,VCFHeaderLineType.Integer,"TLOD from forward reads only"));
headerInfo.add(new VCFInfoHeaderLine("TLOD_REV",1,VCFHeaderLineType.Integer,"TLOD from reverse reads only"));
if ( ! doNotRunPhysicalPhasing ) {
headerInfo.add(GATKVCFHeaderLines.getFormatLine(GATKVCFConstants.HAPLOTYPE_CALLER_PHASING_ID_KEY));
@ -728,7 +732,7 @@ public class MuTect2 extends ActiveRegionWalker<List<VariantContext>, Integer> i
filters.add(GATKVCFConstants.TUMOR_LOD_FILTER_NAME);
}
// if we are in artifact detection mode, apply the thresholds for the LOD scores
// if we are in artifact detection mode, apply the thresholds for the LOD scores
if (!MTAC.ARTIFACT_DETECTION_MODE) {
filters.addAll(calculateFilters(metaDataTracker, originalVC, eventDistanceAttributes));
}
@ -754,11 +758,8 @@ public class MuTect2 extends ActiveRegionWalker<List<VariantContext>, Integer> i
annotatedCalls.add(vcb.make());
}
// TODO: find a better place for this debug message
// logger.info("We had " + TumorPowerCalculator.numCacheHits + " hits in starnd artifact power calculation");
return annotatedCalls;
}
@ -834,16 +835,10 @@ public class MuTect2 extends ActiveRegionWalker<List<VariantContext>, Integer> i
filters.add(GATKVCFConstants.CLUSTERED_EVENTS_FILTER_NAME);
}
Integer tumorFwdPosMedian = (Integer) vc.getAttribute("TUMOR_FWD_POS_MEDIAN");
Integer tumorRevPosMedian = (Integer) vc.getAttribute("TUMOR_REV_POS_MEDIAN");
Integer tumorFwdPosMAD = (Integer) vc.getAttribute("TUMOR_FWD_POS_MAD");
Integer tumorRevPosMAD = (Integer) vc.getAttribute("TUMOR_REV_POS_MAD");
//If the variant is near the read end (median threshold) and the positions are very similar (MAD threshold) then filter
if ( (tumorFwdPosMedian != null && tumorFwdPosMedian <= MTAC.PIR_MEDIAN_THRESHOLD && tumorFwdPosMAD != null && tumorFwdPosMAD <= MTAC.PIR_MAD_THRESHOLD) ||
(tumorRevPosMedian != null && tumorRevPosMedian <= MTAC.PIR_MEDIAN_THRESHOLD && tumorFwdPosMAD != null && tumorRevPosMAD <= MTAC.PIR_MAD_THRESHOLD))
filters.add("M1_CLUSTERED_READ_POSITION");
// TODO: Add clustered read position filter here
// TODO: Move strand bias filter here
return filters;
}
@ -1064,9 +1059,9 @@ public class MuTect2 extends ActiveRegionWalker<List<VariantContext>, Integer> i
@Advanced
@Argument(fullName="annotation", shortName="A", doc="One or more specific annotations to apply to variant calls", required=false)
// protected List<String> annotationsToUse = new ArrayList<>(Arrays.asList(new String[]{"ClippingRankSumTest", "DepthPerSampleHC"}));
//protected List<String> annotationsToUse = new ArrayList<>(Arrays.asList(new String[]{"DepthPerAlleleBySample", "BaseQualitySumPerAlleleBySample", "TandemRepeatAnnotator",
// "RMSMappingQuality","MappingQualityRankSumTest","FisherStrand","StrandOddsRatio","ReadPosRankSumTest","QualByDepth", "Coverage"}));
protected List<String> annotationsToUse = new ArrayList<>(Arrays.asList(new String[]{"DepthPerAlleleBySample", "BaseQualitySumPerAlleleBySample", "TandemRepeatAnnotator", "OxoGReadCounts", "ClusteredEventsAnnotator"}));
// protected List<String> annotationsToUse = new ArrayList<>(Arrays.asList(new String[]{"DepthPerAlleleBySample", "BaseQualitySumPerAlleleBy ruSample", "TandemRepeatAnnotator",
// "RMSMappingQuality","MappingQualityRankSumTest","FisherStrand","StrandOddsRatio","ReadPosRankSumTest","QualByDepth", "Coverage"}));
protected List<String> annotationsToUse = new ArrayList<>(Arrays.asList(new String[]{"DepthPerAlleleBySample", "BaseQualitySumPerAlleleBySample", "TandemRepeatAnnotator", "OxoGReadCounts"}));
/**
* Which annotations to exclude from output in the VCF file. Note that this argument has higher priority than the -A or -G arguments,

View File

@ -49,54 +49,60 @@
* 8.7 Governing Law. This Agreement shall be construed, governed, interpreted and applied in accordance with the internal laws of the Commonwealth of Massachusetts, U.S.A., without regard to conflict of laws principles.
*/
package org.broadinstitute.gatk.queue.qscripts.dev
package org.broadinstitute.gatk.tools.walkers.cancer.m2;
import org.broadinstitute.gatk.queue.QScript
import org.broadinstitute.gatk.queue.extensions.gatk._
import org.broadinstitute.gatk.queue.util.QScriptUtils
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
class run_M2_ICE_NN extends QScript {
/**
* Collection of Statistical methods and tests used by MuTect
*/
public class MuTectStats {
@Argument(shortName = "bams", required = true, doc = "file of all BAM files")
var allBams: String = ""
public static double calculateMAD(ArrayList<Double> xs, double median) {
ArrayList<Double> deviations = new ArrayList<>(xs.size());
@Argument(shortName = "o", required = false, doc = "Output prefix")
var outputPrefix: String = ""
for(double x : xs) {
deviations.add(Math.abs(x - median));
}
@Argument(shortName = "pon", required = false, doc = "Normal PON")
var panelOfNormals: String = "/dsde/working/mutect/panel_of_normals/panel_of_normals_m2_ice_wgs_territory/m2_406_ice_normals_wgs_calling_regions.vcf";
return getMedian(deviations);
@Argument(shortName = "sc", required = false, doc = "base scatter count")
var scatter: Int = 10
def script() {
val bams = QScriptUtils.createSeqFromFile(allBams)
for (tumor <- bams) {
for (normal <- bams) {
if (tumor != normal) add( createM2Config(tumor, normal, new File(panelOfNormals), outputPrefix))
}
}
}
public static double getMedian(ArrayList<Double> data) {
Collections.sort(data);
Double result;
def createM2Config(tumorBAM : File, normalBAM : File, panelOfNormals : File, outputPrefix : String): M2 = {
val mutect2 = new MuTect2
if (data.size() % 2 == 1) {
// If the number of entries in the list is not even.
mutect2.reference_sequence = new File("/seq/references/Homo_sapiens_assembly19/v1/Homo_sapiens_assembly19.fasta")
mutect2.cosmic :+= new File("/xchip/cga/reference/hg19/hg19_cosmic_v54_120711.vcf")
mutect2.dbsnp = new File("/humgen/gsa-hpprojects/GATK/bundle/current/b37/dbsnp_138.b37.vcf")
mutect2.normal_panel :+= panelOfNormals
// Get the middle value.
// You must floor the result of the division to drop the
// remainder.
result = data.get((int) Math.floor(data.size()/2) );
mutect2.intervalsString :+= new File("/dsde/working/mutect/crsp_nn/whole_exome_illumina_coding_v1.Homo_sapiens_assembly19.targets.no_empty.interval_list")
mutect2.memoryLimit = 2
mutect2.input_file = List(new TaggedFile(normalBAM, "normal"), new TaggedFile(tumorBAM, "tumor"))
} else {
// If the number of entries in the list are even.
mutect2.scatterCount = scatter
mutect2.out = outputPrefix + tumorBAM.getName + "-vs-" + normalBAM.getName + ".vcf"
// Get the middle two values and average them.
Double lowerMiddle = data.get(data.size()/2 );
Double upperMiddle = data.get(data.size()/2 - 1 );
result = (lowerMiddle + upperMiddle) / 2;
}
println("Adding " + tumorBAM + " vs " + normalBAM + " as " + mutect2.out)
mutect2
}
return result;
}
public static double[] convertIntegersToDoubles(List<Integer> integers)
{
double[] ret = new double[integers.size()];
for (int i=0; i < ret.length; i++)
{
ret[i] = integers.get(i);
}
return ret;
}
}

View File

@ -22,16 +22,18 @@ TODO: write a simple tool to do this more easily
To calculate per pair-counts, run:
```
# for SNPs
for vcf in *.bam.vcf
do
cat $vcf | grep PASS | awk '{ if ( length($4) + length($5) == 2) print $0 }' | wc -l
done
for vcf in *.vcf
do
cat $vcf | grep PASS | awk '{ if ( length($4) + length($5) == 2) print $0 }' | wc -l
done > snp-fps.txt
cat snp-fps.txt | awk '{ sum += $1 } END { print sum }'
# for INDELs
for vcf in *.bam.vcf
do
cat $vcf | grep PASS | awk '{ if ( length($4) + length($5) != 2) print $0 }' | wc -l
done
for vcf in *.vcf
do
cat $vcf | grep PASS | awk '{ if ( length($4) + length($5) != 2) print $0 }' | wc -l
done > indel-fps.txt
cat indel-fps.txt | awk '{ sum += $1 } END { print sum }'
```
### Current M1 and Indelocator Performance
@ -72,7 +74,7 @@ and then run the following Queue command
java \
-Djava.io.tmpdir=$TEMPDIR \
-jar $QUEUE_JAR \
-S $GSA_UNSTABLE_HOME/private/gatk-tools-private/src/main/java/org/broadinstitute/gatk/tools/walkers/cancer/m2/run_M2_ICE_NN.scala \
-S $GSA_UNSTABLE_HOME/private/gatk-queue-extensions-internal/src/main/qscripts/org/broadinstitute/gatk/queue/qscripts/m2/run_M2_ICE_NN.scala \
-sc 50 \
--job_queue gsa -qsub -jobResReq virtual_free=5G -startFromScratch \
--allbams /humgen/gsa-hpprojects/NA12878Collection/bams/crsp_ice_validation//NA12878.intra.flowcell.replicate.bam_list \

View File

@ -5,7 +5,7 @@
* SOFTWARE LICENSE AGREEMENT
* FOR ACADEMIC NON-COMMERCIAL RESEARCH PURPOSES ONLY
*
* This Agreement is made between the Broad Institute, Inc. with a principal address at 415 Main Street, Cambridge, MA 02142 (BROAD) and the LICENSEE and is effective at the date the downloading is completed (EFFECTIVE DATE).
* This Agreement is made between the Broad Institute, Inc. with a principal address at 415 Main Street, Cambridge, MA 02142 ("BROAD") and the LICENSEE and is effective at the date the downloading is completed ("EFFECTIVE DATE").
*
* WHEREAS, LICENSEE desires to license the PROGRAM, as defined hereinafter, and BROAD wishes to have this PROGRAM utilized in the public interest, subject only to the royalty-free, nonexclusive, nontransferable license rights of the United States Government pursuant to 48 CFR 52.227-14; and
* WHEREAS, LICENSEE desires to license the PROGRAM and BROAD desires to grant a license on the following terms and conditions.
@ -21,11 +21,11 @@
* 2.3 License Limitations. Nothing in this Agreement shall be construed to confer any rights upon LICENSEE by implication, estoppel, or otherwise to any computer software, trademark, intellectual property, or patent rights of BROAD, or of any other entity, except as expressly granted herein. LICENSEE agrees that the PROGRAM, in whole or part, shall not be used for any commercial purpose, including without limitation, as the basis of a commercial software or hardware product or to provide services. LICENSEE further agrees that the PROGRAM shall not be copied or otherwise adapted in order to circumvent the need for obtaining a license for use of the PROGRAM.
*
* 3. PHONE-HOME FEATURE
* LICENSEE expressly acknowledges that the PROGRAM contains an embedded automatic reporting system (PHONE-HOME) which is enabled by default upon download. Unless LICENSEE requests disablement of PHONE-HOME, LICENSEE agrees that BROAD may collect limited information transmitted by PHONE-HOME regarding LICENSEE and its use of the PROGRAM. Such information shall include LICENSEES user identification, version number of the PROGRAM and tools being run, mode of analysis employed, and any error reports generated during run-time. Collection of such information is used by BROAD solely to monitor usage rates, fulfill reporting requirements to BROAD funding agencies, drive improvements to the PROGRAM, and facilitate adjustments to PROGRAM-related documentation.
* LICENSEE expressly acknowledges that the PROGRAM contains an embedded automatic reporting system ("PHONE-HOME") which is enabled by default upon download. Unless LICENSEE requests disablement of PHONE-HOME, LICENSEE agrees that BROAD may collect limited information transmitted by PHONE-HOME regarding LICENSEE and its use of the PROGRAM. Such information shall include LICENSEE'S user identification, version number of the PROGRAM and tools being run, mode of analysis employed, and any error reports generated during run-time. Collection of such information is used by BROAD solely to monitor usage rates, fulfill reporting requirements to BROAD funding agencies, drive improvements to the PROGRAM, and facilitate adjustments to PROGRAM-related documentation.
*
* 4. OWNERSHIP OF INTELLECTUAL PROPERTY
* LICENSEE acknowledges that title to the PROGRAM shall remain with BROAD. The PROGRAM is marked with the following BROAD copyright notice and notice of attribution to contributors. LICENSEE shall retain such notice on all copies. LICENSEE agrees to include appropriate attribution if any results obtained from use of the PROGRAM are included in any publication.
* Copyright 2012-2014 Broad Institute, Inc.
* Copyright 2012-2016 Broad Institute, Inc.
* Notice of attribution: The GATK3 program was made available through the generosity of Medical and Population Genetics program at the Broad Institute, Inc.
* LICENSEE shall not use any trademark or trade name of BROAD, or any variation, adaptation, or abbreviation, of such marks or trade names, or any names of officers, faculty, students, employees, or agents of BROAD except as states above for attribution purposes.
*
@ -51,50 +51,145 @@
package org.broadinstitute.gatk.tools.walkers.cancer.m2;
import java.util.HashMap;
import htsjdk.variant.variantcontext.Allele;
public class AbstractPowerCalculator {
protected HashMap<PowerCacheKey, Double> cache = new HashMap<PowerCacheKey, Double>();
protected double constantEps;
protected double constantLodThreshold;
import java.util.*;
protected static class PowerCacheKey {
private int n;
private double delta;
/**
* A container for allele to value mapping.
*
* Each PerAlleleCollection may hold a value for each ALT allele and, optionally, a value for the REF allele.
* For example,
*
* PerAlleleCollection<Double> alleleFractions = PerAlleleCollection.createPerAltAlleleCollection()
*
* may be a container for allele fractions for ALT alleles in a variant context. While
*
* PerAlleleCollection<Double> alleleCount = PerAlleleCollection.createPerRefAndAltAlleleCollection()
*
* may hold the allele counts for the REF allele and all ALT alleles in a variant context.
*
*
**/
public class PerAlleleCollection<X> {
// TODO: consider using Optional for ref allele
private Optional<Allele> refAllele;
private Optional<X> refValue;
private Map<Allele, X> altAlleleValueMap;
private boolean altOnly;
public PowerCacheKey(int n, double delta) {
this.n = n;
this.delta = delta;
private PerAlleleCollection(boolean altOnly){
this.altOnly = altOnly;
this.altAlleleValueMap = new HashMap();
this.refAllele = Optional.empty();
}
public static PerAlleleCollection createPerAltAlleleCollection(){
return new PerAlleleCollection(true);
}
public static PerAlleleCollection createPerRefAndAltAlleleCollection(){
return new PerAlleleCollection(false);
}
/**
* Take an allele, REF or ALT, and update its value appropriately
*
* @param allele : REF or ALT allele
* @param newValue :
*/
public void set(Allele allele, X newValue){
if (allele == null || newValue == null){
throw new IllegalArgumentException("allele or newValue is null");
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
PowerCacheKey that = (PowerCacheKey) o;
if (Double.compare(that.delta, delta) != 0) return false;
if (n != that.n) return false;
return true;
if (allele.isReference() && altOnly){
throw new IllegalArgumentException("Collection stores values for alternate alleles only");
}
@Override
public int hashCode() {
int result;
long temp;
result = n;
temp = delta != +0.0d ? Double.doubleToLongBits(delta) : 0L;
result = 31 * result + (int) (temp ^ (temp >>> 32));
return result;
if (allele.isReference()){
this.setRef(allele, newValue);
} else {
this.setAlt(allele, newValue);
}
}
protected static double calculateLogLikelihood(int depth, int alts, double eps, double f) {
double a = (depth-alts) * Math.log10(f*eps + (1d-f)*(1d-eps));
double b = (alts) * Math.log10(f*(1d-eps) + (1d-f)*eps);
return (a+b);
public void setRef(Allele refAllele, X newValue){
if (refAllele == null || newValue == null){
throw new IllegalArgumentException("refAllele or newValue is null");
}
if (refAllele.isNonReference()){
throw new IllegalArgumentException("Setting Non-reference allele as reference");
}
if (this.refAllele.isPresent()){
throw new IllegalArgumentException("Resetting the reference allele not permitted");
}
this.refAllele = Optional.of(refAllele);
this.refValue = Optional.of(newValue);
}
}
public void setAlt(Allele altAllele, X newValue){
if (altAllele == null || newValue == null){
throw new IllegalArgumentException("altAllele or newValue is null");
}
if (altAllele.isReference()){
throw new IllegalArgumentException("Setting reference allele as alt");
}
altAlleleValueMap.put(altAllele, newValue);
}
/**
* Get the value for an allele, REF or ALT
* @param allele
*/
public X get(Allele allele){
if (allele == null){
throw new IllegalArgumentException("allele is null");
}
if (allele.isReference()){
if (allele.equals(this.refAllele.get())){
return(getRef());
} else {
throw new IllegalArgumentException("Requested ref allele does not match the stored ref allele");
}
} else {
return(getAlt(allele));
}
}
public X getRef(){
if (altOnly) {
throw new IllegalStateException("Collection does not hold the REF allele");
}
if (this.refAllele.isPresent()){
return(refValue.get());
} else {
throw new IllegalStateException("Collection's ref allele has not been set yet");
}
}
public X getAlt(Allele allele){
if (allele == null){
throw new IllegalArgumentException("allele is null");
}
if (allele.isReference()){
throw new IllegalArgumentException("allele is not an alt allele");
}
if (altAlleleValueMap.containsKey(allele)) {
return(altAlleleValueMap.get(allele));
} else {
throw new IllegalArgumentException("Requested alt allele is not in the collection");
}
}
public Set<Allele> getAltAlleles(){
return(altAlleleValueMap.keySet());
}
}

View File

@ -54,8 +54,9 @@ package org.broadinstitute.gatk.tools.walkers.cancer.m2;
import com.google.java.contract.Ensures;
import htsjdk.samtools.util.StringUtil;
import htsjdk.variant.variantcontext.*;
import org.apache.commons.lang.mutable.MutableDouble;
import org.apache.commons.lang.mutable.MutableInt;
import org.apache.log4j.Logger;
import org.broadinstitute.gatk.tools.walkers.genotyper.GenotypeLikelihoodsCalculationModel;
import org.broadinstitute.gatk.tools.walkers.genotyper.afcalc.AFCalculatorProvider;
import org.broadinstitute.gatk.tools.walkers.haplotypecaller.HaplotypeCallerGenotypingEngine;
import org.broadinstitute.gatk.utils.GenomeLoc;
@ -77,8 +78,10 @@ import java.util.*;
public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
protected M2ArgumentCollection MTAC;
private TumorPowerCalculator strandArtifactPowerCalculator;
protected final M2ArgumentCollection MTAC;
private final TumorPowerCalculator strandArtifactPowerCalculator;
final boolean REF_AND_ALT = false;
final boolean ALT_ONLY = true;
private final static Logger logger = Logger.getLogger(SomaticGenotypingEngine.class);
@ -86,8 +89,8 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
super(configuration, samples, genomeLocParser, afCalculatorProvider, doPhysicalPhasing);
this.MTAC = MTAC;
// coverage related initialization
double powerConstantEps = Math.pow(10, -1 * (MTAC.POWER_CONSTANT_QSCORE/10));
strandArtifactPowerCalculator = new TumorPowerCalculator(powerConstantEps, MTAC.STRAND_ARTIFACT_LOD_THRESHOLD, 0.0f);
final double errorProbability = Math.pow(10, -(MTAC.POWER_CONSTANT_QSCORE/10));
strandArtifactPowerCalculator = new TumorPowerCalculator(errorProbability, MTAC.STRAND_ARTIFACT_LOD_THRESHOLD, 0.0f);
}
/**
@ -95,7 +98,7 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
* genotype likelihoods and assemble into a list of variant contexts and genomic events ready for calling
*
* The list of samples we're working with is obtained from the readLikelihoods
*
* @param haplotypes Haplotypes to assign likelihoods to
* @param readLikelihoods Map from reads->(haplotypes,likelihoods)
* @param perSampleFilteredReadList Map from sample to reads that were filtered after assembly and before calculating per-read likelihoods.
@ -112,7 +115,7 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
// @Requires({"refLoc.containsP(activeRegionWindow)", "haplotypes.size() > 0"})
@Ensures("result != null")
// TODO - can this be refactored? this is hard to follow!
public HaplotypeCallerGenotypingEngine.CalledHaplotypes callMutations (
public CalledHaplotypes callMutations (
final List<Haplotype> haplotypes,
//final Map<String, PerReadAlleleLikelihoodMap> haplotypeReadMap,
final ReadLikelihoods<Haplotype> readLikelihoods,
@ -145,7 +148,7 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
// Somatic Tumor/Normal Sample Handling
verifySamplePresence(tumorSampleName, readLikelihoods.samples());
final boolean hasNormal = (matchedNormalSampleName != null);
final boolean hasNormal = matchedNormalSampleName != null;
// update the haplotypes so we're ready to call, getting the ordered list of positions on the reference
// that carry events among the haplotypes
@ -175,13 +178,6 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
if( mergedVC == null ) { continue; }
final int numAlts = mergedVC.getNAlleles()-1;
// final VariantContextBuilder vcb = new VariantContextBuilder(mergedVC);
final GenotypeLikelihoodsCalculationModel.Model calculationModel = mergedVC.isSNP()
? GenotypeLikelihoodsCalculationModel.Model.SNP : GenotypeLikelihoodsCalculationModel.Model.INDEL;
if (emitReferenceConfidence)
mergedVC = addNonRefSymbolicAllele(mergedVC);
@ -216,173 +212,183 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
MuTect2.logReadInfo(DEBUG_READ_NAME, tumorPRALM.getLikelihoodReadMap().keySet(), "Present after filtering for overlapping reads");
// extend to multiple samples
//handle existence of secondary alts
double[] afs = estimateAlleleFraction(mergedVC, tumorPRALM);
// compute tumor LOD for each alternate allele
// TODO: somewhere we have to ensure that the all the alleles in the variant context is in alleleFractions passed to getHetGenotypeLogLikelihoods. getHetGenotypeLogLikelihoods will not check that for you
final PerAlleleCollection<Double> altAlleleFractions = estimateAlleleFraction(mergedVC, tumorPRALM, false);
final PerAlleleCollection<Double> tumorHetGenotypeLLs = getHetGenotypeLogLikelihoods(mergedVC, tumorPRALM, originalNormalReadQualities, altAlleleFractions);
if( configuration.DEBUG && logger != null ) {
String output = "Calculated allelic fraction at " + loc + " = ";
for (int i = 0; i<afs.length; i++)
output = output + afs[i];
if (logger != null) logger.info(output);
StringBuilder outputSB = new StringBuilder("Calculated allelic fraction at " + loc + " = [");
for (Allele allele : altAlleleFractions.getAltAlleles()){
outputSB.append( allele + ": " + altAlleleFractions.getAlt(allele) + ", ");
}
outputSB.append("]");
logger.info(outputSB.toString());
}
double[] tumorGLs = getVariableGenotypeLikelihoods(mergedVC, tumorPRALM, originalNormalReadQualities, afs);
final PerAlleleCollection<Double> tumorLods = PerAlleleCollection.createPerAltAlleleCollection();
for (Allele altAllele : mergedVC.getAlternateAlleles()){
tumorLods.set(altAllele, tumorHetGenotypeLLs.get(altAllele) - tumorHetGenotypeLLs.getRef());
}
PerReadAlleleLikelihoodMap forwardPRALM = new PerReadAlleleLikelihoodMap();
PerReadAlleleLikelihoodMap reversePRALM = new PerReadAlleleLikelihoodMap();
splitPRALMintoForwardAndReverseReads(tumorPRALM, forwardPRALM, reversePRALM);
// TODO: TS uncomment and fix
// double f_fwd = estimateAlleleFraction(mergedVC, forwardPRALM);
// double[] tumorGLs_fwd = getVariableGenotypeLikelihoods(mergedVC, forwardPRALM, f_fwd);
//
// double f_rev = estimateAlleleFraction(mergedVC, reversePRALM);
// double[] tumorGLs_rev = getVariableGenotypeLikelihoods(mergedVC, reversePRALM, f_rev);
if (configuration.DEBUG && logger != null) {
StringBuilder outputSB = new StringBuilder("Tumor LOD at " + loc + " = [");
for (Allele altAllele : tumorLods.getAltAlleles()) {
outputSB.append( altAllele + ": " + tumorLods.getAlt(altAllele) + ", ");
}
outputSB.append("]");
logger.info(outputSB.toString());
}
double INIT_NORMAL_LOD_THRESHOLD = -Double.MAX_VALUE;
double NORMAL_LOD_THRESHOLD = -Double.MAX_VALUE;
PerReadAlleleLikelihoodMap normalPRALM = null;
double[] normalGLs = null;
PerAlleleCollection<Double> normalLods = PerAlleleCollection.createPerAltAlleleCollection();
// if normal bam is available, compute normal LOD
if (hasNormal) {
normalPRALM = readAlleleLikelihoods.toPerReadAlleleLikelihoodMap(readAlleleLikelihoods.sampleIndex(matchedNormalSampleName));
filterPRALMForOverlappingReads(normalPRALM, mergedVC.getReference(), loc, true);
MuTect2.logReadInfo(DEBUG_READ_NAME, normalPRALM.getLikelihoodReadMap().keySet(), "Present after filtering for overlapping reads");
double[] diploidAFarray = new double[numAlts];
Arrays.fill(diploidAFarray, 0.5d);
normalGLs = getVariableGenotypeLikelihoods(mergedVC, normalPRALM, originalNormalReadQualities, diploidAFarray);
}
double INIT_NORMAL_LOD_THRESHOLD = -Double.MAX_VALUE;
double NORMAL_LOD_THRESHOLD = -Double.MAX_VALUE;
final int REF_INDEX = 0;
double[] tumorLods = new double[numAlts];
for (int altInd = 0; altInd < numAlts; altInd++) {
tumorLods[altInd] = tumorGLs[altInd+1] - tumorGLs[REF_INDEX];
}
if (configuration.DEBUG && logger != null) {
String output = "Tumor LOD at " + loc + " = ";
for (int i = 0; i<tumorLods.length; i++)
output = output + tumorLods[i];
if (logger != null) logger.info(output);
}
double[] normalLods = new double[numAlts];
// TODO: TS extend fwd-rev approach for multiple alt alleles
// int REF = 0, HET = 1;
// double tumorLod = tumorGLs[HET] - tumorGLs[REF];
// double tumorLod_fwd = tumorGLs_fwd[HET] - tumorGLs_fwd[REF];
// double tumorLod_rev = tumorGLs_rev[HET] - tumorGLs_rev[REF];
// double normalLod = 0;
if (hasNormal) {
GenomeLoc eventGenomeLoc = genomeLocParser.createGenomeLoc(activeRegionWindow.getContig(), loc);
Collection<VariantContext> cosmicVC = tracker.getValues(cosmicRod, eventGenomeLoc);
Collection<VariantContext> dbsnpVC = tracker.getValues(dbsnpRod, eventGenomeLoc);
// remove the effect of cosmic from dbSNP
boolean germlineAtRisk = (!dbsnpVC.isEmpty() && cosmicVC.isEmpty());
final boolean germlineAtRisk = (!dbsnpVC.isEmpty() && cosmicVC.isEmpty());
INIT_NORMAL_LOD_THRESHOLD = MTAC.INITIAL_NORMAL_LOD_THRESHOLD; //only set this if this job has a normal
NORMAL_LOD_THRESHOLD = (germlineAtRisk)?MTAC.NORMAL_DBSNP_LOD_THRESHOLD:MTAC.NORMAL_LOD_THRESHOLD;
for (int altInd = 0; altInd < numAlts; altInd++)
normalLods[altInd] = normalGLs[REF_INDEX] - normalGLs[altInd+1];
NORMAL_LOD_THRESHOLD = (germlineAtRisk)?MTAC.NORMAL_DBSNP_LOD_THRESHOLD:MTAC.NORMAL_LOD_THRESHOLD;
// compute normal LOD = LL(X|REF)/LL(X|ALT) where ALT is the diploid HET with AF = 0.5
// note normal LOD is REF over ALT, the reciprocal of the tumor LOD
final PerAlleleCollection<Double> diploidHetAlleleFractions = PerAlleleCollection.createPerRefAndAltAlleleCollection();
for (final Allele allele : mergedVC.getAlternateAlleles()){
diploidHetAlleleFractions.setAlt(allele, 0.5);
}
final PerAlleleCollection<Double> normalGenotypeLLs = getHetGenotypeLogLikelihoods(mergedVC, normalPRALM, originalNormalReadQualities, diploidHetAlleleFractions);
for (final Allele altAllele : mergedVC.getAlternateAlleles()){
normalLods.setAlt(altAllele, normalGenotypeLLs.getRef() - normalGenotypeLLs.getAlt(altAllele));
}
}
//reconcile multiple alts, if applicable
int numPassingAlts = 0;
int lodInd = 0;
for (int altInd = 0; altInd < numAlts; altInd++) {
if (tumorLods[altInd] >= MTAC.INITIAL_TUMOR_LOD_THRESHOLD && normalLods[altInd] >= INIT_NORMAL_LOD_THRESHOLD) {
Set<Allele> allelesThatPassThreshold = new HashSet<>();
Allele alleleWithHighestTumorLOD = null;
// TODO: use lambda
for (Allele altAllele : mergedVC.getAlternateAlleles()) {
final boolean passesTumorLodThreshold = tumorLods.getAlt(altAllele) >= MTAC.INITIAL_TUMOR_LOD_THRESHOLD;
final boolean passesNormalLodThreshold = hasNormal ? normalLods.getAlt(altAllele) >= INIT_NORMAL_LOD_THRESHOLD : true;
if (passesTumorLodThreshold && passesNormalLodThreshold) {
numPassingAlts++;
lodInd = altInd;
allelesThatPassThreshold.add(altAllele);
if (alleleWithHighestTumorLOD == null
|| tumorLods.getAlt(altAllele) > tumorLods.getAlt(alleleWithHighestTumorLOD)){
alleleWithHighestTumorLOD = altAllele;
}
}
}
// TS: if more than one passing alt alleles, filter it out, so doesn't matter which one we pick
final double tumorLod = tumorLods[lodInd];
final double normalLod = normalLods[lodInd];
double tumorSBpower_fwd;
double tumorSBpower_rev;
// TODO: TS fix
// try {
// tumorSBpower_fwd = strandArtifactPowerCalculator.cachingPowerCalculation(forwardPRALM.getNumberOfStoredElements(), f_fwd);
// tumorSBpower_rev = strandArtifactPowerCalculator.cachingPowerCalculation(reversePRALM.getNumberOfStoredElements(), f_rev);
// }
// catch (Throwable t) {
// System.err.println("Error processing " + activeRegionWindow.getContig() + ":" + loc);
// t.printStackTrace(System.err);
//
// throw new RuntimeException(t);
// }
final boolean emitVariant = numPassingAlts > 0;
VariantContext call = null;
if (tumorLod >= MTAC.INITIAL_TUMOR_LOD_THRESHOLD && normalLod >= INIT_NORMAL_LOD_THRESHOLD) {
if (emitVariant) {
VariantContextBuilder callVcb = new VariantContextBuilder(mergedVC);
if (normalLod < NORMAL_LOD_THRESHOLD) {
callVcb.filter(GATKVCFConstants.GERMLINE_RISK_FILTER_NAME);
}
int haplotypeCount = alleleMapper.get(mergedVC.getAlternateAllele(lodInd)).size();
// TODO: TS revisit
// callVcb.attribute("TLOD_FWD",tumorLod_fwd);
// callVcb.attribute("TLOD_REV",tumorLod_rev);
// if ( (tumorSBpower_fwd >= MTAC.STRAND_ARTIFACT_POWER_THRESHOLD && tumorLod_fwd < MTAC.STRAND_ARTIFACT_LOD_THRESHOLD) ||
// (tumorSBpower_rev >= MTAC.STRAND_ARTIFACT_POWER_THRESHOLD && tumorLod_rev < MTAC.STRAND_ARTIFACT_LOD_THRESHOLD) )
// callVcb.filter("M1_STRAND_BIAS");
// TODO: TS revisit
// if ( (tumorSBpower_fwd >= MTAC.STRAND_ARTIFACT_POWER_THRESHOLD && tumorLod_fwd < MTAC.STRAND_ARTIFACT_LOD_THRESHOLD) ||
// (tumorSBpower_rev >= MTAC.STRAND_ARTIFACT_POWER_THRESHOLD && tumorLod_rev < MTAC.STRAND_ARTIFACT_LOD_THRESHOLD) )
// callVcb.filter("M1_STRAND_BIAS");
//
// // FIXME: can simply get first alternate since above we only deal with Bi-allelic sites...
// int haplotypeCount = alleleMapper.get(mergedVC.getAlternateAllele(0)).size();
// FIXME: can simply get first alternate since above we only deal with Bi-allelic sites...
int haplotypeCount = alleleMapper.get(mergedVC.getAlternateAllele(0)).size();
callVcb.attribute(GATKVCFConstants.HAPLOTYPE_COUNT_KEY, haplotypeCount);
callVcb.attribute(GATKVCFConstants.TUMOR_LOD_KEY, tumorLod);
callVcb.attribute(GATKVCFConstants.NORMAL_LOD_KEY, normalLod);
callVcb.attribute(GATKVCFConstants.TUMOR_LOD_KEY, tumorLods.getAlt(alleleWithHighestTumorLOD));
if (normalLod < NORMAL_LOD_THRESHOLD) {
callVcb.filter(GATKVCFConstants.GERMLINE_RISK_FILTER_NAME);
if (hasNormal) {
callVcb.attribute(GATKVCFConstants.NORMAL_LOD_KEY, normalLods.getAlt(alleleWithHighestTumorLOD));
if (normalLods.getAlt(alleleWithHighestTumorLOD) < NORMAL_LOD_THRESHOLD) {
callVcb.filter(GATKVCFConstants.GERMLINE_RISK_FILTER_NAME);
}
}
// M1-style strand artifact filter
// TODO: move code to MuTect2::calculateFilters()
// skip if VC has multiple alleles - it will get filtered later anyway
if (MTAC.ENABLE_STRAND_ARTIFACT_FILTER && numPassingAlts == 1) {
final PerReadAlleleLikelihoodMap forwardPRALM = new PerReadAlleleLikelihoodMap();
final PerReadAlleleLikelihoodMap reversePRALM = new PerReadAlleleLikelihoodMap();
splitPRALMintoForwardAndReverseReads(tumorPRALM, forwardPRALM, reversePRALM);
// TODO: build a new type for probability, likelihood, and log_likelihood. e.g. f_fwd :: probability[], tumorGLs_fwd :: likelihood[]
// TODO: don't want to call getHetGenotypeLogLikelihoods on more than one alternate alelle. May need to overload it to take a scalar f_fwd.
final PerAlleleCollection<Double> alleleFractionsForward = estimateAlleleFraction(mergedVC, forwardPRALM, true);
final PerAlleleCollection<Double> tumorGenotypeLLForward = getHetGenotypeLogLikelihoods(mergedVC, forwardPRALM, originalNormalReadQualities, alleleFractionsForward);
final PerAlleleCollection<Double> alleleFractionsReverse = estimateAlleleFraction(mergedVC, reversePRALM, true);
final PerAlleleCollection<Double> tumorGenotypeLLReverse = getHetGenotypeLogLikelihoods(mergedVC, reversePRALM, originalNormalReadQualities, alleleFractionsReverse);
double tumorLod_fwd = tumorGenotypeLLForward.getAlt(alleleWithHighestTumorLOD) - tumorGenotypeLLForward.getRef();
double tumorLod_rev = tumorGenotypeLLReverse.getAlt(alleleWithHighestTumorLOD) - tumorGenotypeLLReverse.getRef();
double tumorSBpower_fwd = 0.0;
double tumorSBpower_rev = 0.0;
try {
// Note that we use the observed combined (+ and -) allele fraction for power calculation in either direction
tumorSBpower_fwd = strandArtifactPowerCalculator.cachedPowerCalculation(forwardPRALM.getNumberOfStoredElements(), altAlleleFractions.getAlt(alleleWithHighestTumorLOD));
tumorSBpower_rev = strandArtifactPowerCalculator.cachedPowerCalculation(reversePRALM.getNumberOfStoredElements(), altAlleleFractions.getAlt(alleleWithHighestTumorLOD));
}
catch (Throwable t) {
System.err.println("Error processing " + activeRegionWindow.getContig() + ":" + loc);
t.printStackTrace(System.err);
throw new RuntimeException(t);
}
callVcb.attribute(GATKVCFConstants.TLOD_FWD_KEY, tumorLod_fwd);
callVcb.attribute(GATKVCFConstants.TLOD_REV_KEY, tumorLod_rev);
callVcb.attribute(GATKVCFConstants.TUMOR_SB_POWER_FWD_KEY, tumorSBpower_fwd);
callVcb.attribute(GATKVCFConstants.TUMOR_SB_POWER_REV_KEY, tumorSBpower_rev);
// TODO: add vcf INFO fields. see callVcb.attribute(GATKVCFConstants.HAPLOTYPE_COUNT_KEY, haplotypeCount);
if ((tumorSBpower_fwd > MTAC.STRAND_ARTIFACT_POWER_THRESHOLD && tumorLod_fwd < MTAC.STRAND_ARTIFACT_LOD_THRESHOLD) ||
(tumorSBpower_rev > MTAC.STRAND_ARTIFACT_POWER_THRESHOLD && tumorLod_rev < MTAC.STRAND_ARTIFACT_LOD_THRESHOLD))
callVcb.filter(GATKVCFConstants.STRAND_ARTIFACT_FILTER_NAME);
}
// TODO: this probably belongs in M2::calculateFilters()
if (numPassingAlts > 1) {
callVcb.filter(GATKVCFConstants.TRIALLELIC_SITE_FILTER_NAME);
}
// build genotypes TODO: this part needs review and refactor
List<Allele> tumorAlleles = new ArrayList<>();
tumorAlleles.add(mergedVC.getReference());
tumorAlleles.add(mergedVC.getAlternateAllele(lodInd));
GenotypeBuilder tumorGenotype =
new GenotypeBuilder(tumorSampleName, tumorAlleles);
tumorGenotype.attribute(GATKVCFConstants.ALLELE_FRACTION_KEY, afs[lodInd]);
// how should we set the genotype properly here?
List<Allele> refAlleles = new ArrayList<>();
refAlleles.add(mergedVC.getReference());
refAlleles.add(mergedVC.getReference());
tumorAlleles.add(alleleWithHighestTumorLOD);
Genotype tumorGenotype = new GenotypeBuilder(tumorSampleName, tumorAlleles)
.attribute(GATKVCFConstants.ALLELE_FRACTION_KEY, altAlleleFractions.getAlt(alleleWithHighestTumorLOD))
.make(); // TODO: add ADs?
List<Genotype> genotypes = new ArrayList<>();
genotypes.add(tumorGenotype.make());
genotypes.add(tumorGenotype);
// if we are calling with a normal, add that sample in
// We assume that the genotype in the normal is 0/0
// TODO: is normal always homozygous reference?
List<Allele> homRefAllelesforNormalGenotype = new ArrayList<>();
homRefAllelesforNormalGenotype.addAll(Collections.nCopies(2, mergedVC.getReference()));
// if we are calling with a normal, build the genotype for the sample to appear in vcf
int REF = 0, ALT = 1;
if (hasNormal) {
int[] normalCounts = getRefAltCount(mergedVC, normalPRALM);
int[] normalAD = new int[2];
normalAD[REF_INDEX] = normalCounts[REF_INDEX];
normalAD[1] = normalCounts[lodInd+1];
double normalF = (double) normalAD[1] / ((double) normalAD[REF_INDEX] + (double) normalAD[1]);
PerAlleleCollection<Integer> normalCounts = getRefAltCount(mergedVC, normalPRALM, false);
final int normalRefAlleleDepth = normalCounts.getRef();
final int normalAltAlleleDepth = normalCounts.getAlt(alleleWithHighestTumorLOD);
final int[] normalAlleleDepths = { normalRefAlleleDepth, normalAltAlleleDepth };
final double normalAlleleFraction = (double) normalAltAlleleDepth / ( normalRefAlleleDepth + normalAltAlleleDepth);
GenotypeBuilder normalGenotype =
new GenotypeBuilder(matchedNormalSampleName, refAlleles).AD(normalAD);
normalGenotype.attribute(GATKVCFConstants.ALLELE_FRACTION_KEY, normalF);
genotypes.add(normalGenotype.make());
final Genotype normalGenotype = new GenotypeBuilder(matchedNormalSampleName, homRefAllelesforNormalGenotype)
.AD(normalAlleleDepths)
.attribute(GATKVCFConstants.ALLELE_FRACTION_KEY, normalAlleleFraction)
.make();
genotypes.add(normalGenotype);
}
//only use alleles found in the tumor (
@ -426,91 +432,125 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
}
}
/** Calculate the genotype likelihoods for variable allele fraction
/** Calculate the likelihoods of hom ref and each het genotype of the form ref/alt
*
* @param mergedVC input VC
* @param tumorPRALM read likelihoods
* @param originalNormalMQs original MQs, before boosting normals to avoid qual capping
* @param afs allele fraction(s) for alternate allele(s)
* @param alleleFractions allele fraction(s) for alternate allele(s)
*
* @return genotype likelihoods for homRef (index 0) and het for each alternate allele
* @return genotype likelihoods for homRef and het for each alternate allele
*/
private double[] getVariableGenotypeLikelihoods(final VariantContext mergedVC, final PerReadAlleleLikelihoodMap tumorPRALM,
final Map<String, Integer> originalNormalMQs, double[] afs) {
double[] genotypeLikelihoods = new double[mergedVC.getNAlleles()];
for(Map.Entry<GATKSAMRecord,Map<Allele, Double>> e : tumorPRALM.getLikelihoodReadMap().entrySet()) {
Map<Allele, Double> m = e.getValue();
Double refLL = m.get(mergedVC.getReference());
if (originalNormalMQs.get(e.getKey().getReadName()) != 0) {
genotypeLikelihoods[0] += Math.log10(Math.pow(10, refLL));
for (int altInd = 0; altInd < mergedVC.getNAlleles()-1; altInd++) {
Double altLL = m.get(mergedVC.getAlternateAllele(altInd));
genotypeLikelihoods[altInd+1] += Math.log10(Math.pow(10, refLL) * (1 - afs[altInd]) + Math.pow(10, altLL) * afs[altInd]);
}
}
private PerAlleleCollection<Double> getHetGenotypeLogLikelihoods(final VariantContext mergedVC,
final PerReadAlleleLikelihoodMap tumorPRALM,
final Map<String, Integer> originalNormalMQs,
final PerAlleleCollection<Double> alleleFractions) {
// make sure that alleles in alleleFraction are a subset of alleles in the variant context
if (! mergedVC.getAlternateAlleles().containsAll(alleleFractions.getAltAlleles()) ){
throw new IllegalArgumentException("alleleFractions has alleles that are not in the variant context");
}
return genotypeLikelihoods;
final PerAlleleCollection<MutableDouble> genotypeLogLikelihoods = PerAlleleCollection.createPerRefAndAltAlleleCollection();
for (final Allele allele : mergedVC.getAlleles()){
genotypeLogLikelihoods.set(allele, new MutableDouble(0.0));
}
final Allele refAllele = mergedVC.getReference();
for(Map.Entry<GATKSAMRecord,Map<Allele, Double>> readAlleleLikelihoodMap : tumorPRALM.getLikelihoodReadMap().entrySet()) {
Map<Allele, Double> alleleLikelihoodMap = readAlleleLikelihoodMap.getValue();
if (originalNormalMQs.get(readAlleleLikelihoodMap.getKey().getReadName()) == 0) {
continue;
}
final double readRefLogLikelihood = alleleLikelihoodMap.get(refAllele);
genotypeLogLikelihoods.getRef().add(readRefLogLikelihood);
for (Allele altAllele : alleleFractions.getAltAlleles()) {
double readAltLogLikelihood = alleleLikelihoodMap.get(altAllele);
double adjustedReadAltLL = Math.log10(
Math.pow(10, readRefLogLikelihood) * (1 - alleleFractions.getAlt(altAllele)) +
Math.pow(10, readAltLogLikelihood) * alleleFractions.getAlt(altAllele)
);
genotypeLogLikelihoods.get(altAllele).add(adjustedReadAltLL);
}
}
final PerAlleleCollection<Double> result = PerAlleleCollection.createPerRefAndAltAlleleCollection();
mergedVC.getAlleles().stream().forEach(a -> result.set(a,genotypeLogLikelihoods.get(a).toDouble()));
return result;
}
/**
* Find the allele fractions for each alternate allele
*
* @param vc input VC, for alleles
* @param map read likelihoods
* @param pralm read likelihoods
* @return estimated AF for each alt
*/
// FIXME: calculate using the uncertainty rather than this cheap approach
private double[] estimateAlleleFraction(VariantContext vc, PerReadAlleleLikelihoodMap map) {
int[] counts = getRefAltCount(vc, map);
int numAlts = vc.getNAlleles()-1;
double[] afs = new double[numAlts];
int refCount = counts[0];
int altCount;
private PerAlleleCollection<Double> estimateAlleleFraction(final VariantContext vc,
final PerReadAlleleLikelihoodMap pralm,
final boolean oneStrandOnly) {
final PerAlleleCollection<Integer> alleleCounts = getRefAltCount(vc, pralm, oneStrandOnly);
final PerAlleleCollection<Double> alleleFractions = PerAlleleCollection.createPerAltAlleleCollection();
for(int altInd = 0; altInd < numAlts; altInd++) {
altCount = counts[altInd+1];
afs[altInd] = (double) altCount / ((double) refCount + (double) altCount);
//logger.info("Counted " + refCount + " ref and " + altCount + " alt " );
int refCount = alleleCounts.getRef();
for ( final Allele altAllele : vc.getAlternateAlleles() ) {
int altCount = alleleCounts.getAlt(altAllele);
double alleleFraction = (double) altCount / (refCount + altCount);
alleleFractions.setAlt(altAllele, alleleFraction);
// logger.info("Counted " + refCount + " ref and " + altCount + " alt " );
}
return afs;
return alleleFractions;
}
/**
* Evalutate the most likely allele for each read, if it is in fact informative
* Go through the PRALM and tally the most likely allele in each read. Only count informative reads.
*
* @param mergedVC input VC, for alleles
* @param afMap read likelihoods
* @param vc input VC, for alleles
* @param pralm read likelihoods
* @return an array giving the read counts for the ref and each alt allele
*/
// TODO: ensure there are only two alleles in the VC
private int[] getRefAltCount(VariantContext mergedVC, PerReadAlleleLikelihoodMap afMap) {
int counts[] = new int[mergedVC.getNAlleles()];
int REF = 0;
private PerAlleleCollection<Integer> getRefAltCount(final VariantContext vc,
final PerReadAlleleLikelihoodMap pralm,
final boolean oneStrandOnly) {
// Check that the alleles in Variant Context are in PRALM
// Skip the check for strand-conscious PRALM; + reads may not have alleles in - reads, for example.
final Set<Allele> vcAlleles = new HashSet<>(vc.getAlleles());
if ( ! oneStrandOnly && ! pralm.getAllelesSet().containsAll( vcAlleles ) ) {
StringBuilder message = new StringBuilder();
message.append("At Locus chr" + vc.getContig() + ":" + vc.getStart() + ", we detected that variant context had alleles that not in PRALM. ");
message.append("VC alleles = " + vcAlleles + ", PRALM alleles = " + pralm.getAllelesSet());
logger.warn(message);
}
for(Map.Entry<GATKSAMRecord,Map<Allele, Double>> e : afMap.getLikelihoodReadMap().entrySet()) {
Map<Allele, Double> m = e.getValue();
Double rl = m.get(mergedVC.getReference());
for(int altInd=0; altInd<mergedVC.getNAlleles()-1;altInd++) {
Double al = m.get(mergedVC.getAlternateAllele(altInd));
logger.debug("At " + mergedVC.getStart() + ", for read " + e.getKey().getReadName() + ", al = " + al + ", rl = " + rl + ", diff = " + (al - rl));
if (arePairHMMLikelihoodsInformative(rl, al)) {
if (rl > al) {
counts[REF]++;
} else {
counts[altInd+1]++;
logM2Debug("Using " + e.getKey().toString() + " towards alternate allele count");
}
}
final PerAlleleCollection<MutableInt> alleleCounts = PerAlleleCollection.createPerRefAndAltAlleleCollection();
// initialize the allele counts to 0
for (final Allele allele : vcAlleles) {
alleleCounts.set(allele, new MutableInt(0));
}
for (final Map.Entry<GATKSAMRecord, Map<Allele, Double>> readAlleleLikelihoodMap : pralm.getLikelihoodReadMap().entrySet()) {
final GATKSAMRecord read = readAlleleLikelihoodMap.getKey();
final Map<Allele, Double> alleleLikelihoodMap = readAlleleLikelihoodMap.getValue();
MostLikelyAllele mostLikelyAllele = PerReadAlleleLikelihoodMap.getMostLikelyAllele(alleleLikelihoodMap, vcAlleles);
if (read.getMappingQuality() > 0 && mostLikelyAllele.isInformative()) {
alleleCounts.get(mostLikelyAllele.getMostLikelyAllele()).increment();
}
// if (al >= rl) logger.info("Alt found in " + e.getKey().getReadName());
}
return counts;
}
final PerAlleleCollection<Integer> result = PerAlleleCollection.createPerRefAndAltAlleleCollection();
vc.getAlleles().stream().forEach(a -> result.set(a, alleleCounts.get(a).toInteger()));
return(result);
}
private void logM2Debug(String s) {
if (MTAC.M2_DEBUG) {
@ -518,14 +558,6 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
}
}
// would have used org.broadinstitute.sting.utils.genotyper.PerReadAlleleLikelihoodMap.getMostLikelyAllele but we have this case where
// there is a read that doesn't overlap the variant site, and thus supports both alleles equally.
private boolean arePairHMMLikelihoodsInformative(double l1, double l2) {
// TODO: should this be parameterized, or simply encoded
double EPSILON = 0.1;
return (Math.abs(l1 - l2) >= EPSILON);
}
private void filterPRALMForOverlappingReads(PerReadAlleleLikelihoodMap pralm, Allele ref, int location, boolean retainMismatches) {
Map<GATKSAMRecord, Map<Allele, Double>> m = pralm.getLikelihoodReadMap();
@ -598,36 +630,21 @@ public class SomaticGenotypingEngine extends HaplotypeCallerGenotypingEngine {
}
}
private void splitPRALMintoForwardAndReverseReads(final PerReadAlleleLikelihoodMap original, final PerReadAlleleLikelihoodMap forward, final PerReadAlleleLikelihoodMap reverse) {
Map<GATKSAMRecord, Map<Allele, Double>> origMap = original.getLikelihoodReadMap();
Map<GATKSAMRecord, Map<Allele, Double>> fwdMap = forward.getLikelihoodReadMap();
Map<GATKSAMRecord, Map<Allele, Double>> revMap = reverse.getLikelihoodReadMap();
// iterate through the reads, assign reads and likelihoods to the forward or reverse maps based on the read's strand
Set<GATKSAMRecord> forwardReads = new HashSet<>();
Set<GATKSAMRecord> reverseReads = new HashSet<>();
for(GATKSAMRecord rec : origMap.keySet()) {
if (rec.isStrandless())
private void splitPRALMintoForwardAndReverseReads(final PerReadAlleleLikelihoodMap originalPRALM, final PerReadAlleleLikelihoodMap forwardPRALM, final PerReadAlleleLikelihoodMap reversePRALM) {
Map<GATKSAMRecord, Map<Allele, Double>> origReadAlleleLikelihoodMap = originalPRALM.getLikelihoodReadMap();
for (final GATKSAMRecord read : origReadAlleleLikelihoodMap.keySet()) {
if (read.isStrandless())
continue;
if (rec.getReadNegativeStrandFlag())
reverseReads.add(rec);
else
forwardReads.add(rec);
}
final Iterator<Map.Entry<GATKSAMRecord, Map<Allele, Double>>> it = origMap.entrySet().iterator();
while ( it.hasNext() ) {
final Map.Entry<GATKSAMRecord, Map<Allele, Double>> record = it.next();
if(forwardReads.contains(record.getKey())) {
fwdMap.put(record.getKey(), record.getValue());
//logM2Debug("Dropping read " + record.getKey() + " due to overlapping read fragment rules");
}
else if (reverseReads.contains(record.getKey())){
revMap.put(record.getKey(),record.getValue());
for (final Map.Entry<Allele, Double> alleleLikelihoodMap : origReadAlleleLikelihoodMap.get(read).entrySet()) {
final Allele allele = alleleLikelihoodMap.getKey();
final Double likelihood = alleleLikelihoodMap.getValue();
if (read.getReadNegativeStrandFlag())
reversePRALM.add(read, allele, likelihood);
else
forwardPRALM.add(read, allele, likelihood);
}
}
}

View File

@ -5,7 +5,7 @@
* SOFTWARE LICENSE AGREEMENT
* FOR ACADEMIC NON-COMMERCIAL RESEARCH PURPOSES ONLY
*
* This Agreement is made between the Broad Institute, Inc. with a principal address at 415 Main Street, Cambridge, MA 02142 (BROAD) and the LICENSEE and is effective at the date the downloading is completed (EFFECTIVE DATE).
* This Agreement is made between the Broad Institute, Inc. with a principal address at 415 Main Street, Cambridge, MA 02142 ("BROAD") and the LICENSEE and is effective at the date the downloading is completed ("EFFECTIVE DATE").
*
* WHEREAS, LICENSEE desires to license the PROGRAM, as defined hereinafter, and BROAD wishes to have this PROGRAM utilized in the public interest, subject only to the royalty-free, nonexclusive, nontransferable license rights of the United States Government pursuant to 48 CFR 52.227-14; and
* WHEREAS, LICENSEE desires to license the PROGRAM and BROAD desires to grant a license on the following terms and conditions.
@ -21,11 +21,11 @@
* 2.3 License Limitations. Nothing in this Agreement shall be construed to confer any rights upon LICENSEE by implication, estoppel, or otherwise to any computer software, trademark, intellectual property, or patent rights of BROAD, or of any other entity, except as expressly granted herein. LICENSEE agrees that the PROGRAM, in whole or part, shall not be used for any commercial purpose, including without limitation, as the basis of a commercial software or hardware product or to provide services. LICENSEE further agrees that the PROGRAM shall not be copied or otherwise adapted in order to circumvent the need for obtaining a license for use of the PROGRAM.
*
* 3. PHONE-HOME FEATURE
* LICENSEE expressly acknowledges that the PROGRAM contains an embedded automatic reporting system (PHONE-HOME) which is enabled by default upon download. Unless LICENSEE requests disablement of PHONE-HOME, LICENSEE agrees that BROAD may collect limited information transmitted by PHONE-HOME regarding LICENSEE and its use of the PROGRAM. Such information shall include LICENSEES user identification, version number of the PROGRAM and tools being run, mode of analysis employed, and any error reports generated during run-time. Collection of such information is used by BROAD solely to monitor usage rates, fulfill reporting requirements to BROAD funding agencies, drive improvements to the PROGRAM, and facilitate adjustments to PROGRAM-related documentation.
* LICENSEE expressly acknowledges that the PROGRAM contains an embedded automatic reporting system ("PHONE-HOME") which is enabled by default upon download. Unless LICENSEE requests disablement of PHONE-HOME, LICENSEE agrees that BROAD may collect limited information transmitted by PHONE-HOME regarding LICENSEE and its use of the PROGRAM. Such information shall include LICENSEE'S user identification, version number of the PROGRAM and tools being run, mode of analysis employed, and any error reports generated during run-time. Collection of such information is used by BROAD solely to monitor usage rates, fulfill reporting requirements to BROAD funding agencies, drive improvements to the PROGRAM, and facilitate adjustments to PROGRAM-related documentation.
*
* 4. OWNERSHIP OF INTELLECTUAL PROPERTY
* LICENSEE acknowledges that title to the PROGRAM shall remain with BROAD. The PROGRAM is marked with the following BROAD copyright notice and notice of attribution to contributors. LICENSEE shall retain such notice on all copies. LICENSEE agrees to include appropriate attribution if any results obtained from use of the PROGRAM are included in any publication.
* Copyright 2012-2014 Broad Institute, Inc.
* Copyright 2012-2016 Broad Institute, Inc.
* Notice of attribution: The GATK3 program was made available through the generosity of Medical and Population Genetics program at the Broad Institute, Inc.
* LICENSEE shall not use any trademark or trade name of BROAD, or any variation, adaptation, or abbreviation, of such marks or trade names, or any names of officers, faculty, students, employees, or agents of BROAD except as states above for attribution purposes.
*
@ -54,83 +54,143 @@ package org.broadinstitute.gatk.tools.walkers.cancer.m2;
import org.apache.commons.math.MathException;
import org.apache.commons.math.distribution.BinomialDistribution;
import org.apache.commons.math.distribution.BinomialDistributionImpl;
import org.apache.commons.math3.util.Pair;
public class TumorPowerCalculator extends AbstractPowerCalculator{
private double constantContamination;
private boolean enableSmoothing;
import java.util.Arrays;
import java.util.HashMap;
import java.util.OptionalInt;
import java.util.stream.IntStream;
public TumorPowerCalculator(double constantEps, double constantLodThreshold, double constantContamination) {
this(constantEps, constantLodThreshold, constantContamination, true);
/**
* We store a memo to avoid repeated computation of statistical power to detect a variant.
* The key of the memo is a pair of numbers: number of reads and estimated allele fraction
*/
public class TumorPowerCalculator {
private final double errorProbability;
private final double tumorLODThreshold;
private final double contamination;
private final boolean enableSmoothing;
public static int numCacheHits = 0;
private final HashMap<PowerCacheKey, Double> cache = new HashMap<PowerCacheKey, Double>();
public TumorPowerCalculator(double errorProbability, double constantLodThreshold, double contamination) {
this(errorProbability, constantLodThreshold, contamination, true);
}
public TumorPowerCalculator(double constantEps, double constantLodThreshold, double constantContamination, boolean enableSmoothing) {
this.constantEps = constantEps;
this.constantLodThreshold = constantLodThreshold;
this.constantContamination = constantContamination;
public TumorPowerCalculator(double errorProbability, double tumorLODThreshold, double contamination, boolean enableSmoothing) {
this.errorProbability = errorProbability;
this.tumorLODThreshold = tumorLODThreshold;
this.contamination = contamination;
this.enableSmoothing = enableSmoothing;
}
public double cachingPowerCalculation(int n, double delta) throws MathException {
PowerCacheKey key = new PowerCacheKey(n, delta);
/**
* A helper class that acts as the key to the memo of pre-computed power
*
* TODO: Not ideal to use double as a key. Refactor such that we use as keys numAlts and numReads, which are integers. Then calculate numAlts/numReads when we need allele fraction.
*
*/
private static class PowerCacheKey extends Pair<Integer, Double> {
private final Double alleleFraction;
private final Integer numReads;
public PowerCacheKey(final int numReads, final double alleleFraction) {
super(numReads, alleleFraction);
this.alleleFraction = alleleFraction;
this.numReads = numReads;
}
private boolean closeEnough(final double x, final double y, final double epsilon){
return(Math.abs(x - y) < epsilon);
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
PowerCacheKey that = (PowerCacheKey) o;
return (closeEnough(alleleFraction, that.alleleFraction, 0.001) && numReads != that.numReads);
}
@Override
public int hashCode() {
int result;
long temp;
result = numReads;
temp = alleleFraction != +0.0d ? Double.doubleToLongBits(alleleFraction) : 0L;
result = 31 * result + (int) (temp ^ (temp >>> 32));
return result;
}
}
/**
*
* @param numReads total number of reads, REF and ALT combined, in + or - strand
* @param alleleFraction the true allele fraction estimated as the combined allele fraction from + and - reads
* @return probability of correctly calling the variant (i.e. power) given the above estimated allele fraction and number of reads.
* we compute power separately for each strand (+ and -)
* @throws MathException
*
*/
public double cachedPowerCalculation(final int numReads, final double alleleFraction) throws MathException {
PowerCacheKey key = new PowerCacheKey(numReads, alleleFraction);
// we first look up if power for given number of read and allele fraction has already been computed and stored in the cache.
// if not we compute it and store it in teh cache.
Double power = cache.get(key);
if (power == null) {
power = calculatePower(n, constantEps, constantLodThreshold, delta, constantContamination, enableSmoothing);
power = calculatePower(numReads, alleleFraction);
cache.put(key, power);
} else {
numCacheHits++;
}
return power;
return power;
}
protected static double calculateTumorLod(int depth, int alts, double eps, double contam) {
double f = (double) alts / (double) depth;
return (AbstractPowerCalculator.calculateLogLikelihood(depth, alts, eps, f) - AbstractPowerCalculator.calculateLogLikelihood(depth, alts, eps, Math.min(f,contam)));
/* helper function for calculateTumorLod */
private double calculateLogLikelihood(final int numReads, final int numAlts, final double alleleFraction) {
return((numReads-numAlts) * Math.log10( alleleFraction * errorProbability + (1 - alleleFraction)*(1 - errorProbability) ) +
numAlts * Math.log10(alleleFraction * (1 - errorProbability) + (1 - alleleFraction) * errorProbability));
}
protected static double calculatePower(int depth, double eps, double lodThreshold, double delta, double contam, boolean enableSmoothing) throws MathException {
if (depth==0) return 0;
private double calculateTumorLod(final int numReads, final int numAlts) {
final double alleleFraction = (double) numAlts / (double) numReads;
final double altLikelihod = calculateLogLikelihood(numReads, numAlts, alleleFraction);
final double refLikelihood = calculateLogLikelihood(numReads, numAlts, contamination);
return(altLikelihod - refLikelihood);
}
// calculate the probability of each configuration
double p_alt_given_e_delta = delta*(1d-eps) + (1d-delta)*eps;
BinomialDistribution binom = new BinomialDistributionImpl(depth, p_alt_given_e_delta);
double[] p = new double[depth+1];
for(int i=0; i<p.length; i++) {
p[i] = binom.probability(i);
}
private double calculatePower(final int numReads, final double alleleFraction) throws MathException {
if (numReads==0) return 0;
// calculate the LOD scores
double[] lod = new double[depth+1];
for(int i=0; i<lod.length; i++) {
lod[i] = calculateTumorLod(depth, i, eps, contam);
}
// TODO: add the factor of 1/3
final double probAltRead = alleleFraction*(1 - errorProbability) + (1/3)*(1 - alleleFraction) * errorProbability;
final BinomialDistribution binom = new BinomialDistributionImpl(numReads, probAltRead);
final double[] binomialProbabilities = IntStream.range(0, numReads + 1).mapToDouble(binom::probability).toArray();
int k = -1;
for(int i=0; i<lod.length; i++) {
if (lod[i] >= lodThreshold) {
k = i;
break;
}
}
// find the smallest number of ALT reads k such that tumorLOD(k) > tumorLODThreshold
final OptionalInt smallestKAboveLogThreshold = IntStream.range(0, numReads + 1)
.filter(k -> calculateTumorLod(numReads, k) > tumorLODThreshold)
.findFirst();
// if no depth meets the lod score, the power is zero
if (k == -1) {
if (! smallestKAboveLogThreshold.isPresent()){
return 0;
}
double power = 0;
if (smallestKAboveLogThreshold.getAsInt() <= 0){
throw new IllegalStateException("smallest k that meets the tumor LOD threshold is less than or equal to 0");
}
double power = Arrays.stream(binomialProbabilities, smallestKAboveLogThreshold.getAsInt(), binomialProbabilities.length).sum();
// here we correct for the fact that the exact lod threshold is likely somewhere between
// the k and k-1 bin, so we prorate the power from that bin
// the k and k-1 bin, so we prorate the power from that bin
// if k==0, it must be that lodThreshold == lod[k] so we don't have to make this correction
if ( enableSmoothing && k > 0 ) {
double x = 1d - (lodThreshold - lod[k-1]) / (lod[k] - lod[k-1]);
power = x*p[k-1];
}
for(int i=k; i<p.length; i++) {
power += p[i];
if ( enableSmoothing ){
final double tumorLODAtK = calculateTumorLod(numReads, smallestKAboveLogThreshold.getAsInt());
final double tumorLODAtKMinusOne = calculateTumorLod(numReads, smallestKAboveLogThreshold.getAsInt()-1);
final double weight = 1 - (tumorLODThreshold - tumorLODAtKMinusOne ) / (tumorLODAtK - tumorLODAtKMinusOne);
power += weight * binomialProbabilities[smallestKAboveLogThreshold.getAsInt() - 1];
}
return(power);

View File

@ -72,6 +72,9 @@ public class MuTect2IntegrationTest extends WalkerTest {
final static String DREAM3_TP_INTERVALS_FILE = privateTestDir + "m2_dream3.tp.intervals";
final static String DREAM3_FP_INTERVALS_FILE = privateTestDir + "m2_dream3.fp.intervals";
final static String MULTIALLELIC_TUMOR_BAM = privateTestDir + "m2-multiallelic-tumor.bam";
final String commandLine =
"-T MuTect2 --no_cmdline_in_header -dt NONE --disableDithering -alwaysloadVectorHMM -pairHMM LOGLESS_CACHING -ip 50 -R %s --dbsnp %s --cosmic %s --normal_panel %s -I:tumor %s -I:normal %s -L %s";
@ -121,7 +124,7 @@ public class MuTect2IntegrationTest extends WalkerTest {
@Test
public void testMicroRegression() {
M2Test(CCLE_MICRO_TUMOR_BAM, CCLE_MICRO_NORMAL_BAM, CCLE_MICRO_INTERVALS_FILE, "", "a7658ccfb75bf1ce8d3d3cfbf3b552f0");
M2Test(CCLE_MICRO_TUMOR_BAM, CCLE_MICRO_NORMAL_BAM, CCLE_MICRO_INTERVALS_FILE, "", "dc6d742e85a59b237f5541109a6d343e");
}
/**
@ -131,7 +134,7 @@ public class MuTect2IntegrationTest extends WalkerTest {
*/
@Test
public void testTruePositivesDream3() {
M2Test(DREAM3_TUMOR_BAM, DREAM3_NORMAL_BAM, DREAM3_TP_INTERVALS_FILE, "", "91dee82a13275e5568f5d2e680e3162b");
M2Test(DREAM3_TUMOR_BAM, DREAM3_NORMAL_BAM, DREAM3_TP_INTERVALS_FILE, "", "7faeb329798cca63a42867404111847c");
}
/**
@ -140,7 +143,7 @@ public class MuTect2IntegrationTest extends WalkerTest {
@Test
public void testTruePositivesDream3TrackedDropped() {
M2TestWithDroppedReads(DREAM3_TUMOR_BAM, DREAM3_NORMAL_BAM, "21:10935369", "",
"4f1337df1de5dd4468e2d389403ca785",
"a2e6cc12a21219d510b6719ee86c676e",
"b536e76870326b4be01b8d6b83c1cf1c");
}
@ -150,7 +153,7 @@ public class MuTect2IntegrationTest extends WalkerTest {
*/
@Test
public void testFalsePositivesDream3() {
M2Test(DREAM3_TUMOR_BAM, DREAM3_NORMAL_BAM, DREAM3_FP_INTERVALS_FILE, "", "6be3fc318e2c22a28098f58b76c9a5a1");
M2Test(DREAM3_TUMOR_BAM, DREAM3_NORMAL_BAM, DREAM3_FP_INTERVALS_FILE, "", "fe3adcf8ac45e8ec9a9feb26908f67a9"); // e2413f4166b6ed20be6cdee6616ba43d
}
/**
@ -158,7 +161,7 @@ public class MuTect2IntegrationTest extends WalkerTest {
*/
@Test
public void testContaminationCorrection() {
M2Test(CCLE_MICRO_TUMOR_BAM, CCLE_MICRO_NORMAL_BAM, CCLE_MICRO_INTERVALS_FILE, "-contamination 0.1", "b1010a6614b0332c41fd6da9d5f6b14e");
M2Test(CCLE_MICRO_TUMOR_BAM, CCLE_MICRO_NORMAL_BAM, CCLE_MICRO_INTERVALS_FILE, "-contamination 0.1", "4ffcef4c72ac72b9b8738efdcf3e04e9");
}
/**
@ -166,7 +169,19 @@ public class MuTect2IntegrationTest extends WalkerTest {
*/
@Test
public void testTumorOnly(){
m2TumorOnlyTest(CCLE_MICRO_TUMOR_BAM, "2:166000000-167000000", "", "bb0cddfdc29500fbea68a0913d6706a3");
m2TumorOnlyTest(CCLE_MICRO_TUMOR_BAM, "2:166000000-167000000", "", "6044780242414820090c5b4b1d4b8ac0");
}
@Test
public void testStrandArtifactFilter(){
M2Test(DREAM3_TUMOR_BAM, DREAM3_NORMAL_BAM, DREAM3_FP_INTERVALS_FILE, "--enable_strand_artifact_filter", "b988ba4b5f3af4674e28b3501bd3b124");
}
// @Test
// public void testMultiAllelicSite(){
// // TODO need b38 reference
// m2TumorOnlyTest(MULTIALLELIC_TUMOR_BAM, "1:23558000-23560000", "", "5c7182623391c1faec3f7c05c0506781")
// }
}

View File

@ -49,92 +49,59 @@
* 8.7 Governing Law. This Agreement shall be construed, governed, interpreted and applied in accordance with the internal laws of the Commonwealth of Massachusetts, U.S.A., without regard to conflict of laws principles.
*/
package org.broadinstitute.gatk.tools.walkers.cancer.m2
package org.broadinstitute.gatk.tools.walkers.cancer.m2;
import java.io.File
import htsjdk.variant.variantcontext.Allele;
import org.testng.annotations.Test;
import org.broadinstitute.gatk.queue.QScript
import org.broadinstitute.gatk.queue.extensions.gatk._
import org.broadinstitute.gatk.queue.function.CommandLineFunction
import org.broadinstitute.gatk.queue.util.QScriptUtils
import org.broadinstitute.gatk.utils.commandline.{Input, Output}
import org.broadinstitute.gatk.utils.variant.GATKVariantContextUtils.FilteredRecordMergeType
import static org.testng.Assert.*;
import scala.collection.mutable.ListBuffer
class create_M2_pon extends QScript {
@Argument(shortName = "bams", required = true, doc = "file of all BAM files")
var allBams: String = ""
@Argument(shortName = "o", required = true, doc = "Output prefix")
var outputPrefix: String = ""
@Argument(shortName = "minN", required = false, doc = "minimum number of sample observations to include in PON")
var minN: Int = 2
@Argument(doc="Reference fasta file to process with", fullName="reference", shortName="R", required=false)
var reference = new File("/seq/references/Homo_sapiens_assembly19/v1/Homo_sapiens_assembly19.fasta")
@Argument(doc="Intervals file to process with", fullName="intervals", shortName="L", required=true)
var intervals : File = ""
@Argument(shortName = "sc", required = false, doc = "base scatter count")
var scatter: Int = 10
/**
* Created by tsato on 6/21/16.
*/
public class PerAlleleCollectionTest {
def script() {
val bams = QScriptUtils.createSeqFromFile(allBams)
val genotypesVcf = outputPrefix + ".genotypes.vcf"
val finalVcf = outputPrefix + ".vcf"
val perSampleVcfs = new ListBuffer[File]()
for (bam <- bams) {
val outputVcf = "sample-vcfs/" + bam.getName + ".vcf"
add( createM2Config(bam, outputVcf))
perSampleVcfs += outputVcf
@Test
public void testSet() throws Exception {
PerAlleleCollection<Integer> alleleCounts = PerAlleleCollection.createPerRefAndAltAlleleCollection();
Allele refA = Allele.create("A", true);
Allele altT = Allele.create("T", false);
alleleCounts.set(refA, 40);
alleleCounts.set(altT, 10);
assertEquals((int)alleleCounts.getRef(), 40);
assertEquals((int)alleleCounts.getAlt(altT), 10);
}
val cv = new CombineVariants()
cv.reference_sequence = reference
cv.memoryLimit = 2
cv.setKey = "null"
cv.minimumN = minN
cv.memoryLimit = 16
cv.filteredrecordsmergetype = FilteredRecordMergeType.KEEP_IF_ANY_UNFILTERED
cv.filteredAreUncalled = true
cv.variant = perSampleVcfs
cv.out = genotypesVcf
// using this instead of "sites_only" because we want to keep the AC info
val vc = new VcfCutter()
vc.inVcf = genotypesVcf
vc.outVcf = finalVcf
add (cv, vc)
}
@Test
public void testGet() throws Exception {
PerAlleleCollection<Integer> alleleCounts = PerAlleleCollection.createPerRefAndAltAlleleCollection();
Allele refA = Allele.create("A", true);
Allele altT = Allele.create("T", false);
alleleCounts.set(refA, 40);
alleleCounts.set(altT, 10);
assertEquals((int)alleleCounts.get(refA), 40);
assertEquals((int)alleleCounts.get(altT), 10);
}
def createM2Config(bam : File, outputVcf : File): org.broadinstitute.gatk.queue.extensions.gatk.MuTect2 = {
val mutect2 = new org.broadinstitute.gatk.queue.extensions.gatk.MuTect2
@Test
public void testGetAltAlleles() throws Exception {
PerAlleleCollection<Integer> alleleCounts = PerAlleleCollection.createPerAltAlleleCollection();
Allele altA = Allele.create("A", false);
Allele altC = Allele.create("C", false);
Allele altG = Allele.create("G", false);
Allele altT = Allele.create("T", false);
Allele[] altAlleles = {altA, altC, altG, altT};
for (Allele altAllele : altAlleles ) {
alleleCounts.set(altAllele, 3);
}
mutect2.reference_sequence = reference
mutect2.artifact_detection_mode = true
mutect2.intervalsString :+= intervals
mutect2.memoryLimit = 2
mutect2.input_file = List(new TaggedFile(bam, "tumor"))
for (Allele altAllele : altAlleles ) {
assertTrue(alleleCounts.getAltAlleles().contains(altAllele));
}
mutect2.scatterCount = scatter
mutect2.out = outputVcf
mutect2
}
}
class VcfCutter extends CommandLineFunction {
@Input(doc = "vcf to cut") var inVcf: File = _
@Output(doc = "output vcf") var outVcf: File = _
def commandLine = "cat %s | cut -f1-8 > %s".format(inVcf, outVcf)
assertFalse(alleleCounts.getAltAlleles().contains(Allele.create("A", true)));
}
}

View File

@ -49,41 +49,32 @@
* 8.7 Governing Law. This Agreement shall be construed, governed, interpreted and applied in accordance with the internal laws of the Commonwealth of Massachusetts, U.S.A., without regard to conflict of laws principles.
*/
package org.broadinstitute.gatk.queue.qscripts.dev
package org.broadinstitute.gatk.tools.walkers.cancer.m2;
import org.broadinstitute.gatk.queue.QScript
import org.broadinstitute.gatk.queue.extensions.gatk._
import org.testng.annotations.Test;
class run_M2_dream extends QScript {
import static org.testng.Assert.*;
@Argument(shortName = "L", required=false, doc = "Intervals file")
var intervalsFile: List[File] = Nil
@Argument(shortName = "normal", required=true, doc = "Normal sample BAM")
var normalBAM: String = ""
@Argument(shortName = "tumor", required=true, doc = "Tumor sample BAM")
var tumorBAM: String = ""
@Argument(shortName = "o", required=true, doc = "Output file")
var outputFile: String = ""
@Argument(shortName = "sc", required=false, doc = "base scatter count")
var scatter: Int = 10
/**
* Created by tsato on 6/19/16.
*/
public class TumorPowerCalculatorTest {
private boolean closeEnough(double x, double y, double epsilon){
return(Math.abs(x - y) < epsilon);
}
@Test
public void testCachedPowerCalculation() throws Exception {
TumorPowerCalculator tpc = new TumorPowerCalculator(0.001, 2.0, 0.0);
final double epsilon = 0.0001;
assertTrue(closeEnough(tpc.cachedPowerCalculation(100,0.2), 1.0, epsilon));
assertTrue(closeEnough(tpc.cachedPowerCalculation(30,0.1), 0.8864, epsilon));
assertTrue(closeEnough(tpc.cachedPowerCalculation(0,0.02), 0.0, epsilon));
assertTrue(closeEnough(tpc.cachedPowerCalculation(5, 0.01), 0.0520, epsilon));
def script() {
}
val mutect2 = new MuTect2
mutect2.reference_sequence = new File("/seq/references/Homo_sapiens_assembly19/v1/Homo_sapiens_assembly19.fasta")
mutect2.cosmic :+= new File("/xchip/cga/reference/hg19/hg19_cosmic_v54_120711.vcf")
mutect2.dbsnp = new File("/humgen/gsa-hpprojects/GATK/bundle/current/b37/dbsnp_138.b37.vcf")
mutect2.normal_panel :+= new File("/xchip/cga/reference/hg19/wgs_hg19_125_cancer_blood_normal_panel.vcf")
mutect2.intervalsString = intervalsFile
mutect2.memoryLimit = 2
mutect2.input_file = List(new TaggedFile(normalBAM, "normal"), new TaggedFile(tumorBAM, "tumor"))
mutect2.scatterCount = scatter
mutect2.out = outputFile
add(mutect2)
}
}
}

View File

@ -303,10 +303,10 @@ public class VariantAnnotatorEngine {
if ( !(annotationType instanceof ActiveRegionBasedAnnotation) )
continue;
final Map<String, Object> annotationsFromCurrentType = annotationType.annotate(referenceContext, perReadAlleleLikelihoodMap, newGenotypeAnnotatedVC);
if (annotationsFromCurrentType != null) {
infoAnnotations.putAll(annotationsFromCurrentType);
}
final Map<String, Object> annotationsFromCurrentType = annotationType.annotate(null, walker, referenceContext, null, newGenotypeAnnotatedVC, perReadAlleleLikelihoodMap);
if (annotationsFromCurrentType != null) {
infoAnnotations.putAll(annotationsFromCurrentType);
}
}
// create a new VC with info and genotype annotations

View File

@ -488,6 +488,7 @@ public final class AlignmentUtils {
}
}
// pos counts read bases. alignmentPos counts ref bases
int pos = 0;
int alignmentPos = 0;

View File

@ -134,6 +134,11 @@ public final class GATKVCFConstants {
public static final String OXOG_FRACTION_KEY = "FOXOG";
public static final String AS_INSERT_SIZE_RANK_SUM_KEY = "AS_InsertSizeRankSum";
public static final String AS_RAW_INSERT_SIZE_RANK_SUM_KEY = "AS_RAW_InsertSizeRankSum";
public static final String TLOD_FWD_KEY = "TLOD_FWD";
public static final String TLOD_REV_KEY = "TLOD_REV";
public static final String TUMOR_SB_POWER_FWD_KEY = "TUMOR_SB_POWER_FWD";
public static final String TUMOR_SB_POWER_REV_KEY = "TUMOR_SB_POWER_REV";
//FORMAT keys
public static final String ALLELE_BALANCE_KEY = "AB";
@ -173,6 +178,7 @@ public final class GATKVCFConstants {
public static final String STR_CONTRACTION_FILTER_NAME = "str_contraction"; //M2
public static final String TUMOR_LOD_FILTER_NAME = "t_lod_fstar"; //M2
public static final String TRIALLELIC_SITE_FILTER_NAME = "triallelic_site"; //M2
public static final String STRAND_ARTIFACT_FILTER_NAME = "strand_artifact"; // M2
// Symbolic alleles
public final static String SYMBOLIC_ALLELE_DEFINITION_HEADER_TAG = "ALT";

View File

@ -72,6 +72,10 @@ public class GATKVCFHeaderLines {
addFilterLine(new VCFFilterHeaderLine(GATKVCFConstants.TUMOR_LOD_FILTER_NAME, "Tumor does not meet likelihood threshold"));
addFilterLine(new VCFFilterHeaderLine(GATKVCFConstants.STR_CONTRACTION_FILTER_NAME, "Site filtered due to contraction of short tandem repeat region"));
addFilterLine(new VCFFilterHeaderLine(GATKVCFConstants.TRIALLELIC_SITE_FILTER_NAME, "Site filtered because more than two alt alleles pass tumor LOD"));
addFilterLine(new VCFFilterHeaderLine(GATKVCFConstants.STRAND_ARTIFACT_FILTER_NAME, "Strand bias detected: evidence for alt allele comes from one read direction only"));
// addFilterLine(new VCFFilterHeaderLine(GATKVCFConstants.CLUSTERED_READ_POSITION_FILTER_NAME, "Variant appears in similar read positions"));
addFormatLine(new VCFFormatHeaderLine(ALLELE_BALANCE_KEY, 1, VCFHeaderLineType.Float, "Allele balance for each het genotype"));
addFormatLine(new VCFFormatHeaderLine(BASE_COUNTS_BY_SAMPLE_KEY, 4, VCFHeaderLineType.Integer, "Counts of each base by sample"));
@ -201,6 +205,10 @@ public class GATKVCFHeaderLines {
addInfoLine(new VCFInfoHeaderLine(GATKVCFConstants.NORMAL_LOD_KEY, 1, VCFHeaderLineType.String, "Normal LOD score"));
addInfoLine(new VCFInfoHeaderLine(GATKVCFConstants.PANEL_OF_NORMALS_COUNT_KEY, 1, VCFHeaderLineType.String, "Count from Panel of Normals"));
addInfoLine(new VCFInfoHeaderLine(GATKVCFConstants.TUMOR_LOD_KEY, 1, VCFHeaderLineType.String, "Tumor LOD score"));
addInfoLine(new VCFInfoHeaderLine(GATKVCFConstants.TLOD_FWD_KEY,1,VCFHeaderLineType.Float,"TLOD from forward reads only"));
addInfoLine(new VCFInfoHeaderLine(GATKVCFConstants.TLOD_REV_KEY,1,VCFHeaderLineType.Float,"TLOD from reverse reads only"));
addInfoLine(new VCFInfoHeaderLine(GATKVCFConstants.TUMOR_SB_POWER_FWD_KEY,1,VCFHeaderLineType.Float,"Strand bias power for forward reads"));
addInfoLine(new VCFInfoHeaderLine(GATKVCFConstants.TUMOR_SB_POWER_REV_KEY,1,VCFHeaderLineType.Float,"Stand bias power for reverse reads"));
}
}