Skip to content
This repository has been archived by the owner on Jan 27, 2020. It is now read-only.

Commit

Permalink
Merge pull request #711 from MaxUlysse/Munin
Browse files Browse the repository at this point in the history
Improve configuration file priorities
  • Loading branch information
maxulysse authored Dec 21, 2018
2 parents 83d62b2 + bc331a8 commit 8e25ef0
Show file tree
Hide file tree
Showing 7 changed files with 23 additions and 50 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
### `Changed`

- [#710](https://github.com/SciLifeLab/Sarek/pull/710) - Improve release checklist and script
- [#711](https://github.com/SciLifeLab/Sarek/pull/711) - Improve configuration priorities

## [2.2.2] - 2018-12-19

Expand Down
1 change: 0 additions & 1 deletion conf/base.config
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ wf_repository = 'maxulysse'

params {
// set up default params
containerPath = '' // Path to Singularity images
docker = false // Don't use Docker to build buildContainers.nf
download = false // Don't download reference files in buildReferences.nf
explicitBqsrNeeded = true // Enable recalibration in main.nf
Expand Down
1 change: 1 addition & 0 deletions conf/munin.config
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ env {
}

params {
containerPath = '/btb/containers/'
genome_base = params.genome == 'GRCh37' ? '/btb/references/Homo_sapiens/GATK/GRCh37/' : params.genome == 'GRCh38' ? '/btb/references/Homo_sapiens/GATK/GRCh38/' : 'References/smallGRCh37'
singleCPUMem = 15.GB
totalMemory = 754.GB
Expand Down
20 changes: 2 additions & 18 deletions docs/CONFIG.md
Original file line number Diff line number Diff line change
Expand Up @@ -90,16 +90,6 @@ The default profile is `standard`, but Sarek has multiple predefined profiles wh
```bash
nextflow run SciLifeLab/Sarek --sample mysample.tsv -profile myprofile
```
awsbatch {
binac {
btb {
cfc {
docker {
singularity {
singularityPath {
slurm {
slurmDownload {
standard {

### `awsbatch`

Expand Down Expand Up @@ -136,18 +126,12 @@ Singularity images needs to be set up.

This is another profile for use on a UPPMAX cluster using the job scheduler slurm with Singularity.
Will run the workflow on `/scratch`.
Singularity images need to be set up.

### `slurmDownload`

This is another profile for use on a UPPMAX cluster using the job scheduler slurm with Singularity.
Will run the workflow on `/scratch`.
Singularity images will be pulled automatically.
Singularity images are already set up.

### `standard`

This is the default profile for use on a localhost on a UPPMAX cluster with Singularity.
Singularity images need to be set up.
Singularity images are already set up.

## Customisation
The recommended way to use custom settings is to supply Sarek with an additional configuration file. You can use the files in the [`conf/`](https://github.com/SciLifeLab/Sarek/tree/master/conf) directory as an inspiration to make this new `.config` file and specify it using the `-c` flag:
Expand Down
4 changes: 1 addition & 3 deletions docs/INSTALL_BIANCA.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,14 +58,12 @@ For more information about using Singularity with UPPMAX, follow the [Singularit
## Install Sarek

Sarek use Singularity containers to package all the different tools.
All containers are already stored on UPPMAX.
All containers, and all Reference files are already stored on UPPMAX.

As `bianca` is secure, no direct download is available, so Sarek will have to be installed and updated manually.

You can either download Sarek on your computer or on `rackham`, make an archive, and send it to `bianca` using `FileZilla` or `sftp` given your preferences.

All Reference files are already stored in `bianca`.

```bash
# Connect to rackham
> ssh -AX [USER]@rackham.uppmax.uu.se
Expand Down
17 changes: 9 additions & 8 deletions docs/INSTALL_RACKHAM.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,17 @@ This workflow itself needs little installation.

You need to install [Nextflow][nextflow-link] and put it somewhere in your `$PATH`

The Reference files are already stored in `rackham`.
Sarek use Singularity containers to package all the different tools.

All containers, and all Reference files are already stored on UPPMAX.

Nextflow will automatically fetch Sarek from GitHub when launched if `SciLifeLab/Sarek` is specified as the workflow name.

Sarek use Singularity containers to package all the different tools.
All containers are already stored on UPPMAX.

## Test Sarek with small dataset and small reference

For more information, follow the [reference files documentation](REFERENCES.md). The following tutorial explain how to run Sarek on a small dataset using a small reference.
For more information, follow the [reference files documentation](REFERENCES.md).
The following tutorial explain how to run Sarek on a small dataset using a small reference.

```bash
# Connect to rackham
Expand Down Expand Up @@ -45,10 +46,10 @@ For more information, follow the [reference files documentation](REFERENCES.md).
> cd test_Sarek

# Build the smallGRCh37 reference
> nextflow run SciLifeLab/Sarek/buildReferences.nf --download --genome smallGRCh37 --project [PROJECT] -profile download
> nextflow run SciLifeLab/Sarek/buildReferences.nf --download --genome smallGRCh37 --project [PROJECT]

# Test the workflow on a test tiny set
> nextflow run SciLifeLab/Sarek --test --genome smallGRCh37 --noReports --project [PROJECT] -profile download
> nextflow run SciLifeLab/Sarek --test --genome smallGRCh37 --noReports --project [PROJECT]
```

## Update Sarek
Expand All @@ -63,12 +64,12 @@ For more information, follow the [reference files documentation](REFERENCES.md).

## Use Sarek with slurm

To use Sarek on rackham you will need to use the `slurmDownload` profile.
To use Sarek on rackham you will need to use the `slurm` profile.

```bash
# Connect to rackham
> ssh -AX [USER]@rackham.uppmax.uu.se

# Run the workflow directly on the login node
> nextflow run SciLifeLab/Sarek/main.nf --project [PROJECT] -profile slurmDownload
> nextflow run SciLifeLab/Sarek/main.nf --project [PROJECT] -profile slurm
```
29 changes: 9 additions & 20 deletions nextflow.config
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ profiles {
// Docker images will be pulled automatically
awsbatch {
includeConfig 'conf/base.config'
includeConfig 'conf/igenomes.config'
includeConfig 'conf/aws-batch.config'
includeConfig 'conf/docker.config'
includeConfig 'conf/igenomes.config'
includeConfig 'conf/resources.config'
includeConfig 'conf/containers.config'
}
Expand All @@ -49,8 +49,8 @@ profiles {
// Singularity images need to be set up
btb {
includeConfig 'conf/base.config'
includeConfig 'conf/igenomes.config'
includeConfig 'conf/munin.config'
includeConfig 'conf/igenomes.config'
includeConfig 'conf/singularity-path.config'
}
// Default config for CFC cluster in Tuebingen/Germany
Expand All @@ -66,59 +66,48 @@ profiles {
// Docker images will be pulled automatically
docker {
includeConfig 'conf/base.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/travis.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/docker.config'
includeConfig 'conf/containers.config'
}
// Small testing with Singularity profile
// Singularity images will be pulled automatically
singularity {
includeConfig 'conf/base.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/travis.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/singularity.config'
includeConfig 'conf/containers.config'
}
// Small testing with Singularity profile
// Singularity images need to be set up
singularityPath {
includeConfig 'conf/base.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/travis.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/singularity-path.config'
}
// slurm profile for UPPMAX secure clusters
// Runs the pipeline using the job scheduler
// Singularity images need to be set up
// Singularity images are already set up
slurm {
includeConfig 'conf/base.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/uppmax-slurm.config'
includeConfig 'conf/singularity-path.config'
}
// slurm profile for UPPMAX clusters
// Runs the pipeline using the job scheduler
// Singularity images will be pulled automatically
slurmDownload {
includeConfig 'conf/base.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/uppmax-slurm.config'
includeConfig 'conf/singularity.config'
includeConfig 'conf/containers.config'
includeConfig 'conf/singularity-path.config'
}
// Default profile for UPPMAX secure clusters
// Runs the pipeline locally on a single 16-core node
// Singularity images need to be set up
// Singularity images are already set up
standard {
includeConfig 'conf/base.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/uppmax-localhost.config'
includeConfig 'conf/genomes.config'
includeConfig 'conf/singularity-path.config'
}
}


// Function to ensure that resource requirements don't go beyond
// a maximum limit
def check_max(obj, type) {
Expand Down

0 comments on commit 8e25ef0

Please sign in to comment.