EMBO workshop 2020
Dynamo is a [flexible toolbox] to help you solve problems in subtomogram averaging.
The goal of this workshop is to teach the principles of subtomogram averaging and show you some of the ways Dynamo can help you achieve that if you want subtomogram averaging to be a part of your research.
If at any time after the course you need help with Dynamo or subtomogram averaging generally, don't hesitate to ask on the [forum in Google Groups]
Contents
Instructors
Daniel Castaño-Diez - University of Basel
Alister Burt - Institut de Biologie Structurale
Materials
Day 1
Basic principles
Guided presentation:
- Basic Dynamo jargon
- tutorial on basic elements: help, data and metadata formats.
- tutorial on the basic concept in Dynamo alignment: the project.
Working on your own:
- Basic walkthrough: creating a catalogue, picking particles, launching a project.
- Complete the advanced starters guide (~2 hours)
The data can be found in
/g/cryocourse/data/dynamo/crop.rec
the chimera path you need in the tutorial is
/g/easybuild/x86_64/CentOS/7/haswell/software/Chimera/1.13-foss-2017b-Python-2.7.14/bin/chimera
- Further work:
Day 2
Geometric Modelling
Short guided presentation:
- tutorial on membrane modeling with dmslice
- Filament models with dtmslice
- Reusing model workflows ( walkthrough)
- Further work: catalogue
- In the afternoon, we will focus on the extraction of particles from densely packed spherical geometry (~1 hour)
The data is available at
/g/cryocourse/data/dynamo/v17.rec
PCA Based Classification
Walkthrough on PCA through the command line
Day 3
Fiducial based alignment and reconstruction
These new features in Dynamo are at the testing stage.
Interfacing with Other Tools
Dynamo is designed to allow maximum user flexibility and to encourage users to design their own solutions to the specific problems posed by their data.
Alister recently spent some time integrating Dynamo with Warp and M to be able to take advantage of Dynamo for tilt-series alignment and particle picking and plug the results into M's multi-particle refinement Integration with Warp and M
Creation of 3D scenes
Working on your own:
- Walkthrough on the FHV data set (~1hour)
Further support material.
- Walkthrough on depiction and manipulation of triangulations (synthetic data).
Template matching
- walkthrough for automated identification of proteosomes on a real tomogram through template matching. (~1 hour)
The data can be found in
/g/cryocourse/data/dynamo/t20s.mrc
Submitting jobs For GPU computing
The following is applicable for subtomogram averaging projects setup to be run in the "gpu_standalone" mode under computing environment.
Once your project is unfolded, you should have an executable file
wizardTestProject.exe
The EMBL-HD cluster uses slurm for job submission and resource allocation.
To run this project as a job on their computing resources, we should first set up a submission script, here is an example
#!/bin/bash #SBATCH -N 1 # number of nodes #SBATCH -n 1 # number of cores #SBATCH -o slurm.%N.%j.out # STDOUT #SBATCH -e slurm.%N.%j.err # STDERR #SBATCH --mail-type=END,FAIL # notifications for job done & fail #SBATCH --mail-user=alisterburt@gmail.com # send-to address #SBATCH -p gpu # select gpu usage #SBATCH --gres=gpu:1 # number of gpus (if using gpus) module load dynamo ./wizardTestProject.exe
Save this text file in the same directory as the executable file, then submit the job using sbatch
sbatch dynamo_test_gpu.sl
This should then confirm that the job has been submitted
Submitted batch job 65146729
Submitting jobs for CPU computing
The following is applicable for subtomogram averaging projects setup to be run in the "standalone" mode under computing environment.
Once your project is unfolded, you should have an executable file
wizardTestProject.exe
The EMBL-HD cluster uses slurm for job submission and resource allocation.
To run this project as a job on their computing resources, we should first set up a submission script, here is an example using 6 cores on 1 node
#!/bin/bash #SBATCH -N 1 # number of nodes #SBATCH -n 6 # number of cores #SBATCH -o slurm.%N.%j.out # STDOUT #SBATCH -e slurm.%N.%j.err # STDERR #SBATCH --mail-type=END,FAIL # notifications for job done & fail #SBATCH --mail-user=alisterburt@gmail.com # send-to address module load dynamo ./wizardTestProject.exe
Save this text file in the same directory as the executable file, then submit the job using sbatch
sbatch dynamo_test_cpu.sl
This should then confirm that the job has been submitted
Submitted batch job 65146755