The following python modules must be installed for blacktie to function properly:


The following modules will provide useful but optional functionality:


Installing the latest version from the git repository


Git is a very useful tool to have installed and to know how to use. Learn more here and try it out here.

Clone the repo:

$ git clone git://

Install with any unmet requirements using pip:

$ [sudo] pip install -r blacktie/requirements.txt blacktie

Install using standard script:

$ cd blacktie
$ [sudo] python install

Use pip to obtain the package from PyPI

$ [sudo] pip install blacktie Mako PyYAML pprocess

Installing without using git or pip for the download

After installing the requirements:

$ wget
$ unzip
$ cd blacktie-master
$ [sudo] python install

Test to see whether the install worked

To test whether your installation was successful, open a new terminal session and type the following command.

$ blacktie

You should see the help text for blacktie and it should look something like this:

usage: blacktie [-h] [--version]
                [--prog {tophat,cufflinks,cuffmerge,cuffdiff,cummerbund,all}]
                [--hide-logs] [--no-email]
                [--mode {analyze,dry_run,qsub_script}]

This script reads options from a yaml formatted file and organizes the
execution of tophat/cufflinks runs for multiple condition sets.

positional arguments:
config_file           Path to a yaml formatted config file containing setup
                        options for the runs.

optional arguments:
-h, --help            show this help message and exit
--version             Print version number.
--prog {tophat,cufflinks,cuffmerge,cuffdiff,cummerbund,all}
                        Which program do you want to run? (default: tophat)
--hide-logs           Make your log directories hidden to keep a tidy
                        'looking' base directory. (default: False)
--no-email            Don't send email notifications. (default: False)
--mode {analyze,dry_run,qsub_script}
                        1) 'analyze': run the analysis pipeline. 2) 'dry_run':
                        walk through all steps that would be run and print out
                        the command lines; however, do not send the commands
                        to the system to be run. 3) 'qsub_script': generate
                        bash scripts suitable to be sent to a compute
                        cluster's SGE through the qsub command. (default:

If this worked, great!