Difference between revisions of "MZMine"

From UFRC
Jump to navigation Jump to search
(Update to remove gui module and Moba refs)
 
(17 intermediate revisions by 3 users not shown)
Line 1: Line 1:
[[Category:Software]][[Category:Bioinformatics]][[Category:Spectroscopy]]
+
[[Category:Software]][[Category:Spectroscopy]][[Category:Biology]]
 
{|<!--CONFIGURATION: REQUIRED-->
 
{|<!--CONFIGURATION: REQUIRED-->
 
|{{#vardefine:app|mzmine}}
 
|{{#vardefine:app|mzmine}}
Line 6: Line 6:
 
|{{#vardefine:conf|}}          <!--CONFIGURATION-->
 
|{{#vardefine:conf|}}          <!--CONFIGURATION-->
 
|{{#vardefine:exe|1}}            <!--ADDITIONAL INFO-->
 
|{{#vardefine:exe|1}}            <!--ADDITIONAL INFO-->
|{{#vardefine:pbs|1}}            <!--PBS SCRIPTS-->
+
|{{#vardefine:job|}}            <!--JOB SCRIPTS-->
 
|{{#vardefine:policy|}}        <!--POLICY-->
 
|{{#vardefine:policy|}}        <!--POLICY-->
 
|{{#vardefine:testing|}}      <!--PROFILING-->
 
|{{#vardefine:testing|}}      <!--PROFILING-->
Line 19: Line 19:
 
MZmine 2 is an open-source framework for processing, visualization and analysis of mass spectrometry based molecular profile data. It is based on the original MZmine toolbox described in 2006 Bioinformatics publication.
 
MZmine 2 is an open-source framework for processing, visualization and analysis of mass spectrometry based molecular profile data. It is based on the original MZmine toolbox described in 2006 Bioinformatics publication.
 
<!--Modules-->
 
<!--Modules-->
==Required Modules==
+
==Environment Modules==
===Serial===
+
Run <code>module spider {{#var:app}}</code> to find out what environment modules are available for this application.
* {{#var:app}}
 
<!--
 
===Parallel (OpenMP)===
 
* intel
 
* {{#var:app}}
 
===Parallel (MPI)===
 
* intel
 
* openmpi
 
* {{#var:app}}
 
-->
 
 
==System Variables==
 
==System Variables==
* HPC_{{#uppercase:{{#var:app}}}}_DIR - installation directory
+
* HPC_{{uc:{{#var:app}}}}_DIR - installation directory
 
<!--Configuration-->
 
<!--Configuration-->
 
{{#if: {{#var: conf}}|==Configuration==
 
{{#if: {{#var: conf}}|==Configuration==
Line 44: Line 34:
 
We provide an alternate startMZmine script that correctly sets the HEAP memory based on either the HPC_MZMINE_MEM environment variable or, if that variable if absent, based on the total amount of requested memory within a job. Pleas see the sample script below. Note that it appears to be necessary to simulate a virtual X11 environment for mzMine to run in batch mode.
 
We provide an alternate startMZmine script that correctly sets the HEAP memory based on either the HPC_MZMINE_MEM environment variable or, if that variable if absent, based on the total amount of requested memory within a job. Pleas see the sample script below. Note that it appears to be necessary to simulate a virtual X11 environment for mzMine to run in batch mode.
  
Note that HiPerGator2 nodes are diskless, so '<code>/tmp</code>' directory that mzMine uses by default for its temporary files cannot be used. See [[Temporary Directories]] for details on how to set $TMPDIR variable that points to a directory in your /ufrc space.
+
Note that HiPerGator2 nodes are diskless, so '<code>/tmp</code>' directory that mzMine uses by default for its temporary files cannot be used. See [[Temporary Directories]] for details on how to set $TMPDIR variable that points to a directory in your /blue space.
  
If you need to run mzMine GUI you have to run it in a gui session under SLURM and use [http://xpra.org/ Xpra] software to connect to the session. See our [[Xpra|Xpra on HiPerGator]] documentation for reference.
+
If you need to run mzMine GUI, you should use [[GUI Programs|OOD and the HiPerGator Desktop application]].
 
+
   
Example session:
 
 
 
[jdoe@gator3 mzmine]$ module load mzmine
 
 
 
 
 
Let's try the launcher script to star mzMine in a gui session under SLURM and wrap it in Xpra, so we could connect from the outside. Here's the help message when you get if you use the '-h' argument'
 
 
 
[jdoe@gator3 mzmine]$ launch_xpra_mzmine -h
 
<pre>
 
Usage:
 
 
 
        launch_xpra_mzmine [options]
 
 
 
Options:
 
 
 
        -h - show this help message
 
 
 
        -m <value> - memory, gb (default is 4gb)
 
 
 
        -a <value> - SLURM account (default is your main account)
 
 
 
        -b - Use burst SLURM qos (default is main qos)
 
 
 
        -t <value> - SLURM time limit, hrs (default is 4hrs)
 
 
 
 
 
All arguments are optional. Defaults will be used for missing values
 
</pre>
 
 
 
Alright, let's do a test run. Let's say you wanted to use 6gb of memory and run mzmine for 24 hours.
 
 
 
[jdoe@gator3 test_directory]$ launch_xpra_mzmine -m 6 -t 24
 
 
 
<pre>
 
Starting mzMine under Xpra in a SLURM gui session.
 
 
 
Requested mzMine memory size: '6gb'
 
 
 
Submitted batch job 815185
 
 
 
Run: 'xpra_list_sessions' after job starts to see the connection command
 
 
 
See https://wiki.rc.ufl.edu/doc/Xpra for documentation
 
 
 
            JOBID PARTITION    NAME    USER ST      TIME NODES NODELIST(REASON)
 
            815185      gui  mzmine  jdoe  R      0:10      1 aaa-bbb
 
 
 
Refreshing the session list for jdoe to remove stale sessions
 
Warning: Permanently added 'aaa-bbb.ufhpc,123.45.678.9' (RSA) to the list of known hosts.
 
 
 
Xpra session status for jdoe have been refreshed.
 
 
 
Xpra sessions for jdoe:
 
    aaa-bbb.rc.ufl.edu:7176
 
        Connection command: xpra attach ssh:jdoe@aaa-bbb.rc.ufl.edu:7176
 
</pre>
 
 
 
Since the job already started we already see a live Xpra session in the above output. Otherwise, we'd have to wait for the jbo to start and then run xpra_list_sessions (load 'gui' module if needed to acces the command).
 
 
 
Let's connect to UF VPN (https://kb.helpdesk.ufl.edu/FAQs/VPNInstructions) and then use the xpra script from https://wiki.rc.ufl.edu/doc/Xpra#Microsoft_Windows in a MobaXTerm terminal on a windows client machine (for example). If you run
 
 
 
sh xpra attach ssh:jdoe@aaa-bbb.rc.ufl.edu:7176
 
 
 
in MobaXTerm you should see a password window from TortoisePlink. After you enter the password you should see the mzMine GUI show up on your local machine. Do not close the program with the [x] close button in the right-top corner of mzMine GUI window unless you want the job to complete. Click on the MobaXTerm terminal where you started the command and use 'Ctrl+c" key combination to detach from the session, so you could re-attach later.
 
 
|}}
 
|}}
<!--PBS scripts-->
+
<!--JOBscripts-->
{{#if: {{#var: pbs}}|==PBS Script Examples==
+
{{#if: {{#var: job}}|==Job Script Examples==
See the [[{{PAGENAME}}_PBS]] page for {{#var: app}} PBS script examples.
+
See the [[{{PAGENAME}}_Job_Script]] page for {{#var: app}} job script examples.
 
|}}
 
|}}
 
<!--Policy-->
 
<!--Policy-->

Latest revision as of 13:28, 7 July 2023

Description

mzmine website  
MZmine 2 is an open-source framework for processing, visualization and analysis of mass spectrometry based molecular profile data. It is based on the original MZmine toolbox described in 2006 Bioinformatics publication.

Environment Modules

Run module spider mzmine to find out what environment modules are available for this application.

System Variables

  • HPC_MZMINE_DIR - installation directory

Additional Information

mzMine can be run in a batch mode according to mzMine Manual by calling startMZmine with a single argument corresponding to a saved batch script generated within the GUI.

We provide an alternate startMZmine script that correctly sets the HEAP memory based on either the HPC_MZMINE_MEM environment variable or, if that variable if absent, based on the total amount of requested memory within a job. Pleas see the sample script below. Note that it appears to be necessary to simulate a virtual X11 environment for mzMine to run in batch mode.

Note that HiPerGator2 nodes are diskless, so '/tmp' directory that mzMine uses by default for its temporary files cannot be used. See Temporary Directories for details on how to set $TMPDIR variable that points to a directory in your /blue space.

If you need to run mzMine GUI, you should use OOD and the HiPerGator Desktop application.