Spark

From UFRC
Revision as of 16:17, 6 February 2018 by Giljael (talk | contribs) (Created page with "Category:SoftwareCategory:sparkCategory:hadoop {|<!--CONFIGURATION: REQUIRED--> |{{#vardefine:app|spark}} |{{#vardefine:url|http://spark.apache.org/}} <!--CONFIGUR...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Description

spark website  

Apache Spark is a fast and general-purpose cluster computing system. It provides high-level APIs in Java, Scala, Python and R, and an optimized engine that supports general execution graphs. It also supports a rich set of higher-level tools including Spark SQL for SQL and structured data processing, MLlib for machine learning, GraphX for graph processing, and Spark Streaming.

Environment Modules

Run module spider spark to find out what environment modules are available for this application.

System Variables

  • HPC_{{#uppercase:spark}}_DIR - installation directory
  • HPC_{{#uppercase:spark}}_BIN - executable directory
  • HPC_{{#uppercase:spark}}_SLURM - SLURM job script examples
  • SPARK_HOME - examples directory