Spark安装部署

Spark安装:

 

1、下载安装Scala    http://www.scala-lang.org/download/2.11.8.html页面下载scala-2.11.8.tgz

       下载安装Spark   http://spark.apache.org/downloads.html页面下载spark-2.0.0-bin-hadoop2.7.tgz

2、安装Scala

       解压

# tar -zxvf scala-2.11.8.tgz/home/netlabSpark安装部署

   配置环境变量

 [[email protected] ~]$ vi .bash_profile

 export SCALA_HOME=/home/netlab/scala-2.11.4

 export PATH=$PATH:$SCALA_HOME/bin

[[email protected] ~]$ source .bash_profile     

    测试是否安装成功:输入scala,出现以下则成功:Spark安装部署

3、安装Spark

      解压tar -zxvf spark-2.0.0-bin-hadoop2.7.tgz/usr/local/wl/spark

      配置环境变量 

      [[email protected] ~]$ vi .bash_profile

export SPARK_HOME=/usr/local/wl/spark/spark-2.0.0-bin-hadoop2.7
      export PATH=$PATH:$SPARK_HOME/bin

Spark安装部署

[[email protected] ~]$ source  .bash_profile

[[email protected] ~]$ cd spark-2.0.0-bin-hadoop2.7

[[email protected] spark-2.0.0-bin-hadoop2.7]$ cd conf

[[email protected] conf]$ ls

[[email protected] conf]$cp spark-env.sh.template spark-env.sh

[[email protected] conf]$vi spark-env.sh

添加java/scala/spark路径

Spark安装部署

[[email protected] conf]$ source  spark-env.sh

[[email protected] conf]$ cp   slaves.template  slaves

[[email protected] conf]$vi slaves

修改主机名为从节点名称

4、将主节点关于spark配置复制到从节点

[[email protected] ~]$scp -r ~/spark-2.0.0-binhadoop2.7 slave:~/

[[email protected] ~]$ cd spark-2.0.0-bin-hadoop2.7

[[email protected] spark-2.0.0-bin-hadoop2.7]$ cd sbin/

[[email protected] sbin]$ ls

[[email protected] sbin]$ start-all.sh

[[email protected] sbin]$ jps检查节点启动情况

Spark安装部署

Spark安装部署


访问浏览器master:8080

Spark安装部署