Spark的多种运行方式(包括spark on yarn client&&cluster)

1.localhost

        启动命令./spark-shell --master local[2] --jars /...

                      ./spark-submit --master local[2] --jars /...

        或者修改spark-defaults.conf文件,spark.master local[2],写死即可省略--master local[2]

               修改spark-defaults.conf文件,spark-executor.extraClassPath=/home/hadoop/jarsURL 

               spark.diver.extraClassPath=/home/hadoop/jarsURL,写死即可省略--jars /..

        通常我们工作中会根据不同的业务线做不同的spark-defaults.conf

Spark的多种运行方式(包括spark on yarn client&&cluster)

        ./spark-submit --properties-file /spark-defaults.confURL

        bin/spark-submit --class com.ruzoe.spark.core.WordCount --master yarn /home/hadoop/app/spark-2.2.0-bin-2.6.0-cdh5.7.0/sparktrain4.0.jar /home/hadoop/data/datafile.txt 

[[email protected] spark-2.2.0-bin-2.6.0-cdh5.7.0]# bin/spark-submit --class com.ruzoe.spark.core.WordCount --master yarn /home/hadoop/app/spark-2.2.0-bin-2.6.0-cdh5.7.0/sparktrain4.0.jar /home/hadoop/data/datafile.txt 


bin/spark-submit --class com.ruzoe.spark.core.WordCountZXC --master yarn --deploy-mode cluster /home/hadoop/app/spark-2.2.0-bin-2.6.0-cdh5.7.0/sparktrain111.jar /datafile.txt 

Spark的多种运行方式(包括spark on yarn client&&cluster)

Spark的多种运行方式(包括spark on yarn client&&cluster)

2.standalone

        启动命令./spark-shell --master lcoal[2] --jars /...

3.Spark on YARN

        启动命令./spark-shell --master yarn/yarn-cluster --jars /...

4.Mesos

5.K8s