Spark学习(二)Spark高可用集群搭建
1、下载Spark安装包
官网网址:http://spark.apache.org/downloads.html
2、Spark安装过程
2.1、上传并解压缩
[potter@potter2 ~]$ tar -zxvf spark-2.3.0-bin-hadoop2.7.tgz -C apps/
2.2、修改配置文件
(1)进入配置文件所在目录
/home/potter/apps/spark-2.3.0-bin-hadoop2.7/conf
-
[potter@potter2 conf]$ ll
-
total 36
-
-rw-r–r-- 1 potter potter 996 Feb 23 03:42 docker.properties.template
-
-rw-r–r-- 1 potter potter 1105 Feb 23 03:42 fairscheduler.xml.template
-
-rw-r–r-- 1 potter potter 2025 Feb 23 03:42 log4j.properties.template
-
-rw-r–r-- 1 potter potter 7801 Feb 23 03:42 metrics.properties.template
-
-rw-r–r-- 1 potter potter 865 Feb 23 03:42 slaves.template
-
-rw-r–r-- 1 potter potter 1292 Feb 23 03:42 spark-defaults.conf.template
-
-rwxr-xr-x 1 potter potter 4221 Feb 23 03:42 spark-env.sh.template
(2)修改spark-env.sh文件
复制spark-env.sh.template,并重命名为spark-env.sh,并在文件最后添加配置内容
-
[potter@potter2 conf]$ cp spark-env.sh.template spark-env.sh
-
[potter@potter2 conf]$ vi spark-env.sh
-
export JAVA_HOME=/usr/local/java/jdk1.8.0_73
-
#export SCALA_HOME=/usr/share/scala
-
export HADOOP_HOME=/home/potter/apps/hadoop-2.7.5
-
export HADOOP_CONF_DIR=/home/potter/apps/hadoop-2.7.5/etc/hadoop
-
export SPARK_WORKER_MEMORY=500m
-
export SPARK_WORKER_CORES=1
-
export SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=potter2:2181,potter3:2181,potter4:2181,potter5:2181 -Dspark.deploy.zookeeper.dir=/spark"
注: #export SPARK_MASTER_IP=hadoop1 这个配置要注释掉。 集群搭建时配置的spark参数可能和现在的不一样,主要是考虑个人电脑配置问题,如果memory配置太大,机器运行很慢。 说明: -Dspark.deploy.recoveryMode=ZOOKEEPER #说明整个集群状态是通过zookeeper来维护的,整个集群状态的恢复也是通过zookeeper来维护的。就是说用zookeeper做了spark的HA配置,Master(Active)挂掉的话,Master(standby)要想变成Master(Active)的话,Master(Standby)就要像zookeeper读取整个集群状态信息,然后进行恢复所有Worker和Driver的状态信息,和所有的Application状态信息; -Dspark.deploy.zookeeper.url=potter2:2181,potter3:2181,potter4:2181,potter5:2181#将所有配置了zookeeper,并且在这台机器上有可能做master(Active)的机器都配置进来;(我用了4台,就配置了4台) -Dspark.deploy.zookeeper.dir=/spark 这里的dir和zookeeper配置文件zoo.cfg中的dataDir的区别??? -Dspark.deploy.zookeeper.dir是保存spark的元数据,保存了spark的作业运行状态; zookeeper会保存spark集群的所有的状态信息,包括所有的Workers信息,所有的Applactions信息,所有的Driver信息,如果集群 |
(3)复制slaves.template变成slaves
-
[potter@potter2 conf]$ cp slaves.template slaves
-
[potter@potter2 conf]$ vi slaves
添加以下内容:
potter2 potter3 potter4 potter5
(4)将安装包分发给其他节点
-
[potter@potter2 apps]$ scp -r spark-2.3.0-bin-hadoop2.7/ potter3: scp -r spark-2.3.0-bin-hadoop2.7/ potter4: scp -r spark-2.3.0-bin-hadoop2.7/ potter5: vi .bashrc
-
export SPARK_HOME=/home/potter/apps/spark-2.3.0-bin-hadoop2.7
-
export PATH=SPARK_HOME/bin
保存并使其立即生效
[potter@potter2 ~]$ source .bashrc
2.4、配置spark-defaults.conf
复制一个spark-defaults.conf文件
[potter@potter2 conf]$ cp spark-defaults.conf.template spark-defaults.conf
[potter@potter2 conf]$ vi spark-defaults.conf
-
# This is useful for setting default environmental settings.
-
-
# Example:
-
spark.master spark://potter2:7077,potter3:7077,potter4:7077,potter5:7077
-
# spark.eventLog.enabled true
-
# spark.eventLog.dir hdfs://namenode:8021/directory
-
# spark.serializer org.apache.spark.serializer.KryoSerializer
-
# spark.driver.memory 5g
-
# spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers=“one two three"
-
[potter@potter2 conf]$ scp -r spark-defaults.conf potter3: scp -r spark-defaults.conf potter4: scp -r spark-defaults.conf potter5: zkServer.sh start
-
ZooKeeper JMX enabled by default
-
Using config: /home/potter/apps/zookeeper-3.4.10/bin/…/conf/zoo.cfg
-
Starting zookeeper … already running as process 3703.
-
[potter@potter2 ~]$ zkServer.sh status
-
ZooKeeper JMX enabled by default
-
Using config: /home/potter/apps/zookeeper-3.4.10/bin/…/conf/zoo.cfg
-
Mode: follower
3.2、启动HDFS集群
任意一个节点执行即可
[potter@potter2 ~]$ start-dfs.sh
3.3、再启动Spark集群
-
[potter@potter2 ~]$ cd apps/spark-2.3.0-bin-hadoop2.7/sbin/
-
[potter@potter2 sbin]$ ./start-all.sh
3.4、查看进程
-
[potter@potter2 sbin]$ jps
-
6464 Master
-
6528 Worker
-
6561 Jps
-
6562 Jps
-
3909 NameNode
-
6565 Jps
-
3703 QuorumPeerMain
-
5047 NodeManager
-
4412 DFSZKFailoverController
-
4204 JournalNode
-
4014 DataNode
-
[potter@potter3 conf]$ jps
-
4609 Jps
-
3441 DataNode
-
3284 QuorumPeerMain
-
4581 Worker
-
3879 NodeManager
-
3576 JournalNode
-
3372 NameNode
-
3676 DFSZKFailoverController
-
[potter@potter4 conf]$ jps
-
3456 JournalNode
-
3607 NodeManager
-
4123 Jps
-
3356 DataNode
-
3260 QuorumPeerMain
-
4095 Worker
-
[potter@potter5 conf]$ jps
-
3216 QuorumPeerMain
-
3447 NodeManager
-
3304 DataNode
-
3945 Jps
-
3916 Worker
3.5、启动spark
出现以下就算启动成功:
3.5、问题
查看进程发现spark集群只有hadoop1成功启动了Master进程,其他3个节点均没有启动成功,需要手动启动,进入到/home/hadoop/apps/spark/sbin目录下执行以下命令,3个节点都要执行
-
[potter@potter3 ~]$ cd apps/spark-2.3.0-bin-hadoop2.7/sbin/
-
[potter@potter3 sbin]$ ./start-master.sh
-
[potter@potter4 ~]$ cd apps/spark-2.3.0-bin-hadoop2.7/sbin/
-
[potter@potter4 sbin]$ ./start-master.sh
-
[potter@potter5 ~]$ cd apps/spark-2.3.0-bin-hadoop2.7/sbin/
-
[potter@potter5 sbin]$ ./start-master.sh
3.6、执行之后再次查看进程
Master进程和Worker进程都可以启动成功
-
[potter@potter3 sbin]$ jps
-
3441 DataNode
-
3284 QuorumPeerMain
-
4581 Worker
-
3576 JournalNode
-
3372 NameNode
-
3676 DFSZKFailoverController
-
5020 Jps
-
4894 Master
-
[potter@potter4 sbin]$ jps
-
3456 JournalNode
-
4179 Master
-
3356 DataNode
-
3260 QuorumPeerMain
-
4238 Jps
-
4095 Worker
-
[potter@potter5 sbin]$ jps
-
3216 QuorumPeerMain
-
3304 DataNode
-
4057 Jps
-
3916 Worker
-
3998 Master
4、验证
4.1、查看Web界面Master状态
potter2是ALIVE状态,potter3、potter4和potter5均是STANDBY状态
potter2
potter3
potter4
potter5
4.2、手动干掉potter2上面的Master进程,观察是否进行自动切换
-
[potter@potter2 ~]$ jps
-
6464 Master
-
6528 Worker
-
6901 Jps
-
3909 NameNode
-
3703 QuorumPeerMain
-
4412 DFSZKFailoverController
-
4204 JournalNode
-
4014 DataNode
-
[potter@potter2 ~]$ kill -9 6464
-
[potter@potter2 ~]$
干掉potter2上的Master进程之后,再次查看web界面
potter2节点,由于Master进程被干掉,所以界面无法访问
potter3节点,Master被干掉之后,potter3节点上的Master成功篡位成功,成为ALIVE状态
potter4
potter5
五、执行Spark程序on standalone
(1)执行第一个Spark程序
-
[potter@potter2 apps]$ cd
-
[potter@potter2 ~]$ /home/potter/apps/spark-2.3.0-bin-hadoop2.7/bin/spark-submit </div>
-
> –class org.apache.spark.examples.SparkPi </span>
-
> –master spark????/potter2:7077 </span>
-
> –executor-memory 500m </span>
-
> –total-executor-cores 1 </span>
-
> /home/potter/apps/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar </span>
-
> 100
其中的spark://hadoop1:7077是下图中的地址
运行结果:
(2)启动spark-shell
-
/home/potter/apps/spark-2.3.0-bin-hadoop2.7/bin/spark-shell </div>
-
–master spark://potter2:7077 </span>
-
–executor-memory 500m </div>
-
–total-executor-cores 1
参数说明:
–master spark://potter2:7077 指定Master的地址 –executor-memory 500m:指定每个worker可用内存为500m –total-executor-cores 1: 指定整个集群使用的cup核数为1个 |
注意:
如果启动spark shell时没有指定master地址,但是也可以正常启动spark shell和执行spark shell中的程序,其实是启动了spark的local模式,该模式仅在本机启动一个进程,没有与集群建立联系。
Spark Shell中已经默认将SparkContext类初始化为对象sc。用户代码如果需要用到,则直接应用sc即可
Spark Shell中已经默认将SparkSQl类初始化为对象spark。用户代码如果需要用到,则直接应用spark即可
(3)在spark shell中编写workcount程序
1)编写一个workcount.txt文件并上传到HDFS上的 / 目录下
-
[potter@potter2 ~]$ vi workcount.txt
-
[potter@potter2 ~]$ hadoop fs -put hello.txt /
workcount.txt的内容是:
you,jump i,jump
2)在spark shell中用scala语言编写spark程序
scala> sc.textFile(”/workcount.txt").flatMap(.split(",")).map((,1)).reduceByKey(+).collect
说明:
sc是SparkContext对象,该对象是提交spark程序的入口 textFile("/spark/hello.txt")是hdfs中读取数据 flatMap(.split(" "))先map再压平 map((,1))将单词和1构成元组 reduceByKey(+)按照key进行reduce,并将value累加 saveAsTextFile("/spark/out")将结果写入到hdfs中 |
运行结果:
六、执行Spark程序on YARN
(1)前提
成功启动zookeeper集群、HDFS集群、YARN集群
(2)启动Spark on YARN
[potter@potter2 ~]$ spark-shell --master yarn --deploy-mode client
报错如下:
报错原因:内存资源给的过小,yarn直接kill掉进程,则报rpc连接失败、ClosedChannelException等错误。
解决方法:
先停止YARN服务,然后修改yarn-site.xml,增加如下内容
-
<property>
-
<name>yarn.nodemanager.vmem-check-enabled</name>
-
<value>false</value>
-
<description>Whether virtual memory limits will be enforced for containers</description>
-
</property>
-
<property>
-
<name>yarn.nodemanager.vmem-pmem-ratio</name>
-
<value>4</value>
-
<description>Ratio between virtual memory to physical memory when setting memory limits for containers</description>
-
</property>
将新的yarn-site.xml文件分发到其他Hadoop节点对应的目录下,最后在重新启动YARN。
重新执行以下命令启动spark on yarn
[potter@potter2 hadoop]$ spark-shell --master yarn --deploy-mode client
(3)打开YARN的web界面
打开YARN的web界面:http://potter4:8088
可以看到Spark shell应用程序正在运行
单击ID号链接,可以看到该应用程序的详细信息
单击“ApplicationMaster”链接
(4)运行程序
-
scala> val array = Array(1,2,3,4,5)
-
array: Array[Int] = Array(1, 2, 3, 4, 5)
-
-
scala> val rdd = sc.makeRDD(array)
-
rdd: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at makeRDD at <console>:26
-
-
scala> rdd.count
-
res0: Long = 5
-
-
scala>
再次查看YARN的web界面
(5)执行Spark自带的示例程序PI
-
spark-submit –class org.apache.spark.examples.SparkPi </span>
-
–master yarn </span>
-
–deploy-mode cluster </span>
-
–driver-memory 500m </span>
-
–executor-memory 500m </span>
-
–executor-cores 1 </span>
-
/home/potter/apps/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar </span>
-
10
执行过程:
-
2018-04-22 18:16:08 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
-
2018-04-22 18:16:13 INFO Client:54 - Requesting a new application from cluster with 4 NodeManagers
-
2018-04-22 18:16:14 INFO Client:54 - Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
-
2018-04-22 18:16:14 INFO Client:54 - Will allocate AM container, with 884 MB memory including 384 MB overhead
-
2018-04-22 18:16:14 INFO Client:54 - Setting up container launch context for our AM
-
2018-04-22 18:16:14 INFO Client:54 - Setting up the launch environment for our AM container
-
2018-04-22 18:16:14 INFO Client:54 - Preparing resources for our AM container
-
2018-04-22 18:16:18 WARN Client:66 - Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
-
2018-04-22 18:16:26 INFO Client:54 - Uploading resource file:/tmp/spark-e645ee0d-099c-4b22-8729-cb77babf5e0a/__spark_libs__3299474380368175903.zip -> hdfs://myha01/user/potter/.sparkStaging/application_1524389838076_0006/__spark_libs__3299474380368175903.zip
-
2018-04-22 18:17:04 INFO Client:54 - Uploading resource file:/home/potter/apps/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar -> hdfs://myha01/user/potter/.sparkStaging/application_1524389838076_0006/spark-examples_2.11-2.3.0.jar
-
2018-04-22 18:17:05 INFO Client:54 - Uploading resource file:/tmp/spark-e645ee0d-099c-4b22-8729-cb77babf5e0a/__spark_conf__7169638864757569614.zip -> hdfs://myha01/user/potter/.sparkStaging/application_1524389838076_0006/spark_conf.zip
-
2018-04-22 18:17:05 INFO SecurityManager:54 - Changing view acls to: potter
-
2018-04-22 18:17:05 INFO SecurityManager:54 - Changing modify acls to: potter
-
2018-04-22 18:17:05 INFO SecurityManager:54 - Changing view acls groups to:
-
2018-04-22 18:17:05 INFO SecurityManager:54 - Changing modify acls groups to:
-
2018-04-22 18:17:05 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(potter); groups with view permissions: Set(); users with modify permissions: Set(potter); groups with modify permissions: Set()
-
2018-04-22 18:17:06 INFO Client:54 - Submitting application application_1524389838076_0006 to ResourceManager
-
2018-04-22 18:17:06 INFO YarnClientImpl:273 - Submitted application application_1524389838076_0006
-
2018-04-22 18:17:07 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:07 INFO Client:54 -
-
client token: N/A
-
diagnostics: N/A
-
ApplicationMaster host: N/A
-
ApplicationMaster RPC port: -1
-
queue: default
-
start time: 1524392325362
-
final status: UNDEFINED
-
tracking URL: http://potter4:8088/proxy/application_1524389838076_0006/
-
user: potter
-
2018-04-22 18:17:08 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:09 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:10 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:11 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:12 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:13 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:14 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:15 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:16 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:17 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:18 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:19 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:20 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:22 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:23 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:24 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:25 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:26 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:28 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:29 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:30 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:31 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:33 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:34 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:35 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:36 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:37 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:38 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:40 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:41 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:42 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:43 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:44 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:45 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:46 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:47 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:48 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:49 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:50 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:51 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:52 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:53 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:54 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:55 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:56 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:57 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:58 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:17:59 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:00 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:01 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:02 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:03 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:04 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:05 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:06 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:07 INFO Client:54 - Application report for application_1524389838076_0006 (state: ACCEPTED)
-
2018-04-22 18:18:08 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:08 INFO Client:54 -
-
client token: N/A
-
diagnostics: N/A
-
ApplicationMaster host: 192.168.123.102
-
ApplicationMaster RPC port: 0
-
queue: default
-
start time: 1524392325362
-
final status: UNDEFINED
-
tracking URL: http://potter4:8088/proxy/application_1524389838076_0006/
-
user: potter
-
2018-04-22 18:18:09 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:10 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:11 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:12 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:13 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:14 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:15 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:16 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:17 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:19 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:20 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:21 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:23 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:24 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:25 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:26 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:27 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:28 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:29 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:30 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:31 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:32 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:33 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:35 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:36 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:37 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:38 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:39 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:40 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:41 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:42 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:43 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:44 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:45 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:46 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:47 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:48 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:49 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:50 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:51 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:52 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:53 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:54 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:55 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:56 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:57 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:58 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:18:59 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:19:00 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:19:01 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:19:02 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:19:03 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:19:04 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:19:05 INFO Client:54 - Application report for application_1524389838076_0006 (state: RUNNING)
-
2018-04-22 18:19:06 INFO Client:54 - Application report for application_1524389838076_0006 (state: FINISHED)
-
2018-04-22 18:19:06 INFO Client:54 -
-
client token: N/A
-
diagnostics: N/A
-
ApplicationMaster host: 192.168.123.102
-
ApplicationMaster RPC port: 0
-
queue: default
-
start time: 1524392325362
-
final status: SUCCEEDED
-
tracking URL: http://potter4:8088/proxy/application_1524389838076_0006/
-
user: potter
-
2018-04-22 18:19:08 INFO ShutdownHookManager:54 - Shutdown hook called
-
2018-04-22 18:19:09 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-6f009c18-9d50-460b-b480-77b0ca856369
-
2018-04-22 18:19:09 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-e645ee0d-099c-4b22-8729-cb77babf5e0a
-
[[email protected] ~]$