调整Spark Streaming的日志输出级别,解决Spark Streaming的代码报错:Marking the coordinator * dead for group *

Spark Streaming消费Kafka中的数据,程序卡在这个地方不动了,报错截图如下:
调整Spark Streaming的日志输出级别,解决Spark Streaming的代码报错:Marking the coordinator * dead for group *`19/05/07 17:16:58 INFO AbstractCoordinator: Discovered coordinator slave2:9092 (id: 2147483645 rack: null) for group UserClikAnalysis.
19/05/07 17:17:00 INFO AbstractCoordinator: Marking the coordinator slave2:9092 (id: 2147483645 rack: null) dead for group UserClikAnalysis

排查错误,调整日志输出级别为DEBUG
调整Spark Streaming的日志输出级别,解决Spark Streaming的代码报错:Marking the coordinator * dead for group *启动后,看到错误:
调整Spark Streaming的日志输出级别,解决Spark Streaming的代码报错:Marking the coordinator * dead for group * 这种错误,需要添加hosts映射
调整Spark Streaming的日志输出级别,解决Spark Streaming的代码报错:Marking the coordinator * dead for group *调整Spark Streaming的日志输出级别,解决Spark Streaming的代码报错:Marking the coordinator * dead for group *
添加完hosts映射后,启动成功!!!