spark.sql.SqlContext等包找不到

##spark.sql.SqlContext等包找不到##

报错信息如下:
spark.sql.SqlContext等包找不到

百度了很多,没有找到原因,后在一朋友指导下修复了该问题

解决方案

修改pom,

spark.sql.SqlContext等包找不到

把依赖包的范围给注释掉

然后reimport dependencies

spark.sql.SqlContext等包找不到

重新运行main示例,问题解决

关于maven依赖范围的几点补充:

scope:
This element refers to the classpath of the task at hand (compiling and runtime, testing, etc.) as well as how to limit the transitivity of a dependency. There are five scopes available:

compile - this is the default scope, used if none is specified. Compile dependencies are available in all classpaths. Furthermore, those dependencies are propagated to dependent projects.

provided - this is much like compile, but indicates you expect the JDK or a container to provide it at runtime. It is only available on the compilation and test classpath, and is not transitive.

runtime - this scope indicates that the dependency is not required for compilation, but is for execution. It is in the runtime and test classpaths, but not the compile classpath.
test - this scope indicates that the dependency is not required for normal use of the application, and is only available for the test compilation and execution phases. It is not transitive.

system - this scope is similar to provided except that you have to provide the JAR which contains it explicitly. The artifact is always available and is not looked up in a repository.

简而言之:

compile:编译/运行/测试,打包的时候包含,默认的依赖
provided:编译/运行/测试,打包的时候不包含
runtime:运行时
system:与provided类似,不过是从本地导入