查询蜂房返回空结果
问题描述:
我在AWS EMR集群上运行火花2.1.0(基于以下的 - https://aws.amazon.com/blogs/big-data/running-jupyter-notebook-and-jupyterhub-on-amazon-emr/)查询蜂房返回空结果
我尝试查询它存在一个表,并在远程HIVE具有内部数据。 Spark正确地干涉模式,但表格内容为空。任何想法?
import os
import findspark
findspark.init('/usr/lib/spark/')
# Spark related imports
from pyspark.sql import SparkSession
from pyspark import SparkContext
sc = SparkContext.getOrCreate()
spark = SparkSession.builder.config(conf=sc.getConf()).getOrCreate()
remote_hive = "jdbc:hive2://myhost:10000/mydb"
driver = "org.apache.hive.jdbc.HiveDriver"
user="user"
password = "password"
df = spark.read.format("jdbc").\
options(url=remote_hive,
driver=driver,
user=user,
password=password,
dbtable="mytable").load()
df.printSchema()
# returns the right schema
df.count()
0
答
你可以尝试 -
spark\
.read.format("jdbc")\
.option("driver", driver)
.option("url", remote_url)
.option("dbtable", "mytable")
.option("user", "user")\
.option("password", "password")
.load()
同样的结果 - 空表(df.count()= 0) –