在 pyspark 中,
注册了一个临时表
from pyspark import HiveContext
sqlContext = HiveContext(sc)
df = sqlContext.sql("select * from test").collect()
df.registerTempTable("testing")
sqlContext.sql("show tables").show()
+--------------------+-----------+
| tableName|isTemporary|
+--------------------+-----------+
| testing| true|
| check| false|
+--------------------+-----------+
我可以从 pyspark 查看临时表“测试”
我启动了 spark thrift 服务器
启动 JDBC 客户端并连接到 spark thrift 服务器,
$ ./bin/beeline
beeline> !connect jdbc:hive2://ip:10000
Connecting to jdbc:hive2://ip:10000
Enter username for jdbc:hive2://ip:
Enter password for jdbc:hive2://ip:10000:
16/03/06 13:17:41 INFO jdbc.Utils: Supplied authorities: :10000
16/03/06 13:17:41 INFO jdbc.Utils: Resolved authority: :10000
16/03/06 13:17:41 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://ip:10000
Connected to: Spark SQL (version 1.5.2)
Driver: Spark Project Core (version 1.5.2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://ip.> show tables;
+-------------+--------------+--+
| tableName | isTemporary |
+-------------+--------------+--+
| check | false |
+-------------+--------------+--+
2 rows selected (0.842 seconds)
0: jdbc:hive2://ip.>
我无法查看临时表。有什么我想念的吗?