Running the hive queries from a file through code via sparkcontext or hivecontext (not through command line)
NickName:Siva kumar Ask DateTime:2016-11-19T20:40:26

Running the hive queries from a file through code via sparkcontext or hivecontext (not through command line)

Consider there are few hive queries in a file, my moto is to run the file using hivecontext or sparkcontext

Using command line I can do that by hive -f 'filepath/filename' But I have to run it via code (hivecontext or sparkcontext) Can anybody help on this?

For a single query I can use:

sparkContext.SQL('query')

But I have to run a file which is having queries.

Copyright Notice:Content Author:「Siva kumar」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/40692999/running-the-hive-queries-from-a-file-through-code-via-sparkcontext-or-hivecontex

Answers
A. BENCHAMA 2017-09-07T09:45:34

You can do it using Spark/Scala:\n\nqueryFile = \"path_of_your_file_queries\"\nSource.fromFile(queryFile, \"utf-8\").getLines().foreach(query => sparksql.sql(query))\n",


More about “Running the hive queries from a file through code via sparkcontext or hivecontext (not through command line)” related questions

Running the hive queries from a file through code via sparkcontext or hivecontext (not through command line)

Consider there are few hive queries in a file, my moto is to run the file using hivecontext or sparkcontext Using command line I can do that by hive -f 'filepath/filename' But I have to run it via...

Show Detail

PySpark is not able to read Hive ORC transaction table through sparkContext/hiveContext ? Can we update/delete hive table data using Pyspark?

I have tried to access the Hive ORC Transactional table (which has underlying delta files on HDFS) using PySpark but I'm not able to read the transactional table through sparkContext/hiveContext. ...

Show Detail

Inserting or Reading from Hive with HiveContext

I am trying to understand how spark works with HDFS/Hive.. If i am using HiveContext to read or write to hive, will it generate a map reduce job on the hadoop cluster end? Which resources(spark or

Show Detail

Using Hive functions in Spark Job via hiveContext

I am using Hive 1.2 and Spark 1.4.1. The Following query runs perfectly fine via Hive CLI: hive> select row_number() over (partition by one.id order by two.id) as sk, two.id, two.name, one.name,

Show Detail

loading a csv file to existing HIVE tale through Spark

Below is the code that I have written to connect to a RDBMS, then create temp table , execute SQL query on that temp table, saving the SQL query output to a .csv format through databricks module. ...

Show Detail

Can't load a Hive table through Spark

I am new to Spark and needed help in figuring out why my Hive databases are not accessible to perform a data load through Spark. Background: I am running Hive, Spark, and my Java program on a si...

Show Detail

Spark SQL(Hive query through HiveContext) always creating 31 partitions

I am running hive queries using HiveContext from my Spark code. No matter which query I run and how much data it is, it always generates 31 partitions. Anybody knows the reason? Is there a predefined/

Show Detail

Not able to run the Hive sql via Spark

I am trying to execute hive SQL via spark code but it is throwing below mentioned error. I can only select data from hive table. My spark version is 1.6.1 My Hive version is 1.2.1 command to run ...

Show Detail

Spark HiveContext - reading from external partitioned Hive table delimiter issue

I have an external partitioned Hive table with underling file ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' Reading data via Hive directly is just fine, but when using Spark's Dataframe API the del...

Show Detail

Multiple Spark applications with HiveContext

Having two separate pyspark applications that instantiate a HiveContext in place of a SQLContext lets one of the two applications fail with the error: Exception: ("You must build Spark with Hive.

Show Detail