如何配置spark source code
1个回答
展开全部
Spark二进制包载并解压某台*nux机器段代码‘/Users/jilu/Downloads/’段换自路径单
机执行SparkSQL代码程序我已经创建sqlContext部SparkSQL教程我更新完1.3版新
改程序意外1.X版本都用
PS:补充Python APIScala
import os
import sys
import traceback
# Path for spark source folder
os.environ['SPARK_HOME']="/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4"
# Append pyspark to Python Path
sys.path.append("/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4/python/")
sys.path.append("/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4/python/lib/py4j-0.8.2.1-src.zip")
# try to import needed models
try:
from pyspark import SparkContext
from pyspark import SparkConf
from pyspark.sql import SQLContext, Row
print ("Successfully imported Spark Modules")
except ImportError as e:
print ("Can not import Spark Modules {}".format(traceback.format_exc()))
sys.exit(1)
# config spark env
conf = SparkConf().setAppName("myApp").setMaster("local")
sc = SparkContext(conf=conf)
sqlContext = SQLContext(sc)
机执行SparkSQL代码程序我已经创建sqlContext部SparkSQL教程我更新完1.3版新
改程序意外1.X版本都用
PS:补充Python APIScala
import os
import sys
import traceback
# Path for spark source folder
os.environ['SPARK_HOME']="/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4"
# Append pyspark to Python Path
sys.path.append("/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4/python/")
sys.path.append("/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4/python/lib/py4j-0.8.2.1-src.zip")
# try to import needed models
try:
from pyspark import SparkContext
from pyspark import SparkConf
from pyspark.sql import SQLContext, Row
print ("Successfully imported Spark Modules")
except ImportError as e:
print ("Can not import Spark Modules {}".format(traceback.format_exc()))
sys.exit(1)
# config spark env
conf = SparkConf().setAppName("myApp").setMaster("local")
sc = SparkContext(conf=conf)
sqlContext = SQLContext(sc)
推荐律师服务:
若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询