如何运行含spark的python脚本
1个回答
2016-02-23
展开全部
~spark$ bin/spark-submit first.py
-----------first.py-------------------------------
from pyspark import SparkConf, SparkContext
conf = SparkConf().setMaster("local").setAppName("My App")
sc = SparkContext(conf = conf)
lines = sc.textFile("first.py")
pythonLines = lines.filter(lambda line: "Python" in line)
print "hello python"
print pythonLines.first()
print pythonLines.first()
print "hello spark!"
---------------------------------------------------
hello python
pythonLines = lines.filter(lambda line: "Python" in line)
pythonLines = lines.filter(lambda line: "Python" in line)
hello spark!
到spark的安装目录下/bin 下面 spark-submit ***.py 即可
-----------first.py-------------------------------
from pyspark import SparkConf, SparkContext
conf = SparkConf().setMaster("local").setAppName("My App")
sc = SparkContext(conf = conf)
lines = sc.textFile("first.py")
pythonLines = lines.filter(lambda line: "Python" in line)
print "hello python"
print pythonLines.first()
print pythonLines.first()
print "hello spark!"
---------------------------------------------------
hello python
pythonLines = lines.filter(lambda line: "Python" in line)
pythonLines = lines.filter(lambda line: "Python" in line)
hello spark!
到spark的安装目录下/bin 下面 spark-submit ***.py 即可
推荐律师服务:
若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询