eclipse怎么打有spark的 jar包
1个回答
展开全部
以WordCount为例:
package com.lxw.test
import org.apache.spark.{SparkConf, SparkContext}
import SparkContext._
object WordCount {
def main (args: Array[String]) {
if(args.length < 2) {
println("Usage: WordCount ")
System.exit(1)
}
val hdfsIn = args(0);
val hdfsOut = args(1);
val sc = new SparkContext(new SparkConf().setAppName("WordCount"))
val srcData = sc.textFile(hdfsIn)
val result = srcData.flatMap(_.split("\\s+")).map((_,1)).reduceByKey(_+_)
result.saveAsTextFile(hdfsOut)
}
}
在eclipse中将程序打成普通的Java jar包即可。
在Spark的一台Client机器上使用spark-submit来提交运行jar包:
$SPARK_HOME/bin/spark-submit \ --name "lxw1234-wordcount" \ --master spark://192.168.1.130:7077 \ --executor-memory 1G \ --class com.lxw.test.WordCount \ /home/lxw1234/lxw-spark.jar /logs/site/2015-05-14/ /tmp/lxwoutput
package com.lxw.test
import org.apache.spark.{SparkConf, SparkContext}
import SparkContext._
object WordCount {
def main (args: Array[String]) {
if(args.length < 2) {
println("Usage: WordCount ")
System.exit(1)
}
val hdfsIn = args(0);
val hdfsOut = args(1);
val sc = new SparkContext(new SparkConf().setAppName("WordCount"))
val srcData = sc.textFile(hdfsIn)
val result = srcData.flatMap(_.split("\\s+")).map((_,1)).reduceByKey(_+_)
result.saveAsTextFile(hdfsOut)
}
}
在eclipse中将程序打成普通的Java jar包即可。
在Spark的一台Client机器上使用spark-submit来提交运行jar包:
$SPARK_HOME/bin/spark-submit \ --name "lxw1234-wordcount" \ --master spark://192.168.1.130:7077 \ --executor-memory 1G \ --class com.lxw.test.WordCount \ /home/lxw1234/lxw-spark.jar /logs/site/2015-05-14/ /tmp/lxwoutput
推荐律师服务:
若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询