新聞中心
Linux系統(tǒng):Ubuntu 16.04
Hadoop: 2.7.1
JDK: 1.8
Spark: 2.4.3
一.下載安裝文件
http://spark.apache.org/downloads.html
https://archive.apache.org/dist/spark/
hadoop@dblab:/usr/local$ sudo wget http://mirror.bit.edu.cn/apache/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
hadoop@dblab:/usr/local$ sudo tar -zxf spark-2.4.3-bin-hadoop2.7.tgz -C spark
hadoop@dblab:/usr/local$ sudo chown -R hadoop:hadoop spark/
二.配置相關文件
hadoop@dblab:/usr/local/spark$ ./conf/spark-env.sh.template? ./conf/spark-env.sh
export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)
#驗證Spark是否安裝成功
hadoop@dblab:/usr/local/spark$ bin/run-example SparkPi
Pi is roughly 3.139035695178476? ?
三.啟動Spark Shell
hadoop@dblab:/usr/local/spark$ ./bin/spark-shell?? ??
Welcome to
? ____? ? ? ? ? ? ? __
?/ __/__? ___ _____/ /__
_\ \/ _ \/ _ `/ __/? '_/
/___/ .__/\_,_/_/ /_/\_\? ?version 2.1.0
? /_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_212)
Type in expressions to have them evaluated.
Type :help for more information.
scala>?
scala> 8*2+5
res0: Int = 21
四.讀取文件
1.讀取本地文件
hadoop@dblab:/usr/local/hadoop$ ./sbin/start-dfs.sh? ? ? ? ? ? ? ? ? ? ? ? ? ? ?
scala> val textFile=sc.textFile("file:///usr/local/spark/README.md")
textFile: org.apache.spark.rdd.RDD[String] = file:///usr/local/spark/README.md MapPartitionsRDD[1] at textFile at
scala> textFile.first()
res0: String = # Apache Spark
2.讀取HDFS文件
hadoop@dblab:/usr/local/hadoop$ ./bin/hdfs dfs -put /usr/local/spark/README.md .
hadoop@dblab:/usr/local/hadoop$ ./bin/hdfs dfs -cat README.md
scala> val textFile=sc.textFile("hdfs://localhost:9000/user/hadoop/README.md")
textFile: org.apache.spark.rdd.RDD[String] = hdfs://localhost:9000/user/hadoop/README.md MapPartitionsRDD[3] at textFile at
scala> textFile.first()
res1: String = # Apache Spark
scala> :quit
另外有需要云服務器可以了解下創(chuàng)新互聯(lián)scvps.cn,海內外云服務器15元起步,三天無理由+7*72小時售后在線,公司持有idc許可證,提供“云服務器、裸金屬服務器、高防服務器、香港服務器、美國服務器、虛擬主機、免備案服務器”等云主機租用服務以及企業(yè)上云的綜合解決方案,具有“安全穩(wěn)定、簡單易用、服務可用性高、性價比高”等特點與優(yōu)勢,專為企業(yè)上云打造定制,能夠滿足用戶豐富、多元化的應用場景需求。
網(wǎng)站題目:Spark的安裝和基礎編程-創(chuàng)新互聯(lián)
路徑分享:http://www.ef60e0e.cn/article/dggsej.html