singularities/spark:2.2版本中
Hadoop版本:2.8.2
Spark版本: 2.2.1
Scala版本:2.11.8
Java版本:1.8.0_151
拉取镜像:
[root@localhost docker-spark-2.1.0]# docker pull singularities/spark
查看:
[root@localhost docker-spark-2.1.0]# docker image lsREPOSITORY TAG IMAGE ID CREATED SIZEdocker.io/singularities/spark latest 84222b254621 6 months ago 1.39 GB
创建docker-compose.yml文件
[root@localhost home]# mkdir singularitiesCR[root@localhost home]# cd singularitiesCR[root@localhost singularitiesCR]# touch docker-compose.yml
内容:
version: "2"services: master: image: singularities/spark command: start-spark master hostname: master ports: - "6066:6066" - "7070:7070" - "8080:8080" - "50070:50070" worker: image: singularities/spark command: start-spark worker master environment: SPARK_WORKER_CORES: 1 SPARK_WORKER_MEMORY: 2g links: - master
执行docker-compose up即可启动一个单工作节点的standlone模式下运行的spark集群
[root@localhost singularitiesCR]# docker-compose up -dCreating singularitiescr_master_1 ... doneCreating singularitiescr_worker_1 ... done
查看容器:
[root@localhost singularitiesCR]# docker-compose ps Name Command State Ports --------------------------------------------------------------------------------------------------------------------------------------------------------singularitiescr_master_1 start-spark master Up 10020/tcp, 13562/tcp, 14000/tcp, 19888/tcp, 50010/tcp, 50020/tcp, 0.0.0.0:50070->50070/tcp, 50075/tcp, 50090/tcp, 50470/tcp, 50475/tcp, 0.0.0.0:6066->6066/tcp, 0.0.0.0:7070->7070/tcp, 7077/tcp, 8020/tcp, 0.0.0.0:8080->8080/tcp, 8081/tcp, 9000/tcp singularitiescr_worker_1 start-spark worker master Up 10020/tcp, 13562/tcp, 14000/tcp, 19888/tcp, 50010/tcp, 50020/tcp, 50070/tcp, 50075/tcp, 50090/tcp, 50470/tcp, 50475/tcp, 6066/tcp, 7077/tcp, 8020/tcp, 8080/tcp, 8081/tcp, 9000/tcp
查看结果:
停止容器:
[root@localhost singularitiesCR]# docker-compose stopStopping singularitiescr_worker_1 ... doneStopping singularitiescr_master_1 ... done[root@localhost singularitiesCR]# docker-compose ps Name Command State Ports-----------------------------------------------------------------------singularitiescr_master_1 start-spark master Exit 137 singularitiescr_worker_1 start-spark worker master Exit 137
删除容器:
[root@localhost singularitiesCR]# docker-compose rmGoing to remove singularitiescr_worker_1, singularitiescr_master_1Are you sure? [yN] yRemoving singularitiescr_worker_1 ... doneRemoving singularitiescr_master_1 ... done[root@localhost singularitiesCR]# docker-compose psName Command State Ports------------------------------
进入master容器查看版本:
[root@localhost singularitiesCR]# docker exec -it 497 /bin/bashroot@master:/# hadoop versionHadoop 2.8.2Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 66c47f2a01ad9637879e95f80c41f798373828fbCompiled by jdu on 2017-10-19T20:39ZCompiled with protoc 2.5.0From source with checksum dce55e5afe30c210816b39b631a53b1dThis command was run using /usr/local/hadoop-2.8.2/share/hadoop/common/hadoop-common-2.8.2.jarroot@master:/# which is hadoop/usr/local/hadoop-2.8.2/bin/hadooproot@master:/# spark-shellSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).18/08/14 09:20:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicableSpark context Web UI available at http://172.18.0.2:4040Spark context available as 'sc' (master = local[*], app id = local-1534238447256).Spark session available as 'spark'.Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.1 /_/ Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_151)Type in expressions to have them evaluated.Type :help for more information.
参考: