# 一、安装 D:/env/docker-compose/spark ```shell docker-compose up -d ``` ## 启动Hadoop 1. 进入master 节点,执行./start-hadoop.sh # 二、webUI [spark 集群管理](http://localhost:8080/) [hdfs管理](http://localhost:9870/dfshealth.html#tab-overview) [YARN](http://localhost:8088/cluster) # 三、FAQ ## WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable [hadoop 2.x安装:不能加载本地库 - java.library.path错误 - 爱码网 (likecs.com)](https://www.likecs.com/show-308371162.html) 需要修改hadoop-env.sh文件配置