【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

说在前面

之前一段时间想着把 LeetCode 每个专题完结之后,就开始着手大数据和算法的内容。

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:b736f92d-de7d-4874-a61d-55e81d2bff81

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:07c201c5-fa26-4325-87bd-92a342ffe52f

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:8723b9ad-c323-4ecf-a3e2-00ea216bc8a5

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:bc11df43-e688-4af2-b278-e70a2549ca24

LeetCode 专题复盘,已经进行了一大半了。

大数据计划

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:80593928-86bb-414d-b7c6-cb84fc3bbbce

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:47955753-fc22-4c7b-aa08-9ec5d9305a2a

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:5f2c2600-8190-4a06-bd0f-48bc7ab3fe0b

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:3eeb5e72-44f8-48cb-8661-ef183191c606

很多人已经在 hive、HBASE、Spark、Flink 这几个方面使用的很熟练了,也有的人虽然使用了,但还是感觉对于大数据比较模糊。

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:8bbca45b-a93d-4c08-9ac8-dc13f9186136

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:a0bd4a33-3c6e-4658-8894-1013ae12d737

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:6eb0a98e-522e-4c1a-8b76-71c1e2e43761

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:eb9b3ed9-4b63-42b9-9e01-4fc31bd6c01e

后面会逐步把 HDFS、hive、Hase、Spark、YARN、Kafka、Zookeeper等逐个突破!

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:65d51df4-3b54-4059-977c-4e49a8bbcec7

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:7ab5a617-813c-4f31-a7df-1f5d0189bb47

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

感受大数据

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:c0cf9bb4-7129-4d0e-ba32-59a304999fb8

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:a338210a-13c7-4d30-ab2a-8ed07dd5bac3

从感受开始吧!

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:41624bb6-c687-4326-a85b-20d9c9c8c9f2

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:aaf6f522-ab3a-43f6-a778-d6b84d6a3d8b

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:893e9114-e57d-4b19-ab98-c9599a569c2f

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:336c67f3-b924-428a-86f3-1624dd8448bf

1、hadoop伪分布式环境搭建(需要的相关文件已经给大家准备好,文末可取!);

2、使用官方的 WordCount 程序进行感受;

3、自己写一个 hadoop Streaming 计算 WordCount。

ps:虽然伪分布式和wc老生常谈了,但是伪分布式用来测试功能还是不错的。另外,这两点作为大数据开篇也是完美的!

搭建一个伪分布式环境

我这边是在一台服务器上搭建的,配置是2核2G。

也可以在虚拟机上搭建!

第 ① 步 安装 jdk

官网下载地址:https://www.oracle.com/java/technologies/downloads/#java8

这边选择jdk8进行下载(文末取)

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

在创建 /usr/local/src/ 下,创建 java 文件,将 jdk 解压到这里。

mkdir -p /usr/local/src/java
tar -xvf jdk-8u311-linux-x64.tar -C /usr/local/java/src

第 ② 步 配置环境变量

直接在 /etc/profile 的行末进行配置,也可以根据自己实际情况进行配置。

vim /etc/profile
行末添加
export JAVA_HOME=/usr/local/src/java/jdk1.8.0_311/
export PATH=$JAVA_HOME/bin:$PATH

刷新配置:

source /etc/profile

验证java是否安装成功:

java -version
java version "1.8.0_311"
Java(TM) SE Runtime Environment (build 1.8.0_311-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.311-b11, mixed mode)

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:8d9f0fd3-a58d-4f9c-8767-03174ad1442c

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:e74d2e34-d6bf-4e67-a601-b53b8918d81c

第 ③ 步 上传 hadoop 压缩包(文末取)

这里选取的是 2.6.1,然后进行解压到指定目录

mkdir -p /usr/local/src/hadoop
tar -xvf  hadoop-2.6.1.tar.gz -C /usr/local/src/hadoop/

第 ④ 步 修改 hadoop 配置文件

我们需要修改的文件有 5 个,位置都是在 /usr/local/src/hadoop/hadoop-2.6.1/etc/hadoop下

5 个文件分别为:

hadoop-env.sh
core-site.xml
hdfs-site.xml
mapred-site.xml.template
yarn-site.xml

4.1 hadoop-env.sh 文件

该文件主要是java环境的配置,使用 vim 打开之后,进行配置。

注意:这里一定是原始的路径,而非环境变量名

export JAVA_HOME=/usr/local/src/java/jdk1.8.0_311/

4.2 配置 core-site.xml

用于定义系统级别的参数,如HDFS URI 、Hadoop的临时目录等


                fs.defaultFS
                hdfs://localhost:9000

                hadoop.tmp.dir
                /usr/local/src/hadoop/hadoop-2.6.1/data

4.3 配置 hdfs-site.xml

这里可以定义HDFS中文件副本数量,一般情况配置为 3,但是咱们今天是伪分布式,就一台机器,设置为 1 就好。


                dfs.replication
                1

4.4 配置 mapred-site.xml 文件

系统给的是一个模板 mapred-site.xml.template 文件,首先拷贝一份进行配置

cp mapred-site.xml.template mapred-site.xml

然后进行yarn的主节点配置,以及 map 结果传递给 reduce 的 shuffle 机制。


                yarn.resourcemanager.hostname
                localhost

                yarn.nodemanager.aux.services
                mapreduce_shuffle

4.5 配置 yarn-size.xml

配置ResourceManager ,nodeManager的通信端口,web监控端口等


        yarn.nodemanager.aux-services
        mapreduce_shuffle

        yarn.nodemanager.aux-services.mapreduce.shuffle.class
        org.apache.hadoop.mapred.ShuffleHandler

        yarn.resourcemanager.address
        0.0.0.0:8032

        yarn.resourcemanager.scheduler.address
        0.0.0.0:8030

        yarn.resourcemanager.resource-tracker.address
        0.0.0.0:8035

        yarn.resourcemanager.admin.address
        0.0.0.0:8033

        yarn.resourcemanager.webapp.address
        0.0.0.0:8088

第 ⑤ 步 hadoop添加到环境变量

和 java 环境变量的配置一样,配置环境变量

vim /etc/profile
底部编辑
export HADOOP_HOME=/usr/local/src/hadoop/hadoop-2.6.1
export PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$JAVA_HOME/bin:$PATH

刷新配置文件

source /etc/profile

第 ⑥ 步 初始化namenode

初始化 hdfs 格式,相当于格式化文件系统。

会往/usr/local/src/hadoop/hadoop-2.6.1/data写文件

hadoop namenode -format

显示格式化成功!

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

第 ⑦ 步 配置本地ssh 免密登录

如果没有配置本地 ssh 免密登录,则在配置中会一直提示让输入用户密码

yum -y install openssh-server

如果本地ssh正常就不用配置了

之后,

 ssh-keygen -t rsa

一直 enter 下去:

[root@iZ2zebkqy02hia7o7gj8paZ sbin]# ssh-keygen -t rsa
Generating public/private rsa key pair.

Enter file in which to save the key (/root/.ssh/id_rsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /root/.ssh/id_rsa.

Your public key has been saved in /root/.ssh/id_rsa.pub.

The key fingerprint is:
SHA256:oZgJuXGvxnk5pkkfd3r5CT/+uxq9nJMn2xJWk5pa6so root@iZ2zebkqy02hia7o7gj8paZ
The key's randomart image is:
+---[RSA 3072]----+
|                 |
|   .             |
|  + .   .       .|
|   = = . .     o.|
|  . + o S     o..|
|   . o .     =o  |
|    * * . o.=..o |
|   o * +.oo=.+=+.|
|    o . .Eo+*+OO.|
+----[SHA256]-----+

之后,

cd ~/.ssh/
cp id_rsa.pub authorized_keys

验证一下:

ssh localhost

如果不用输入密码就跳转登录了,此时就 ok 了!

第 ⑧ 步 启动 hadoop 集群

启动的组件包括 HDFS 以及 yarn。

先启动 HDFS,再启动 yarn,均在 /usr/local/src/hadoop/hadoop-2.6.1/sbin下

./start-dfs.sh
./start-yarn.sh

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:4e8d84a9-7d8e-4591-94b8-5c8a58713073

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:2ed13f56-b5a9-4fd3-a2b7-68491601b6e7

启动之后,使用 jps 看一下进程

35072 NodeManager
34498 DataNode
34644 SecondaryNameNode
34788 ResourceManager
34380 NameNode
82205 Jps

说明是成功的。

我这边是在服务器上配置的,所以想要访问 hdfs 或者 yarn,需要直接使用 ip 地址就可以访问了。

HDFS的web界面:ip:50070(ip更换为自己的地址)

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

yarn的web界面:ip:8088(ip更换为自己的地址)

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

那到现在,一个伪分布式的集群就搭建好了,包括了HDFS、Yarn、MapReduce 等组件。

下面首先使用 hadoop 自带的一个例子实现 WordCount。

系统 WordCount 演示

首先,咱们看到系统自带的 WordCount 文件的 jar 包在 /usr/local/src/hadoop/hadoop-2.6.1/share/hadoop/mapreduce 下的 hadoop-mapreduce-examples-2.6.1.jar。

预备需要做的就是创建一个文本文件,然后使用提供的 jar 包进行对文本文件中单词的计数。

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:985220b1-a58a-4882-85e1-316a9761a239

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:843b60cd-085d-468b-948b-0f12cbcb951c

在 /usr/local/src/hadoop/hadoop-2.6.1/data/ 下创建了 data1.txt 和 data2.txt。

vim data1.txt
vim data2.txt

data1.txt的内容

hadoop flink spark
kafka hive
hbase flink hadoop spark
spark
hbase spark hadoop

data2.txt的内容

hadoop flink spark
kafka hive hbase flink hadoop spark
spark hbase spark hadoop

然后,在 HDFS 创建/input 目录,并且把上述两个文件上传

创建 input 目录
hadoop dfs -mkdir /input
上传文件到 input 目录下
hadoop dfs -put data* /input/

查看,已经上传成功

 hadoop dfs -ls /input

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

可以在 HDFS 的 web 界面进行查看。

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

使用自带的例子进行 WordCount 案例演示

hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.1.jar wordcount /input /out

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

可以看到本地集群的 1 号任务。

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

最后,查看计算结果

[root@iZ2zebkqy02hia7o7gj8paZ hadoop-2.6.1]# hadoop dfs -ls /out
DEPRECATED: Use of this script to execute hdfs command is deprecated.

Instead use the hdfs command for it.

Found 2 items
-rw-r--r--   1 root supergroup          0 2021-12-06 18:28 /out/_SUCCESS
-rw-r--r--   1 root supergroup         48 2021-12-06 18:28 /out/part-r-00000
[root@iZ2zebkqy02hia7o7gj8paZ hadoop-2.6.1]# hadoop dfs -text /out/part-r-00000
DEPRECATED: Use of this script to execute hdfs command is deprecated.

Instead use the hdfs command for it.

flink   4
hadoop  6
hbase   4
hive    2
kafka   2
spark   8

同样也可以在 web 页面进行查看。

【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

ok!至此,在伪分布式环境计算了第一个 MapReduce 任务。

系统案例感受完了,下面看看自己写一个 MapReduce 任务。

第一个 MR 程序

通常开发一个 MR(MapReduce)程序,是用 Java 来进行开发的,本身 hadoop 生态也是用 Java实现。

所以,使用 Java 开发 MR 是最好的选择。

但今天选取 Python 作为 MR 开发语言。

原因有二:其一、很多算法同学对于 Python 的友好是不言而喻的。其二、MR 程序本身担任的是离线任务,对实时性要求不高,但是对于开发效率的要求却不低,Python 开发的MR程序,开箱即用。但用 Java 的话,需要配置一些jar环境,然后打包,上传。。。

下面就用 Python 作为开发语言进行一个 WordCount 的实现。

官网这么说:

Hadoop streaming是Hadoop的一个工具, 它帮助用户创建和运行一类特殊的map/reduce作业, 这些特殊的map/reduce作业是由一些可执行文件或脚本文件充当mapper或者reducer。
例如:

$HADOOP_HOME/bin/hadoop jar $HADOOP_HOME/hadoop-streaming.jar \
    -input myInputDirs \
    -output myOutputDir \
    -mapper /bin/cat \
    -reducer /bin/wc

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:01ca9683-bf9e-4f8b-ab60-b8feb0bfee79

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:d90e5bd9-45e9-4483-8c5f-c99cbdac4f6a

另外,需要一个 mapper 程序以及一个 reducer 程序。

下面就开始搞吧!

首先,咱们需要一个 mapper 程序来进行将文件从标准输入进行读取

编写 mapper.py:

#!/usr/bin/python

import sys
import re

for line in sys.stdin:
    words = re.split(" +", line.strip())
    for word in words:
        print("%s\t%s" % (word, "1"))

其中使用正则 re 是防止单词之间出现多个空格。

下面编写 reducer.py:

#!/usr/bin/python

import sys

sum = 0
last_word = None

for line in sys.stdin:
    word = line.strip().split("\t")
    if len(word) != 2:
        continue
    word = word[0]

    if last_word is None:
        last_word = word

    if last_word != word:
        print('\t'.join([last_word, str(sum)]))
        last_word = word
        sum = 0
    sum += 1

print('\t'.join([last_word, str(sum)]))

下面可以先进性一番测试,通过一个shell命令即可:

cat input_file | python mapper.py | sort -k1 | python reducer.py

最后,就可以编写文档中提供的 Hadoop streaming 工具了。

编写 main.sh 调度 mapper 和 reducer:

#!/bin/bash

HADOOP_HOME="/usr/local/src/hadoop/hadoop-2.6.1/bin/hadoop"
STREAM_JAR_PATH="/usr/local/src/hadoop/hadoop-2.6.1/share/hadoop/tools/lib/hadoop-streaming-2.6.1.jar"

INPUT_PATH="/input"
OUTPUT_PATH="/out_streaming"

清空上次记录
${HADOOP_HOME} dfs -rmr ${OUTPUT_PATH}

${HADOOP_HOME} jar ${STREAM_JAR_PATH} \
    -input ${INPUT_PATH} \
    -output ${OUTPUT_PATH} \
    -mapper "python mapper.py" \
    -reducer "python reducer.py" \
    -file ./mapper.py \
    -file ./reducer.py

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:8636bf40-5104-44f1-b1a1-b58920ea34fb

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:694bda50-ac11-4282-8e4d-571e2dd29231

-x 可以查看执行的详细信息

sh -x main.sh

现在看下结果:

[root@iZ2zebkqy02hia7o7gj8paZ script]# hadoop fs -ls /out_streaming
Found 2 items
-rw-r--r--   1 root supergroup          0 2021-12-07 15:43 /out_streaming/_SUCCESS
-rw-r--r--   1 root supergroup         48 2021-12-07 15:43 /out_streaming/part-00000
[root@iZ2zebkqy02hia7o7gj8paZ script]# hadoop fs -text /out_streaming/part-00000
flink   4
hadoop  6
hbase   4
hive    2
kafka   2
spark   8

现在显示的结果和上面使用系统默认wc提供程序的结果是一致的!

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:2d50eb85-658e-44c3-8225-04aa7527fe19

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:a6477c03-68b2-48fd-aee0-f16900560f0d

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:210a365f-e0bd-41c9-b3d5-b0464adb75f8

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:e0fe4f5a-333a-412d-b953-80acf226ceeb

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:8b526b40-2bbd-405b-9641-3d6f684b9cba

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:264cebe1-2890-41e4-a130-9eb6daa395d2

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:2d37bf3b-227f-43f6-8ad7-3ffe32e82f3b

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:a4742237-830c-4758-9ed0-bc007803bb5b

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:df7c2588-b31f-4ff1-a9b5-42d94746335e

[En]

[TencentCloudSDKException] code:FailedOperation.ServiceIsolate message:service is stopped due to arrears, please recharge your account in Tencent Cloud requestId:0efec5e3-7991-451e-93fb-9c477459e8d6

好了~ 咱们下期再见!bye~~

Original: https://www.cnblogs.com/yydsxiaozhu/p/15673429.html
Author: 技术gogogo
Title: 【每天五分钟大数据-第一期】 伪分布式+Hadoopstreaming

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/563855/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球