[big data learning 02] preparation for Hadoop cluster installation

I. connecting virtual machines to the Internet The virtual machine cannot ping the network card. The following settings are required at this time vi ...
I. connecting virtual machines to the Internet
II. Fixed IP configuration of virtual machine
3. Create users and configure sudo permissions
IV. configure virtual machine host name
V. close the firewall
Vi. install and configure SSH service
7. Configure the corresponding relationship between host name and IP
VIII. Installation of JDK

I. connecting virtual machines to the Internet

The virtual machine cannot ping the network card. The following settings are required at this time

vi /etc/sysconfig/network-scripts/ifcfg-eth0 //Modify network card profile ----------------------------- //Make the following changes ONBOOT=yes ----------------------------- service network restart //Restart NIC

II. Fixed IP configuration of virtual machine

vi /etc/sysconfig/network-scripts/ifcfg-eth0 //Modify network card profile ----------------------------- //Make the following changes BOOTPROTO=static IPADDR=192.168.80.133 NETMASK=255.255.255.0 GATEWAY=192.168.80.2 ----------------------------- service network restart //Restart NIC

3. Create users and configure sudo permissions

Create user useradd -m hadoop
Configure sudo permissions

//Edit under root visudo ---------------------------------------------- hadoop ALL=(ALL) NOPASSWD:ALL //End add ----------------------------------------------

IV. configure virtual machine host name

  1. Temporary modification (generally not used)
    Hostname 123 / / 123 is the temporarily modified hostname
  2. Permanent modification
vi /etc/sysconfig/network ------------------------- //Modify hostname HOSTNAME=123 ------------------------- reboot //Restart, effective

V. close the firewall

The firewall is the protection of the server, but it will hinder the communication between clusters, so the firewall needs to be closed

Temporarily Closed:
1. Check the firewall status, service iptables status
2. Close the firewall, service iptables stop

Permanent closure
chkconfig iptables off / / does not take effect immediately
Reboot / / reboot takes effect

Vi. install and configure SSH service

yum install -y openssh-clients //Install ssh service su hadoop //Switch to the user who needs to configure ssh ssh-keygen -t rsa //Generate key pair cd /home/hadoop/.ssh cp id_rsa.pub authorized_keys //Put public key in file sudo passwd hadoop //Create a password for hadoop users ssh learn //Test ssh connection

7. Configure the corresponding relationship between host name and IP

vi /etc/hosts ------------------------------ //Add the following statement 192.168.80.133 learn //IP host name ------------------------------

VIII. Installation of JDK

Because hadoop is written in java, the running environment of java is needed to run hadoop. Use this connection to download jdk1.7
https://www.oracle.com/technetwork/java/javase/downloads/java-archive-downloads-javase7-521261.html

su hadoop mkdir /home/hadoop/app //Create a directory to install the software cd /home/hadoop/app mkdir modules //Storage of decompressed software package mkdir soft //Storage of software compression package #Upload the downloaded jdk to the soft directory sudo chown hadoop:hadoop jdk-7u80-linux-x64.tar.gz tar -zxvf jdk-7u80-linux-x64.tar.gz -C ./../modules //decompression

Configure JDK environment variables

sudo vi /etc/profile #JAVA Enviroment export JAVA_HOME=/home/hadoop/app/modules/jdk1.7.0_80 export CLASSPATH=$JAVA_HOME/lib:$CLASSPATH export PATH=$JAVA_HOME/bin:$PATH source /etc/profile //Make environment variables effective java -version //Check whether the configuration is successful

3 December 2019, 21:55 | Views: 8270

Add new comment

For adding a comment, please log in
or create account

0 comments