Installing Hadoop on Mac Step-by-Step Guide

Installing Hadoop on Mac Step-by-Step Guide
Slide Note
Embed
Share

Learn how to install and configure Hadoop on your Mac computer, including setting up HomeBrew, installing Java 8, configuring Hadoop properties files, and more. Follow the detailed instructions and screenshots provided to successfully set up Hadoop for your data processing needs.

  • Hadoop installation
  • Mac setup
  • HomeBrew
  • Java 8
  • Configuration

Uploaded on Mar 01, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Install Hadoop on Mac

  2. Install HomeBrew /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)" Press RETURN to imitate the process. The process may take 10 minutes, patiently wait till you see command prompt again.

  3. Install Java 8 Use the following command to check your existing Java version $ java version If you don t have java, or your java version is not 1.8, you need to install jdk 1.8. Issue the following command to install Java 8, like is shown in the following screenshot. $ brew install --cast homebrew/cask-versions/adoptopenjdk8

  4. Install Hadoop Use the following command to install the most current version of Hadoop at the path /usr/local/Cellar/hadoop $ brew install hadoop

  5. Configure Hadoop Use the following command to open hadoop-env.sh $ cd /usr/local/cellar/hadoop/3.3.0/libexec/etc/hadoop $ open hadoop-env.sh

  6. Add the following to hadoop-env.sh export JAVA_HOME= /Library/Java/JavaVirtualMachines/adoptopenjdk- 8.jdk/Contents/Home

  7. Then open core-site.xml $ open -e core-site.xml Then replace the empty <configuration></configuration> with the following <configuration> <property> <name>fs.defaultFS</name> <value>hdfs://localhost:9000</value> </property> </configuration>

  8. Then open hdfs-site.xml $ open -e hdfs-site.xml Then replace the empty <configuration></configuration> with the following <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration>

  9. Then open mapred-site.xml $ open -e mapred-site.xml Then replace the empty <configuration></configuration> with the following <configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property> <property> <name>mapreduce.application.classpath</name> <value>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED _HOME/share/hadoop/mapreduce/lib/*</value> </property> </configuration>

  10. Then open yarn-site.xml $ open -e yarn-site.xml Then replace the empty <configuration></configuration> with the following <configuration> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.nodemanager.env-whitelist</name> <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CO NF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED _HOME</value> </property> </configuration>

  11. Remove password requirement Check if you are able to ssh without a password by typing $ ssh localhost If it requires passwd, then use the following commands one at a time to remove the needs $ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa $ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys $ chmod 0600 ~/.ssh/authorized_keys

  12. Format Namenode Issue the following commands to format the namenode $ cd /usr/local/cellar/hadoop/3.3.0/libexec/bin $ hdfs namenode -format

  13. Run Hadoop $ cd /usr/local/cellar/hadoop/3.3.0/libexec/sbin $ ./start-all.sh $ jps When the above commands are successfully run, you should see something like the following

  14. Access http://localhost:9870 in a browser

  15. Close Hadoop $ cd /usr/local/cellar/hadoop/3.3.0/libexec/sbin $ ./stop-all.sh

Related


More Related Content