Important Safety Measures for Handling Data on Hadoop Cluster
Implementing critical clean-up procedures, warning against potential dangers, and emphasizing the need for caution when performing tasks on the Hadoop cluster. The guide stresses the importance of data integrity and proper handling techniques to ensure the smooth functioning of the system.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
0. Clean-Up The Hard-disks Delete tmp/ folder from workspace/mdp-lab3 Delete unneeded downloads
0. Peligro! Please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please
0. Peligro! please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please
Peligro! please
Peligro! please be careful of what you are doing! Think twice before: rm mv cp kill emacs/vim/ configuration files
Peligro! please.
1. Download tools http://aidanhogan.com/teaching/cc5212- 1/tools/ Unzip them somewhere you can find them
2. Log-in PuTTy 1 2 3
3. Open DFS Browser http://cluster.dcc.uchile.cl:50070/
3. PuTTy: Upload data to HDFS hadoop fs -ls / hadoop fs -ls /uhadoop hadoop fs -mkdir /uhadoop/[username] [username] = first letter first name, last name (e.g., ahogan ) cd /data/hadoop/hadoop/data/ hadoop fs -copyFromLocal /data/hadoop/hadoop/data/es-abstracts.txt /uhadoop/[username]/es-abstracts.txt
Note on namespace If you need to disambiguate local/remote files HDFS file hdfs://cm:9000/uhadoop/ Local file file:///data/hadoop/...
4. Lets Build Our First MapReduce Job Hint: Use Monday s slides for inspiration http://aidanhogan.com/teaching/cc5212-1/ 1. Implement map(.,.,.,.) method 2. Implement reduce(.,.,.,.) method 3. Implement main(.) method
5. Eclipse: Build jar Right Click build.xml > dist (Might need to make a dist folder)
6. WinSCP: Copy .jar to Master Server 1 2 3 Don t save password! 4
6. WinSCP: Copy .jar to Master Server Create dir: /data/2014/uhadoop/[username]/ Copy your mdp-lab4.jar into it
7. Putty: Run Job hadoop jar /data/2014/uhadoop/[username]/mdp- lab4.jar WordCount /uhadoop/[username]/es- abstracts.txt /uhadoop/[username]/wc/ All one command!
8. Look at output hadoop fs -ls /uhadoop/[username]/wc/ hadoop fs -cat /uhadoop/[username]/wc/part-00000 | more All one command! Look for de 4575144 occurrences in local run hadoop fs -cat /uhadoop/[username]/wc/part-00000 | grep - e "^de" | more
9. Look at output through browser http://cluster.dcc.uchile.cl:50070/