Important Safety Measures for Handling Data on Hadoop Cluster

Slide Note
Embed
Share

Implementing critical clean-up procedures, warning against potential dangers, and emphasizing the need for caution when performing tasks on the Hadoop cluster. The guide stresses the importance of data integrity and proper handling techniques to ensure the smooth functioning of the system.


Uploaded on Dec 07, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Hola Hadoop

  2. 0. Clean-Up The Hard-disks Delete tmp/ folder from workspace/mdp-lab3 Delete unneeded downloads

  3. 0. Peligro! Please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please

  4. 0. Peligro! please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please please

  5. Peligro! please

  6. Peligro! please be careful of what you are doing! Think twice before: rm mv cp kill emacs/vim/ configuration files

  7. Peligro! please.

  8. cluster.dcc.uchile.cl

  9. 1. Download tools http://aidanhogan.com/teaching/cc5212- 1/tools/ Unzip them somewhere you can find them

  10. 2. Log-in PuTTy 1 2 3

  11. 3. Open DFS Browser http://cluster.dcc.uchile.cl:50070/

  12. 3. PuTTy: Upload data to HDFS hadoop fs -ls / hadoop fs -ls /uhadoop hadoop fs -mkdir /uhadoop/[username] [username] = first letter first name, last name (e.g., ahogan ) cd /data/hadoop/hadoop/data/ hadoop fs -copyFromLocal /data/hadoop/hadoop/data/es-abstracts.txt /uhadoop/[username]/es-abstracts.txt

  13. Note on namespace If you need to disambiguate local/remote files HDFS file hdfs://cm:9000/uhadoop/ Local file file:///data/hadoop/...

  14. 4. Lets Build Our First MapReduce Job Hint: Use Monday s slides for inspiration http://aidanhogan.com/teaching/cc5212-1/ 1. Implement map(.,.,.,.) method 2. Implement reduce(.,.,.,.) method 3. Implement main(.) method

  15. 5. Eclipse: Build jar Right Click build.xml > dist (Might need to make a dist folder)

  16. 6. WinSCP: Copy .jar to Master Server 1 2 3 Don t save password! 4

  17. 6. WinSCP: Copy .jar to Master Server

  18. 6. WinSCP: Copy .jar to Master Server Create dir: /data/2014/uhadoop/[username]/ Copy your mdp-lab4.jar into it

  19. 7. Putty: Run Job hadoop jar /data/2014/uhadoop/[username]/mdp- lab4.jar WordCount /uhadoop/[username]/es- abstracts.txt /uhadoop/[username]/wc/ All one command!

  20. 8. Look at output hadoop fs -ls /uhadoop/[username]/wc/ hadoop fs -cat /uhadoop/[username]/wc/part-00000 | more All one command! Look for de 4575144 occurrences in local run hadoop fs -cat /uhadoop/[username]/wc/part-00000 | grep - e "^de" | more

  21. 9. Look at output through browser http://cluster.dcc.uchile.cl:50070/

Related


More Related Content