GNU/Linux >> Linux Esercitazione >  >> Ubuntu

Formattazione dei nomi nell'installazione di Hadoop?

Ho fornito il percorso sia per la directory dei file di configurazione che per la directory dei file .sh.

Sono riuscito a fare le configurazioni con questo percorso:

[email protetta]:/home/hduser/hadoop/hadoop-2.4.0/etc/hadoop# gedit coresite.xml

<configuration>

<!-- In: conf/core-site.xml -->
<property>
  <name>hadoop.tmp.dir</name>
  <value>/home/hduser/tmp</value>
  <description>A base for other temporary directories.</description>
</property>

<property>
  <name>fs.default.name</name>
  <value>hdfs://localhost:54310</value>
  <description>The name of the default file system.  A URI whose
  scheme and authority determine the FileSystem implementation.  The
  uri's scheme determines the config property (fs.SCHEME.impl) naming
  the FileSystem implementation class.  The uri's authority is used to
  determine the host, port, etc. for a filesystem.</description>
</property>

</configuration>

[email protected]:/home/hduser/hadoop/hadoop-2.4.0/etc/hadoop# gedit mapred-site.xml

<configuration>

 <!-- In: conf/mapred-site.xml -->
<property>
  <name>mapred.job.tracker</name>
  <value>localhost:54311</value>
  <description>The host and port that the MapReduce job tracker runs
  at.  If "local", then jobs are run in-process as a single map
  and reduce task.
  </description>
</property>

</configuration>

[email protected]:/home/hduser/hadoop/hadoop-2.4.0/etc/hadoop# gedit hdfs-site.xml

<configuration>

 <!-- In: conf/hdfs-site.xml -->
<property>
  <name>dfs.replication</name>
  <value>1</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
</property>

</configuration>

Dettagli della mia directory:

[email protected]:~/hadoop/hadoop-2.4.0/etc$ cd ./hadoop
[email protected]:~/hadoop/hadoop-2.4.0/etc/hadoop$ ls
capacity-scheduler.xml      hadoop-policy.xml        mapred-queues.xml.template
configuration.xsl           hdfs-site.xml            mapred-site.xml
container-executor.cfg      hdfs-site.xml~           mapred-site.xml~
core-site.xml               httpfs-env.sh            mapred-site.xml.template
core-site.xml~              httpfs-log4j.properties  slaves
hadoop-env.cmd              httpfs-signature.secret  ssl-client.xml.example
hadoop-env.sh               httpfs-site.xml          ssl-server.xml.example
hadoop-env.sh~              log4j.properties         yarn-env.cmd
hadoop-metrics2.properties  mapred-env.cmd           yarn-env.sh
hadoop-metrics.properties   mapred-env.sh            yarn-site.xml
[email protected]:~/hadoop/hadoop-2.4.0/etc/hadoop$ cd ..
[email protected]:~/hadoop/hadoop-2.4.0/etc$ ls
hadoop

[email protected]:~/hadoop/hadoop-2.4.0/etc$ cd ..
[email protected]:~/hadoop/hadoop-2.4.0$ ls
bin  etc  include  lib  libexec  LICENSE.txt  logs  NOTICE.txt  README.txt  sbin  share
[email protected]:~/hadoop/hadoop-2.4.0$ cd ./etc
[email protected]:~/hadoop/hadoop-2.4.0/etc$ ls
hadoop
[email protected]:~/hadoop/hadoop-2.4.0/etc$ cd ..
[email protected]:~/hadoop/hadoop-2.4.0$ cd ./sbin
[email protected]:~/hadoop/hadoop-2.4.0/sbin$ ls
distribute-exclude.sh  hdfs-config.sh           slaves.sh          start-dfs.cmd        start-yarn.sh     stop-dfs.cmd        stop-yarn.sh
hadoop-daemon.sh       httpfs.sh                start-all.cmd      start-dfs.sh  
       stop-all.cmd      stop-dfs.sh         yarn-daemon.sh
hadoop-daemons.sh      mr-jobhistory-daemon.sh  start-all.sh       start-secure-dns.sh  stop-all.sh       stop-secure-dns.sh  yarn-daemons.sh
hdfs-config.cmd        refresh-namenodes.sh     start-balancer.sh  start-yarn.cmd       stop-balancer.sh  stop-yarn.cmd

Quando avvio i demoni:

[email protected]:~/hadoop/hadoop-2.4.0/sbin$ ls
distribute-exclude.sh    start-all.cmd        stop-all.sh
hadoop-daemon.sh         start-all.sh         stop-balancer.sh
hadoop-daemons.sh        start-balancer.sh    stop-dfs.cmd
hdfs-config.cmd          start-dfs.cmd        stop-dfs.sh
hdfs-config.sh           start-dfs.sh         stop-secure-dns.sh
httpfs.sh                start-secure-dns.sh  stop-yarn.cmd
mr-jobhistory-daemon.sh  start-yarn.cmd       stop-yarn.sh
refresh-namenodes.sh     start-yarn.sh        yarn-daemon.sh
slaves.sh                stop-all.cmd         yarn-daemons.sh
[email protected]:~/hadoop/hadoop-2.4.0/sbin$ ./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/hduser/hadoop/hadoop-2.4.0/logs/hadoop-hduser-namenode-ratan-Inspiron-N5110.out
localhost: starting datanode, logging to /home/hduser/hadoop/hadoop-2.4.0/logs/hadoop-hduser-datanode-ratan-Inspiron-N5110.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/hduser/hadoop/hadoop-2.4.0/logs/hadoop-hduser-secondarynamenode-ratan-Inspiron-N5110.out
starting yarn daemons
starting resourcemanager, logging to /home/hduser/hadoop/hadoop-2.4.0/logs/yarn-hduser-resourcemanager-ratan-Inspiron-N5110.out
localhost: starting nodemanager, logging to /home/hduser/hadoop/hadoop-2.4.0/logs/yarn-hduser-nodemanager-ratan-Inspiron-N5110.out
[email protected]:~/hadoop/hadoop-2.4.0/sbin$ jps

1441 DataNode
1608 SecondaryNameNode
1912 NodeManager
2448 Jps
1775 ResourceManager

[email protected]:~/hadoop/hadoop-2.4.0/sbin$ 

Il problema è che non riesco a trovare il namenode che deve essere formattato. E quando eseguo i demoni, il namenode non si vede da nessuna parte. Dove sbaglio?

Risposta accettata:

Controlla questo file di registro:

hadoop-hduser-namenode-ratan-Inspiron-N5110.log

Se dice che il namenode non è formattato, formattalo

bin/hadoop namenode -format


Ubuntu
  1. Installazione Ubuntu 22.04 Cinnamon Desktop

  2. Installazione di Ubuntu 20.04 da USB

  3. Installazione Ubuntu 20.04 Cinnamon Desktop

  4. Installazione di Ubuntu 20.04 Tomcat

  5. Python 3.3.2 Installazione?

Installazione di Ubuntu 22.04 LAMP

Installazione della GUI di Ubuntu 22.04

Installazione di Ubuntu 22.04 VLC

Installazione di Ubuntu 22.04 Eclipse

Installazione di Ubuntu 22.04 PostgreSQL

Installazione del browser Chromium di Ubuntu 22.04