Äú¿ÉÒÔ¾èÖú£¬Ö§³ÖÎÒÃǵĹ«ÒæÊÂÒµ¡£

1Ôª 10Ôª 50Ôª





ÈÏÖ¤Â룺  ÑéÖ¤Âë,¿´²»Çå³þ?Çëµã»÷Ë¢ÐÂÑéÖ¤Âë ±ØÌî



  ÇóÖª ÎÄÕ ÎÄ¿â Lib ÊÓÆµ iPerson ¿Î³Ì ÈÏÖ¤ ×Éѯ ¹¤¾ß ½²×ù Model Center   Code  
»áÔ±   
   
 
     
   
 ¶©ÔÄ
  ¾èÖú
hadoop_eclipse¼°HDT²å¼þµÄʹÓÃ
 
  2463  次浏览      30
 2018-5-15 
 
±à¼­ÍƼö:
±¾ÎÄÀ´×ÔÓÚcnblogs£¬ÎÄÕÂÕë¶ÔWindows°æµÄeclipse£¬½éÉÜÒ»ÖÖ²»Í¬µÄ°²×°·½Ê½¡¢µ¼ÈëºÍʹÓ÷½Ê½¡£

Hadoop Development Tools (HDT)ÊÇ¿ª·¢hadoopÓ¦ÓõÄeclipse²å¼þ£¬http://hdt.incubator.apache.org/½éÉÜÁËÆäÌØµã£¬°²×°£¬Ê¹Óõȣ¬Õë¶ÔWindows°æµÄeclipse£¬½éÉÜÒ»ÖÖ²»Í¬µÄ°²×°·½Ê½¡¢ºÍʹÓ÷½Ê½¡£

1 ÏÂÔØHDT

´ò¿ª£ºhttp://hdt.incubator.apache.org/download.html£¬²¿·ÖÒ³Ãæ£º

ÏÂÔØHDT 0.0.2.incubating (Binary)°æ¡£µã»÷¡°tar.gz¡±,Ìø×ªµ½£º

http://www.apache.org/dyn/closer.cgi/incubator/hdt/hdt-0.0.2.incubating/hdt-0.0.2.incubating-bin.tar.gz£¬²¿·ÖÒ³Ãæ£º

µã»÷ºì¿ò²¿·ÖµÄÁ¬½Ó£¬ÏÂÔØHDT£¬½âѹ¿´µ½Îļþ¼ÐÄÚÈÝ£º

2 °²×°HDT²å¼þ

ÏÂÔØµ±Ç°×îа棨eclipse oxygen£©

µã»÷Download Packages¡£

ÏÂÔØ64bit°æ±¾¡£ÎļþΪ£ºeclipse-jee-oxygen-3a-win32-x86_64.zip£¬½âѹ£º

½«HDTµÄfeaturesºÍpluginsÖеÄÎļþ£¬¶ÔÓ¦·Åµ½ÉÏÃæµÄÎļþ¼ÐÄÚ¡£

3 ÏÂÔØhadoop²¢ÅäÖû·¾³±äÁ¿

ÏÂÔØHadoop

ÊäÈëÍøÖ·£ºhttp://hadoop.apache.org/£¬¿´µ½ÏÂÃæµÄ²¿·Ö¡£

µã»÷Download½øÈëÏÂÔØÒ³Ãæ£º

ÏÂÔØ2.6.5°æ±¾µÄbinary£¬×¢ÒâÏÂÔØµÄʱºòÑ¡ÔñÒ»¸ö¹úÄڵľµÏñ£¬ÕâÑùÏÂÔØµÄËÙ¶È»á±È½Ï¿ì¡£½âѹµ½Ö¸¶¨Ä¿Â¼£¬ÀýÈ磺E:\hadoop-2.6.5¡£Îļþ¼Ð°üÀ¨£º

ÅäÖû·¾³±äÁ¿

ÅäÖÃHADOOP_HOME¡¢HADOOP_USER_NAME»·¾³±äÁ¿¡¢PATH£¨ÏµÍ³±äÁ¿£©

HADOOP_HOMEÅäÖÃΪE:\hadoop-2.6.5,PATHÌí¼Ó%HADOOP_HOME%\bin

WindowsÏ¿ª·¢

ΪÁËÄÜÔÚWindowsƽ̨ÏÂ×ö¿ª·¢£¬»¹ÐèÒªÁ½¸öÎļþwinutils.exeºÍhadoop.dll

½«winutils.exe·ÅÔÚE:\hadoop-2.6.0\binĿ¼Ï£¬½«hadoop.dll·ÅÔÚC:\Windows\System32ÏÂ

4 °²×°HDT

1£©µã»÷˳Ðò£ºFile->Other->Õ¹¿ªHadoop£¬ÈëÏÂÃæÁ½·ùͼËùʾ£º

2£©Ñ¡Ôñ£¬ ÈçÏÂͼ£º

¸øÏîĿȡһ¸öÃû³Æ£ºMapReduce_4_27£¬²¢Ñ¡Ôñ¡°Use default Hadoop¡±£¨Ä¬ÈϵÄÉèÖã©¡£

3£©ÅäÖÃHadoop°²×°Ä¿Â¼

µã»÷2£©²½×àÖеĽøÐÐÅäÖã¬ÆäÖÐÅäÖõľÍÊǸղÅhadoop½âѹÎļþµÄ·¾¶¡£

µã»÷¡°Apply and Close¡±£¬ÏÔʾÈçϽçÃæ£º

µã»÷£¬ÏÔʾÈçϽçÃæ£º

×îºóµã»÷¡°Finish¡±¡£

4£©µ¼È뿪·¢°üºÍjavadocÎĵµ

ÓÒ¼ü->ÏîÄ¿ÊôÐÔ-> Ñ¡Ôñ Property-> ÔÚµ¯³öµÄ¶Ô»°¿ò×ó²àÁбíÖÐÑ¡ÔñJava Build Path-> Ñ¡ÔñLibraries-> Ñ¡ÔñAdd Library->µ¯³ö´°¿ÚÄÚÑ¡ÔñUser Library-> µã»÷Next-> µã»÷User Libraries-> µã»÷New->ÔÚµ¯³öµÄ´°¿ÚÄÚÊäÈë±ØÒªÐÅÏ¢-> ½«±ØÒªµÄjar°üÌí¼Ó½øÈ¥¡£

ËùÐèµÄ¿ª·¢°üÔÚE:\hadoop-3.0.2\share\hadoop£¬Õâ¸öÎļþ¼ÐÊǸղŽâѹhadoop°²×°°ü½âѹµÄÎļþ¼Ð¡£

µ¼ÈëdocÎĵµ

ÓÒ¼ülibÎļþ¼Ð->µã»÷Build Path->µã»÷Config Build Path

µã»÷Javadoc Location->µã»÷BrowseÑ¡ÔñdocÎĵµÂ·¾¶¡£

µã»÷validate¿ÉÒÔÑéÖ¤ÊÇ·ñÊÇÕýÈ·µÄ·¾¶£¬ÏÂÃæ·Ö±ðչʾÁËÕýÈ·µÄ·¾¶ºÍ·ÇÕýÈ·µÄ·¾¶ÑéÖ¤ÐÅÏ¢¡£

5 ʹÓÃHDT£¨MapReduce±à³Ì£©

ÉèÖÃJVM²ÎÊý

´´½¨ºÃMap/Reduce ProjectºóÒªÉèÖÃJVM²ÎÊýÉèÖÃΪ£º

-Djava.library.path=$HADOOP_HOME/lib/native

Mapper£º´´½¨MapperÀàµÄ×ÓÀà

Àý£¬Ä£°å×Ô¶¯Éú³ÉµÄmapº¯Êý¿ò¼Ü

import java.io.IOException;
import org.apache.hadoop.io .IntWritable;
import org.apache.hadoop.io .LongWritable;
import org.apache.hadoop.io .Text;
import org.apache.hadoop.mapreduce .Mapper;
public class Tmap extends Mapper <LongWritable, Text , Text, IntWritable > {
public void map (LongWritable key, Text value, Context context) throws IOException, InterruptedException {
}
}

Reducer£º´´½¨ReducerÀàµÄ×ÓÀà

Àý£ºÄ£°å×Ô¶¯Éú³ÉµÄreduceº¯Êý¿ò¼Ü

import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;
public class Treduce extends Reducer<Text, IntWritable, Text, IntWritable> {
public void reduce(Text key, Iterable<IntWritable> values, Context context)
throws IOException, InterruptedException {
while (values.iterator().hasNext()) {
// replace ValueType with the real type of your value
// process value
}
}
}

MapReduce Driver£º´´½¨Çý¶¯

Àý£ºÄ£°å×Ô¶¯Éú³ÉµÄÇý¶¯¿ò¼Ü

import org.apache.hadoop.fs .Path;
import org.apache.hadoop.io .IntWritable;
import org.apache.hadoop.io .Text;
import org.apache.hadoop.mapreduce .Job;
import org .apache .hadoop .mapreduce .lib.input . File InputFormat ;
import org.apache .hadoop.mapreduce .lib.output. File OutputFormat;
public class TMR {
public static void main (String[] args) throws IOException ,InterruptedException , Class NotFound Exception {
Job job = new Job();
job.setJarByClass ( ... );
job.setJobName ( "a nice name" );
FileInputFormat.setInputPaths (job, new Path(args[0]));
FileOutputFormat.setOutputPath (job, new Path(args[1]));
// TODO: specify a mapper
job. setMapperClass( ... );
// TODO : specify a reducer
job. setReducerClass( ... );
job. setOutputKeyClass(Text.class);
job. setOutputValueClass(IntWritable.class);
boolean success = job.waitForCompletion (true);
System.exit (success ? 0 : 1);
};
}

New MR Cluster£º¼¯ÈºÅäÖÃ

¿ÉÒÔµã»÷ÏÂͼÖеÄNew MR ClusterÅäÖü¯Èº

Ò²¿ÉÒÔµã»÷eclipseµÄͼ±êÀ´ÅäÖü¯Èº£º

ÅäÖÃÒ³ÃæÈçÏ£º

Resource Manager Node£ºÅäÖÃ×ÊÔ´¹ÜÀí½Úµã£¬¶ÔÓ¦HadoopÅäÖÃÎļþ

DFS Master£ºÅäÖ÷ֲ¼Ê½ÎļþϵͳÖ÷½Úµã£¬¼´NameNode½ÚµãµÄ¶Ë¿ÚºÅ¡£¶ÔÓ¦ÅäÖÃÎļþfs.default.nameµÄÖµ¡£

   
2463 ´Îä¯ÀÀ       30
Ïà¹ØÎÄÕÂ

»ùÓÚEAµÄÊý¾Ý¿â½¨Ä£
Êý¾ÝÁ÷½¨Ä££¨EAÖ¸ÄÏ£©
¡°Êý¾Ýºþ¡±£º¸ÅÄî¡¢ÌØÕ÷¡¢¼Ü¹¹Óë°¸Àý
ÔÚÏßÉ̳ÇÊý¾Ý¿âϵͳÉè¼Æ ˼·+Ч¹û
 
Ïà¹ØÎĵµ

GreenplumÊý¾Ý¿â»ù´¡Åàѵ
MySQL5.1ÐÔÄÜÓÅ»¯·½°¸
ijµçÉÌÊý¾ÝÖÐ̨¼Ü¹¹Êµ¼ù
MySQL¸ßÀ©Õ¹¼Ü¹¹Éè¼Æ
Ïà¹Ø¿Î³Ì

Êý¾ÝÖÎÀí¡¢Êý¾Ý¼Ü¹¹¼°Êý¾Ý±ê×¼
MongoDBʵս¿Î³Ì
²¢·¢¡¢´óÈÝÁ¿¡¢¸ßÐÔÄÜÊý¾Ý¿âÉè¼ÆÓëÓÅ»¯
PostgreSQLÊý¾Ý¿âʵսÅàѵ