微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

将Apache Hadoop数据输出存储到Mysql数据库

我需要将map-reduce程序的输出存储到数据库中,有什么办法吗?

如果是这样,是否可以将输出存储到多个列中.表根据要求??

请给我一些解决方案.

谢谢..

解决方法:

这个很好的例子显示on this blog,我尝试了它,它非常好.我引用了代码中最重要的部分.

首先,您必须创建一个表示您要存储的数据的类.该类必须实现DBWritable接口:

public class DBOutputWritable implements Writable, DBWritable
{
   private String name;
   private int count;

   public DBOutputWritable(String name, int count) {
     this.name = name;
     this.count = count;
   }

   public void readFields(DataInput in) throws IOException {   }

   public void readFields(ResultSet rs) throws sqlException {
     name = rs.getString(1);
     count = rs.getInt(2);
   }

   public void write(DataOutput out) throws IOException {    }

   public void write(PreparedStatement ps) throws sqlException {
     ps.setString(1, name);
     ps.setInt(2, count);
   }
}

在Reducer中创建以前定义的类的对象:

public class Reduce extends Reducer<Text, IntWritable, DBOutputWritable, NullWritable> {

   protected void reduce(Text key, Iterable<IntWritable> values, Context ctx) {
     int sum = 0;

     for(IntWritable value : values) {
       sum += value.get();
     }

     try {
       ctx.write(new DBOutputWritable(key.toString(), sum), NullWritable.get());
     } catch(IOException e) {
       e.printstacktrace();
     } catch(InterruptedException e) {
       e.printstacktrace();
     }
   }
}

最后,您必须配置与数据库的连接(不要忘记在类路径上添加数据库连接器)并注册映射器和reducer的输入/输出数据类型.

public class Main
{
   public static void main(String[] args) throws Exception
   {
     Configuration conf = new Configuration();
     DBConfiguration.configureDB(conf,
     "com.MysqL.jdbc.Driver",   // driver class
     "jdbc:MysqL://localhost:3306/testDb", // db url
     "user",    // username
     "password"); //password

     Job job = new Job(conf);
     job.setJarByClass(Main.class);
     job.setMapperClass(Map.class); // your mapper - not shown in this example
     job.setReducerClass(Reduce.class);
     job.setMapOutputKeyClass(Text.class); // your mapper - not shown in this example
     job.setMapOutputValueClass(IntWritable.class); // your mapper - not shown in this example
     job.setoutputKeyClass(DBOutputWritable.class); // reducer's KEYOUT
     job.setoutputValueClass(NullWritable.class);   // reducer's VALUEOUT
     job.setInputFormatClass(...);
     job.setoutputFormatClass(DBOutputFormat.class);

     DBInputFormat.setInput(...);

     DBOutputFormat.setoutput(
     job,
     "output",    // output table name
     new String[] { "name", "count" }   //table columns
     );

     System.exit(job.waitForCompletion(true) ? 0 : 1);
   }
}

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。

相关推荐