HDFS之JavaAPI上传文件
原创
wx62be9d88ce294博主文章分类:大数据 ©著作权
文章标签 hdfs java hadoop apache 文章分类 Hadoop 大数据
©著作权归作者所有:来自51CTO博客作者wx62be9d88ce294的原创作品,请联系作者获取转载授权,否则将追究法律责任
方式一:
package com.yqq;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.junit.Test;import java.io.IOException;import java.net.URI;import java.net.URISyntaxException; public class HdfsAPIDemo01 { public void copyFromLocalFile() throws URISyntaxException, IOException, InterruptedException { Configuration configuration =new Configuration(); configuration.set("dfs.blocksize","1048576"); configuration.set("dfs.replication","2"); FileSystem fileSystem = FileSystem.get(new URI("hdfs://node1:9820"), configuration, "root"); fileSystem.copyFromLocalFile(new Path("E:\\yqq.txt"),new Path("/api/show/yqq2.txt")); fileSystem.close(); System.out.println("操作完成"); }}
方式二:
package com.yqq;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FSDataOutputStream;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.junit.Test;import java.io.FileInputStream;import java.io.FileOutputStream;import java.net.URI;import java.net.URISyntaxException; public class HdfsAPIDemo02 { public void uploadFile() throws Exception { Configuration configuration =new Configuration(); FileSystem fileSystem = FileSystem.get(new URI("hdfs://node1:9820"), configuration, "root"); FileInputStream fileInputStream = new FileInputStream("E:\\yqq.txt"); FSDataOutputStream fsDataOutputStream = fileSystem.create(new Path("/api/show1/hh.txt")); byte[] data = new byte[1024]; int len = -1; while ((len = fileInputStream.read(data))!=-1) fsDataOutputStream.write(data,0,len); fileInputStream.close(); fsDataOutputStream.flush(); fsDataOutputStream.close(); fileSystem.close(); }}
- 赞
- 收藏
- 评论
- *举报
Original: https://blog.51cto.com/u_15704423/5434923
Author: wx62be9d88ce294
Title: HDFS之JavaAPI上传文件
原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/516902/
转载文章受原作者版权保护。转载请注明原作者出处!