写入 Avro 数据文件

发布于 2024-10-30 16:48:03 字数 3289 浏览 4 评论 0原文

以下代码只是将数据写入 avro 格式,并从写入的 avro 文件中读取并显示相同的数据。我只是在尝试 Hadoop 权威指南书中的示例。我第一次能够执行此操作。然后我收到以下错误。它第一次起作用了。所以我不确定我犯了什么错误。

这是例外:

Exception in thread "main" java.io.EOFException: No content to map to Object due to end of input
    at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2173)
    at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2106)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1065)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1040)
    at org.apache.avro.Schema.parse(Schema.java:895)
    at org.avro.example.SimpleAvro.AvroExample.avrocreate(AvroDataExample.java:23)
    at org.avro.example.SimpleAvro.AvroDataExample.main(AvroDataExample.java:55)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

这是代码:

package org.avro.example.SimpleAvro;

import java.io.File;
import java.io.IOException;

import org.apache.avro.Schema;
import org.apache.avro.file.DataFileReader;
import org.apache.avro.file.DataFileWriter;
import org.apache.avro.generic.GenericData;
import org.apache.avro. generic.GenericDatumReader;
import org.apache.avro.generic.GenericDatumWriter;
import org.apache.avro.generic.GenericRecord;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;

class AvroExample{

    AvroExample(){

    }
    void avrocreate() throws Exception{

        Schema schema=Schema.parse(getClass().getResourceAsStream("Pair.avsc"));

        GenericRecord datum=new GenericData.Record(schema);
        datum.put("left", "L");
        datum.put("right", "R");

        File file=new File("data.avro");
        DatumWriter<GenericRecord> writer=new GenericDatumWriter<GenericRecord>(schema);
        DataFileWriter<GenericRecord> dataFileWriter=new DataFileWriter<GenericRecord>(writer);
        dataFileWriter.create(schema, file);
        dataFileWriter.append(datum);
        dataFileWriter.close();

        System.out.println("Written to avro data file");
        //reading from the avro data file

        DatumReader<GenericRecord> reader= new GenericDatumReader<GenericRecord>();
        DataFileReader<GenericRecord> dataFileReader=new DataFileReader<GenericRecord>(file,reader);
        GenericRecord result=dataFileReader.next();
        System.out.println("data" + result.get("left").toString());

        result=dataFileReader.next();
        System.out.println("data :" + result.get("left").toString());


    }

}
public class AvroDataExample {
    public static void main(String args[])throws Exception{

        AvroExample a=new AvroExample();
        a.avrocreate();
    }



}

以下是 Pair.avsc 文件 [在本书的示例代码中给出]

{
  "type": "record",
  "name": "Pair",
  "doc": "A pair of strings.",
  "fields": [
    {"name": "left", "type": "string"},
    {"name": "right", "type": "string"}
  ]
}

The following code simply writes data into avro format and reads and displays the same from the avro file written too. I was just trying out the example in the Hadoop definitive guide book. I was able to execute this first time. Then I got the following error. It did work for the first time. So I am not sure wat mistake i am making.

This is the exception:

Exception in thread "main" java.io.EOFException: No content to map to Object due to end of input
    at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2173)
    at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2106)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1065)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1040)
    at org.apache.avro.Schema.parse(Schema.java:895)
    at org.avro.example.SimpleAvro.AvroExample.avrocreate(AvroDataExample.java:23)
    at org.avro.example.SimpleAvro.AvroDataExample.main(AvroDataExample.java:55)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

This is the code:

package org.avro.example.SimpleAvro;

import java.io.File;
import java.io.IOException;

import org.apache.avro.Schema;
import org.apache.avro.file.DataFileReader;
import org.apache.avro.file.DataFileWriter;
import org.apache.avro.generic.GenericData;
import org.apache.avro. generic.GenericDatumReader;
import org.apache.avro.generic.GenericDatumWriter;
import org.apache.avro.generic.GenericRecord;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;

class AvroExample{

    AvroExample(){

    }
    void avrocreate() throws Exception{

        Schema schema=Schema.parse(getClass().getResourceAsStream("Pair.avsc"));

        GenericRecord datum=new GenericData.Record(schema);
        datum.put("left", "L");
        datum.put("right", "R");

        File file=new File("data.avro");
        DatumWriter<GenericRecord> writer=new GenericDatumWriter<GenericRecord>(schema);
        DataFileWriter<GenericRecord> dataFileWriter=new DataFileWriter<GenericRecord>(writer);
        dataFileWriter.create(schema, file);
        dataFileWriter.append(datum);
        dataFileWriter.close();

        System.out.println("Written to avro data file");
        //reading from the avro data file

        DatumReader<GenericRecord> reader= new GenericDatumReader<GenericRecord>();
        DataFileReader<GenericRecord> dataFileReader=new DataFileReader<GenericRecord>(file,reader);
        GenericRecord result=dataFileReader.next();
        System.out.println("data" + result.get("left").toString());

        result=dataFileReader.next();
        System.out.println("data :" + result.get("left").toString());


    }

}
public class AvroDataExample {
    public static void main(String args[])throws Exception{

        AvroExample a=new AvroExample();
        a.avrocreate();
    }



}

The following is the Pair.avsc file [ given in the book's example code]

{
  "type": "record",
  "name": "Pair",
  "doc": "A pair of strings.",
  "fields": [
    {"name": "left", "type": "string"},
    {"name": "right", "type": "string"}
  ]
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

活雷疯 2024-11-06 16:48:03

您可能没有正确读取架构文件。我怀疑这是问题所在,因为堆栈跟踪显示它无法解析架构:

Exception in thread "main" java.io.EOFException: No content to map to Object due to end of input
    at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2173)
    at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2106)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1065)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1040)
    at org.apache.avro.Schema.parse(Schema.java:895)

从“资源”读取文件充满了问题,除非您的环境设置正确。另外,由于您提到它之前工作过一次,您可能只是更改了第二次运行的一些环境设置(例如工作目录)。

尝试将模式字符串复制粘贴到字符串变量中,并 直接解析它而不是使用资源加载器:

String schemaJson = "paste schema here (and fix quotes)";
Schema schema = Schema.parse(schemaJson);
GenericRecord datum = new GenericData.Record(schema);
...

You are probably not reading the schema file correctly. I suspect this is the problem because the stack trace shows that it is failing to parse the schema:

Exception in thread "main" java.io.EOFException: No content to map to Object due to end of input
    at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2173)
    at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2106)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1065)
    at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1040)
    at org.apache.avro.Schema.parse(Schema.java:895)

Reading files from "resources" is fraught with problems unless you have your environment set up just right. Also, since you mentioned that it worked once before, you may just have changed some environmental setting (such as working directory) for the second runs.

Try copy-pasting the schema string into a String variable and parse it directly rather than using the resource loader:

String schemaJson = "paste schema here (and fix quotes)";
Schema schema = Schema.parse(schemaJson);
GenericRecord datum = new GenericData.Record(schema);
...
彡翼 2024-11-06 16:48:03
    GenericRecord result=dataFileReader.next();
    System.out.println("data" + result.get("left").toString());
    result=dataFileReader.next();
    System.out.println("data :" + result.get("left").toString());

我想这就是你出错的地方。

您应该调用记录的“左”属性和“右”属性。

尝试一下。

这对我有用。

    GenericRecord result=dataFileReader.next();
    System.out.println("data" + result.get("left").toString());
    result=dataFileReader.next();
    System.out.println("data :" + result.get("left").toString());

I guess this is where you are going wrong.

You should call the "left" attribute and the "right" attribute of your record.

Try it.

It worked for me.

绝情姑娘 2024-11-06 16:48:03

如果文件位于 jar 的根目录中,请在文件名前添加斜杠。

Schema.parse(getClass().getResourceAsStream("/Pair.avsc"));

If the file is on the root of your jar place a slash before the file name.

Schema.parse(getClass().getResourceAsStream("/Pair.avsc"));
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文