I'm trying to read a parquet file bundled as a resource inside a JAR, ideally as a stream.
Does anyone have a working example that doesn't involve writing the resource out as a temporary file first?
Here is the code I'm using to read the files which works fine in the IDE before bundling as a JAR:
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.parquet.avro.AvroParquetReader;
try {
Path path = new Path(classLoader.getResource(pattern_id).toURI());
Configuration conf = new Configuration();
try (ParquetReader<GenericRecord> r = AvroParquetReader.<GenericRecord>builder(
HadoopInputFile.fromPath(path, conf))
.disableCompatibility()
.build()) {
patternsFound.add(pattern_id);
GenericRecord record;
while ((record = r.read()) != null) {
// Do some work
}
} catch (IOException e) {
e.printStackTrace();
}
} catch (NullPointerException | URISyntaxException e) {
e.printStackTrace();
}
When running this code from a JAR file, I get this error:
org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "jar"
Which I figured I could get around by using:
InputStream inputFile = classLoader.getResourceAsStream(pattern_id);
But don't know how to get AvroParquetReader to work with Input Streams.
CodePudding user response:
I winded up being able to read a parquet file as a resource stream by tweaking the solution here: https://stackoverflow.com/a/58261488/3112960
import org.apache.commons.io.IOUtils;
import org.apache.parquet.io.DelegatingSeekableInputStream;
import org.apache.parquet.io.InputFile;
import org.apache.parquet.io.SeekableInputStream;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
public class ParquetStream implements InputFile {
private final String streamId;
private final byte[] data;
private static class SeekableByteArrayInputStream extends ByteArrayInputStream {
public SeekableByteArrayInputStream(byte[] buf) {
super(buf);
}
public void setPos(int pos) {
this.pos = pos;
}
public int getPos() {
return this.pos;
}
}
public ParquetStream(String streamId, InputStream stream) throws IOException {
this.streamId = streamId;
this.data = IOUtils.toByteArray(stream);
}
@Override
public long getLength() {
return this.data.length;
}
@Override
public SeekableInputStream newStream() throws IOException {
return new DelegatingSeekableInputStream(new SeekableByteArrayInputStream(this.data)) {
@Override
public void seek(long newPos) {
((SeekableByteArrayInputStream) this.getStream()).setPos((int) newPos);
}
@Override
public long getPos() {
return ((SeekableByteArrayInputStream) this.getStream()).getPos();
}
};
}
@Override
public String toString() {
return "ParquetStream[" streamId "]";
}
}
Then I can just do this:
InputStream in = classLoader.getResourceAsStream(pattern_id);
try {
ParquetStream parquetStream = new ParquetStream(pattern_id, in);
ParquetReader<GenericRecord> r = AvroParquetReader.<GenericRecord>builder(parquetStream)
.disableCompatibility()
.build();
GenericRecord record;
while ((record = r.read()) != null) {
// do some work
}
} catch (IOException e) {
e.printStackTrace();
}
Maybe this will help someone in the future since I couldn't find any straightforward answers to this.