Я получаю ClassCastException в Java, но не могу устранить. Может ли кто-нибудь сказать мне, что мне здесь не хватает?
Вот мой класс:
private List<Record> getGenericRecordsforHierarchy(int hID, Schema schema,List<HierarchyStruc> hs, int maxLevel) throws BookHierarchyException {
List<GenericData.Record> recordList = new ArrayList<GenericData.Record>();
GenericData.Record record = new GenericData.Record(schema);
for (int i = 0; i < hs.size(); i++) {
record = new GenericData.Record(schema);
record.put("HierarchyId", hID);
for (int j = 0; j <= (maxLevel*3) && j < ((Constants.MAX_ALLOWED)*3); j++) {
int k=0;
record.put("Level" + (k+1) + "Id", hs.get(j));
record.put("Level" + (k+1) + "Desc", hs.get(j+1) );
record.put("Level" + (k+1) + "nodeId", hs.get(j+2) );
j= j+2;
k++;
if (j + 1 > (maxLevel*3) || null == hs.get(j+1)) {
record.put("parentNodeId", hs.get(i).getParentNodeId());
record.put("BookName", hs.get(i).getBookName());
record.put("HierarchyName", hs.get(i).getHierarchyName());
record.put("NodeDesc", hs.get(i).getNodeDesc());
break;
}
}
recordList.add(record);
}
return recordList;
}
My Generi c. Класс записи выглядит следующим образом:
public static class Record implements GenericRecord, Comparable<Record> {
private final Schema schema;
private final Object[] values;
public Record(Schema schema) {
if (schema == null || !Type.RECORD.equals(schema.getType()))
throw new AvroRuntimeException("Not a record schema: "+schema);
this.schema = schema;
this.values = new Object[schema.getFields().size()];
}
public Record(Record other, boolean deepCopy) {
schema = other.schema;
values = new Object[schema.getFields().size()];
if (deepCopy) {
for (int ii = 0; ii < values.length; ii++) {
values[ii] = INSTANCE.deepCopy(
schema.getFields().get(ii).schema(), other.values[ii]);
}
}
else {
System.arraycopy(other.values, 0, values, 0, other.values.length);
}
}
@Override public Schema getSchema() { return schema; }
@Override public void put(String key, Object value) {
Schema.Field field = schema.getField(key);
if (field == null)
throw new AvroRuntimeException("Not a valid schema field: "+key);
values[field.pos()] = value;
}
@Override public void put(int i, Object v) { values[i] = v; }
@Override public Object get(String key) {
Field field = schema.getField(key);
if (field == null) return null;
return values[field.pos()];
}
@Override public Object get(int i) { return values[i]; }
@Override public boolean equals(Object o) {
if (o == this) return true; // identical object
if (!(o instanceof Record)) return false; // not a record
Record that = (Record)o;
if (!this.schema.equals(that.schema))
return false; // not the same schema
return GenericData.get().compare(this, that, schema, true) == 0;
}
@Override public int hashCode() {
return GenericData.get().hashCode(this, schema);
}
@Override public int compareTo(Record that) {
return GenericData.get().compare(this, that, schema);
}
@Override public String toString() {
return GenericData.get().toString(this);
}
}
Получение ошибки в методе ниже:
private void writeToParquet(String hadoopPath, List<Record> recordList, Schema schema)
throws BookHierarchyException {
org.apache.hadoop.fs.Path path = new org.apache.hadoop.fs.Path(hadoopPath);
ParquetWriter<GenericData.Record> writer = null;
Configuration configuration = new Configuration(false);
configuration.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
try {
writer = AvroParquetWriter.<GenericData.Record>builder(path)
.withRowGroupSize(ParquetWriter.DEFAULT_BLOCK_SIZE).withPageSize(ParquetWriter.DEFAULT_PAGE_SIZE)
.withSchema(schema).withConf(new Configuration()).withCompressionCodec(CompressionCodecName.SNAPPY)
.withValidation(false).withDictionaryEncoding(false).build();
for (GenericData.Record record : recordList) {
writer.write(record);
}
log.info("File writing done. Closing file");
writer.close();
} catch (IOException e) {
log.error(e);
throw new BookHierarchyException("Error in file handling");
}
}
Ошибка:
java.lang.ClassCastException: HierarchyStruc cannot be cast to java.lang.CharSequence
Может кто-нибудь, дайте мне знать, что я здесь делаю не так? Я сопоставил тип схемы и типы столбцов HierarchyStru c, но они также совпадают. Кажется, это другая проблема, которую я пытаюсь решить с утра.
java.lang.ClassCastException: com.package.beans.HierarchyStruc cannot be cast to java.lang.CharSequence
at org.apache.parquet.avro.AvroWriteSupport.fromAvroString(AvroWriteSupport.java:371)
at org.apache.parquet.avro.AvroWriteSupport.writeValueWithoutConversion(AvroWriteSupport.java:346)
at org.apache.parquet.avro.AvroWriteSupport.writeValue(AvroWriteSupport.java:278)
at org.apache.parquet.avro.AvroWriteSupport.writeRecordFields(AvroWriteSupport.java:191)
at org.apache.parquet.avro.AvroWriteSupport.write(AvroWriteSupport.java:165)
at org.apache.parquet.hadoop.InternalParquetRecordWriter.write(InternalParquetRecordWriter.java:128)
at org.apache.parquet.hadoop.ParquetWriter.write(ParquetWriter.java:299)
at com.package.utilities.ParquetFileHandler.writeToParquet(ParquetFileHandler.java:73)
public void persistHierarchyData(String dateYYYY_MM_DD, int hID, List<HierachyStruc> hs, int maxLevel)
throws BookHierarchyException {
Schema schema = parquetSchemaParser.parseSchema(avroSchemaName);
String hadoopPath = getUpdatedDatePattern(hadoopDirectory + hadoopFileName,dateYYYY_MM_DD);
hadoopPath = getUpdatedHierarchyPattern(hadoopPath, hID);
List<GenericData.Record> recordList = getGenericRecordsforHierarchy(hID, schema, hs,maxLevel);
log.info("recordList size is " + recordList.size());
parquetFileHandler.persist(schema, recordList, hadoopPath);
}