如何使用Spark⼀Scala读取Hbase的数据

2024-12-25 20:48:31
推荐回答(1个)
回答1:

java.io.NotSerializableException: org.apache.hadoop.hbase.io.ImmutableBytesWritable
spark-shell--conf spark.serializer=org.apache.spark.serializer.KryoSerializer
以下代码,经过MaprDB实测通过
import org.apache.spark._
import org.apache.spark.rdd.NewHadoopRDD
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import org.apache.hadoop.hbase.client.HBaseAdmin