执行spark代码插入数据到hbase表中去的时候,遇到的错误
1. 缺少hadoop-mapreduce-client-core-2.5.1.jar包
错误:java.lang.ClassNotFoundException: org.apache.hadoop.mapred.JobConf
2. 缺少hbase-protocol-1.3.1.jar包
错误:java.lang.ClassNotFoundException: org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingInterface
3. 缺少metrics-core-2.2.0.jar的包
终端出现该错误:com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
4. 需要的jar包
hadoop-mapreduce-client-core-2.5.1.jar //org.apache.hadoop.mapred.JobConfhbase-client-1.3.1.jarhbase-common-1.3.1.jarhbase-server-1.3.1.jar //主要针对操作hbasehbase-protocol-1.3.1.jar //org.apache.hadoop.hbase.protobuf.generated.MasterProtoskafka-clients-1.0.0.jarkafka_2.11-1.0.0.jar //主要针对于操作Kafkaspark-core_2.11-2.1.1.jarspark-streaming_2.11-2.1.1.jarspark-streaming-kafka-0-10_2.11-2.1.1.jar //主要针对于操作sparkstreamingzkclient-0.10.jarzookeeper-3.4.10.jar //主要针对于操作zookeeperFlumeKafkaToHbase.jar //自定义jar包
5. 执行
/home/spark/bin/spark-submit \--master local[2] \--driver-class-path /usr/local/hbase/lib/metrics-core-2.2.0.jar \--class com..FlumeKafkaToHbase \--executor-memory 4G \--total-executor-cores 2 \FlumeKafkaToHbase.jar