failed to load怎么解决(hive启动一直失败)
mxj 发布:2023-06-21 01:58:58 74
最后安装了大数据集群,但是hive一直没有启动。
第一步,仔细检查mysql中创建的hive数据和用户名。
创建数据库配置单元默认字符集utf8 collate utf8_general_ci创建由' Hive-123 '标识的用户' hive'@'% ',授予对配置单元的所有权限。*必须增加对' hive'@'% '的刷新权限,否则本地连接将失败,并且没有权限创建用户' hive' @' localhost '标识的' hive-123 '授予对hive的所有权限。* To' hive' @' localhost '执行上述操作后,仍会出现以下错误。关注“无法加载驱动程序”。
所以初步判断没有com.mysql.jdbc.Driver mysql驱动,所以按照错误报告查找hive的lib地址。
/usr/hdp/3.0.1.0-187/hive/libcd /usr/share/java/scp -r mysql-connector-java.jar slave1:/usr/hdp/3.0.1.0-187/hive/libscp -r mysql-connector-java.jar slave2:/usr/hdp/3.0.1.0-187/hive/libcp mysql-connector-java.jar /usr/hdp/3.0.1.0-187/hive/lib
再次启动hive,OK终于解决了。/usr/HDP/3 . 0 . 1 . 0-187/hive/libcd/usr/share/Java/scp-r MySQL-connector-Java . jar slave 1:/usr/HDP/3 . 0 . 1 . 0-187/hive/Libscp-r MySQL-connector-Java . jar slave 2:/usr/HDP/3 . 0 . 1 . 0-187/hive/libcpmysql-connector-Java . jar/usr/HDP/3.0
stderr: Traceback(最近一次调用last):File“/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/HIVE _ metastore . py”,ltmodulegt HiveMetastore()中的第201行。execute()文件“/usr/lib/ambari-agent/lib/resource _ management/libraries/script/script . py”,第351行,在execute方法(env)文件“/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/HIVE _ metastore . py”,第61行,在start create _ metastore _ schema()# execute without config lock File”/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/HIVE . py”中 第150行,in _call _ wrapper result = _ call(command,* * kwargs _ copy)File "/usr/lib/ambari-agent/lib/resource _ management/core/shell . py ",第314行,in _ call提升执行失败(err_msg,code,out,err)resource _ management . core . exceptions . Execution失败:执行' export HIVE _ CONF _ DIR =/usr/HDP/current/HIVE-metastore/conf//usr/HDP/current/HIVE/HIVE-server 2/bin/bin SLF4J:类路径包含多个SLF4J绑定。SLF4J:在[jar:file:/usr/HDP/3 . 0 . 1 . 0-187/hive/lib/log4j-slf4j-impl-2 . 10 . 0 . jar中找到绑定!/org/slf4j/impl/staticloggerbinder . class]SLF4J:在[jar:file:/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/lib/slf4j-log4j 12-1 . 7 . 25 . jar中找到绑定!/org/slf4j/impl/staticloggerbinder . class]SLF4J:有关解释,请参见http://www.slf4j.org/codes.html#多重绑定。SLF4J:实际绑定类型为[org . Apache . logging . slf4j . log 4 jloggerfactory]Metastore连接URL:JDBC:MySQL://master/hive Metastore连接驱动程序:com.mysql.jdbc.DriverMetastore连接用户:hive org . Apache . Hadoop . hive . Metastore . hive metaexception:未能加载d driverUnderlying原因:Java . lang . classnotfoundexception:com . MySQL . JDBC . Driver org . Apache . Hadoop...11更多*** schemaTool失败*** stdout:2020-12-18 15:26:53,289 -堆栈功能版本信息:集群堆栈=3.0,命令堆栈=无,命令版本= 3 . 0 . 1 . 0-187-gt 3 . 0 . 1 . 0-1872020-12-18 15:26:53,308 -使用Hadoop conf dir:/usr/HDP/3 . 0 . 1 . 0-187 ' fetch_nonlocal_groups': True,' groups': ['hadoop'],' uid ':None } 2020-12-18 15:26:53473-User[' hive ']{ ' gid ':' Hadoop ',' fetch_nonlocal_groups': True,' groups': ['hadoop'],' uid ':None } 2020-12-18 15:26:53473-User[' storm ']{ ' GID ':' Hadoop ',' GID groups': ['hadoop'],' uid ':None } 2020-12-18 15:26:53485-User[' tez ']{ ' GID ':' Hadoop ',' fetch_nonlocal_groups': True,' groups': ['hadoop ',' users'],' uid ':None } 2020-12-18 15:26:53486-User[' zeppelin ']{ ' GID ':' Hadoop ',' fetch_nonlocal groups': ['hadoop ',' users'],' uid ':None } 2020-12-18 15:26:53491-User[' Kafka ']{ ' GID ':' Hadoop ',' fetch_nonlocal_groups': True,' groups': ['hadoop'],' uid ':None } 2020-12-18 15:26:53492-User[' HDFS ']{ ' GID ':' Hadoop ',' fetch _ non local _ in 497-Execute['/var/lib/ambari-agent/tmp/change uid . sh ambari-QA/tmp/Hadoop-ambari-QA,/tmp/hsperfdata_ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-QA-QA 0 ']{ ' not _ if ':'(test $(id-u ambari-QA)-gt 1000)| |(false)' } 2020-12-11 /tmp/hbase-h base 1020 ']{ ' not _ if ':'(test $(id-u h base)-gt 1000)| |(false)' } 2020-12-18 15:26:53,519 -跳过执行['/var/lib/ambari-agent/tmp/change uid . sh h base/home/h base,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/h base ...),' owner': 'hdfs ',' group': 'hadoop ' } 2020-12-18 15:26:53,535-Directory['/var/lib/ambari-agent/tmp/Hadoop _ Java _ io _ tmpdir ']{ ' owner ':' HDFS ',' group ':' Hadoop ',' mode': 01777}2020-12-18 15:26:53,551 - Execute[('setenforce ',' 0')] {which getenforce)| |(which getenforce amp getenforce | grep-q Disabled)',' sudo': True,' only _ if ':' test-f/selinux/enforce ' } 2020-12-18 15:26:53,559 -由于not_if2020-12-18 15:26:53,560-Directory['/var/log/Hadoop ']{ ' owner ':' root ','跳过执行[(' 0 ')...),' owner': 'hdfs ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:26:53,583-File['/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/conf/Hadoop-metrics 2 . properties ']{ ' content ':inline template(...),' owner': 'hdfs ',' group ':' Hadoop ' } 2020-12-18 15:26:53,583-File['/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/conf/task-log4j . properties ']{ ' content ':static File(' task-log4j . properties '),' mode': 0755}2020-12-18 15:26:53,584 - File 841-call[' ambari-python-wrap/usr/bin/HDP-select status hive-server 2 ']{ ' time out ':20 } 2020-12-18 15:26:53,864 - call returned (0,' hive-server 2-3 . 0 . 1 . 0-187 ')2020-12-18 15:26:53,865 - Stack功能版本信息:集群堆栈=3.0,命令堆栈=无,命令版本=3 hdfs_resource_ignore ',' hdfs_site ':...,' kinit_path_local': 'kinit ',' principal _ name ':' missing _ principal ',' user': 'hdfs ',' owner': 'hive ',' Hadoop _ conf _ dir ':/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/conf ',' type': 'directory ',' action': ['create_on_execute'],' immutable _ paths ':[u '/Mr-history/done ',u '/warehouse/tablespace/managed/hive ',u '/warehouse/tablespace/external/hive ',uop = getfilestatusampuser . name = HDFS ' " ' " ' 1gt/tmp/tmp 2 en 4 TG 2gt/tmp/tmp 7 xpymd ' ']{ ' logout put ':None,' quiet ':False } 2020-12-18 15:26:55,451 - call returned (0,' ')2020-12-18 15:26:55,451-get _ user _ call _ output returned(0,u ' " file status ":{ " 1hdfs_resource_ignore ',' hdfs_site ':...,' kinit_path_local': 'kinit ',' principal _ name ':' missing _ principal ',' user': 'hdfs ',' owner ':' Hadoop ',' Hadoop _ conf _ dir ':/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/conf ',' type': 'directory ',' action': ['create_on_execute'],' immutable _ paths ':[u '/Mr-history/done ',u '/warehouse/tablespace/managed/hive ',u '/warehouse/table/external/externalop = getfilestatusampuser . name = HDFS ' " ' " ' 1gt/tmp/tmpglcv 7v 2gt/tmp/tmp/tmp/tmp 4 hxdsq ' ']{ ' logout put ':None,' quiet ':False } 2020-12-18 15:26:55,524 - call returned (0,u'{"FileStatus ":)hdfs_resource_ignore ',' hdfs_site ':...,' kinit_path_local': 'kinit ',' principal _ name ':' missing _ principal ',' user': 'hdfs ',' owner ':' Hadoop ',' Hadoop _ conf _ dir ':/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/conf ',' type': 'directory ',' action': ['create_on_execute'],' immutable _ paths ':[u '/Mr-history/done ',u '/warehouse/tablespace/managed/hive ',u '/warehouse/table/external/externalop = getfilestatusampuser . name = HDFS ' " ' " ' 1gt/tmp/tmp VO 9 rch 2gt/tmp/tmpwdmd 9 I ' ']{ ' logout put ':None,' quiet ':False } 2020-12-18 15:26:55,604 - call returned (0,' ')2020-12-18 15:26:55,604-get _ user _ call _ output returned(0,u'{"FileStatus":{ 112-call[' ambari-sudo . sh su HDFS-l-s/bin/bash-c ' HDFS getconf-conf key DFS . NameNode . POSIX . ACL . inheritance . enabled 1gt/tmp/tmppsl 07 o 2gt/tmp/tmpdy 60kx ' ']{ ' quiet ':False } 2020-12-18 15:26:58,625 - call returned (0,' ')2020-12-18 15:20 hdfs_resource_ignore ',' hdfs_site ':...,' kinit_path_local': 'kinit ',' principal _ name ':' missing _ principal ',' user': 'hdfs ',' action': ['execute'],' Hadoop _ conf _ dir ':/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/conf ',' immutable _ paths ':[u '/Mr-history/done ',u '/warehouse/tablespace/managed/hive ',u '/warehouse/tablespace/external/hive ',u'/app-logs ',u'/tmp']}2020...}2020-12-18 15:27:04,767 -生成配置:/etc/hive/3 . 0 . 1 . 0-187/0/mapred-site . XML 2020-12-18 15:27:04,768-File['/etc/hive/3 . 0 . 1 . 0-187/0/mapred-site . XML ']{ ' owner ':' hive ',' content': InlineTemplate(...),' group': 'hadoop ',' mode': 0644,' encoding ':' UTF-8 ' } 2020-12-18 15:27:04811-File['/etc/hive/3 . 0 . 1 . 0-187/0/hive-default . XML . template ']{ ' owner ':' hive ',' group': 'hadoop ',' mode ':0644 } 2020-12-18 15:27:04812-File['...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,817-File['/etc/hive/3 . 0 . 1 . 0-187/0/llap-CLI-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,819-File['/etc/hive/3 . 0 . 1 . 0-187/0/hive-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,821-File['/etc/hive/3 . 0 . 1 . 0-187/0/hive-exec-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,823-File['/etc/hive/3 . 0 . 1 . 0-187/0/beeline-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,824-XML config[' beeline-site . XML ']{ ' owner ':' hive ',' group': 'hadoop ',' mode': 0644,' conf _ dir ':'/etc/hive/3 . 0 . 1 . 0-187/0 ',' configuration ':{ ' beeline . hs2 . JDBC . URL . container ':u ' JDBC...),' group': 'hadoop ',' mode': 0644,' encoding ':' UTF-8 ' } 2020-12-18 15:27:04834-File['/etc/hive/3 . 0 . 1 . 0-187/0/parquet-logging . properties ']{ ' content ':...、' owner': 'hive ',' group': 'hadoop ',' mode ':0644 } 2020-12-18 15:27:04834-Directory['/etc/hive/3 . 0 . 1 . 0-187/0 ']{ ' owner ':' hive ',' group': 'hadoop ',' create_parents': True,' mode ':0755 } 2020-12-18 15:27:04835-XML config[' mapred-1...}2020-12-18 15:27:04,842 -生成配置:/etc/hive/3 . 0 . 1 . 0-187/0/mapred-site . XML 2020-12-18 15:27:04,842-File['/etc/hive/3 . 0 . 1 . 0-187/0/mapred-site . XML ']{ ' owner ':' hive ',' content': InlineTemplate(...),' group': 'hadoop ',' mode': 0644,' encoding ':' UTF-8 ' } 2020-12-18 15:27:04882-File['/etc/hive/3 . 0 . 1 . 0-187/0/hive-default . XML . template ']{ ' owner ':' hive ',' group': 'hadoop ',' mode ':0644 } 2020-12-18 15:27:04882-File[' '...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,888-File['/etc/hive/3 . 0 . 1 . 0-187/0/llap-CLI-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode ':0644 } 2020-12-18 15:27:04890-File['/etc/hive/3 . 0 . 1 . 0-187/0/hive-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,892-File['/etc/hive/3 . 0 . 1 . 0-187/0/hive-exec-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode ':0644 } 2020-12-18 15:27:04894-File['/etc/hive/3 . 0 . 1 . 0-187/0/beeline-log4j 2 . properties ']{ ' content ':inline template(...),' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,894-XML config[' beeline-site . XML ']{ ' owner ':' hive ',' group': 'hadoop ',' mode': 0644,' conf _ dir ':'/etc/hive/3 . 0 . 1 . 0-187/0 ',' configuration ':{ ' beeline . hs2 . JDBC . URL . container ':u ' JDBC...),' group': 'hadoop ',' mode': 0644,' encoding ':' UTF-8 ' } 2020-12-18 15:27:04,904-File['/etc/hive/3 . 0 . 1 . 0-187/0/parquet-logging . properties ']{ ' content ':...,' owner': 'hive ',' group': 'hadoop ',' mode': 0644}2020-12-18 15:27:04,905-File['/usr/HDP/current/hive-metastore/conf/hive-site . JCE ks ']{ ' content ':static File('/var/lib/ambari-agent/cred/conf/hive _ metastore/hive-site . JCE ks '),' owner': 'hive ',' group': 'hadoop ',' mode': 0644...}2020-12-18 15:27:04,913 -生成配置:/usr/HDP/current/hive-metastore/conf/hive-site . XML 2020-12-18 15:27:04,914-File['/usr/HDP/current/hive-metastore/conf/hive-site . XML ']{ ' owner ':' hive ',' content': InlineTemplate(...),' group': 'hadoop ',' mode': 0644,' encoding ':' UTF-8 ' } 2020-12-18 15:27:05,049 -写文件['/usr/HDP/current/hive-metastore/conf/hive-site . XML ']因为内容不匹配2020-12-18 15:27:05,050 -生成Atlas Hook配置文件/usr/HDP/current/hive-metastore/conf/Atlas...}2020-12-18 15:27:05,054 -生成属性文件:/usr/HDP/current/hive-metastore/conf/atlas-application . properties 2020-12-18 15:27:05,054-File['/usr/HDP/current/hive-metastore/conf/atlas-application . properties ']{ ' owner ':' hive ',' content': InlineTemplate(...),' group': 'hadoop ',' mode': 0644,' encoding ':' UTF-8 ' } 2020-12-18 15:27:05,067 -写文件['/usr/HDP/current/hive-metastore/conf/atlas-application . properties ']因为内容不匹配2020-12-18 15:27:05,071-File['/usr/HDP/current/hive-metastore/conf//hive-env...),' owner': 'hive ',' group': 'hadoop ',' mode': 0755}2020-12-18 15:27:05,072 -写入文件['/usr/HDP/current/hive-metastore/conf//hive-env . sh '],因为内容不匹配2020-12-18 15:27:05,072-Directory['/etc/security/limits . d ']{ ' owner ':' root ',' create_parents': True group': 'hadoop ',' mode': 0755,' CD _ access ':' a ' } 2020-12-18 15:27:05,077-Directory['/var/lib/hive ']{ ' owner ':' hive ',' create_parents': True,' group': 'hadoop ',' mode': 0755,' CD _ access ':' a ' } 2020-12-18 15:27:05,077-XML config[' hive metastore-site ...}2020-12-18 15:27:05,084 -生成配置:/usr/HDP/current/hive-metastore/conf/hive metastore-site . XML 2020-12-18 15:27:05,085-File['/usr/HDP/current/hive-metastore/conf/hive metastore-site . XML ']{ ' owner ':' hive ',' content': InlineTemplate(...),' group': 'hadoop ',' mode': 0600,' encoding ':' UTF-8 ' } 2020-12-18 15:27:05,096-File['/usr/HDP/current/hive-metastore/conf/Hadoop-metrics 2-hive metastore . properties ']{ ' content ':Template(' Hadoop-metrics 2-hive metastore . properties . J2),' owner': 'hive ',' group': 'hadoop ',' mode': 0600hdfs_resource_ignore ',' hdfs_site ':...,' kinit_path_local': 'kinit ',' principal _ name ':' missing _ principal ',' user': 'hdfs ',' action': ['execute'],' Hadoop _ conf _ DIR ':/usr/HDP/3 . 0 . 1 . 0-187/Hadoop/conf ',' immutable _ paths ':[u '/Mr-history/done ',u '/warehouse/tablespace/managed/HIVE ',u '/warehouse/tablespace/external/HIVE ',u'/app-logs ',u'/tmp']}2020
设置启动的mysql驱动路径,再次执行就可以测试成功了。设置启动mysql驱动路径,再次执行测试成功。
ambari-server setup-JDBC-db = MySQL-JDBC-driver =/usr/share/Java/MySQL-connector-Java . jar
版权说明:如非注明,本站文章均为 零度游戏网 原创,转载请注明出处和附带本文链接;
相关推荐
- 06-06收纳物语科目二怎么过 全部通关方法一览
- 06-06收纳物语造个句子怎么过 正确造句答案一览
- 06-06无期迷途EMP培养推荐 角色强度介绍
- 06-06原神恒动械画第六关怎么过 复原械画部件之六通关攻略
- 06-06收纳物语知足常乐怎么过 修脚通关流程攻略
- 06-06原神恒动械画第五关怎么过 复原械画部件之五通关攻略
- 09-24dnf罗莉安的花戒绝版了吗(罗莉安的花戒属性怎么样)
- 11-04镇魂街破晓怎么开启省电模式-开启省电模式方法
- 11-04光遇魔法工坊在哪里-魔法工坊位置
- 11-04破雪刃怎么转职-转职攻略
- 推荐资讯
-
- 09-24dnf罗莉安的花戒绝版了吗(罗莉安的花戒属性怎么样)
- 11-04镇魂街破晓怎么开启省电模式-开启省电模式方法
- 11-04光遇魔法工坊在哪里-魔法工坊位置
- 11-04破雪刃怎么转职-转职攻略
- 11-04鸣潮椿声骸怎么选择-椿声骸词条选择攻略
- 11-04龙族卡塞尔之门怎么更换队伍-更换队伍方法
- 11-04寂静岭2重制版结局是什么-结局大全
- 09-28tbc燃烧王座副本怎么过(燃烧王座boss流程详解)
- 11-04归龙潮沉浮聆长歌任务怎么做-沉浮聆长歌任务攻略
- 11-04崩坏星穹铁道2.6绳索与棍棒成就怎么达成-2.6绳索与棍棒成就达成攻略
- 最近发表
- 热门文章
- 标签列表
- 随机文章