DataX之MySQL数据写入Hive
1、编写脚本mysql-to-hive.json{"job": {"setting": {"speed": {"channel": 3},"errorLimit": {"record": 0,"percentage": 0.02}},"c
·
1、编写脚本mysql-to-hive.json
{
"job": {
"setting": {
"speed": {
"channel": 3
},
"errorLimit": {
"record": 0,
"percentage": 0.02
}
},
"content": [
{
"reader": {
"name": "mysqlreader",
"parameter": {
"username": "用户名",
"password": "密码",
"column": [
"deptno",
"dname",
"loc"
],
"connection": [
{
"table": [
"dept"
],
"jdbcUrl": [
"jdbc:mysql://IP:3306/test"
]
}
]
}
},
"writer": {
"name": "hdfswriter",
"parameter": {
"defaultFS": "hdfs://hdfs-ha",
"hadoopConfig":{
"dfs.nameservices": "hdfs-ha",
"dfs.ha.namenodes.hdfs-ha": "nn1,nn2",
"dfs.namenode.rpc-address.hdfs-ha.nn1": "node01:8020",
"dfs.namenode.rpc-address.hdfs-ha.nn2": "node02:8020",
"dfs.client.failover.proxy.provider.hdfs-ha": "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider"
},
"fileType": "text",
"path": "/user/hive/warehouse/ods.db/datax_dept",
"fileName": "202104",
"column": [
{
"name": "deptno",
"type": "int"
},
{
"name": "dname",
"type": "varchar"
},
{
"name": "loc",
"type": "varchar"
}
],
"writeMode": "append",
"fieldDelimiter": "\t"
}
}
}
]
}
}
2、执行脚本
/datax/bin/datax.py ./mysql-to-hive.json

DAMO开发者矩阵,由阿里巴巴达摩院和中国互联网协会联合发起,致力于探讨最前沿的技术趋势与应用成果,搭建高质量的交流与分享平台,推动技术创新与产业应用链接,围绕“人工智能与新型计算”构建开放共享的开发者生态。
更多推荐
所有评论(0)