读取HDFS数据写入MySQL
1)将上个案例上传的文件改名
[atguigu@hadoop102 datax]$ hadoop fs -mv /student.txt* /student.txt
2)查看官方模板
[atguigu@hadoop102 datax]$ python bin/datax.py -r hdfsreader -w mysqlwriter
{
“job”: {
“content”: [
{
“reader”: {
“name”: “hdfsreader”,
“parameter”: {
“column”: [],
“defaultFS”: “”,
“encoding”: “UTF-8”,
“fieldDelimiter”: “,”,
“fileType”: “orc”,
“path”: “”
}
},
“writer”: {
“name”: “mysqlwriter”,
“parameter”: {
“column”: [],
“connection”: [
{
“jdbcUrl”: “”,
“table”: []
}
],
“password”: “”,
“preSql”: [],
“session”: [],
“username”: “”,
“writeMode”: “”
}
}
}
],
“setting”: {
“speed”: {
“channel”: “”
}
}
}
}
3)创建配置文件
[atguigu@hadoop102 datax]$ vim job/hdfs2mysql.json
{
“job”: {
“content”: [
{
“reader”: {
“name”: “hdfsreader”,
“parameter”: {
“column”: [“*”],
“defaultFS”: “hdfs://hadoop102:9000”,
“encoding”: “UTF-8”,
“fieldDelimiter”: “\t”,
“fileType”: “text”,
“path”: “/student.txt”
}
},
“writer”: {
“name”: “mysqlwriter”,
“parameter”: {
“column”: [
“id”,
“name”
],
“connection”: [
{
“jdbcUrl”: “jdbc:mysql://hadoop102:3306/datax”,
“table”: [“student2”]
}
],
“password”: “000000”,
“username”: “root”,
“writeMode”: “insert”
}
}
}
],
“setting”: {
“speed”: {
“channel”: “1”
}
}
}
}
4)在MySQL的datax数据库中创建student2
mysql> use datax;
mysql> create table student2(id int,name varchar(20));
5)执行任务
6)查看student2表