APACHE-ATLAS-2.1.0 - 安装HIVE HOOK(六)

news2024/12/23 18:06:28

写在前面

本博文以获取HIVE元数据为例,进行流程和源码的分析。
请提前安装好HADOOP和HIVE的环境,用于测试。
ATLAS官网:https://atlas.apache.org/#/HookHive

ATLAS支持的元数据源

在这里插入图片描述

什么是Hive Hook(钩子)

HOOK是一种在处理过程中拦截事件、消息或函数调用的机制,从这种意义上讲, HIVE HOOKS 提供了使用HIVE扩展和集成外部功能的能力。

HIVE HOOK 的工作流程

HIVE组件 -> ATLAS HIVE HOOK -> KAFKA -> ATLAS

安装HIVE HOOK的场景描述

(1)先在HIVE中创建一个数据库:my_test;
(2)安装配置HIVE HOOK;
(3)执行导入脚本,导入HIVE的历史元数据信息,在ATLAS中检查是否可以查询到HIVE数据库:my_test;
(4)在my_test数据库中创建一张表:user_click_info,在ATLAS中检查是否可以查询到HIVE表:user_click_info。

安装HIVE HOOK的操作步骤

(1)在HIVE安装服务下的hive-site.xml文件中添加如下参数,设置 ATLAS HOOK的配置:

  <property>
    <name>hive.exec.post.hooks</name>
    <value>org.apache.atlas.hive.hook.HiveHook</value>
    <description>
      Comma-separated list of post-execution hooks to be invoked for each statement.
      A post-execution hook is specified as the name of a Java class which implements the
      org.apache.hadoop.hive.ql.hooks.ExecuteWithHookContext interface.
    </description>
  </property>

(2)将apache-atlas-2.1.0-hive-hook.tar.gz在atlas的目录下执行解压,解压后会产生hook和hook-bin两个目录:

[root@hm apache-atlas-2.1.0]# pwd
/opt/software/apache-atlas-2.1.0
[root@hm apache-atlas-2.1.0]# tar -zxvf apache-atlas-2.1.0-hive-hook.tar.gz
[root@hm apache-atlas-2.1.0]# ll
总用量 28
drwxr-xr-x. 2 root root  4096 524 13:59 bin
drwxr-xr-x. 5 root root   231 61 17:09 conf
drwxr-xr-x. 5 root root    65 519 01:04 data
-rwxr-xr-x. 1 root root   210 78 2020 DISCLAIMER.txt
drwxr-xr-x. 8 root root   194 518 15:40 hbase
drwxr-xr-x  3 root root    18 65 22:25 hook
drwxr-xr-x  2 root root    28 65 22:25 hook-bin
-rwxr-xr-x. 1 root root 14289 78 2020 LICENSE
drwxr-xr-x. 2 root root   212 65 19:06 logs
drwxr-xr-x. 7 root root   107 78 2020 models
-rwxr-xr-x. 1 root root   169 78 2020 NOTICE
drwxr-xr-x. 3 root root    20 518 15:39 server
drwxr-xr-x. 9 root root   201 518 11:45 solr
drwxr-xr-x  2 root root    22 530 11:05 target
drwxr-xr-x. 4 root root    62 518 15:39 tools
[root@hm apache-atlas-2.1.0]#

(3)在HIVE安装服务下的hive-env.sh文件中添加如下参数,配置HIVE编译/执行需要的第三方的库,如:HIVE HOOK:

# Folder containing extra libraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=/opt/software/apache-atlas-2.1.0/hook/hive

(4)将ATLAS安装目录下的conf/atlas-application.properties文件复制到HIVE服务的conf目录下。
(5)执行同步HIVE元数据的脚本:

[root@hm hook-bin]# pwd
/opt/software/apache-atlas-2.1.0/hook-bin
[root@hm hook-bin]# ll
总用量 8
-rwxr-xr-x 1 root root 4304 65 22:25 import-hive.sh
# 执行导入元数据的脚本(需要输入ATLAS的用户名和密码:admin/admin),这里同步的是Hive中已有数据的元数据。
[root@hm hook-bin]# ./import-hive.sh

(6)导入元数据成功(Hive Meta Data imported successfully!!!):

[root@hm hook-bin]#
[root@hm hook-bin]# pwd
/opt/software/apache-atlas-2.1.0/hook-bin
[root@hm hook-bin]# ./import-hive.sh
Using Hive configuration directory [/opt/software/apache-hive-3.1.0-bin/conf]
Log file for import is /opt/software/apache-atlas-2.1.0/logs/import-hive.log
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/software/apache-hive-3.1.0-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/software/hadoop-3.1.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2023-06-05T22:47:05,338 INFO [main] org.apache.atlas.ApplicationProperties - Looking for atlas-application.properties in classpath
2023-06-05T22:47:05,342 INFO [main] org.apache.atlas.ApplicationProperties - Loading atlas-application.properties from file:/opt/software/apache-hive-3.1.0-bin/conf/atlas-application.properties
2023-06-05T22:47:05,381 INFO [main] org.apache.atlas.ApplicationProperties - Using graphdb backend 'janus'
2023-06-05T22:47:05,381 INFO [main] org.apache.atlas.ApplicationProperties - Using storage backend 'hbase2'
2023-06-05T22:47:05,381 INFO [main] org.apache.atlas.ApplicationProperties - Using index backend 'solr'
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Atlas is running in MODE: PROD.
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Setting solr-wait-searcher property 'true'
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Setting index.search.map-name property 'false'
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Setting atlas.graph.index.search.max-result-set-size = 150
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.db-cache = true
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.db-cache-clean-wait = 20
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.db-cache-size = 0.5
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.tx-cache-size = 15000
2023-06-05T22:47:05,382 INFO [main] org.apache.atlas.ApplicationProperties - Property (set to default) atlas.graph.cache.tx-dirty-size = 120
Enter username for atlas :- admin
Enter password for atlas :-
2023-06-05T22:47:11,008 INFO [main] org.apache.atlas.AtlasBaseClient - Client has only one service URL, will use that for all actions: http://localhost:21000
2023-06-05T22:47:11,050 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Found configuration file file:/opt/software/apache-hive-3.1.0-bin/conf/hive-site.xml
2023-06-05T22:47:12,117 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-06-05T22:47:12,289 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2023-06-05T22:47:12,311 WARN [main] org.apache.hadoop.hive.metastore.ObjectStore - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2023-06-05T22:47:12,318 INFO [main] org.apache.hadoop.hive.metastore.ObjectStore - ObjectStore, initialize called
2023-06-05T22:47:12,318 INFO [main] org.apache.hadoop.hive.metastore.conf.MetastoreConf - Found configuration file file:/opt/software/apache-hive-3.1.0-bin/conf/hive-site.xml
2023-06-05T22:47:12,319 INFO [main] org.apache.hadoop.hive.metastore.conf.MetastoreConf - Unable to find config file hivemetastore-site.xml
2023-06-05T22:47:12,319 INFO [main] org.apache.hadoop.hive.metastore.conf.MetastoreConf - Found configuration file null
2023-06-05T22:47:12,320 INFO [main] org.apache.hadoop.hive.metastore.conf.MetastoreConf - Unable to find config file metastore-site.xml
2023-06-05T22:47:12,320 INFO [main] org.apache.hadoop.hive.metastore.conf.MetastoreConf - Found configuration file null
2023-06-05T22:47:12,520 INFO [main] DataNucleus.Persistence - Property datanucleus.cache.level2 unknown - will be ignored
2023-06-05T22:47:12,911 INFO [main] com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting...
2023-06-05T22:47:12,919 WARN [main] com.zaxxer.hikari.util.DriverDataSource - Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
2023-06-05T22:47:13,108 INFO [main] com.zaxxer.hikari.pool.PoolBase - HikariPool-1 - Driver does not support get/set network timeout for connections. (Feature not implemented: No details.)
2023-06-05T22:47:13,115 INFO [main] com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Start completed.
2023-06-05T22:47:13,142 INFO [main] com.zaxxer.hikari.HikariDataSource - HikariPool-2 - Starting...
2023-06-05T22:47:13,143 WARN [main] com.zaxxer.hikari.util.DriverDataSource - Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
2023-06-05T22:47:13,148 INFO [main] com.zaxxer.hikari.pool.PoolBase - HikariPool-2 - Driver does not support get/set network timeout for connections. (Feature not implemented: No details.)
2023-06-05T22:47:13,151 INFO [main] com.zaxxer.hikari.HikariDataSource - HikariPool-2 - Start completed.
2023-06-05T22:47:13,486 INFO [main] org.apache.hadoop.hive.metastore.ObjectStore - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2023-06-05T22:47:13,633 INFO [main] org.apache.hadoop.hive.metastore.MetaStoreDirectSql - Using direct SQL, underlying DB is DERBY
2023-06-05T22:47:13,636 INFO [main] org.apache.hadoop.hive.metastore.ObjectStore - Initialized ObjectStore
2023-06-05T22:47:13,823 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:13,824 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:13,824 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:13,825 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:13,825 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:13,825 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:14,508 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:14,512 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:14,512 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:14,513 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:14,513 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:14,513 WARN [main] DataNucleus.MetaData - Metadata has jdbc-type of null yet this is not valid. Ignored
2023-06-05T22:47:15,484 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - Added admin role in metastore
2023-06-05T22:47:15,486 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - Added public role in metastore
2023-06-05T22:47:15,514 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - No user is added in admin role, since config is empty
2023-06-05T22:47:15,670 INFO [main] org.apache.hadoop.hive.metastore.RetryingMetaStoreClient - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2023-06-05T22:47:15,691 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_all_functions
2023-06-05T22:47:15,694 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=root     ip=unknown-ip-addr      cmd=get_all_functions
2023-06-05T22:47:15,745 INFO [main] org.apache.atlas.hive.bridge.HiveMetaStoreBridge - Importing Hive metadata
2023-06-05T22:47:15,745 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_databases: @hive#
2023-06-05T22:47:15,745 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=root     ip=unknown-ip-addr      cmd=get_databases: @hive#
2023-06-05T22:47:15,756 INFO [main] org.apache.atlas.hive.bridge.HiveMetaStoreBridge - Found 2 databases
2023-06-05T22:47:15,756 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_database: @hive#default
2023-06-05T22:47:15,756 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=root     ip=unknown-ip-addr      cmd=get_database: @hive#default
2023-06-05T22:47:15,845 INFO [main] org.apache.atlas.AtlasBaseClient - method=GET path=api/atlas/v2/entity/uniqueAttribute/type/ contentType=application/json; charset=UTF-8 accept=application/json status=404
2023-06-05T22:47:17,519 INFO [main] org.apache.atlas.AtlasBaseClient - method=POST path=api/atlas/v2/entity/ contentType=application/json; charset=UTF-8 accept=application/json status=200
2023-06-05T22:47:17,627 INFO [main] org.apache.atlas.AtlasBaseClient - method=GET path=api/atlas/v2/entity/guid/ contentType=application/json; charset=UTF-8 accept=application/json status=200
2023-06-05T22:47:17,636 INFO [main] org.apache.atlas.hive.bridge.HiveMetaStoreBridge - Created hive_db entity: name=default@primary, guid=99c04f6e-e8cf-498e-b113-db9d1d1c281f
2023-06-05T22:47:17,677 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_tables: db=@hive#default pat=.*
2023-06-05T22:47:17,678 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=root     ip=unknown-ip-addr      cmd=get_tables: db=@hive#default pat=.*
2023-06-05T22:47:17,709 INFO [main] org.apache.atlas.hive.bridge.HiveMetaStoreBridge - No tables to import in database default
2023-06-05T22:47:17,710 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_database: @hive#my_test
2023-06-05T22:47:17,714 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=root     ip=unknown-ip-addr      cmd=get_database: @hive#my_test
2023-06-05T22:47:17,735 INFO [main] org.apache.atlas.AtlasBaseClient - method=GET path=api/atlas/v2/entity/uniqueAttribute/type/ contentType=application/json; charset=UTF-8 accept=application/json status=404
2023-06-05T22:47:18,600 INFO [main] org.apache.atlas.AtlasBaseClient - method=POST path=api/atlas/v2/entity/ contentType=application/json; charset=UTF-8 accept=application/json status=200
2023-06-05T22:47:18,641 INFO [main] org.apache.atlas.AtlasBaseClient - method=GET path=api/atlas/v2/entity/guid/ contentType=application/json; charset=UTF-8 accept=application/json status=200
2023-06-05T22:47:18,641 INFO [main] org.apache.atlas.hive.bridge.HiveMetaStoreBridge - Created hive_db entity: name=my_test@primary, guid=425c3888-3610-462f-965d-4b6ddbe84d3f
2023-06-05T22:47:18,642 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore - 0: get_tables: db=@hive#my_test pat=.*
2023-06-05T22:47:18,642 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore.audit - ugi=root     ip=unknown-ip-addr      cmd=get_tables: db=@hive#my_test pat=.*
2023-06-05T22:47:18,643 INFO [main] org.apache.atlas.hive.bridge.HiveMetaStoreBridge - No tables to import in database my_test
Hive Meta Data imported successfully!!!
[root@hm hook-bin]# ll

在这里插入图片描述

(7)在HIVE中创建表,在ATLAS中可以看到该表的信息,说明HIVE元数据的变化也可以同步到ATLAS中:

// 第一步:在HIVE中创建表user_click_info:

create table user_click_info
(user_id string comment "用户id",
click_time string comment "点击时间",
item_id string comment "物品id")
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' 
LINES TERMINATED BY '\n'
STORED AS TEXTFILE;

// 第二步:在ATLA中查询:
在这里插入图片描述

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/614396.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

可视化的三种图结构方案 (canvas、fabric、G6)

原生 canvas、fabric 以及 G6&#xff0c;三种方案各有优劣势 原生 canvasfabricG6优点灵活、自由、可定制化非常强封装了 canvas 的 api&#xff0c;使用简单灵活提供了复杂树、图等 api&#xff0c;只需要按照文档配置即可缺点开发复杂、耗时对于构建大型树、图等复杂、耗时…

chatgpt赋能python:选择函数:Python实现之道

选择函数&#xff1a;Python 实现之道 什么是选择函数&#xff1f; 在 SEO 中&#xff0c;选择函数是指搜索引擎在对网站内容进行排名时所采用的一种规则。选择函数由搜索引擎定义&#xff0c;其目的在于建立一个算法来确定哪些网站会出现在搜索结果的前几页中。对于网站管理…

百度视频质量评测的实践之路

视频编解码技术日新月异&#xff0c;新的编解码技术赋予视频业务新的应用场景和新的用户视听体验。同时&#xff0c;视频作为带宽消耗大户&#xff0c;如何在视听体验和视频带宽之间取得最优的平衡是一个永恒的话题。视频质量评测主要用来回答&#xff1a;体验是否改善、带宽是…

chatgpt赋能python:如何用Python制作动画?

如何用Python制作动画&#xff1f; Python作为一种优秀的编程语言&#xff0c;可以用于不同领域的编程。其中&#xff0c;Python也可以被用于创建动画。使用Python的主要好处之一是其强大的Matplotlib库&#xff0c;它可以帮助我们更轻松地创建可视化效果。 什么是Matplotlib…

chatgpt赋能python:Python如何取出List中的数据

Python如何取出List中的数据 在Python中&#xff0c;列表&#xff08;List&#xff09;是一种非常常见的数据类型&#xff0c;它可以存储任意类型的数据&#xff0c;并且可以按照下标索引来访问其中的元素。本篇文章将介绍如何使用Python来取出List中的数据。 常规方法 在Py…

Android BlueToothBLE入门(一)——低功耗蓝牙介绍

学更好的别人&#xff0c; 做更好的自己。 ——《微卡智享》 本文长度为3150字&#xff0c;预计阅读8分钟 前言 距上篇文章发布都一个多月了&#xff0c;先声明&#xff0c;我可不会停更。这么长时间没更新文章&#xff0c;其实原因就三点&#xff1a; 原因一是工作上事确实多&…

JavaScript之事件(十)

JavaScript之事件 1、事件绑定2、事件类型3、事件的传播4、事件对象1、事件对象常用的属性2、事件对象常用的方法 事件可用于处理、验证用户输入、用户动作和浏览器动作。 1、事件绑定 事件绑定就是给HTML标签绑定特定的事件&#xff0c;当该事件触发&#xff0c;则会执行相应的…

一款射频芯片的layout设计指导案例-篇章2

《一款射频芯片的layout设计指导案例-篇章1》中&#xff0c;我们阐述了RTL8762元件布局顺序、DC/DC电路元件布局走线、电源Bypass布局规范、外部flash布局走线、RF布局走线&#xff0c; 本篇阐述晶振、ESD、板层等相关指导建议—— 40MHz晶振布局走线规范 在没有结构限制情况下…

chatgpt赋能python:Python如何在同一行输入三个数?

Python如何在同一行输入三个数&#xff1f; Python语言是一门广泛使用的编程语言&#xff0c;被广泛应用于数据分析、机器学习、Web开发、科学计算、人工智能等领域。但是&#xff0c;有时候我们需要在同一行输入多个变量或数字&#xff0c;这可能给一些初学者带来一些困惑。本…

暑期实习开始啦「GitHub 热点速览」

作者&#xff1a;HelloGitHub-小鱼干 无巧不成书&#xff0c;刚好最近有小伙伴在找实习&#xff0c;而 GitHub 热榜又有收录实习信息的项目在榜。所以&#xff0c;无意外本周特推就收录了这个实习项目&#xff0c;当然还有国内版本。除了应景的实习 repo 之外&#xff0c;还有帮…

快手 | 后端Java实习生 | 一面

目录 1.Redis缓存和MySQL数据一致性如何保证&#xff1f;2.你使用缓存&#xff0c;在高并发的情况下&#xff0c;如果多个缓存同时过期了怎么办&#xff1f;3.消息队列积压了怎么办&#xff1f;4.jdk1.8之后Java内存模型分别哪几个部分&#xff1f;每个部分用一句话概括一下&am…

GaussDB云数据库SQL应用系列-定时任务管理

前言 GaussDB数据库定时任务主要可以用于实现定期的备份、统计信息采集、数据汇总、数据清理与优化等&#xff0c;它是指在指定的时间间隔内自动执行一次或多次SQL语句的程序。 一、GaussDB数据库定时任务介绍 GaussDB数据库兼容Oracle定时任务功能主要通过DBE_TASK高级功能…

POSTGRESQL PSQL 命令中如何使用变量带入查询和函数

开头还是介绍一下群&#xff0c;如果感兴趣polardb ,mongodb ,mysql ,postgresql ,redis 等有问题&#xff0c;有需求都可以加群群内有各大数据库行业大咖&#xff0c;CTO&#xff0c;可以解决你的问题。加群请联系 liuaustin3 &#xff0c;在新加的朋友会分到2群&#xff08;共…

chatgpt赋能python:Python如何删除已经写好的代码?

Python如何删除已经写好的代码&#xff1f; 在编程中&#xff0c;大多数时候写出来的代码都是需要保留的。但是&#xff0c;随着项目的发展和需求的变化&#xff0c;有些代码可能就没有用了。这时&#xff0c;我们需要删除这些无用的代码&#xff0c;以保持程序的简洁性和效率…

Nginx、location匹配、Rewrite重写模块

Nginx、location匹配、Rewrite重写模块 一、常用的Nginx正则表达式二、location匹配概述1、location大致可以分为三类2、location常用的匹配规则3、location 优先级4、location 示例说明5、实际网站使用中&#xff0c;至少有三个匹配规则定义 三、rewrite重写1、rewrite 跳转场…

记录--JavaScript 中有趣的 9 个常用编码套路

这里给大家分享我在网上总结出来的一些知识&#xff0c;希望对大家有所帮助 1️⃣ set对象&#xff1a;数组快速去重 常规情况下&#xff0c;我们想要筛选唯一值&#xff0c;一般会想到遍历数组然后逐个对比&#xff0c;或者使用成熟的库比如lodash之类的。 不过&#xff0c;ES…

Spark RDD分区

文章目录 一、RRD分区&#xff08;一&#xff09;RDD分区概念&#xff08;二&#xff09;RDD分区作用 二、RDD分区数量&#xff08;一&#xff09;RDD分区原则&#xff08;二&#xff09;影响分区的因素&#xff08;三&#xff09;使用parallelize()方法创建RDD时的分区数量1、…

软件测试之环境搭建—苏汽web系统测试环境搭建

一、搭建环境的准备工作 1、安装好RedHat&#xff0c;输入用户名&#xff1a;root&#xff0c;密码&#xff1a;123456&#xff0c;右键点击桌面&#xff0c;打开终端输入“ifconfig”查询IP地址 2.打开xshell&#xff0c;点击文件&#xff0c;选择新建连接&#xff0c;在输入…

【数据分析之道-Matplotlib(七)】Matplotlib直方图

文章目录 专栏导读1、hist()基本语法2、使用 hist() 函数绘制多个数据组的直方图3、修改直方图的颜色及边框颜色4、六一儿童节为主题&#xff0c;使用直方图进行可视化 专栏导读 ✍ 作者简介&#xff1a;i阿极&#xff0c;CSDN Python领域新星创作者&#xff0c;专注于分享pyth…

POI报表的高级应用

POI报表的高级应用 掌握基于模板打印的POI报表导出理解自定义工具类的执行流程 熟练使用SXSSFWorkbook完成百万数据报表打印理解基于事件驱动的POI报表导入 模板打印 概述 自定义生成Excel报表文件还是有很多不尽如意的地方&#xff0c;特别是针对复杂报表头&#xff0c;单元格…