Apache Knox 2.0.0使用

news2024/11/19 8:30:01

目录

介绍

使用

gateway-site.xml

users.ldif

my_hdfs.xml

my_yarn.xml

其它


介绍

        The Apache Knox Gateway is a system that provides a single point of authentication and access for Apache Hadoop services in a cluster. The goal is to simplify Hadoop security for both users (i.e. who access the cluster data and execute jobs) and operators (i.e. who control access and manage the cluster). The gateway runs as a server (or cluster of servers) that provide centralized access to one or more Hadoop clusters. In general the goals of the gateway are as follows:

  • Provide perimeter security for Hadoop REST APIs to make Hadoop security easier to setup and use
    • Provide authentication and token verification at the perimeter
    • Enable authentication integration with enterprise and cloud identity management systems
    • Provide service level authorization at the perimeter
  • Expose a single URL hierarchy that aggregates REST APIs of a Hadoop cluster
    • Limit the network endpoints (and therefore firewall holes) required to access a Hadoop cluster
    • Hide the internal Hadoop cluster topology from potential attackers

使用

 解压后,目录如下:

进入conf目录:

文件说明:

gateway-site.xml

网关文件,修改如下:

    <property>
        <name>gateway.port</name>
        <value>18483</value>
        <description>The HTTP port for the Gateway.</description>
    </property>
    <property>
        <name>gateway.path</name>
        <value>my/mimi</value>
        <description>The default context path for the gateway.</description>
    </property>
    <property>
        <name>gateway.dispatch.whitelist</name>
        <value>.*$</value>
    </property>

users.ldif

用户文件,全部如下

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

version: 1

# Please replace with site specific values
dn: dc=hadoop,dc=apache,dc=org
objectclass: organization
objectclass: dcObject
o: Hadoop
dc: hadoop

# Entry for a sample people container
# Please replace with site specific values
dn: ou=people,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:organizationalUnit
ou: people

dn: ou=myhadoop,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:organizationalUnit
ou: myhadoop

# Entry for a sample end user
# Please replace with site specific values
dn: uid=guest,ou=people,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: Guest
sn: User
uid: guest
userPassword:123456

dn: uid=myclient,ou=people,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: Myclient
sn: Client
uid: myclient
userPassword:qwe

dn: uid=myhdfs,ou=myhadoop,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: myhdfs
sn: myhdfs
uid: myhdfs
userPassword:123456

dn: uid=myyarn,ou=myhadoop,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: myyarn
sn: myyarn
uid: myyarn
userPassword:123


# entry for sample user admin
dn: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: Admin
sn: Admin
uid: admin
userPassword:123456

dn: uid=root,ou=myhadoop,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: Admin
sn: Admin
uid: root
userPassword:123456

# entry for sample user sam
dn: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: sam
sn: sam
uid: sam
userPassword:sam-password

# entry for sample user tom
dn: uid=tom,ou=people,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:person
objectclass:organizationalPerson
objectclass:inetOrgPerson
cn: tom
sn: tom
uid: tom
userPassword:tom-password

# create FIRST Level groups branch
dn: ou=groups,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass:organizationalUnit
ou: groups
description: generic groups branch

# create the analyst group under groups
dn: cn=analyst,ou=groups,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass: groupofnames
cn: analyst
description:analyst  group
member: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
member: uid=tom,ou=people,dc=hadoop,dc=apache,dc=org
member: uid=myhdfs,ou=myhadoop,dc=hadoop,dc=apache,dc=org
member: uid=myyarn,ou=myhadoop,dc=hadoop,dc=apache,dc=org

# create the scientist group under groups
dn: cn=scientist,ou=groups,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass: groupofnames
cn: scientist
description: scientist group
member: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org

# create the admin group under groups
dn: cn=admin,ou=groups,dc=hadoop,dc=apache,dc=org
objectclass:top
objectclass: groupofnames
cn: admin
description: admin group
member: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org
member: uid=root,ou=myhadoop,dc=hadoop,dc=apache,dc=org

注意,我在这里面添加了两个用户 myhdfs,myyarn

 进入knox-2.0.0/conf/topologies 文件,会发现一个sandbox.xml文件

它是一个模板文件,使用时改一个名字,我的如下

my_hdfs.xml

<?xml version="1.0" encoding="UTF-8"?>
<topology>
   <name>my_hdfs</name>
   <gateway>
      <provider>
         <role>authentication</role>
         <name>ShiroProvider</name>
         <enabled>true</enabled>
         <param>
            <name>sessionTimeout</name>
            <value>30</value>
         </param>
         <param>
            <name>main.ldapRealm</name>
            <value>org.apache.knox.gateway.shirorealm.KnoxLdapRealm</value>
         </param>
         <param>
            <name>main.ldapContextFactory</name>
            <value>org.apache.knox.gateway.shirorealm.KnoxLdapContextFactory</value>
         </param>
         <param>
            <name>main.ldapRealm.contextFactory</name>
            <value>$ldapContextFactory</value>
         </param>


		<param>
			<name>main.ldapRealm.contextFactory.systemUsername</name>
			<value>uid=myclient,ou=people,dc=hadoop,dc=apache,dc=org</value>
		</param>
		<param>
			<name>main.ldapRealm.contextFactory.systemPassword</name>
			<value>${ALIAS=ldcSystemPassword}</value>
		</param>
		<param>
			<name>main.ldapRealm.userSearchBase</name>
			<value>ou=myhadoop,dc=hadoop,dc=apache,dc=org</value>
		</param>
		<param>
			<name>main.ldapRealm.userSearchAttributeName</name>
			<value>uid</value>
		</param>
		<param>
			<name>main.ldapRealm.userSearchFilter</name>
			<value>(&amp;(objectclass=person)(uid={0})(uid=myhdfs))</value>
		</param>

         <param>
            <name>main.ldapRealm.contextFactory.url</name>
            <value>ldap://localhost:33389</value>
         </param>
         <param>
            <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
            <value>simple</value>
         </param>
         <param>
            <name>urls./**</name>
            <value>authcBasic</value>
         </param>
      </provider>
      <provider>
         <role>identity-assertion</role>
         <name>Default</name>
         <enabled>true</enabled>
      </provider>
      <provider>
         <role>hostmap</role>
         <name>static</name>
         <enabled>true</enabled>
         <param>
            <name>localhost</name>
            <value>my_hdfs</value>
         </param>
      </provider>
   </gateway>
<!--
   <service>
      <role>NAMENODE</role>
      <url>hdfs://hadoop02:9000</url>
   </service>
   <service>
      <role>WEBHDFS</role>
      <url>http://hadoop02:50070/webhdfs</url>
   </service>
-->   
   <service>
      <role>HDFSUI</role>
      <url>http://hadoop02:50070</url>
      <version>2.7.0</version>
   </service>

</topology>

my_yarn.xml

<?xml version="1.0" encoding="UTF-8"?>
<topology>
   <name>my_yarn</name>
   <gateway>
      <provider>
         <role>authentication</role>
         <name>ShiroProvider</name>
         <enabled>true</enabled>
         <param>
            <name>sessionTimeout</name>
            <value>30</value>
         </param>
         <param>
            <name>main.ldapRealm</name>
            <value>org.apache.knox.gateway.shirorealm.KnoxLdapRealm</value>
         </param>
         <param>
            <name>main.ldapContextFactory</name>
            <value>org.apache.knox.gateway.shirorealm.KnoxLdapContextFactory</value>
         </param>
         <param>
            <name>main.ldapRealm.contextFactory</name>
            <value>$ldapContextFactory</value>
         </param>
         
		 <!-- 登录测试 -->
        <param>
			<name>main.ldapRealm.contextFactory.systemUsername</name>
			<value>uid=myclient,ou=people,dc=hadoop,dc=apache,dc=org</value>
		</param>
		<param>
			<name>main.ldapRealm.contextFactory.systemPassword</name>
			<value>${ALIAS=ldcSystemPassword}</value>
		</param>
		<param>
			<name>main.ldapRealm.userSearchBase</name>
			<value>ou=myhadoop,dc=hadoop,dc=apache,dc=org</value>
		</param>
		<param>
			<name>main.ldapRealm.userSearchAttributeName</name>
			<value>uid</value>
		</param>
		<param>
			<name>main.ldapRealm.userSearchFilter</name>
			<value>(&amp;(objectclass=person)(uid={0})(uid=myyarn))</value>
		</param>
		 <!-- 安全测试-->
		<param><name>csrf.enabled</name><value>true</value></param>
		<param><name>csrf.customHeader</name><value>X-XSRF-Header</value></param>
		<param><name>csrf.methodsToIgnore</name><value>GET,OPTIONS,HEAD</value></param>
		<param><name>cors.enabled</name><value>false</value></param>
		<param><name>xframe.options.enabled</name><value>true</value></param>
		<param><name>xss.protection.enabled</name><value>true</value></param>
		<param><name>strict.transport.enabled</name><value>true</value></param>
		<param><name>rate.limiting.enabled</name><value>true</value></param>

         <param>
            <name>main.ldapRealm.contextFactory.url</name>
            <value>ldap://localhost:33389</value>
         </param>
         <param>
            <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
            <value>simple</value>
         </param>
         <param>
            <name>urls./**</name>
            <value>authcBasic</value>
         </param>
      </provider>
      <provider>
         <role>identity-assertion</role>
         <name>Default</name>
         <enabled>true</enabled>
      </provider>
      <provider>
         <role>hostmap</role>
         <name>static</name>
         <enabled>true</enabled>
         <param>
            <name>localhost</name>
            <value>my_yarn</value>
         </param>
      </provider>
   </gateway>
   <service>
      <role>YARNUI</role>
      <url>http://hadoop03:8088</url>
   </service>
   <service>
    <role>JOBHISTORYUI</role>
    <url>http://hadoop03:19888</url>
  </service>

</topology>

说明,配置中的${ALIAS=ldcSystemPassword}是生成的密文

参考:

如果测试不成功改名实际密码测试

之后进入knox-2.0.0/bin

执行(注意,不能使用root用户)

./ldap.sh start
./knoxcli.sh create-master
./gateway.sh start

网站访问:

https://192.168.200.11:18483/my/mimi/my_yarn/yarn

账号:myyarn 密码 123

https://192.168.200.11:18483/my/mimi/my_hdfs/hdfs

账号:myhdfs 密码 123456

其它

通过我的多次测试,成功实现了,不同用户访问不同页面。下面是几个问题

1 必须有多个单独的xml才能实现不同用户访问不同页面

2 我测试成功的只有hdfs、yarn、hbase。之后准备测试hue,发现死活不成功

注意:文中xml单用户访问页面可以直接复制,然后修改。官网描述很多测试失败。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/1663937.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

双轴测径仪功能多 适用于各行各业外径检测

JG02Z-DG 系列双轴测径仪是双光路外径检测设备&#xff0c;两组测头可以进行不同形式的组合&#xff0c;从而完成不同产线需求的检测&#xff0c;今天我们主要讲解45角双轴测径仪&#xff0c;该种测径仪是较为常用的检测设备&#xff0c;两组测头与水平方向垂直方向呈45度角&am…

免费SSL证书有效期现状

自2024年4月25日起&#xff0c;腾讯云上申请的免费SSL证书有效期将从原先的12个月调整为3个月。而在其他平台&#xff0c;比如Gworg&#xff0c;已经有策略表明将停止签发1年期的免费SSL证书&#xff0c;转而仅提供有效期为3个月的证书。 目前&#xff0c;免费SSL证书的有效期…

笨方法自学python(一)

我觉得python和c语言有很多相似之处&#xff0c;如果有c语言基础的话学习python也不是很难。这一系列主要是学习例题来学习python&#xff1b;我用的python版本是3.12 代码编辑器我用的是notepad&#xff0c;运行py程序用cmd 现在开始写第一个程序&#xff1a; print ("…

拉线开关C++

题目&#xff1a; 思路&#xff1a; m%2的结果为0&#xff0c;就证明开关次数为偶数也就是处于关灯的状态就按题目所要求的输出0&#xff1b; 否则&#xff0c;就证明开关次数为奇数也就是处于开灯的状态就按题目所要求的输出1. 代码 &#xff1a; #include<iostream> u…

IDEA-SpringBoot项目yml配置文件不自动提示解决办法

文章目录 一、五花八门的解决方案二、通用解决方案 前言&#xff1a;本文为SpringBoot项目配置文件.yaml/.yml文件编写时没有自动提示的解决方案 一、五花八门的解决方案 同样的&#xff0c;当resources中只存在.properties文件和.yaml文件时&#xff0c;.yaml文件会主动提示信…

Vue3组件库开发项目实战——02项目搭建(配置Eslint/Prettier/Sass/Tailwind CSS/VitePress/Vitest)

摘要&#xff1a;在现代前端开发中&#xff0c;构建一个高效、可维护且易于协作的开发环境至关重要。特别是在开发Vue3组件库时&#xff0c;我们需要确保代码的质量、一致性和文档的完整性。本文将带你从0搭建vue3组件库开发环境&#xff0c;以下是配置代码规范、格式化、CSS样…

JVS物联网平台点位管理:让数据更高效

采集点位数据管理 采集点位数据管理采用列表页的方式进行展示&#xff0c;系统支持新增、批量导入、批量导出、批量删除、点位发现等功能&#xff0c;点位的详情、删除、历史记录等&#xff1a; 点位新增 点击新增点位按钮&#xff0c;在弹出表单中&#xff0c;可以填写相关的…

深入理解JavaScript事件循环Event Loop:宏任务与微任务的奇幻之旅

&#x1f525; 个人主页&#xff1a;空白诗 文章目录 &#x1f389; 引言&#x1f31f; 什么是事件循环&#xff1f;&#x1f4da; 「宏任务」 vs 「微任务」「宏任务」(Macrotask)「微任务」(Microtask)实际应用中的注意事项 &#x1f500; 执行流程概览&#x1f4dd; 代码示例…

【挑战30天首通《谷粒商城》】-【第一天】03、简介-分布式基础概念

文章目录 课程介绍 ( 本章了解即可&#xff0c;可以略过)1、微服务简而言之: 2、集群&分布式&节点2.1、定义2.2、示例 3、远程调用4、负载均衡常见的负裁均衡算法: 5、服务注册/发现&注册中心6、配置中心7、服务熔断&服务降级7.1、服务熔断7.2、服务降级 8、AP…

QT作业5

1、聊天室 服务器端 //头文件 #ifndef WIDGET_H #define WIDGET_H#include <QWidget> #include <QTcpServer> #include <QTcpSocket> #include <QList> #include <QListWidget> #include <QMessageBox> #include <QDebug> #includ…

C语言(指针)4

Hi~&#xff01;这里是奋斗的小羊&#xff0c;很荣幸各位能阅读我的文章&#xff0c;诚请评论指点&#xff0c;关注收藏&#xff0c;欢迎欢迎~~ &#x1f4a5;个人主页&#xff1a;小羊在奋斗 &#x1f4a5;所属专栏&#xff1a;C语言 本系列文章为个人学习笔记&#x…

我必须要吹一波MATLAB 2024a,太牛逼了!|福利:附安装教程及下载地址

最近逛MATLAB官网&#xff0c;发现MATLAB 2024a版本已经Pre-release了&#xff0c;翻了下release note&#xff0c;不得不感叹&#xff0c;实在是太强了&#xff01; 这次重点更新了四个工具箱&#xff1a; Computer Vision Toolbox Deep Learning Toolbox Instrument Contro…

1.基于python的单细胞数据预处理-降维可视化

目录 降维的背景PCAt-sneUMAP检查质量控制中的指标 参考&#xff1a; [1] https://github.com/Starlitnightly/single_cell_tutorial [2] https://github.com/theislab/single-cell-best-practices 降维的背景 虽然特征选择已经减少了维数&#xff0c;但为了可视化&#xff0…

InputStream,OutputStream的用法以及相应的案例

1. 文件系统的操作&#xff1a;File类。 2. 文件内容的操作&#xff1a;Stream流。 字符流&#xff1a;IntputStream &#xff0c; OutputStream。 字节流&#xff1a;read &#xff0c; write。 InputStream&#xff0c;OutputStream InputStream和OutputStream都不能被实例…

Web数字孪生引擎

Web数字孪生引擎是指用于在Web上创建和运行数字孪生的软件平台。它们通常提供一组API和工具&#xff0c;用于连接到实时数据源、可视化数据并创建交互式体验。Web数字孪生引擎被广泛应用于各种应用&#xff0c;例如工业物联网、智能建筑、城市管理和公共安全等。北京木奇移动技…

Linux 安裝 rpm包

下载 地址&#xff1a;https://developer.aliyun.com/packageSearch 安装 rpm -ivh lsof-4.87-6.el7.x86_64.rpmlsof -Ki|awk {print $2}|sort|uniq -c|sort -nr|head lsof | wc -l

上位机图像处理和嵌入式模块部署(树莓派4b和电源供给)

【 声明&#xff1a;版权所有&#xff0c;欢迎转载&#xff0c;请勿用于商业用途。 联系信箱&#xff1a;feixiaoxing 163.com】 前面&#xff0c;我们说过pc电脑和嵌入式设备&#xff0c;两者都可以实现相同的软件功能。但是和pc相比较&#xff0c;嵌入式设备不仅价格更便宜&a…

RabbitMQ (windows) 安装

大家好我是苏麟 , 今天安装一下 RabbitMQ . 官网 : RabbitMQ: One broker to queue them all | RabbitMQ 1.点击 Getting Started 2. 点击 Docs 3.点击 Install And Upgrade 4.点击 installation via Chocolatory 5. 直接下载安装包 RabbitMQ 下好了先放在一遍 RabbitMQ 需要 E…

前端框架-echarts

Echarts 项目中要使用到echarts框架&#xff0c;从零开始在csdn上记笔记。 这是一个基础的代码&#xff0c;小白入门看一下 <!DOCTYPE html> <html lang"en"> <head><meta charset"UTF-8"><meta name"viewport" co…

新手做抖音小店必须了解的回款周期问题!这样做,一步快速结算

哈喽~我是电商月月 我们做抖音小店&#xff0c;在商品上架出单后&#xff0c;商家需要到商品的上家拍单&#xff0c;由上家发货给顾客&#xff0c;顾客收到货后&#xff0c;周期一到才能赚取到商品的差价&#xff0c;这个时间大概在7~15天之内&#xff0c;想准时发货不超时违规…