文本生成公开数据集/开源工具/经典论文详细列表分享

news2024/11/15 23:38:56

    这是一份由清华大学自然语言处理小组整理的文本生成相关的公开数据集/开源工具/经典论文列表,并且不断增加论文和持续修改名单,分享给大家。

    源链接:https://github.com/THUNLP-MT/TG-Reading-List

目录

    数据集

        故事生成

        文本生成

工具

经典论文

        相关论文

        基于Seq2Seq的方法

        基于变分自动编码器的方法

        基于生成对抗网的方法

        基于强化学习的方法

        基于知识的方法

        风格转移

公开数据集

故事生成

        ROCStories: Mostafazadeh, Nasrin and Chambers, Nathanael and He, Xiaodong and Parikh, Devi and Batra, Dhruv and Vanderwende, Lucy and Kohli, Pushmeet and Allen, James. 2016. A Corpus and Evaluation Framework for Deeper Understanding of Commonsense Stories. In Proceedings of NAACL-HLT 2016.

        VIST: Huang, Ting-Hao (Kenneth) and Ferraro, Francis and Mostafazadeh, Nasrin and Misra, Ishan and Agrawal, Aishwarya and Devlin, Jacob and Girshick, Ross and He, Xiaodong and Kohli, Pushmeet and Batra, Dhruv and Zitnick, C. Lawrence and Parikh, Devi and Vanderwende, Lucy and Galley, Michel and Mitchell, Margaret. 2016. Visual Storytelling. In Proceedings of ACL 2016.

        WritingPrompts: Fan, Angela and Lewis, Mike and Dauphin, Yann. 2018. Hierarchical Neural Story Generation. In Proceedings of ACL 2018.

 文本生成

        Yelp Review Generation Dataset: Xu, Jingjing and Ren, Xuancheng and Lin, Junyang and Sun, Xu. 2018. Diversity-Promoting GAN: A Cross-Entropy Based Generative Adversarial Network for Diversified Text Generation. In Proceedings of EMNLP 2018.

        Amazon Review Generation Dataset: McAuley, Julian John and Leskovec, Jure. 2013. From Amateurs to Connoisseurs: Modeling The Evolution of User Expertise Through Online Reviews. In Proceedings of WWW 2013.

        Zhihu Dataset and Composition Dataset: Feng, Xiaocheng and Liu, Ming and Liu, Jiahao and Qin, Bing and Sun, Yibo and Liu, Ting. 2018. Topic-to-essay generation with neural networks. In Proceedings of IJCAI 2018.

        ACL Title and Abstract Dataset: Wang, Qingyun and Zhou, Zhihao and Huang, Lifu and Whitehead, Spencer and Zhang, Boliang and Ji, Heng and Knight, Kevin. 2018. Paper Abstract Writing through Editing Mechanism. In Proceedings of ACL 2018.

        AGENDA Dataset: Rik, Koncel-Kedziorski and Dhanush, Bekal and Yi, Luan and Mirella, Lapata and Hannaneh, Hajishirzi. 2019. Text Generation from Knowledge Graphs with Graph Transformers. In Proceedings of NAACL-HLT 2019.

开源工具

        Hu, Zhiting and Yang, Zichao and Zhao, Tiancheng and Shi, Haoran and He, Junxian and Wang, Di and Ma, Xuezhe and Liu, Zhengzhong and Liang, Xiaodan and Qin, Lianhui and others. 2018. Texar: A Modularized, Versatile, and Extensible Toolbox for Text Generation. In Proceedings of ACL 2018. (GitHub)

        Zhu, Yaoming and Lu, Sidi and Zheng, Lei and Guo, Jiaxian and Zhang, Weinan and Wang, Jun and Yu, Yong. 2018. Textgen: A Benchmarking Platform for Text Generation Models. In Proceedings of SIGIR 2018. (GitHub)

        Radford, Alec and Wu, Jeffrey and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya. 2019. Language models are unsupervised multitask learners. OpenAI Blog, 1:8. (GitHub)

        Seraphina, Goldfarb-Tarrant and Haining, Feng and Nanyun, Peng. 2019. Plan, Write, and Revise: an Interactive System for Open-Domain Story Generation. In Proceedings of NAACL-HLT 2019. (GitHub)

经典论文

相关论文

        Kingma, Diederik P and Welling, Max. 2014. Auto-Encoding Variational Bayes. In Proceedings of ICLR 2014. (Citation: 4,317)

        Ilya Sutskever, Oriol Vinyals, and Quoc V. Le. 2014. Sequence to Sequence Learning with Neural Networks. In Proceedings of NeurIPS 2014. (Citation: 6,076)

        Goodfellow, Ian and Pouget-Abadie, Jean and Mirza, Mehdi and Xu, Bing and Warde-Farley, David and Ozair, Sherjil and Courville, Aaron and Bengio, Yoshua. 2014. Generative Adversarial Nets. In Proceedings of NeurIPS 2014. (Citation: 7,952)

        Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2015. Neural Machine Translation by Jointly Learning to Align and Translate. In Proceedings of ICLR 2015. (Citation: 6,317)

        Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is All You Need. In Proceedings of NeurIPS 2017. (Citation: 1,393)

        Jacob, Devlin and Ming-Wei, Chang and Kenton, Lee and Kristina, Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of NAACL-HLT 2019. (Citation: 345)

基于Seq2Seq的方法

        Huang, Ting-Hao (Kenneth) and Ferraro, Francis and Mostafazadeh, Nasrin and Misra, Ishan and Agrawal, Aishwarya and Devlin, Jacob and Girshick, Ross and He, Xiaodong and Kohli, Pushmeet and Batra, Dhruv and Zitnick, C. Lawrence and Parikh, Devi and Vanderwende, Lucy and Galley, Michel and Mitchell, Margaret. 2016. Visual Storytelling. In Proceedings of ACL 2016. (Citation: 76)

        Melissa Roemmele. 2016. Writing Stories with Help from Recurrent Neural Networks. In Proceedings of AAAI 2016. (Citation: 13)

        Jain, Parag and Agrawal, Priyanka and Mishra, Abhijit and Sukhwani, Mohak and Laha, Anirban and Sankaranarayanan, Karthik. 2017. Story Generation from Sequence of Independent Short Descriptions. arXiv preprint arXiv:1707.05501. (Citation: 10)

        Liu, Tianyu and Wang, Kexiang and Sha, Lei and Chang, Baobao and Sui, Zhifang. 2017. Table-to-text Generation by Structure-aware Seq2seq Learning. In Proceedings of AAAI 2018. (Citation: 14)

        Fan, Angela and Lewis, Mike and Dauphin, Yann. 2018. Hierarchical Neural Story Generation. In Proceedings of ACL 2018. (Citation: 18)

        Song, Linfeng and Zhang, Yue and Wang, Zhiguo and Gildea, Daniel. 2018. A Graph-to-Sequence Model for AMR-to-Text Generation. In Proceedings of ACL 2018. (Ciation: 10)

        Martin, Lara J and Ammanabrolu, Prithviraj and Wang, Xinyu and Hancock, William and Singh, Shruti and Harrison, Brent and Riedl, Mark O. 2018. Event Representations for Automated Story Generation with Deep Neural Nets. In Proceedings of AAAI 2018. (Citation: 30)

        Clark, Elizabeth and Ji, Yangfeng and Smith, Noah A. 2018. Neural Text Generation in Stories Using Entity Representation as Contex. In Proceedings of NAACL-HLT 2018. (Citation: 7)

        Wiseman, Sam and Shieber, Stuart and Rush, Alexander. 2018. Learning Neural Templates for Text Generation. In Proceedings of EMNLP 2018. (Citation: 5)

        Chaturvedi, Snigdha and Peng, Haoruo and Roth, Dan. 2018. Story Comprehension for Predicting What Happens Next. In Proceedings of EMNLP 2018. (Citation: 15)

        Zhang, Yue and Liu, Qi and Song, Linfeng. 2018. Sentence-State LSTM for Text Representation. In Proceedings of ACL 2018. (Citation: 5)

        Kezar, Lee. 2018. Mixed Feelings: Natural Text Generation with Variable, Coexistent Affective Categories. In Proceedings of ACL 2018, Student Research Workshop.

        Welleck, Sean and Brantley, Kianté and Daumé III, Hal and Cho, Kyunghyun. 2019. Non-Monotonic Sequential Text Generation. In Proceedings of ICML 2019. (Citation: 1)

        Nikolaos, Pappas and James, Henderson. 2019. Deep Residual Output Layers for Neural Language Generation. In Proceedings of ICML 2019.

        Amit, Moryossef and Yoav, Goldberg and Ido, Dagan. 2019. Step-by-Step: Separating Planning from Realization in Neural Data to Text Generation. In Proceedings of NAACL-HLT 2019.

        Sheng, Shen and Daniel, Fried and Jacob, Andreas and Dan, Klein. 2019. Pragmatically Informative Text Generation. In Proceedings of NAACL-HLT 2019.

        Fan, Angela and Lewis, Mike and Dauphin, Yann. 2019. Strategies for Structuring Story Generation. In Proceedings of ACL 2019.

  基于变分推理的方法

        Li, Jiwei and Luong, Thang and Jurafsky, Dan. 2015. A Hierarchical Neural Autoencoder for Paragraphs and Documents. In Proceedings of ACL 2015. (Citation: 283)

        Semeniuta, Stanislau and Severyn, Aliaksei and Barth, Erhardt. 2017. A Hybrid Convolutional Variational Autoencoder for Text Generation. In Proceedings of EMNLP 2017. (Citation: 57)

        Serban, Iulian Vlad and Ororbia II, Alexander and Pineau, Joelle and Courville, Aaron. 2017. Piecewise Latent Variables for Neural Variational Text Processing. In Proceedings of EMNLP 2017. (Citation: 11)

        Yang, Zichao and Hu, Zhiting and Salakhutdinov, Ruslan and Berg-Kirkpatrick, Taylor. 2017. Improved Variational Autoencoders for Text Modeling using Dilated Convolutions. In Proceedings of ICML 2017. (Citation: 72)

        Hu, Zhiting and Yang, Zichao and Liang, Xiaodan and Salakhutdinov, Ruslan and Xing, Eric P. 2017. Toward Controlled Generation of Text. In Proceedings of ICML 2017. (Citation: 120)

        Deng, Yuntian and Kim, Yoon and Chiu, Justin and Guo, Demi and Rush, Alexander. 2018. Latent Alignment and Variational Attention. In Proceedings of NeurIPS 2018. (Citation: 9)

        Kim, Yoon and Wiseman, Sam and Miller, Andrew C and Sontag, David and Rush, Alexander M. 2018. Semi-Amortized Variational Autoencoders. In Proceedings of ICML 2018. (Citation: 27)

        Bahuleyan, Hareesh and Mou, Lili and Vechtomova, Olga and Poupart, Pascal. 2018. Variational Attention for Sequence-to-Sequence Models. In Proceedings of COLING 2018. (Citation: 14)

        Xu, Jiacheng and Durrett, Greg. 2018. Spherical Latent Spaces for Stable Variational Autoencoders. In Proceedings of EMNLP 2018. (Citation: 6)

        Yoo, Kang Min and Shin, Youhyun and Lee, Sang-goo. 2019. Data Augmentation for Spoken Language Understanding via Joint Variational Generation. In Proceedings of AAAI 2019. (Citation: 2)

        Wenlin, Wang and Zhe, Gan and Hongteng, Xu and Ruiyi, Zhang and Guoyin, Wang and Dinghan, Shen and Changyou, Chen and Lawrence, Carin. 2019. Topic-Guided Variational Auto-Encoder for Text Generation. In Proceedings of NAACL-HLT 2019.

        Bahuleyan, Hareesh and Mou, Lili and Vamaraju, Kartik and Zhou, Hao and Vechtomova, Olga. 2019. Probabilistic Natural Language Generation with Wasserstein Autoencoders. In Proceedings of NAACL-HLT 2019. (Citation: 3)

        Gu, Xiaodong and Cho, Kyunghyun and Ha, Jung-Woo and Kim, Sunghun. 2019. DialogWAE: Multimodal Response Generation with Conditional Wasserstein Auto-Encoder. In Proceedings of ICLR 2019. (Citation: 9)

        Zhang, Xinyuan and Yang, Yi and Yuan, Siyang and Shen, Dinghan and Carin, Lawrence. 2019. Syntax-Infused Variational Autoencoder for Text Generation. In Proceedings of ACL 2019.

        Shen, Dinghan and Celikyilmaz, Asli and Zhang, Yizhe and Chen, Liqun and Wang, Xin and Gao, Jianfeng and Carin, Lawrence. 2019. Towards Generating Long and Coherent Text with Multi-Level Latent Variable Models. In Proceedings of ACL 2019.

基于生成对抗网络的方法

        Kusner, Matt J and Hernández-Lobato, José Miguel. 2016. GANS for Sequences of Discrete Elements with the Gumbel-softmax Distribution. arXiv preprint arXiv:1611.04051. (Citation: 71)

        Gulrajani, Ishaan and Ahmed, Faruk and Arjovsky, Martin and Dumoulin, Vincent and Courville, Aaron C. 2017. Improved Training of Wasserstein GANs. In Proceedings of NeurIPS 2017. (Citation: 1,102)

        Yu, Lantao and Zhang, Weinan and Wang, Jun and Yu, Yong. 2017. SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient. In Proceedings of AAAI 2017. (Citation: 436)

        Liang, Xiaodan and Hu, Zhiting and Zhang, Hao and Gan, Chuang and Xing, Eric P. 2017. Recurrent Topic-Transition GAN for Visual Paragraph Generation. In Proceedings of IEEE 2017. (Citation: 65)

        Zhang, Yizhe and Gan, Zhe and Fan, Kai and Chen, Zhi and Henao, Ricardo and Shen, Dinghan and Carin, Lawrence. 2017. Adversarial Feature Matching for Text Generation. In Proceedings of ICML 2017. (Citation: 68)

        Guo, Jiaxian and Lu, Sidi and Cai, Han and Zhang, Weinan and Yu, Yong and Wang, Jun. 2017. Long Text Generation via Adversarial Training with Leaked Information. In Proceedings of AAAI 2018. (Citation: 46)

        Xu, Jingjing and Ren, Xuancheng and Lin, Junyang and Sun, Xu. 2018. Diversity-Promoting GAN: A Cross-Entropy Based Generative Adversarial Network for Diversified Text Generation. In Proceedings of EMNLP 2018. (Citation: 2)

        Mroueh, Youssef and Li, Chun-Liang and Sercu, Tom and Raj, Anant and Cheng, Yu. 2018. Sobolev GAN. In Proceedings of ICLR 2018. (Citation: 22)

        Fedus, William and Goodfellow, Ian and Dai, Andrew M. 2018. MaskGAN: Better Text Generation via Filling in the_. In Proceedings of ICLR 2018. (Citation: 58)

        Li, Jianing and Lan, Yanyan and Guo, Jiafeng and Xu, Jun and Cheng, Xueqi. 2019. Differentiated Distribution Recovery for Neural Text Generation. In Proceedings of AAAI 2019.

        Nie, Weili and Narodytska, Nina and Patel, Ankit. 2019. RelGAN: Relational Generative Adversarial Networks for Text Generation. In Proceedings of ICLR 2019. (Citation: 5)

        Chen, Francine and Chen, Yan-Ying. 2019. Adversarial Domain Adaptation Using Artificial Titles for Abstractive Title Generation. In Proceedings of ACL 2019.

基于强化学习的方法

        Lin, Kevin and Li, Dianqi and He, Xiaodong and Zhang, Zhengyou and Sun, Ming-Ting. 2017. Adversarial Ranking for Language Generation. In Proceedings of NeurIPS 2017. (Citation: 54)

        Che, Tong and Li, Yanran and Zhang, Ruixiang and Hjelm, R Devon and Li, Wenjie and Song, Yangqiu and Bengio, Yoshua. 2017. Maximum-Likelihood Augmented Discrete Generative Adversarial Networks. arXiv preprint arXiv:1702.07983. (Citation: 64)

        Xu, Jingjing and Zhang, Yi and Zeng, Qi and Ren, Xuancheng and Cai, Xiaoyan and Sun, Xu. 2018. A Skeleton-Based Model for Promoting Coherence Among Sentences in Narrative Story Generation. In Proceedings of EMNLP 2018. (Citation: 4)

        Wang, Xin and Chen, Wenhu and Wang, Yuan-Fang and Wang, William Yang. 2018. No Metrics Are Prefect: Adversarial Reward Learning for Visual Storytelling. In Proceedings of ACL 2018. (Citation: 19)

        Hjelm, R Devon and Jacob, Athul Paul and Che, Tong and Trischler, Adam and Cho, Kyunghyun and Bengio, Yoshua. 2018. Boundary-Seeking Generative Adversarial Networks. In Proceedings of ICLR 2018. (Citation: 52)

        Shi, Zhan and Chen, Xinchi and Qiu, Xipeng and Huang, Xuanjing. 2018. Towards Diverse Text Generation with Inverse Reinforcement Learning. In Proceedings of IJCAI 2018. (Citation: 4)

        Subramanian, Sandeep and Mudumba, Sai Rajeswar and Sordoni, Alessandro and Trischler, Adam and Courville, Aaron C and Pal, Chris. 2018. Towards Text Generation with Adversarially Learned Neural Outlines. In Advances in NeurIPS 2018. (Citation: 2)

        Huang, Qiuyuan and Gan, Zhe and Celikyilmaz, Asli and Wu, Dapeng and Wang, Jianfeng and He, Xiaodong. 2019. Hierarchically Structured Reinforcement Learning for Topically Coherent Visual Story Generation. In Proceedings of AAAI 2019. (Citation: 9)

        Kazuma, Hashimoto and Yoshimasa, Tsuruoka. 2019. Accelerated Reinforcement Learning for Sentence Generation by Vocabulary Prediction. In Proceedings of NAACL-HLT 2019.

        Chan, Hou Pong and Chen, Wang and Wang, Lu and King, Irwin. 2019. Neural Keyphrase Generation via Reinforcement Learning with Adaptive Rewards. In Proceedings of ACL 2019.

基于知识方法

        Liu, Hugo and Singh, Push. 2002. MAKEBELIEVE: Using Commonsense Knowledge to Generate Stories. In Proceedings of AAAI 2002. (Citation: 86)

        Yang, Bishan and Mitchell, Tom. 2017. Leveraging Knowledge Bases in LSTMs for Improving Machine Reading. In Proceedings of ACL 2017. (Citation: 36)

        Ghazvininejad, Marjan and Brockett, Chris and Chang, Ming-Wei and Dolan, Bill and Gao, Jianfeng and Yih, Wen-tau and Galley, Michel. 2018. A Knowledge-Grounded Neural Conversation Model. In Proceedings of AAAI 2018. (Citation: 61)

        Li, Qian and Li, Ziwei and Wei, Jin-Mao and Gu, Yanhui and Jatowt, Adam and Yang, Zhenglu. 2018. A Multi-Attention Based Neural Network with External Knowledge for Story Ending Predicting Task. In Proceedings of COLING 2018. (Citation: 4)

        Jian Guan, Yansen Wang and Minlie Huang. 2019. Story Ending Generation with Incremental Encoding and Commonsense Knowledge. In Proceedings of AAAI 2019. (Citation: 3)

        Chen, Jiaao and Chen, Jianshu and Yu, Zhou. 2019. Incorporating Structured Commonsense Knowledge in Story Completion. In Proceedings of AAAI 2019.

        Shang, Mingyue and Fu, Zhenxin and Yin, Hongzhi and Tang, Bo and Zhao, Dongyan and Yan, Rui. 2019. Find a Reasonable Ending for Stories: Does Logic Relation Help the Story Cloze Test?. In Student Abstract of AAAI 2019.

        Li, Christy Y. and Liang, Xiaodan and Hu, Zhiting and Xing, Eric P.. 2019. Knowledge-Driven Encode, Retrieve, Paraphrase for Medical Image Report Generation. In Proceedings of AAAI 2019. (Citation: 4)

        Koncel-Kedziorshi, Rik and Bekal, Dhanush and Luan, Yi and Lapata, Mirella and Hajishirzi, Hannaneh. 2019. Text Generation from Knowledge Graphs with Graph Transformers. In Proceedings of NAACL-HLT 2019.

        Valerie, Hajdik and Jan, Buys and Michael W., Goodman and Emily M., Bender. 2019. Neural Text Generation from Rich Semantic Representations. In Proceedings of NAACL-HLT 2019.

        Yang, Pengcheng and Luo, Fuli and Chen, Peng and Li, Lei and Chang, Baobao and Sui, Zhifang and Sun, Xu. 2019. Knowledgeable Storyteller: A Commonsense-Driven Generative Model for Visual Storytelling. In Proceedings of IJCAI 2019.

        Yang, Pengcheng and Li, Lei and Luo, Fuli and Liu, Tianyu and Sun, Xu. 2019. Enhancing Topic-to-Essay Generation with External Commonsense Knowledge. In Proceedings of ACL 2019.

风格迁移

        Hu, Zhiting and Yang, Zichao and Liang, Xiaodan and Salakhutdinov, Ruslan and Xing, Eric P.. 2017. Toward Controlled Generation of Text. In Proceedings of ICML 2017. [code] (Citation: 179)

        Shen, Tianxiao and Lei, Tao and Barzilay, Regina and Jaakkola, Tommi. 2017. Style Transfer from Non-Parallel Text by Cross-Alignment. In Proceedings of NeurIPS 2017. [code](Citation: 123)

        Han, Mengqiao and Wu, Ou and Niu, Zhendong. 2017. Unsupervised Automatic Text Style Transfer using LSTM. In Proceedings of NLPCC 2017. (Citation: 5)

        Li, Juncen and Jia, Robin and He, He and Liang, Percy. 2018. Delete, retrieve, generate: A simple approach to sentiment and style transfer. In Proceedings of NAACL-HLT 2018. [code] (Citation: 53)

        Zhang, Ye and Ding, Nan and Soricut, Radu. SHAPED: Shared-Private Encoder-Decoder for Text Style Adaptation. In Proceedings of NAACL-HLT 2018. (Citation: 9)

        Prabhumoye, Shrimai and Tsvetkov, Yulia and Salakhutdinov, Ruslan and Black, Alan W. 2018. Style Transfer Through Back-Translation. In Proceedings of ACL 2018. [code] (Citation: 47)

        Xu, Jingjing and Sun, Xu and Zeng, Qi and Ren, Xuancheng and Zhang, Xiaodong and Wang, Houfeng and Li, Wenjie. 2018. Unpaired sentiment-to-sentiment translation: A cycled reinforcement learning approach. In Proceedings of ACL 2018. [code] (Citation: 21)

        Santos, Cicero Nogueira dos and Melnyk, Igor and Padhi, Inkit. 2018. Fighting offensive language on social media with unsupervised text style transfer. In Proceedings of ACL 2018. (Citation: 9)

        Yang, Zichao and Hu, Zhiying and Dyer, Chris and Xing, Eric P. and Berg-Kirkpatrick, Taylor. 2018. Unsupervised Text Style Transfer using Language Models as Discriminators. In Proceedings of NeurIPS 2018. (Citation: 31)

        Zhang, Zhirui and Ren, Shuo and Liu, Shujie and Wang, Jianyong and Chen, Peng and Li, Mu and Zhou, Ming and Chen, Enhong. 2018. Style Transfer as Unsupervised Machine Translation. arXiv preprint arXiv:1808.07894. (Citation: 5)

        Gong, Hongyu and Bhat, Suma and Wu, Lingfei and Xiong, Jinjun and Hwu, Wen-mei. 2019. Reinforcement Learning Based Text Style Transfer without Parallel Training Corpus. In Proceedings of NAACL-HLT 2019. (Citation: 1)

        Luo, Fuli and Li, Peng and Zhou, Jie and Yang, Pengcheng and Chang, Baobao and Sui, Zhifang and Sun, Xu. 2019. A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer. In Proceedings of IJCAI 2019. [code] (Citation: 3)

        Lee, Joseph and Xie, Ziang and Wang, Cindy and Drach, Max and Jurafsky, Dan and Ng, Andrew Y. 2019. Neural Text Style Transfer via Denoising and Reranking. In Proceedings of ACL 2019 Workshop.

 

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/78701.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

开关电源环路稳定性分析(06)-功率级和控制级

大家好,这里是大话硬件。 根据上一篇文章的分析,开关电源系统主要分为3个部分,功率级,控制级,反馈级。今天这篇文章我们分析功率级和控制级的传递函数。 1.功率级传递函数 从功能框图上可以看出来,功率…

教材征订和下发系统

项目描述 临近学期结束,还是毕业设计,你还在做java程序网络编程,期末作业,老师的作业要求觉得大了吗?不知道毕业设计该怎么办?网页功能的数量是否太多?没有合适的类型或系统?等等。这里根据疫情当下,你想解决的问…

Spring Cloud Alibaba Nacos Config - - - >配置中心

官方文档:https://github.com/alibaba/spring-cloud-alibaba/wiki/Nacos-config 市面上比较有名的配置中心: Spring Cloud ConfigApolloSpring Cloud Alibaba Nacos Config Spring Cloud Config 大部分场景结合 git 使用,动态变更还需要依赖…

Python获取世界杯热搜榜,并制作脚本自动发送信息到邮箱

前言 现在正是卡塔尔世界杯激战正酣的时候,每天都有各种各样的新闻。而且,不同的球队,随着比赛的进程,关注的热度也会发生翻天覆地的变化。 今天我们就来获取卡塔尔世界的球队热搜榜,并制作自动发送邮件脚本&#xff…

深度优先搜索(DFS)剪枝:记忆化搜索(C++)

目录 一、基本思想 二、样例 三、程序 1、普通的深度优先搜索 2、分析 3、记忆化搜索 程序 四、实际速度样例 一、基本思想 今天我们来讲一下深搜的剪枝方法中的一个:记忆化搜索。 顾名思义,记忆化搜索就是让程序记住一些东西,然后可以…

Stimulsoft Dashboards.JS JavaScript 2203.1.0仪表板

Stimulsoft Dashboards.JS--Ω578867473 Dashboards.JS 是一个功能齐全的工具,用于为 JavaScript 平台创建仪表板。 JavaScript 仪表板 Dashboards.JS 是一个功能齐全的工具,用于为 JavaScript 平台创建仪表板。要生成和查看仪表板,您需要任何…

Qt扫盲-QAbstractButton 笔记总结

QAbstractButton使用总结一、概要1.显示内容2. 快捷键3. 对话框默认按钮4. 按钮状态5. 信号说明6. 自定义按钮QAbstractButton 类实现的是一个抽象按钮。主要是Button类具有的共性,但是处理用户的操作响应、并绘制不同按钮的形式是由子类来完成的。一、概要 QAbstr…

图文深度解析Linux内存碎片整理实现机制以及源码

图文深度解析Linux内存碎片整理实现机制以及源码。 物理内存是以页为单位进行管理的,每个内存页大小默认是4K(大页除外)。申请物理内存时,一般都是按顺序分配的,但释放内存的行为是随机的。随着系统运行时间变长后,将会出现以下情况: 在多道程序当中,如果要让我们的程…

深度解析车载域控制器

文章目录域控制器域控制器的组成ADAS域控制器智能座舱域HUD仪表盘IVI域控制器的发展域控制器对传统ECU的挑战域控制器 ​ 随着车辆的信息化程度的发展,车辆的ECU也越来越多,从引擎控制、转向助力、仪表、影音等,传统的汽车电子电气架构是分布…

基于Intel® Core™ i5 机器人控制器

XM-6815是一款基于Intel 11代酷睿i系列平台CPU壁挂式电脑,扩展内存槽,1mSATA,3千兆网口,6COM,4USB 3.0,4USB 2.0。该产品适合工业机器人控制器、机器视觉控制器等壁挂安装应用场景. 产品规格 产品类型Inte…

阿里巴巴内部不传之秘「十亿级并发系统顶级教程」GitHub一夜封神

何为超大流量? 超大流量是一个很容易理解的意思!举个例子:现在国内疫情反弹,每个小区都要做核酸那么如果同一时间下来一大批人一起做核酸,那么这就是大流量,然后志愿者将人员进行分配排队让医务人员处理的过来那么这就…

Qt5.6.1移植海思Hi3521d(二)

系列文章目录 Qt5.6.1移植海思Hi3521d(一) 前言 该篇讲解一下,使用海思交叉编译器arm-hisiv500-linux-gcc,编译qt5.6源码,搭建qt交叉编译环境 一、修改qmake.conf 打开文件~/Project/qt-everywhere-opensource-src-5…

Python制作简易版烟花,没资金买烟花就来做个电子版的吧

前言 听说有人说我很久没更新了,那今天来表演个粒子烟花 跨年倒计时20天?我已经开始整烟花了,虽然不是很好看吧,但是也能将就看看 😥 这个的背景图,音乐,还有文字都是可以自己修改的哦 效果…

[附源码]JAVA毕业设计-心理健康管理-(系统+LW)

[附源码]JAVA毕业设计-心理健康管理-(系统LW) 项目运行 环境项配置: Jdk1.8 Tomcat8.5 Mysql HBuilderX(Webstorm也行) Eclispe(IntelliJ IDEA,Eclispe,MyEclispe,Sts都支持)。 项目技术&a…

String 创建了几个对象?

问题一 String zy1 “小朱”; String zy2 “小朱”; 复制代码 问题二 String zy1 “小朱”; String zy2 “大朱”; 复制代码 问题三 String zy1 new String(“小朱”); String zy2 new String(“小朱”); 复制代码 问题四 String zy1 new String(“小朱”);…

09 - 主引导程序控制权的转移

---- 整理自狄泰软件唐佐林老师课程 文章目录1. BootLoader内存布局2. 通过FAT表加载文件内容3. 编程实验:Loader文件内容的加载4. 第一个Loader程序4.1 汇编小贴士:标志寄存器4.2 编程实验:控制权转移5. 小结1. BootLoader内存布局 2. 通过F…

你还不知道「并发下的三色标记」么?

引用计数算法 在对象中添加一个引用计数器,每当有一个地方引用它时 计数器值就加一;当引用失效时,计数器值就减一; 任何时刻计数器为零的对象就是不可能再被使用的。 引用计数算法的缺陷 如下面代码,两个对象互相引用导致无法回收♻️ 对…

【OpenCV学习】第12课:特征提取(高斯不同)

仅自学做笔记用,后续有错误会更改 理论 定义:就是把同一张图像在不同的参数下做高斯模糊之后的结果相减,得到的输出图像,称为高斯不同(DOG)高斯不同是图像的内在特征, 在灰度图像增强丶角点检测中经常用到…

【MySQL】深入分析 锁机制(一)行锁 加锁规则 之 等值查询

文章目录前言一、共享锁(S)和排它锁(X)二、行锁的3种算法Record LockGap LockNext-key Lock三、加锁规则 之 等值查询分析数据准备3.1 聚集索引有匹配索引无匹配索引3.2 唯一索引有匹配索引无匹配索引3.3 普通索引有匹配索引无匹配…

游戏蓝牙耳机哪个好用?2022超低延迟游戏蓝牙耳机推荐

随着蓝牙耳机的快速发展,使用蓝牙耳机玩游戏的人也越来越多。那么,游戏蓝牙耳机哪个好用呢?当然是延迟越低的蓝牙耳机玩游戏的体验感会越好,我们都知道蓝牙耳机相对于有线耳机来说,或多或少会存在延迟。下面&#xff0…