pyopenssl

Is it possible to set subjectAltName using pyOpenSSL?

半城伤御伤魂 提交于 2021-02-04 18:50:30
问题 I need to generate SSL certificates from Python using pyOpenSSL. Does anyone know if it's possible to set subjectAltName? From the documentation (https://pythonhosted.org/pyOpenSSL/api/crypto.html#x509-objects) it doesn't seem so. In fact, only a set_subject method is provided. Is there any way to add that to the certificate? 回答1: san_list = ["DNS:*.google.com", "DNS:google.ym"] cert.add_extensions([ OpenSSL.crypto.X509Extension( "subjectAltName", False, ", ".join(san_list) ) ]) 回答2: I

How to select specific the cipher while sending request via python request module

可紊 提交于 2021-02-04 04:40:41
问题 Usecase: I want to find out how many ciphers are supported by the hostname with python request module. I am not able to find a way to provide the cipher name to request module hook. Can anyone suggest the way to provide the way to specify cipher. import ssl from requests.adapters import HTTPAdapter from requests.packages.urllib3.poolmanager import PoolManager class Ssl3HttpAdapter(HTTPAdapter): """"Transport adapter" that allows us to use SSLv3.""" def init_poolmanager(self, connections,

How to select specific the cipher while sending request via python request module

為{幸葍}努か 提交于 2021-02-04 04:40:11
问题 Usecase: I want to find out how many ciphers are supported by the hostname with python request module. I am not able to find a way to provide the cipher name to request module hook. Can anyone suggest the way to provide the way to specify cipher. import ssl from requests.adapters import HTTPAdapter from requests.packages.urllib3.poolmanager import PoolManager class Ssl3HttpAdapter(HTTPAdapter): """"Transport adapter" that allows us to use SSLv3.""" def init_poolmanager(self, connections,

Python3.7 Scrapy安装(Windows)

大憨熊 提交于 2021-01-10 17:01:12
本文分为两个部分,前大半部分说的都是Windows下手动安装Scrapy,文末给初学编程的童鞋或者不想这么手工安装的童鞋推荐了Scrapy中文网,直接使用其推荐的Anaconda安装Scrapy即可啦! 自己动手,红红脸颊系列: Scrapy依赖的库比较多,在安装之前,你需要确保以下库已经安装:wheel、lxml、pyOpenSSL、Twisted、pywin32,如果没有,先装完,再装Scrapy。 安装wheel 用途: pip安装固然方便,但有时候会遇到安装失败的问题。wheel和egg都是打包的格式,支持不需要编译或制作的安装过程。wheel现在被认为是Python标准的二进制打包格式。 安装命令: pip install wheel 注意:如果你是刚刚安装过python并且从没有安装过wheel,你可以直接运行上述命令。但如果你的pip版本不够新,你需要在执行install命令之前更新一下pip,在命令行中输入:python -m pip install --upgrade pip更新pip,再输入安装命令即可。 安装lxml 用途: python的一个解析库,支持HTML和XML的解析,支持XPath解析方式,而且解析效率非常高。 安装命令: pip install lxml 安装zope.interface 用途: python本身不提供interface的实现

Get .pfx Cert File Expiration with pyOpenSSL

好久不见. 提交于 2021-01-05 12:42:44
问题 I'm trying to use pyOpenSSL to check the expiration of a .pfx file the client will need to use with my application. We issue the cert to the client, and it expires every two years. I know using openssl in the command line works, by converting to a .pem and then running '-noout -enddate' on the resulting .pem file. There is a good chance the client will not have openssl installed, so I'd like to use the library if possible. How would I check the .pfx expiration date? I've gotten the cert

Python Confluent-Kafka SSL Configuration

最后都变了- 提交于 2020-12-13 03:49:47
问题 A basic Confluent-Kafka producer and consumer have been created to send plaintext messages. After successfully sending messages from producer to consumer, additional configs were added to use SSL rather than PLAINTEXT. The following Configs have been implemented, which result in the following error. "Message Timed Out" Producer Configs: bootstrap.servers: localhost9093 security.protocol: SSL ssl.keystore.location: ../keystore.p12 ssl.keystore.password: [password] ssl.ca.location: ../CARoot

Python爬虫进阶之Scrapy

[亡魂溺海] 提交于 2020-12-04 05:36:55
用Scrapy爬取百度图片 前段时间用python的requests库和BeautifulSoup库爬取了猫眼电影关于柯南剧场版的6000条评论 这次我们来使用Scrapy框架来实现爬虫任务——百度“唯美图片”的爬取 整个项目的工程源码我已经上传到GitHub上了,感兴趣的同学可以自行下载,能顺便给我的项目一个star那再好不过了 项目地址:https://github.com/ITBoy-China/scrapy 先展示下我们爬取的结果 看着爬取下来的这一张一张的图,内心的满满的成就感有没有,哈哈,那接下来就跟着我一起来看看如何去实现图片的爬取吧。 一、准备工作 我们此次用到的工具有: python3.7.3 PyCharm5.0.3 Scrapy1.7.4 没有安装scrapy的直接在命令行里pip install scrapy安装scrapy框架,在windows环境下安装scrapy开始会报错,这是因为安装scrapy要安装其它的一些依赖库,lxml、pyOpenSSL、Twisted 、pywin32。 安装好这些库之后,再去安装scrapy就不会报错了。 安装完成之后我们在命令行里输入scrapy看是否安装成功,结果如下: 然后我们开始创建Scrapy项目,在命令行输入: scrapy startproject XXX 其中XXX表示的是你的项目名称