twisted

Sending data from one Protocol to another Protocol in Twisted?

。_饼干妹妹 提交于 2019-11-27 06:29:44
问题 One of my protocols is connected to a server, and with the output of that I'd like to send it to the other protocol. I need to access the 'msg' method in ClassA from ClassB but I keep getting: exceptions.AttributeError: 'NoneType' object has no attribute 'write' Actual code: from twisted.words.protocols import irc from twisted.internet import protocol from twisted.internet.protocol import Protocol, ClientFactory from twisted.internet import reactor IRC_USERNAME = 'xxx' IRC_CHANNEL = '#xxx' T

Threads vs. Async

落花浮王杯 提交于 2019-11-27 05:06:08
问题 I have been reading up on the threaded model of programming versus the asynchronous model from this really good article. http://krondo.com/blog/?p=1209 However, the article mentions the following points. An async program will simply outperform a sync program by switching between tasks whenever there is a I/O. Threads are managed by the operating system. I remember reading that threads are managed by the operating system by moving around TCBs between the Ready-Queue and the Waiting-Queue

Asynchronous Programming in Python Twisted

余生颓废 提交于 2019-11-27 05:04:57
问题 I'm having trouble developing a reverse proxy in Twisted. It works, but it seems overly complex and convoluted. So much of it feels like voodoo. Are there any simple, solid examples of asynchronous program structure on the web or in books? A sort of best practices guide? When I complete my program I'd like to be able to still see the structure in some way, not be looking at a bowl of spaghetti. 回答1: Twisted contains a large number of examples. One in particular, the "evolution of Finger"

What is the difference between event driven model and reactor pattern? [closed]

亡梦爱人 提交于 2019-11-27 04:55:26
问题 From the wikipedia Reactor Pattern article: The reactor design pattern is an event handling pattern for handling service requests delivered concurrently to a service handler by one or more inputs. It named a few examples, e.g. nodejs , twisted , eventmachine But what I understand that above is popular event driven framework, so make them also a reactor pattern framework? How to differentiate between these two? Or they are the same? 回答1: The reactor pattern is more specific than "event driven

twisted学习(deferred机制)

僤鯓⒐⒋嵵緔 提交于 2019-11-27 04:25:25
1、在异步编程中,defer机制主要是用来管理callback函数,在twisted中,许多功能的实现都是采用的事件驱动机制,在异步编程中,错误处理机制和同步程序并不一样,异步程序会无视错误执行下去。在异步程序中处理错误显得十分重要。defer就是来可以帮助我们来管理我们的callback和errback函数。合理的安排defer在异步编程中显得十分重要。 2、一个deferred包含有一对callback链,一个用于处理正常的情况,一个用于处理异常情况。每一对链都是一个互斥关系,也就是说,当callback执行了,那么errback就不会执行了。同样,errback执行了,那么callback不会执行 from twisted.internet.defer import Deferred def successHandle(result): print 'It is success:' print result def failedHandle(reason): print 'Error.' d = Deferred() # add a callback/errback pair to the chain d.addCallbacks(successHandle, failedHandle) # fire the chain with a normal result d

zg手册 之 twisted 开发(2)-- Deferreds 组件

荒凉一梦 提交于 2019-11-27 04:25:12
Deferreds 异步回调序列 Deferred 本质上是一个回调函数的集合,twisted 提供了对函数延迟调用的机制。 在 twisted 内部,使用 Deferred 对象管理回调序列。当异步请求结果返回时,使用 Deferred 对象调用回调序列中的函数。 这里通过几个例子来了解 Deferreds 使用的方式和工作的原理。 先看一个简单的例子 from twisted.internet import reactor, defer def getDummyData(x): """创建一个 Deferred 对象,并返回这个对象""" d = defer.Deferred() # 2 秒钟后执行 Deferred 回调函数序列,把 x * 3 作为参数传递给回调序列中的第一个函数 # reactor.callLater 是一个定时延迟调用的方法 reactor.callLater(2, d.callback, x * 3) return d def printData(result): """ 打印结果 """ print resultd = getDummyData(3) # 添加回调函数到回调函数序列中 d.addCallback(printData) # 4 秒钟后停止 reactor 循环(退出进程) reactor.callLater(4, reactor

(转) Twisted : 第八部分 使用Deferred的诗歌下载客户端

拥有回忆 提交于 2019-11-27 04:24:58
客户端 4.0 我们已经对 deferreds 有些理解了,现在我们可以使用它重写我们的客户端。你可以在 twisted-client-4/get-poetry.py 中 看到它的实现。 这里的 get_poetry 已经再也不需要 callback 与 errback 参数了。相反,返回了一个用户可能根据需要添加 callbacks 和 errbacks 的新 deferred 。 def get_poetry(host, port): """ Download a poem from the given host and port. This function returns a Deferred which will be fired with the complete text of the poem or a Failure if the poem could not be downloaded. """ d = defer.Deferred() from twisted.internet import reactor factory = PoetryClientFactory(d) reactor.connectTCP(host, port, factory) return d 这里的工厂使用一个 deferred 而不 callback/errback 对来初始化

Scrapy crawl from script always blocks script execution after scraping

三世轮回 提交于 2019-11-27 04:24:38
I am following this guide http://doc.scrapy.org/en/0.16/topics/practices.html#run-scrapy-from-a-script to run scrapy from my script. Here is part of my script: crawler = Crawler(Settings(settings)) crawler.configure() spider = crawler.spiders.create(spider_name) crawler.crawl(spider) crawler.start() log.start() reactor.run() print "It can't be printed out!" It works at it should: visits pages, scrape needed info and stores output json where I told it(via FEED_URI). But when spider finishing his work(I can see it by number in output json) execution of my script wouldn't resume. Probably it isn

Twisted DeferredList用法

南笙酒味 提交于 2019-11-27 04:24:34
DeferredList 有时候,你想等所有的事件都发生后通知你,而不是每一个都通知一下。比如,你想等待在在一个列表里的所有连接都关闭后通知你,twisted.internet.defer.DeferredList 能够完成你想要的功能。 为了创建一个DeferredList从多个Deferreds,你只需要简单的传递一个列表就可以了: #Create a DeferredList dl = defer.DeferredList([deferred1, deferred2, deferred3]) 你可以把这个DeferredList当做一个原生的Deferred;你能够调用 addCallbacks等.DeferredList将调用它的callback当所有的deferreds完成的时候 例子 from twisted.internet import defer def printResult(result): for (success, value) in result: if success: print 'Success', value else: print 'Failure', value.getErrorMessage() deferred1 = defer.Deferred() deferred2 = defer.Deferred() deferred3 =

ReactorNotRestartable - Twisted and scrapy

时光总嘲笑我的痴心妄想 提交于 2019-11-27 03:01:11
问题 Before you link me to other answers related to this, note that I've read them and am still a bit confused. Alrighty, here we go. So I am creating a webapp in Django. I am importing the newest scrapy library to crawl a website. I am not using celery (I know very little about it, but saw it in other topics related to this). One of the url's of our website, /crawl/, is meant to start the crawler running. It's the only url in our site that requires scrapy to be used. Here is the function which is