twisted

Autobahn sending user specific and broadcast messages from external application

Deadly 提交于 2019-12-05 14:34:24
问题 Totally new to websockets. I am having a bit of trouble understanding how to interact with python Autobahn / twisted from another application and cannot seem to find any useful examples. I have a Python application running that needs on certain events to send one of two types of messages. The first is a broadcast message to all users. The second type is to a single specific user. Using the following two examples I can receive messages and send a response. However I do not need to receive

理解twisted中的reactor和deferred

岁酱吖の 提交于 2019-12-05 14:18:30
from twisted.internet import reactor, defer def getDummyData(inputData): """ This function is a dummy which simulates a delayed result and returns a Deferred which will fire with that result. Don't try too hard to understand this. """ print('getDummyData called') deferred = defer.Deferred() # simulate a delayed result by asking the reactor to fire the # Deferred in 2 seconds time with the result inputData * 3 reactor.callLater(2, deferred.callback, inputData * 3) return deferred def cbPrintData(result): """ Data handling function to be added as a callback: handles the data by printing the

twisted log level switch

荒凉一梦 提交于 2019-12-05 13:36:54
Is there any way in Twisted how to change logging level of messages which should be logged? I am using three levels in project: log.msg('agent nr.1 has free slots', logging.DEBUG) # debug message log.msg('agent nr.1 has free slots') # info message log.err('agent nr.1 has free slots') # error message And I configure logging in this way: from twisted.python import log from twisted.python.logfile import LogFile logfile = LogFile("someFile.log", '/some/path/', rotateLength=1000, maxRotatedFiles=100) application.setComponent(log.ILogObserver, log.FileLogObserver(logfile).emit) But I need set which

Python OSX $ which Python gives /Library/Frameworks/Python.framework/Versions/2.7/bin/python

大城市里の小女人 提交于 2019-12-05 13:13:16
Hello I'm trying to run twisted along with python but python cannot find twisted . I did run $pip install twisted successfully but it is still not available. ImportError: No module named twisted.internet.protocol It seems that most people have $which python at /usr/local/bin/python but I get /Library/Frameworks/Python.framework/Versions/2.7/bin/python May this be the issue? If so, how can I change the PATH env? It is just fine. Python may be installed in multiple places in your computer. When you get a new Mac, the default python directory may be 'usr/bin/python2.7' You may also have a

Python Scrapy - mimetype based filter to avoid non-text file downloads

孤者浪人 提交于 2019-12-05 10:59:58
I have a running scrapy project, but it is being bandwidth intensive because it tries to download a lot of binary files (zip, tar, mp3, ..etc). I think the best solution is to filter the requests based on the mimetype (Content-Type:) HTTP header. I looked at the scrapy code and found this setting: DOWNLOADER_HTTPCLIENTFACTORY = 'scrapy.core.downloader.webclient.ScrapyHTTPClientFactory' I changed it to: DOWNLOADER_HTTPCLIENTFACTORY = 'myproject.webclients.ScrapyHTTPClientFactory' And played a little with the ScrapyHTTPPageGetter , here is the edits highlighted: class ScrapyHTTPPageGetter

Non-blocking server in Twisted

烂漫一生 提交于 2019-12-05 10:21:20
I am building an application that needs to run a TCP server on a thread other than the main. When trying to run the following code: reactor.listenTCP(ServerConfiguration.tcpport, TcpCommandFactory()) reactor.run() I get the following error exceptions.ValueError: signal only works in main thread Can I run the twisted servers on threads other than the main one? Twisted can run in any thread - but only one thread at a time. If you want to run in the non-main thread, simply do reactor.run(installSignalHandlers=False) . However, you cannot use a reactor on the non-main thread to spawn subprocesses,

How make a twisted python client with readline functionality

倖福魔咒の 提交于 2019-12-05 08:26:49
I'm trying to write a client for simple TCP server using Python Twisted. Of course I pretty new to Python and just started looking at Twisted so I could be doing it all wrong. The server is simple and you're intended to use use nc or telnet. There is no authentication. You just connect and get a simple console. I'd like to write a client that adds some readline functionality (history and emacs like ctrl-a/ctrl-e are what I'm after) Below is code I've written that works just as good as using netcat from the command line like this nc localhost 4118 from twisted.internet import reactor, protocol,

Python Twisted integration with Cmd module

拥有回忆 提交于 2019-12-05 08:20:31
I like Python's Twisted and Cmd . I want to use them together. I got some things working, but so far I haven't figured out how to make tab-completion work, because I don't see how to receive tab keypres events right away (without pressing Enter) in Twisted's LineReceiver. Here's my code so far: #!/usr/bin/env python from cmd import Cmd from twisted.internet import reactor from twisted.internet.stdio import StandardIO from twisted.protocols.basic import LineReceiver class CommandProcessor(Cmd): def do_EOF(self, line): return True class LineProcessor(LineReceiver): from os import linesep as

Socket.IO vs. Twisted [closed]

試著忘記壹切 提交于 2019-12-05 08:09:58
Closed . This question is opinion-based . It is not currently accepting answers. Want to improve this question? Update the question so it can be answered with facts and citations by editing this post . Closed 6 years ago . My idea is to build a simple chat application for iOS and Android. In any case, my question is related to the server-side. The best option to do a chat application, from what I've read, is to build a socket. Referring to the database, my intention is to use MySQL, which may also be important to take into account in order to choose one of the possibilities. My question is, in

Scrapy: how to debug scrapy lost requests

夙愿已清 提交于 2019-12-05 06:43:33
I have a scrapy spider, but it doesn't return requests sometimes. I've found that by adding log messages before yielding request and after getting response. Spider has iterating over a pages and parsing link for item scrapping on each page. Here is a part of code SampleSpider(BaseSpider): .... def parse_page(self, response): ... request = Request(target_link, callback=self.parse_item_general) request.meta['date_updated'] = date_updated self.log('parse_item_general_send {url}'.format(url=request.url), level=log.INFO) yield request def parse_item_general(self, response): self.log('parse_item