python3.x 中的标记化
我在 python2.x 中有以下代码:
class _CHAIN(object):
def __init__(self, execution_context=None):
self.execution_context = execution_context
def eat(self, toktype, tokval, rowcol, line, logical_line):
#some code and error checking
operations = _CHAIN(execution_context)
tokenize(StringIO(somevalue).readline, operations.eat)
现在的问题是在 python3.x 中第二个参数不存在。我需要在标记化之前调用函数 Operations.eat() 。我们如何在python3.x中执行上述任务。一个想法是在“tokenize”语句(代码的最后一行)之前直接调用函数 tokenize.eat() 。但我不确定要通过的论点。我确信一定有更好的方法来做到这一点。
I have following codes in python2.x:
class _CHAIN(object):
def __init__(self, execution_context=None):
self.execution_context = execution_context
def eat(self, toktype, tokval, rowcol, line, logical_line):
#some code and error checking
operations = _CHAIN(execution_context)
tokenize(StringIO(somevalue).readline, operations.eat)
Now problem is that in python3.x second argument does not exist. I need to call the function operations.eat() before tokenize. How can we perform the above task in python3.x. One idea is to directly call the function tokenize.eat() before 'tokenize' statement(last line of the code). But I am not sure about the arguments to be passed. I'm sure there must be better ways to do it.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
您使用的是一个稍微奇怪的遗留系统,您向函数传递一个可迭代的函数以及一个可以接受令牌的可调用函数。新方法在概念上更简单,并且适用于 Python 2 和 3:
这在技术上没有针对 Python 3 的文档记录,但不太可能被取消。 Python 3 中的官方
tokenize
函数需要字节,而不是字符串。有请求使用官方API来标记字符串,但它似乎已经停滞了。You're using a slightly odd legacy system where you pass the function an iterable along with a callable which can accept the tokens. The new way is conceptually more simple, and works in both Python 2 and 3:
This is technically undocumented for Python 3, but unlikely to be taken away. The official
tokenize
function in Python 3 expects bytes, rather than strings. There's a request for an official API to tokenize strings, but it seems to have stalled.根据 http://docs.python.org/py3k/library/tokenize.html,您现在应该使用
tokenize.tokenize(readline)
:Accoring to http://docs.python.org/py3k/library/tokenize.html, you should now use
tokenize.tokenize(readline)
: