Changeset 391 for python/trunk/Doc/library/tokenize.rst
- Timestamp:
- Mar 19, 2014, 11:31:01 PM (11 years ago)
- Location:
- python/trunk
- Files:
-
- 2 edited
Legend:
- Unmodified
- Added
- Removed
-
python/trunk
-
Property svn:mergeinfo
set to
/python/vendor/Python-2.7.6 merged eligible /python/vendor/current merged eligible
-
Property svn:mergeinfo
set to
-
python/trunk/Doc/library/tokenize.rst
r2 r391 1 2 1 :mod:`tokenize` --- Tokenizer for Python source 3 2 =============================================== … … 8 7 .. sectionauthor:: Fred L. Drake, Jr. <fdrake@acm.org> 9 8 9 **Source code:** :source:`Lib/tokenize.py` 10 11 -------------- 10 12 11 13 The :mod:`tokenize` module provides a lexical scanner for Python source code, … … 13 15 well, making it useful for implementing "pretty-printers," including colorizers 14 16 for on-screen displays. 17 18 To simplify token stream handling, all :ref:`operators` and :ref:`delimiters` 19 tokens are returned using the generic :data:`token.OP` token type. The exact 20 type can be determined by checking the token ``string`` field on the 21 :term:`named tuple` returned from :func:`tokenize.tokenize` for the character 22 sequence that identifies a specific operator token. 15 23 16 24 The primary entry point is a :term:`generator`: … … 22 30 :meth:`readline` method of built-in file objects (see section 23 31 :ref:`bltin-file-objects`). Each call to the function should return one line 24 of input as a string. 32 of input as a string. Alternately, *readline* may be a callable object that 33 signals completion by raising :exc:`StopIteration`. 25 34 26 35 The generator produces 5-tuples with these members: the token type; the token
Note:
See TracChangeset
for help on using the changeset viewer.