OpenQuizz
Une application de gestion des contenus pédagogiques
|
Public Member Functions | |
def | __init__ (self, environment) |
def | tokenize (self, source, name=None, filename=None, state=None) |
def | wrap (self, stream, name=None, filename=None) |
def | tokeniter (self, source, name, filename=None, state=None) |
Data Fields | |
lstrip_unless_re | |
newline_sequence | |
keep_trailing_newline | |
rules | |
Class that implements a lexer for a given environment. Automatically created by the environment class, usually you don't have to do that. Note that the lexer is not automatically bound to an environment. Multiple environments can share the same lexer.
def __init__ | ( | self, | |
environment | |||
) |
def tokeniter | ( | self, | |
source, | |||
name, | |||
filename = None , |
|||
state = None |
|||
) |
This method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template.
def tokenize | ( | self, | |
source, | |||
name = None , |
|||
filename = None , |
|||
state = None |
|||
) |
Calls tokeniter + tokenize and wraps it in a token stream.
def wrap | ( | self, | |
stream, | |||
name = None , |
|||
filename = None |
|||
) |
This is called with the stream as returned by `tokenize` and wraps every token in a :class:`Token` and converts the value.
keep_trailing_newline |
lstrip_unless_re |
newline_sequence |
rules |