Home
last modified time | relevance | path

Searched refs:new_tokenizer (Results 1 – 1 of 1) sorted by relevance

/linux/tools/lib/python/kdoc/
H A Dc_lex.py344 def groups(self, new_tokenizer): argument
368 tokens = new_tokenizer.tokens
452 def tokens(self, new_tokenizer): argument
453 level, groups = self.groups(new_tokenizer)
592 new_tokenizer = CTokenizer(tokenizer.tokens[start:end + 1])
595 yield new_tokenizer
597 yield str(new_tokenizer)
627 new_tokenizer = CTokenizer()
638 new_tokenizer.tokens += tokenizer.tokens[pos:start]
642 new_tokenizer.tokens += args_match.tokens(new)
[all …]