Home
last modified time | relevance | path

Searched refs:tokenizer (Results 1 – 3 of 3) sorted by relevance

/linux/tools/lib/python/kdoc/
H A Dc_lex.py508 def _search(self, tokenizer): argument
535 for i, tok in enumerate(tokenizer.tokens):
571 s = str(tokenizer)
574 yield start, len(tokenizer.tokens)
585 tokenizer = source
588 tokenizer = CTokenizer(source)
591 for start, end in self._search(tokenizer):
592 new_tokenizer = CTokenizer(tokenizer.tokens[start:end + 1])
618 tokenizer = source
621 tokenizer = CTokenizer(source)
[all …]
/linux/Documentation/tools/
H A Dkdoc_ancillary.rst24 C tokenizer
/linux/tools/unittests/
H A Dtest_tokenizer.py51 tokenizer = CTokenizer(data["source"])