Searched refs:new_tokenizer (Results 1 – 1 of 1) sorted by relevance
344 def groups(self, new_tokenizer): argument368 tokens = new_tokenizer.tokens452 def tokens(self, new_tokenizer): argument453 level, groups = self.groups(new_tokenizer)592 new_tokenizer = CTokenizer(tokenizer.tokens[start:end + 1])595 yield new_tokenizer597 yield str(new_tokenizer)627 new_tokenizer = CTokenizer()638 new_tokenizer.tokens += tokenizer.tokens[pos:start]642 new_tokenizer.tokens += args_match.tokens(new)[all …]