Struct pomelo_parse::Parser
source · pub struct Parser<'a> {
current_pos: usize,
tokens: Vec<Token<'a>>,
errors: Vec<Error>,
builder: NodeBuilder,
}
Expand description
The main interface for manipulating the lexed source.
This handles iterating over the lex tokens as well as building the syntax tree
(see rowan::GreenNodeBuilder
for details).
Fields§
§current_pos: usize
§tokens: Vec<Token<'a>>
Tokens are stored in reverse order
errors: Vec<Error>
§builder: NodeBuilder
Implementations§
source§impl<'a> Parser<'a>
impl<'a> Parser<'a>
pub fn new(src: &'a str) -> Self
sourcepub fn parse(self) -> SyntaxTree
pub fn parse(self) -> SyntaxTree
Parse an entire source file.
sourcepub fn parse_expr(self) -> SyntaxTree
pub fn parse_expr(self) -> SyntaxTree
Parse a single expression.
sourcepub fn parse_pat(self) -> SyntaxTree
pub fn parse_pat(self) -> SyntaxTree
Parse a single pattern.
sourcepub fn parse_type(self) -> SyntaxTree
pub fn parse_type(self) -> SyntaxTree
Parse a single type.
sourcepub fn parse_dec(self) -> SyntaxTree
pub fn parse_dec(self) -> SyntaxTree
Parse a single declaration.
source§impl<'a> Parser<'a>
impl<'a> Parser<'a>
sourcepub(crate) fn parse_inner<F>(self, f: F) -> SyntaxTreewhere
F: FnMut(&mut Parser<'_>),
pub(crate) fn parse_inner<F>(self, f: F) -> SyntaxTreewhere
F: FnMut(&mut Parser<'_>),
Parse according to a specified parsing function f
.
This is here for testing purposes.
sourcepub(crate) fn peek(&self) -> SyntaxKind
pub(crate) fn peek(&self) -> SyntaxKind
Peek at the kind of the next token.
sourcepub(crate) fn peek_nth(&self, n: usize) -> SyntaxKind
pub(crate) fn peek_nth(&self, n: usize) -> SyntaxKind
Peek ahead n
tokens.
sourcepub(crate) fn peek_next_nontrivia(&self, skip: usize) -> SyntaxKind
pub(crate) fn peek_next_nontrivia(&self, skip: usize) -> SyntaxKind
Skips past skip
nontrivia tokens, then peeks at the kind of the next one.
sourcepub(crate) fn peek_token(&self) -> Option<&Token<'_>>
pub(crate) fn peek_token(&self) -> Option<&Token<'_>>
Peek at the next token.
sourcepub(crate) fn peek_token_next_nontrivia(&self, skip: usize) -> Option<&Token<'_>>
pub(crate) fn peek_token_next_nontrivia(&self, skip: usize) -> Option<&Token<'_>>
Peeks past skip
nontrivia tokens, then peeks at the next token.
sourcepub(crate) fn eat(&mut self, kind: SyntaxKind) -> bool
pub(crate) fn eat(&mut self, kind: SyntaxKind) -> bool
If kind
matches the next token kind, consumes the token and returns true.
Else, returns false.
sourcepub(crate) fn eat_any(&mut self) -> SyntaxKind
pub(crate) fn eat_any(&mut self) -> SyntaxKind
Consume the next token, regardless of its kind.
sourcepub(crate) fn eat_trivia(&mut self) -> bool
pub(crate) fn eat_trivia(&mut self) -> bool
While the next token is trivia, consume it.
sourcepub(crate) fn eat_through_trivia(&mut self, kind: SyntaxKind) -> bool
pub(crate) fn eat_through_trivia(&mut self, kind: SyntaxKind) -> bool
If the next nontrivia token matches kind
, consume it (and any leading trivia) and
return true. Else returns false.
sourcepub(crate) fn expect(&mut self, kind: SyntaxKind)
pub(crate) fn expect(&mut self, kind: SyntaxKind)
Consume the next token if it matches kind
, else generate an error.
sourcepub(crate) fn error(&mut self, msg: impl Into<String> + Clone)
pub(crate) fn error(&mut self, msg: impl Into<String> + Clone)
Append a new error to the stored list of errors.
sourcepub(crate) fn eat_mapped(&mut self, kind: SyntaxKind) -> SyntaxKind
pub(crate) fn eat_mapped(&mut self, kind: SyntaxKind) -> SyntaxKind
Consume the next token but remap its SyntaxKind
to be kind
.
pub(crate) fn is_eof(&self) -> bool
sourcepub(crate) fn is_vid(&self) -> bool
pub(crate) fn is_vid(&self) -> bool
Check if the current token is a valid VId.
With the current lexing strategy, correct symbolic identifiers are caught at lexing stage. Thus, the only special one we need to check for is EQ.
However, not sure if this is a good strategy for being error-resilient. The lexer obviously has less context than the parser for determining what to do if there is an error. This may be generally an issue with gluing together tokens at lex-time (like “=>” as THICK_ARROW, “…” as ELLIPSIS, etc.)
sourcepub(crate) fn next_nontrivia_is_vid(&self) -> bool
pub(crate) fn next_nontrivia_is_vid(&self) -> bool
Check if the current token is a valid VId.
With the current lexing strategy, correct symbolic identifiers are caught at lexing stage. Thus, the only special one we need to check for is EQ.
However, not sure if this is a good strategy for being error-resilient. The lexer obviously has less context than the parser for determining what to do if there is an error. This may be generally an issue with gluing together tokens at lex-time (like “=>” as THICK_ARROW, “…” as ELLIPSIS, etc.)
fn pop(&mut self) -> Token<'a>
source§impl<'a> Parser<'a>
impl<'a> Parser<'a>
sourcepub(crate) fn start_node(&mut self, kind: SyntaxKind) -> NodeGuard
pub(crate) fn start_node(&mut self, kind: SyntaxKind) -> NodeGuard
Start a new node in the syntax tree.
Note that the returned NodeGuard
will be dropped immediately if not
bound to a variable.
sourcepub(crate) fn checkpoint(&self) -> Checkpoint
pub(crate) fn checkpoint(&self) -> Checkpoint
Set a checkpoint that can be used later to create a node earlier in the tree.
This is essentially just lookahead. Make an example?
sourcepub(crate) fn start_node_at(
&mut self,
checkpoint: Checkpoint,
kind: SyntaxKind
) -> NodeGuard
pub(crate) fn start_node_at(
&mut self,
checkpoint: Checkpoint,
kind: SyntaxKind
) -> NodeGuard
Use a previously set checkpoint
to create a new node at its position (higher up in the tree).
Note that the returned NodeGuard
will be dropped immediately if not
bound to a variable.
sourcepub fn push_token(&mut self, token: Token<'_>)
pub fn push_token(&mut self, token: Token<'_>)
Add a token to the current node.