Struct pomelo_parse::Parser

source ·
pub struct Parser<'a> {
    current_pos: usize,
    tokens: Vec<Token<'a>>,
    errors: Vec<Error>,
    builder: NodeBuilder,
}
Expand description

The main interface for manipulating the lexed source.

This handles iterating over the lex tokens as well as building the syntax tree (see rowan::GreenNodeBuilder for details).

Fields§

§current_pos: usize§tokens: Vec<Token<'a>>

Tokens are stored in reverse order

§errors: Vec<Error>§builder: NodeBuilder

Implementations§

Parse an entire source file.

Parse a single expression.

Parse a single pattern.

Parse a single type.

Parse a single declaration.

Parse according to a specified parsing function f.

This is here for testing purposes.

Peek at the kind of the next token.

Peek ahead n tokens.

Skips past skip nontrivia tokens, then peeks at the kind of the next one.

Peek at the next token.

Peek at the text of the next token.

Peeks past skip nontrivia tokens, then peeks at the next token.

If kind matches the next token kind, consumes the token and returns true. Else, returns false.

Consume the next token, regardless of its kind.

While the next token is trivia, consume it.

If the next nontrivia token matches kind, consume it (and any leading trivia) and return true. Else returns false.

Consume the next token if it matches kind, else generate an error.

Append a new error to the stored list of errors.

Consume the next token but remap its SyntaxKind to be kind.

Check if the current token is a valid VId.

With the current lexing strategy, correct symbolic identifiers are caught at lexing stage. Thus, the only special one we need to check for is EQ.

However, not sure if this is a good strategy for being error-resilient. The lexer obviously has less context than the parser for determining what to do if there is an error. This may be generally an issue with gluing together tokens at lex-time (like “=>” as THICK_ARROW, “…” as ELLIPSIS, etc.)

Check if the current token is a valid VId.

With the current lexing strategy, correct symbolic identifiers are caught at lexing stage. Thus, the only special one we need to check for is EQ.

However, not sure if this is a good strategy for being error-resilient. The lexer obviously has less context than the parser for determining what to do if there is an error. This may be generally an issue with gluing together tokens at lex-time (like “=>” as THICK_ARROW, “…” as ELLIPSIS, etc.)

Check if the current token is a valid StrId.

Start a new node in the syntax tree.

Note that the returned NodeGuard will be dropped immediately if not bound to a variable.

Set a checkpoint that can be used later to create a node earlier in the tree.

This is essentially just lookahead. Make an example?

Use a previously set checkpoint to create a new node at its position (higher up in the tree).

Note that the returned NodeGuard will be dropped immediately if not bound to a variable.

Add a token to the current node.

Trait Implementations§

Returns a copy of the value. Read more
Performs copy-assignment from source. Read more
Formats the value using the given formatter. Read more

Auto Trait Implementations§

Blanket Implementations§

Gets the TypeId of self. Read more
Immutably borrows from an owned value. Read more
Mutably borrows from an owned value. Read more

Returns the argument unchanged.

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

The resulting type after obtaining ownership.
Creates owned data from borrowed data, usually by cloning. Read more
Uses borrowed data to replace owned data, usually by cloning. Read more
The type returned in the event of a conversion error.
Performs the conversion.
The type returned in the event of a conversion error.
Performs the conversion.