Home > Software engineering >  How do you define a decimal number in a PEG grammar?
How do you define a decimal number in a PEG grammar?

Time:12-14

I have the following grammar

Arithmetic:
    Term     < Factor (Add / Sub)*
    Add      < " " Factor
    Sub      < "-" Factor
    Factor   < Primary (Mul / Div)*
    Mul      < "*" Primary
    Div      < "/" Primary
    Primary  < Parens / Neg / Pos / Number / Variable
    Parens   < "(" Term ")"
    Neg      < "-" Primary
    Pos      < " " Primary

    Dot      < "."
    Decimal  < ~(digit  Dot? digit*)
    Integer  < digits
    Number   < Integer / Decimal

    Function < identifier (space  Primary) 
    Variable <- identifier

Most everything works, except when I try to parse a decimal (such as 0.5) it does not work. What is the proper syntax to define a Decimal parser in PEG?

I am using the Pegged library in d-lang. See here for the docs.

CodePudding user response:

Since PEG alternatives are ordered, you need to write:

Number   < Decimal / Integer

As written, Integer / Decimal will always match the Integer at the beginning of a number, so Decimal will never be tried.

  • Related