An implementation of parser combinators for Rust, inspired by the Haskell library Parsec. As in Parsec the parsers are LL(1) by default but they can opt-in to arbitrary lookahead using the attempt combinator.
extern crate combine;
use combine::{many1, Parser, sep_by};
use combine::parser::char::{letter, space};
// Construct a parser that parses *many* (and at least *1) *letter*s
let word = many1(letter());
// Construct a parser that parses many *word*s where each word is *separated by* a (white)*space*
let mut parser = sep_by(word, space())
// Combine can collect into any type implementing `Default + Extend` so we need to assist rustc
// by telling it that `sep_by` should collect into a `Vec` and `many1` should collect to a `String`
.map(|mut words: Vec<String>| words.pop());
let result = parser.parse("Pick up that word!");
// `parse` returns `Result` where `Ok` contains a tuple of the parsers output and any remaining input.
assert_eq!(result, Ok((Some("word".to_string()), "!")));
Larger examples can be found in the examples, tests and benches folders.
A tutorial as well as explanations on what goes on inside combine can be found in the wiki.
-
Parse arbitrary streams - Combine can parse anything from
&[u8]
and&str
to iterators andRead
instances. If none of the builtin streams fit your use case you can even implement a couple traits your self to create your own custom stream! -
zero-copy parsing - When parsing in memory data, combine can parse without copying. See the range module for parsers specialized for zero-copy parsing.
-
partial parsing - Combine parsers can be stopped at any point during parsing and later be resumed without losing any progress. This makes it possible to start parsing partial data coming from an io device such as a socket without worrying about if enough data is present to complete the parse. If more data is needed the parser will stop and may be resumed at the same point once more data is available. See the async example for an example and this post for an introduction.
A parser combinator is, broadly speaking, a function which takes several parsers as arguments and returns a new parser, created by combining those parsers. For instance, the many parser takes one parser, p
, as input and returns a new parser which applies p
zero or more times. Thanks to the modularity that parser combinators gives it is possible to define parsers for a wide range of tasks without needing to implement the low level plumbing while still having the full power of Rust when you need it.
The library adheres to semantic versioning.
If you end up trying it I welcome any feedback from your experience with it. I am usually reachable within a day by opening an issue, sending an email or posting a message on Gitter.
Since combine
aims to crate parsers with little to no overhead, streams over &str
and &[T]
do not carry any extra position information, but instead, they only rely on comparing the pointer of the buffer to check which Stream
is further ahead than another Stream
. To retrieve a better position, either call translate_position
on the PointerOffset
which represents the position or wrap your stream with State
.
#73 contains discussion and links to comparisons to nom.
- GraphQL https://github.com/graphql-rust/graphql-parser (Uses a custom tokenizer as input)
- DiffX https://github.com/brennie/diffx-rs
- Redis redis-rs/redis-rs#141 (Uses partial parsing)
- Toml https://github.com/ordian/toml_edit
- Maker Interchange Format https://github.com/aidanhs/frametool (Uses combine as a lexer)
- Javascript https://github.com/freemasen/ress
- JPEG Metadata https://github.com/vadixidav/exifsd
- Template language https://github.com/tailhook/trimmer
- Code exercises https://github.com/dgel/adventOfCode2017
- Programming language
- Query parser (+ more) https://github.com/mozilla/mentat
- Query parser https://github.com/tantivy-search/tantivy
There is an additional crate which has parsers to lex and parse programming languages in combine-language.
The easiest way to contribute is to just open an issue about any problems you encounter using combine but if you are interested in adding something to the library here is a list of some of the easier things to work on to get started.
- Add additional parsers If you have a suggestion for another parser just open an issue or a PR with an implementation.
- Add additional examples More examples for using combine will always be useful!
- Add and improve the docs Not the fanciest of work but one cannot overstate the importance of good documentation.