Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tokenizer is needed #724

Closed
gracjan opened this issue Jun 15, 2015 · 1 comment
Closed

Tokenizer is needed #724

gracjan opened this issue Jun 15, 2015 · 1 comment

Comments

@gracjan
Copy link
Contributor

gracjan commented Jun 15, 2015

Plenty of code relies on local regexes to find tokens. All of these are half-backed.

We need a common next-token functionality as a building block for many more advanced parts.

Reference: #648, #628, #572, #549, #450, #229. Possibly many more.

@geraldus
Copy link
Contributor

Good idea. It will be great work I guess.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants