Erster Docker-Stand

This commit is contained in:
Ali
2026-02-20 16:06:40 +09:00
commit f31e2e8ed3
8818 changed files with 1605323 additions and 0 deletions

1
_node_modules/chevrotain/CHANGELOG.md generated Normal file
View File

@@ -0,0 +1 @@
See: https://chevrotain.io/docs/changes/CHANGELOG.html

202
_node_modules/chevrotain/LICENSE.txt generated Normal file
View File

@@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

20
_node_modules/chevrotain/README.md generated Normal file
View File

@@ -0,0 +1,20 @@
# Chevrotain
For details see:
- Chevrotain's [website](https://chevrotain.io/docs/).
- Chevrotain's root [README](https://github.com/chevrotain/chevrotain).
## Install
Using npm:
```sh
npm install chevrotain
```
or using yarn:
```sh
yarn add chevrotain
```

2
_node_modules/chevrotain/chevrotain.d.ts generated vendored Normal file
View File

@@ -0,0 +1,2 @@
export * from "@chevrotain/types"
export as namespace chevrotain

View File

@@ -0,0 +1 @@
See [online docs](https://chevrotain.io/docs/guide/generating_syntax_diagrams.html).

View File

@@ -0,0 +1,85 @@
svg.railroad-diagram path {
stroke-width: 3;
stroke: black;
fill: rgba(0, 0, 0, 0);
}
svg.railroad-diagram text {
font: bold 14px monospace;
text-anchor: middle;
}
svg.railroad-diagram text.label {
text-anchor: start;
}
svg.railroad-diagram text.comment {
font: italic 12px monospace;
}
svg.railroad-diagram g.non-terminal rect {
fill: hsl(223, 100%, 83%);
}
svg.railroad-diagram rect {
stroke-width: 3;
stroke: black;
fill: hsl(190, 100%, 83%);
}
.diagramHeader {
display: inline-block;
-webkit-touch-callout: default;
-webkit-user-select: text;
-khtml-user-select: text;
-moz-user-select: text;
-ms-user-select: text;
user-select: text;
font-weight: bold;
font-family: monospace;
font-size: 18px;
margin-bottom: -8px;
text-align: center;
}
.diagramHeaderDef {
background-color: lightgreen;
}
svg.railroad-diagram text {
-webkit-touch-callout: default;
-webkit-user-select: text;
-khtml-user-select: text;
-moz-user-select: text;
-ms-user-select: text;
user-select: text;
}
svg.railroad-diagram g.non-terminal rect.diagramRectUsage {
color: green;
fill: yellow;
stroke: 5;
}
svg.railroad-diagram g.terminal rect.diagramRectUsage {
color: green;
fill: yellow;
stroke: 5;
}
div {
-webkit-touch-callout: none;
-webkit-user-select: none;
-khtml-user-select: none;
-moz-user-select: none;
-ms-user-select: none;
user-select: none;
}
svg {
width: 100%;
}
svg.railroad-diagram g.non-terminal text {
cursor: pointer;
}

View File

@@ -0,0 +1,221 @@
;(function (root, factory) {
if (typeof define === "function" && define.amd) {
// AMD. Register as an anonymous module.
define([], factory)
} else if (typeof module === "object" && module.exports) {
// Node. Does not work with strict CommonJS, but
// only CommonJS-like environments that support module.exports,
// like Node.
module.exports = factory()
} else {
// Browser globals (root is window)
root.diagrams_behavior = factory()
}
})(this, function () {
/**
* @param [scrollingEnabled=true] {boolean} - Is the scrolling from a non-terminal usage to it's definition
* enabled. it is enabled by default, but this flow is not relevant in all use cases (playground) and thus
* it is parametrized.
*/
function initDiagramsBehavior(scrollingEnabled) {
if (scrollingEnabled === undefined) {
scrollingEnabled = true
}
var diagramHeaders = toArr(document.getElementsByClassName("diagramHeader"))
diagramHeaders.forEach(function (header) {
header.addEventListener(
"mouseover",
toggleNonTerminalUsageAndDef_fromHeader
)
header.addEventListener(
"mouseout",
toggleNonTerminalUsageAndDef_fromHeader
)
})
var noneTerminals = toArr(document.getElementsByClassName("non-terminal"))
var noneTerminalsText = findDomChildrenByTagName(noneTerminals, "text")
noneTerminalsText.forEach(function (nonTerminal) {
nonTerminal.addEventListener(
"mouseover",
toggleNonTerminalUsageAndDef_fromNoneTerminal
)
nonTerminal.addEventListener(
"mouseout",
toggleNonTerminalUsageAndDef_fromNoneTerminal
)
if (scrollingEnabled) {
nonTerminal.addEventListener("click", jumpToNoneTerminalDef)
}
})
var terminals = toArr(document.getElementsByClassName("terminal"))
var terminalsText = findDomChildrenByTagName(terminals, "text")
terminalsText.forEach(function (terminal) {
terminal.addEventListener("mouseover", toggleTerminalUsage)
terminal.addEventListener("mouseout", toggleTerminalUsage)
})
}
function toggleTerminalUsage(mouseEvent) {
var terminalName = mouseEvent.target.getAttribute("label")
var rects = getUsageSvgRect(terminalName, "terminal", "label")
toggleClassForNodes(rects, "diagramRectUsage")
}
function toggleNonTerminalUsageAndDef_fromNoneTerminal(mouseEvent) {
var rectsHeaderAndRuleName = getUsageRectAndDefHeader(mouseEvent.target)
toggleClassForNodes(rectsHeaderAndRuleName.rects, "diagramRectUsage")
toggleClass(rectsHeaderAndRuleName.header, "diagramHeaderDef")
}
function jumpToNoneTerminalDef(mouseEvent) {
var header = findHeader(mouseEvent.target.getAttribute("rulename"))
scrollToY(header.offsetTop, 666, "easeInOutQuint")
}
function toggleNonTerminalUsageAndDef_fromHeader(mouseEvent) {
toggleClass(mouseEvent.target, "diagramHeaderDef")
// this does not work on an svg DOM element so its ok to use innerHTML.
var definitionName = mouseEvent.target.innerHTML
var rects = getUsageSvgRect(definitionName, "non-terminal", "rulename")
toggleClassForNodes(rects, "diagramRectUsage")
}
function getUsageSvgRect(definitionName, className, attributeName) {
var classDomElements = toArr(document.getElementsByClassName(className))
var rects = findDomChildrenByTagName(classDomElements, "rect")
return rects.filter(function (currRect) {
var textNode = currRect.parentNode.getElementsByTagName("text")[0]
return textNode.getAttribute(attributeName) === definitionName
})
}
function findHeader(headerName) {
var headers = toArr(document.getElementsByClassName("diagramHeader"))
var header = headers.find(function (currHeader) {
// this works on H2 dom elements and not SVG elements so innerHTML usage is safe.
return currHeader.innerHTML === headerName
})
return header
}
function getUsageRectAndDefHeader(target) {
var headerName = target.getAttribute("rulename")
var rects = getUsageSvgRect(headerName, "non-terminal", "rulename")
var header = findHeader(headerName)
return {
rects: rects,
header: header,
ruleName: headerName
}
}
// utils
// IE 10/11 does not support this on svg elements.
// I'm uncertain I really care... :)
// https://developer.mozilla.org/en-US/docs/Web/API/Element/classList
function toggleClass(domNode, className) {
if (domNode.classList.contains(className)) {
domNode.classList.remove(className)
} else {
domNode.classList.add(className)
}
}
function toggleClassForNodes(domNodes, className) {
domNodes.forEach(function (currDomNode) {
toggleClass(currDomNode, className)
})
}
function toArr(htmlCollection) {
return Array.prototype.slice.call(htmlCollection)
}
// first add raf shim
// http://www.paulirish.com/2011/requestanimationframe-for-smart-animating/
var requestAnimFrame = (function () {
return (
window.requestAnimationFrame ||
window.webkitRequestAnimationFrame ||
window.mozRequestAnimationFrame ||
function (callback) {
window.setTimeout(callback, 1000 / 60)
}
)
})()
// https://stackoverflow.com/questions/8917921/cross-browser-javascript-not-jquery-scroll-to-top-animation
function scrollToY(scrollTargetY, speed, easing) {
// scrollTargetY: the target scrollY property of the window
// speed: time in pixels per second
// easing: easing equation to use
var scrollY = window.scrollY,
scrollTargetY = scrollTargetY || 0,
speed = speed || 2000,
easing = easing || "easeOutSine",
currentTime = 0
// min time .1, max time .8 seconds
var time = Math.max(
0.1,
Math.min(Math.abs(scrollY - scrollTargetY) / speed, 0.8)
)
// easing equations from https://github.com/danro/easing-js/blob/master/easing.js
var PI_D2 = Math.PI / 2,
easingEquations = {
easeOutSine: function (pos) {
return Math.sin(pos * (Math.PI / 2))
},
easeInOutSine: function (pos) {
return -0.5 * (Math.cos(Math.PI * pos) - 1)
},
easeInOutQuint: function (pos) {
if ((pos /= 0.5) < 1) {
return 0.5 * Math.pow(pos, 5)
}
return 0.5 * (Math.pow(pos - 2, 5) + 2)
}
}
// add animation loop
function tick() {
currentTime += 1 / 60
var p = currentTime / time
var t = easingEquations[easing](p)
if (p < 1) {
requestAnimFrame(tick)
window.scrollTo(0, scrollY + (scrollTargetY - scrollY) * t)
} else {
window.scrollTo(0, scrollTargetY)
}
}
// call it once to get started
tick()
}
function findDomChildrenByTagName(domElements, tagName) {
var elemsFound = []
domElements.forEach(function (currDomNode) {
toArr(currDomNode.children).forEach(function (currChild) {
if (currChild.tagName === tagName) {
elemsFound.push(currChild)
}
})
})
return elemsFound
}
return {
initDiagramsBehavior: initDiagramsBehavior
}
})

View File

@@ -0,0 +1,204 @@
;(function (root, factory) {
if (typeof define === "function" && define.amd) {
// AMD. Register as an anonymous module.
// TODO: remove dependency to Chevrotain
define(["../vendor/railroad-diagrams"], factory)
} else if (typeof module === "object" && module.exports) {
// Node. Does not work with strict CommonJS, but
// only CommonJS-like environments that support module.exports,
// like Node.
// TODO: remove dependency to Chevrotain
module.exports = factory(require("../vendor/railroad-diagrams"))
} else {
// Browser globals (root is window)
root.diagrams_builder = factory(root.railroad)
}
})(this, function (railroad) {
var Diagram = railroad.Diagram
var Sequence = railroad.Sequence
var Choice = railroad.Choice
var Optional = railroad.Optional
var OneOrMore = railroad.OneOrMore
var ZeroOrMore = railroad.ZeroOrMore
// var Terminal = railroad.Terminal
var NonTerminal = railroad.NonTerminal
/**
* @param {chevrotain.gast.ISerializedGast} topRules
*
* @returns {string} - The htmlText that will render the diagrams
*/
function buildSyntaxDiagramsText(topRules) {
var diagramsHtml = ""
topRules.forEach(function (production) {
var currDiagramHtml = convertProductionToDiagram(
production,
production.name
)
diagramsHtml +=
'<h2 class="diagramHeader">' +
production.name +
"</h2>" +
currDiagramHtml
})
return diagramsHtml
}
function definitionsToSubDiagrams(definitions, topRuleName) {
var subDiagrams = definitions.map(function (subProd) {
return convertProductionToDiagram(subProd, topRuleName)
})
return subDiagrams
}
/**
* @param {chevrotain.gast.ISerializedTerminal} prod
* @param {string} topRuleName
* @param {string} dslRuleName
*
* @return {RailRoadDiagram.Terminal}
*/
function createTerminalFromSerializedGast(prod, topRuleName, dslRuleName) {
// PATTERN static property will not exist when using custom lexers (hand built or other lexer generators)
var toolTipTitle = undefined
// avoid trying to use a custom token pattern as the title.
if (
typeof prod.pattern === "string" ||
Object.prototype.toString.call(prod.pattern) === "[object RegExp]"
) {
toolTipTitle = prod.pattern
}
return railroad.Terminal(
prod.label,
undefined,
toolTipTitle,
prod.occurrenceInParent,
topRuleName,
dslRuleName,
prod.name
)
}
/**
* @param prod
* @param topRuleName
*
* Converts a single Chevrotain Grammar production to a RailRoad Diagram.
* This is also exported to allow custom logic in the creation of the diagrams.
* @returns {*}
*/
function convertProductionToDiagram(prod, topRuleName) {
if (prod.type === "NonTerminal") {
// must handle NonTerminal separately from the other AbstractProductions as we do not want to expand the subDefinition
// of a reference and cause infinite loops
return NonTerminal(
getNonTerminalName(prod),
undefined,
prod.occurrenceInParent,
topRuleName
)
} else if (prod.type !== "Terminal") {
var subDiagrams = definitionsToSubDiagrams(prod.definition, topRuleName)
if (prod.type === "Rule") {
return Diagram.apply(this, subDiagrams)
} else if (prod.type === "Alternative") {
return Sequence.apply(this, subDiagrams)
} else if (prod.type === "Option") {
if (subDiagrams.length > 1) {
return Optional(Sequence.apply(this, subDiagrams))
} else if (subDiagrams.length === 1) {
return Optional(subDiagrams[0])
} else {
throw Error("Empty Optional production, OOPS!")
}
} else if (prod.type === "Repetition") {
if (subDiagrams.length > 1) {
return ZeroOrMore(Sequence.apply(this, subDiagrams))
} else if (subDiagrams.length === 1) {
return ZeroOrMore(subDiagrams[0])
} else {
throw Error("Empty Optional production, OOPS!")
}
} else if (prod.type === "Alternation") {
// todo: what does the first argument of choice (the index 0 means?)
return Choice.apply(this, [0].concat(subDiagrams))
} else if (prod.type === "RepetitionMandatory") {
if (subDiagrams.length > 1) {
return OneOrMore(Sequence.apply(this, subDiagrams))
} else if (subDiagrams.length === 1) {
return OneOrMore(subDiagrams[0])
} else {
throw Error("Empty Optional production, OOPS!")
}
} else if (prod.type === "RepetitionWithSeparator") {
if (subDiagrams.length > 0) {
// MANY_SEP(separator, definition) === (definition (separator definition)*)?
return Optional(
Sequence.apply(
this,
subDiagrams.concat([
ZeroOrMore(
Sequence.apply(
this,
[
createTerminalFromSerializedGast(
prod.separator,
topRuleName,
"many_sep"
)
].concat(subDiagrams)
)
)
])
)
)
} else {
throw Error("Empty Optional production, OOPS!")
}
} else if (prod.type === "RepetitionMandatoryWithSeparator") {
if (subDiagrams.length > 0) {
// AT_LEAST_ONE_SEP(separator, definition) === definition (separator definition)*
return Sequence.apply(
this,
subDiagrams.concat([
ZeroOrMore(
Sequence.apply(
this,
[
createTerminalFromSerializedGast(
prod.separator,
topRuleName,
"at_least_one_sep"
)
].concat(subDiagrams)
)
)
])
)
} else {
throw Error("Empty Optional production, OOPS!")
}
}
} else if (prod.type === "Terminal") {
return createTerminalFromSerializedGast(prod, topRuleName, "consume")
} else {
throw Error("non exhaustive match")
}
}
function getNonTerminalName(prod) {
if (prod.nonTerminalName !== undefined) {
return prod.nonTerminalName
} else {
return prod.name
}
}
return {
buildSyntaxDiagramsText: buildSyntaxDiagramsText,
convertProductionToDiagram: convertProductionToDiagram
}
})

View File

@@ -0,0 +1,20 @@
/**
* @param {string} targetFilePath - The path and file name to serialize to.
* @param {string} varName - The name of the global variable to expose the serialized contents/
* @param {chevrotain.Parser} parserInstance - A parser instance whose grammar will be serialized.
*/
function serializeGrammarToFile(targetFilePath, varName, parserInstance) {
var fs = require("fs")
var serializedGrammar = parserInstance.getSerializedGastProductions()
var serializedGrammarText = JSON.stringify(serializedGrammar, null, "\t")
// generated a JavaScript file which exports the serialized grammar on the global scope (Window)
fs.writeFileSync(
targetFilePath,
"var " + varName + " = " + serializedGrammarText
)
}
module.exports = {
serializeGrammarToFile: serializeGrammarToFile
}

View File

@@ -0,0 +1,30 @@
;(function (root, factory) {
if (typeof define === "function" && define.amd) {
// AMD. Register as an anonymous module.
define(["./diagrams_builder", "./diagrams_behavior"], factory)
} else if (typeof module === "object" && module.exports) {
// Node. Does not work with strict CommonJS, but
// only CommonJS-like environments that support module.exports,
// like Node.
module.exports = factory(
require("./diagrams_builder"),
require("./diagrams_behavior")
)
} else {
// Browser globals (root is window)
root.main = factory(root.diagrams_builder, root.diagrams_behavior)
}
})(this, function (builder, behavior) {
return {
drawDiagramsFromParserInstance: function (parserInstanceToDraw, targetDiv) {
var topRules = parserInstanceToDraw.getSerializedGastProductions()
targetDiv.innerHTML = builder.buildSyntaxDiagramsText(topRules)
behavior.initDiagramsBehavior()
},
drawDiagramsFromSerializedGrammar: function (serializedGrammar, targetDiv) {
targetDiv.innerHTML = builder.buildSyntaxDiagramsText(serializedGrammar)
behavior.initDiagramsBehavior()
}
}
})

View File

@@ -0,0 +1,965 @@
/*
Railroad Diagrams
by Tab Atkins Jr. (and others)
http://xanthir.com
http://twitter.com/tabatkins
http://github.com/tabatkins/railroad-diagrams
This document and all associated files in the github project are licensed under CC0: http://creativecommons.org/publicdomain/zero/1.0/
This means you can reuse, remix, or otherwise appropriate this project for your own use WITHOUT RESTRICTION.
(The actual legal meaning can be found at the above link.)
Don't ask me for permission to use any part of this project, JUST USE IT.
I would appreciate attribution, but that is not required by the license.
*/
/*
This file uses a module pattern to avoid leaking names into the global scope.
The only accidental leakage is the name "temp".
The exported names can be found at the bottom of this file;
simply change the names in the array of strings to change what they are called in your application.
As well, several configuration constants are passed into the module function at the bottom of this file.
At runtime, these constants can be found on the Diagram class.
*/
;(function (options) {
function subclassOf(baseClass, superClass) {
baseClass.prototype = Object.create(superClass.prototype)
baseClass.prototype.$super = superClass.prototype
}
function unnull(/* children */) {
return [].slice.call(arguments).reduce(function (sofar, x) {
return sofar !== undefined ? sofar : x
})
}
function determineGaps(outer, inner) {
var diff = outer - inner
switch (Diagram.INTERNAL_ALIGNMENT) {
case "left":
return [0, diff]
break
case "right":
return [diff, 0]
break
case "center":
default:
return [diff / 2, diff / 2]
break
}
}
function wrapString(value) {
return typeof value == "string" ? new Terminal(value) : value
}
function stackAtIllegalPosition(items) {
/* The height of the last line of the Stack is determined by the last child and
therefore any element outside the Stack could overlap with other elements.
If the Stack is the last element no overlap can occur. */
for (var i = 0; i < items.length; i++) {
if (items[i] instanceof Stack && i !== items.length - 1) {
return true
}
}
return false
}
function SVG(name, attrs, text) {
attrs = attrs || {}
text = text || ""
var el = document.createElementNS("http://www.w3.org/2000/svg", name)
for (var attr in attrs) {
if (attr === "xlink:href")
el.setAttributeNS("http://www.w3.org/1999/xlink", "href", attrs[attr])
else el.setAttribute(attr, attrs[attr])
}
el.textContent = text
return el
}
function FakeSVG(tagName, attrs, text) {
if (!(this instanceof FakeSVG)) return new FakeSVG(tagName, attrs, text)
if (text) this.children = text
else this.children = []
this.tagName = tagName
this.attrs = unnull(attrs, {})
return this
}
FakeSVG.prototype.format = function (x, y, width) {
// Virtual
}
FakeSVG.prototype.addTo = function (parent) {
if (parent instanceof FakeSVG) {
parent.children.push(this)
return this
} else {
var svg = this.toSVG()
parent.appendChild(svg)
return svg
}
}
FakeSVG.prototype.escapeString = function (string) {
// Escape markdown and HTML special characters
return string.replace(/[*_\`\[\]<&]/g, function (charString) {
return "&#" + charString.charCodeAt(0) + ";"
})
}
FakeSVG.prototype.toSVG = function () {
var el = SVG(this.tagName, this.attrs)
if (typeof this.children == "string") {
el.textContent = this.children
} else {
this.children.forEach(function (e) {
el.appendChild(e.toSVG())
})
}
return el
}
FakeSVG.prototype.toString = function () {
var str = "<" + this.tagName
var group = this.tagName == "g" || this.tagName == "svg"
for (var attr in this.attrs) {
str +=
" " +
attr +
'="' +
(this.attrs[attr] + "").replace(/&/g, "&amp;").replace(/"/g, "&quot;") +
'"'
}
str += ">"
if (group) str += "\n"
if (typeof this.children == "string") {
str += FakeSVG.prototype.escapeString(this.children)
} else {
this.children.forEach(function (e) {
str += e
})
}
str += "</" + this.tagName + ">\n"
return str
}
function Path(x, y) {
if (!(this instanceof Path)) return new Path(x, y)
FakeSVG.call(this, "path")
this.attrs.d = "M" + x + " " + y
}
subclassOf(Path, FakeSVG)
Path.prototype.m = function (x, y) {
this.attrs.d += "m" + x + " " + y
return this
}
Path.prototype.h = function (val) {
this.attrs.d += "h" + val
return this
}
Path.prototype.right = Path.prototype.h
Path.prototype.left = function (val) {
return this.h(-val)
}
Path.prototype.v = function (val) {
this.attrs.d += "v" + val
return this
}
Path.prototype.down = Path.prototype.v
Path.prototype.up = function (val) {
return this.v(-val)
}
Path.prototype.arc = function (sweep) {
var x = Diagram.ARC_RADIUS
var y = Diagram.ARC_RADIUS
if (sweep[0] == "e" || sweep[1] == "w") {
x *= -1
}
if (sweep[0] == "s" || sweep[1] == "n") {
y *= -1
}
if (sweep == "ne" || sweep == "es" || sweep == "sw" || sweep == "wn") {
var cw = 1
} else {
var cw = 0
}
this.attrs.d +=
"a" +
Diagram.ARC_RADIUS +
" " +
Diagram.ARC_RADIUS +
" 0 0 " +
cw +
" " +
x +
" " +
y
return this
}
Path.prototype.format = function () {
// All paths in this library start/end horizontally.
// The extra .5 ensures a minor overlap, so there's no seams in bad rasterizers.
this.attrs.d += "h.5"
return this
}
function Diagram(items) {
if (!(this instanceof Diagram)) return new Diagram([].slice.call(arguments))
FakeSVG.call(this, "svg", { class: Diagram.DIAGRAM_CLASS })
if (stackAtIllegalPosition(items)) {
throw new RangeError(
"Stack() must only occur at the very last position of Diagram()."
)
}
this.items = items.map(wrapString)
this.items.unshift(new Start())
this.items.push(new End())
this.width =
this.items.reduce(function (sofar, el) {
return sofar + el.width + (el.needsSpace ? 20 : 0)
}, 0) + 1
this.height = this.items.reduce(function (sofar, el) {
return sofar + el.height
}, 0)
this.up = Math.max.apply(
null,
this.items.map(function (x) {
return x.up
})
)
this.down = Math.max.apply(
null,
this.items.map(function (x) {
return x.down
})
)
this.formatted = false
}
subclassOf(Diagram, FakeSVG)
for (var option in options) {
Diagram[option] = options[option]
}
Diagram.prototype.format = function (paddingt, paddingr, paddingb, paddingl) {
paddingt = unnull(paddingt, 20)
paddingr = unnull(paddingr, paddingt, 20)
paddingb = unnull(paddingb, paddingt, 20)
paddingl = unnull(paddingl, paddingr, 20)
var x = paddingl
var y = paddingt
y += this.up
var g = FakeSVG(
"g",
Diagram.STROKE_ODD_PIXEL_LENGTH ? { transform: "translate(.5 .5)" } : {}
)
for (var i = 0; i < this.items.length; i++) {
var item = this.items[i]
if (item.needsSpace) {
Path(x, y).h(10).addTo(g)
x += 10
}
item.format(x, y, item.width + item.offsetX).addTo(g)
x += item.width + item.offsetX
y += item.height
if (item.needsSpace) {
Path(x, y).h(10).addTo(g)
x += 10
}
}
this.attrs.width = this.width + paddingl + paddingr
this.attrs.height = this.up + this.height + this.down + paddingt + paddingb
this.attrs.viewBox = "0 0 " + this.attrs.width + " " + this.attrs.height
g.addTo(this)
this.formatted = true
return this
}
Diagram.prototype.addTo = function (parent) {
var scriptTag = document.getElementsByTagName("script")
scriptTag = scriptTag[scriptTag.length - 1]
var parentTag = scriptTag.parentNode
parent = parent || parentTag
return this.$super.addTo.call(this, parent)
}
Diagram.prototype.toSVG = function () {
if (!this.formatted) {
this.format()
}
return this.$super.toSVG.call(this)
}
Diagram.prototype.toString = function () {
if (!this.formatted) {
this.format()
}
return this.$super.toString.call(this)
}
function ComplexDiagram() {
var diagram = new Diagram([].slice.call(arguments))
var items = diagram.items
items.shift()
items.pop()
items.unshift(new Start(false))
items.push(new End(false))
diagram.items = items
return diagram
}
function Sequence(items) {
if (!(this instanceof Sequence))
return new Sequence([].slice.call(arguments))
FakeSVG.call(this, "g")
if (stackAtIllegalPosition(items)) {
throw new RangeError(
"Stack() must only occur at the very last position of Sequence()."
)
}
this.items = items.map(wrapString)
this.width = this.items.reduce(function (sofar, el) {
return sofar + el.width + (el.needsSpace ? 20 : 0)
}, 0)
this.offsetX = 0
this.height = this.items.reduce(function (sofar, el) {
return sofar + el.height
}, 0)
this.up = this.items.reduce(function (sofar, el) {
return Math.max(sofar, el.up)
}, 0)
this.down = this.items.reduce(function (sofar, el) {
return Math.max(sofar, el.down)
}, 0)
}
subclassOf(Sequence, FakeSVG)
Sequence.prototype.format = function (x, y, width) {
// Hook up the two sides if this is narrower than its stated width.
var gaps = determineGaps(width, this.width)
Path(x, y).h(gaps[0]).addTo(this)
Path(x + gaps[0] + this.width, y + this.height)
.h(gaps[1])
.addTo(this)
x += gaps[0]
for (var i = 0; i < this.items.length; i++) {
var item = this.items[i]
if (item.needsSpace) {
Path(x, y).h(10).addTo(this)
x += 10
}
item.format(x, y, item.width).addTo(this)
x += item.width
y += item.height
if (item.needsSpace) {
Path(x, y).h(10).addTo(this)
x += 10
}
}
return this
}
function Stack(items) {
if (!(this instanceof Stack)) return new Stack([].slice.call(arguments))
FakeSVG.call(this, "g")
if (stackAtIllegalPosition(items)) {
throw new RangeError(
"Stack() must only occur at the very last position of Stack()."
)
}
if (items.length === 0) {
throw new RangeError("Stack() must have at least one child.")
}
this.items = items.map(wrapString)
this.width = this.items.reduce(function (sofar, el) {
return Math.max(sofar, el.width + (el.needsSpace ? 20 : 0))
}, 0)
if (this.items.length > 1) {
this.width += Diagram.ARC_RADIUS * 2
}
this.up = this.items[0].up
this.down = this.items[this.items.length - 1].down
this.height = 0
for (var i = 0; i < this.items.length; i++) {
this.height += this.items[i].height
if (i !== this.items.length - 1) {
this.height +=
Math.max(this.items[i].down, Diagram.VERTICAL_SEPARATION) +
Math.max(this.items[i + 1].up, Diagram.VERTICAL_SEPARATION) +
Diagram.ARC_RADIUS * 4
}
}
if (this.items.length === 0) {
this.offsetX = 0
} else {
// the value is usually negative because the linebreak resets the x value for the next element
this.offsetX = -(
this.width -
this.items[this.items.length - 1].width -
this.items[this.items.length - 1].offsetX -
(this.items[this.items.length - 1].needsSpace ? 20 : 0)
)
if (this.items.length > 1) {
this.offsetX += Diagram.ARC_RADIUS * 2
}
}
}
subclassOf(Stack, FakeSVG)
Stack.prototype.format = function (x, y, width) {
var xIntitial = x
for (var i = 0; i < this.items.length; i++) {
var item = this.items[i]
if (item.needsSpace) {
Path(x, y).h(10).addTo(this)
x += 10
}
item
.format(
x,
y,
Math.max(item.width + item.offsetX, Diagram.ARC_RADIUS * 2)
)
.addTo(this)
x += Math.max(item.width + item.offsetX, Diagram.ARC_RADIUS * 2)
y += item.height
if (item.needsSpace) {
Path(x, y).h(10).addTo(this)
x += 10
}
if (i !== this.items.length - 1) {
Path(x, y)
.arc("ne")
.down(Math.max(item.down, Diagram.VERTICAL_SEPARATION))
.arc("es")
.left(x - xIntitial - Diagram.ARC_RADIUS * 2)
.arc("nw")
.down(Math.max(this.items[i + 1].up, Diagram.VERTICAL_SEPARATION))
.arc("ws")
.addTo(this)
y +=
Math.max(item.down, Diagram.VERTICAL_SEPARATION) +
Math.max(this.items[i + 1].up, Diagram.VERTICAL_SEPARATION) +
Diagram.ARC_RADIUS * 4
x = xIntitial + Diagram.ARC_RADIUS * 2
}
}
Path(x, y)
.h(width - (this.width + this.offsetX))
.addTo(this)
return this
}
function Choice(normal, items) {
if (!(this instanceof Choice))
return new Choice(normal, [].slice.call(arguments, 1))
FakeSVG.call(this, "g")
if (typeof normal !== "number" || normal !== Math.floor(normal)) {
throw new TypeError("The first argument of Choice() must be an integer.")
} else if (normal < 0 || normal >= items.length) {
throw new RangeError(
"The first argument of Choice() must be an index for one of the items."
)
} else {
this.normal = normal
}
this.items = items.map(wrapString)
this.width =
this.items.reduce(function (sofar, el) {
return Math.max(sofar, el.width)
}, 0) +
Diagram.ARC_RADIUS * 4
this.offsetX = 0
this.height = this.items[normal].height
this.up = this.down = 0
for (var i = 0; i < this.items.length; i++) {
var item = this.items[i]
if (i < normal) {
this.up += Math.max(
Diagram.ARC_RADIUS,
item.up + item.height + item.down + Diagram.VERTICAL_SEPARATION
)
}
if (i == normal) {
this.up += Math.max(Diagram.ARC_RADIUS, item.up)
this.down += Math.max(Diagram.ARC_RADIUS, item.down)
}
if (i > normal) {
this.down += Math.max(
Diagram.ARC_RADIUS,
Diagram.VERTICAL_SEPARATION + item.up + item.down + item.height
)
}
}
}
subclassOf(Choice, FakeSVG)
Choice.prototype.format = function (x, y, width) {
// Hook up the two sides if this is narrower than its stated width.
var gaps = determineGaps(width, this.width)
Path(x, y).h(gaps[0]).addTo(this)
Path(x + gaps[0] + this.width, y + this.height)
.h(gaps[1])
.addTo(this)
x += gaps[0]
var last = this.items.length - 1
var innerWidth = this.width - Diagram.ARC_RADIUS * 4
// Do the elements that curve above
for (var i = this.normal - 1; i >= 0; i--) {
var item = this.items[i]
if (i == this.normal - 1) {
var distanceFromY = Math.max(
Diagram.ARC_RADIUS * 2,
this.items[i + 1].up +
Diagram.VERTICAL_SEPARATION +
item.height +
item.down
)
}
Path(x, y)
.arc("se")
.up(distanceFromY - Diagram.ARC_RADIUS * 2)
.arc("wn")
.addTo(this)
item
.format(x + Diagram.ARC_RADIUS * 2, y - distanceFromY, innerWidth)
.addTo(this)
Path(
x + Diagram.ARC_RADIUS * 2 + innerWidth,
y - distanceFromY + item.height
)
.arc("ne")
.down(
distanceFromY -
item.height +
this.items[this.normal].height -
Diagram.ARC_RADIUS * 2
)
.arc("ws")
.addTo(this)
distanceFromY += Math.max(
Diagram.ARC_RADIUS,
item.up +
Diagram.VERTICAL_SEPARATION +
(i == 0 ? 0 : this.items[i - 1].down + this.items[i - 1].height)
)
}
// Do the straight-line path.
Path(x, y)
.right(Diagram.ARC_RADIUS * 2)
.addTo(this)
this.items[this.normal]
.format(x + Diagram.ARC_RADIUS * 2, y, innerWidth)
.addTo(this)
Path(x + Diagram.ARC_RADIUS * 2 + innerWidth, y + this.height)
.right(Diagram.ARC_RADIUS * 2)
.addTo(this)
// Do the elements that curve below
for (var i = this.normal + 1; i <= last; i++) {
var item = this.items[i]
if (i == this.normal + 1) {
var distanceFromY = Math.max(
Diagram.ARC_RADIUS * 2,
this.items[i - 1].height +
this.items[i - 1].down +
Diagram.VERTICAL_SEPARATION +
item.up
)
}
Path(x, y)
.arc("ne")
.down(distanceFromY - Diagram.ARC_RADIUS * 2)
.arc("ws")
.addTo(this)
item
.format(x + Diagram.ARC_RADIUS * 2, y + distanceFromY, innerWidth)
.addTo(this)
Path(
x + Diagram.ARC_RADIUS * 2 + innerWidth,
y + distanceFromY + item.height
)
.arc("se")
.up(
distanceFromY -
Diagram.ARC_RADIUS * 2 +
item.height -
this.items[this.normal].height
)
.arc("wn")
.addTo(this)
distanceFromY += Math.max(
Diagram.ARC_RADIUS,
item.height +
item.down +
Diagram.VERTICAL_SEPARATION +
(i == last ? 0 : this.items[i + 1].up)
)
}
return this
}
function Optional(item, skip) {
if (skip === undefined) return Choice(1, Skip(), item)
else if (skip === "skip") return Choice(0, Skip(), item)
else throw "Unknown value for Optional()'s 'skip' argument."
}
function OneOrMore(item, rep) {
if (!(this instanceof OneOrMore)) return new OneOrMore(item, rep)
FakeSVG.call(this, "g")
rep = rep || new Skip()
this.item = wrapString(item)
this.rep = wrapString(rep)
this.width =
Math.max(this.item.width, this.rep.width) + Diagram.ARC_RADIUS * 2
this.offsetX = 0
this.height = this.item.height
this.up = this.item.up
this.down = Math.max(
Diagram.ARC_RADIUS * 2,
this.item.down +
Diagram.VERTICAL_SEPARATION +
this.rep.up +
this.rep.height +
this.rep.down
)
}
subclassOf(OneOrMore, FakeSVG)
OneOrMore.prototype.needsSpace = true
OneOrMore.prototype.format = function (x, y, width) {
// Hook up the two sides if this is narrower than its stated width.
var gaps = determineGaps(width, this.width)
Path(x, y).h(gaps[0]).addTo(this)
Path(x + gaps[0] + this.width, y + this.height)
.h(gaps[1])
.addTo(this)
x += gaps[0]
// Draw item
Path(x, y).right(Diagram.ARC_RADIUS).addTo(this)
this.item
.format(x + Diagram.ARC_RADIUS, y, this.width - Diagram.ARC_RADIUS * 2)
.addTo(this)
Path(x + this.width - Diagram.ARC_RADIUS, y + this.height)
.right(Diagram.ARC_RADIUS)
.addTo(this)
// Draw repeat arc
var distanceFromY = Math.max(
Diagram.ARC_RADIUS * 2,
this.item.height +
this.item.down +
Diagram.VERTICAL_SEPARATION +
this.rep.up
)
Path(x + Diagram.ARC_RADIUS, y)
.arc("nw")
.down(distanceFromY - Diagram.ARC_RADIUS * 2)
.arc("ws")
.addTo(this)
this.rep
.format(
x + Diagram.ARC_RADIUS,
y + distanceFromY,
this.width - Diagram.ARC_RADIUS * 2
)
.addTo(this)
Path(
x + this.width - Diagram.ARC_RADIUS,
y + distanceFromY + this.rep.height
)
.arc("se")
.up(
distanceFromY -
Diagram.ARC_RADIUS * 2 +
this.rep.height -
this.item.height
)
.arc("en")
.addTo(this)
return this
}
function ZeroOrMore(item, rep, skip) {
return Optional(OneOrMore(item, rep), skip)
}
function Start(simpleType) {
if (!(this instanceof Start)) return new Start()
FakeSVG.call(this, "path")
this.width = 20
this.height = 0
this.offsetX = 0
this.up = 10
this.down = 10
this.simpleType = simpleType
}
subclassOf(Start, FakeSVG)
Start.prototype.format = function (x, y) {
if (this.simpleType === false) {
this.attrs.d = "M " + x + " " + (y - 10) + " v 20 m 0 -10 h 20.5"
} else {
this.attrs.d =
"M " + x + " " + (y - 10) + " v 20 m 10 -20 v 20 m -10 -10 h 20.5"
}
return this
}
function End(simpleType) {
if (!(this instanceof End)) return new End()
FakeSVG.call(this, "path")
this.width = 20
this.height = 0
this.offsetX = 0
this.up = 10
this.down = 10
this.simpleType = simpleType
}
subclassOf(End, FakeSVG)
End.prototype.format = function (x, y) {
if (this.simpleType === false) {
this.attrs.d = "M " + x + " " + y + " h 20 m 0 -10 v 20"
} else {
this.attrs.d = "M " + x + " " + y + " h 20 m -10 -10 v 20 m 10 -20 v 20"
}
return this
}
function Terminal(
text,
href,
title,
occurrenceIdx,
topRuleName,
dslRuleName,
tokenName
) {
if (!(this instanceof Terminal))
return new Terminal(
text,
href,
title,
occurrenceIdx,
topRuleName,
dslRuleName,
tokenName
)
FakeSVG.call(this, "g", { class: "terminal" })
this.text = text
this.label = text
this.href = href
this.title = title
this.occurrenceIdx = occurrenceIdx
this.topRuleName = topRuleName
this.dslRuleName = dslRuleName
this.tokenName = tokenName
this.width =
text.length * 8 +
20 /* Assume that each char is .5em, and that the em is 16px */
this.height = 0
this.offsetX = 0
this.up = 11
this.down = 11
}
subclassOf(Terminal, FakeSVG)
Terminal.prototype.needsSpace = true
Terminal.prototype.format = function (x, y, width) {
// Hook up the two sides if this is narrower than its stated width.
var gaps = determineGaps(width, this.width)
Path(x, y).h(gaps[0]).addTo(this)
Path(x + gaps[0] + this.width, y)
.h(gaps[1])
.addTo(this)
x += gaps[0]
FakeSVG("rect", {
x: x,
y: y - 11,
width: this.width,
height: this.up + this.down,
rx: 10,
ry: 10
}).addTo(this)
var text = FakeSVG(
"text",
{
x: x + this.width / 2,
y: y + 4,
occurrenceIdx: this.occurrenceIdx,
topRuleName: this.topRuleName,
dslRuleName: this.dslRuleName,
tokenName: this.tokenName,
label: this.label
},
this.text
)
var title = FakeSVG("title", {}, this.title)
if (this.href) FakeSVG("a", { "xlink:href": this.href }, [text]).addTo(this)
else {
text.addTo(this)
if (this.title !== undefined) {
title.addTo(this)
}
}
return this
}
function NonTerminal(text, href, occurrenceIdx, topRuleName) {
if (!(this instanceof NonTerminal))
return new NonTerminal(text, href, occurrenceIdx, topRuleName)
FakeSVG.call(this, "g", { class: "non-terminal" })
this.text = text
this.ruleName = text
this.href = href
this.occurrenceIdx = occurrenceIdx
this.topRuleName = topRuleName
this.width = text.length * 8 + 20
this.height = 0
this.offsetX = 0
this.up = 11
this.down = 11
}
subclassOf(NonTerminal, FakeSVG)
NonTerminal.prototype.needsSpace = true
NonTerminal.prototype.format = function (x, y, width) {
// Hook up the two sides if this is narrower than its stated width.
var gaps = determineGaps(width, this.width)
Path(x, y).h(gaps[0]).addTo(this)
Path(x + gaps[0] + this.width, y)
.h(gaps[1])
.addTo(this)
x += gaps[0]
FakeSVG("rect", {
x: x,
y: y - 11,
width: this.width,
height: this.up + this.down
}).addTo(this)
var text = FakeSVG(
"text",
{
x: x + this.width / 2,
y: y + 4,
occurrenceIdx: this.occurrenceIdx,
topRuleName: this.topRuleName,
ruleName: this.ruleName
},
this.text
)
if (this.href) FakeSVG("a", { "xlink:href": this.href }, [text]).addTo(this)
else text.addTo(this)
return this
}
function Comment(text) {
if (!(this instanceof Comment)) return new Comment(text)
FakeSVG.call(this, "g")
this.text = text
this.width = text.length * 7 + 10
this.height = 0
this.offsetX = 0
this.up = 11
this.down = 11
}
subclassOf(Comment, FakeSVG)
Comment.prototype.needsSpace = true
Comment.prototype.format = function (x, y, width) {
// Hook up the two sides if this is narrower than its stated width.
var gaps = determineGaps(width, this.width)
Path(x, y).h(gaps[0]).addTo(this)
Path(x + gaps[0] + this.width, y + this.height)
.h(gaps[1])
.addTo(this)
x += gaps[0]
FakeSVG(
"text",
{
x: x + this.width / 2,
y: y + 5,
class: "comment"
},
this.text
).addTo(this)
return this
}
function Skip() {
if (!(this instanceof Skip)) return new Skip()
FakeSVG.call(this, "g")
this.width = 0
this.height = 0
this.offsetX = 0
this.up = 0
this.down = 0
}
subclassOf(Skip, FakeSVG)
Skip.prototype.format = function (x, y, width) {
Path(x, y).right(width).addTo(this)
return this
}
var root
if (typeof define === "function" && define.amd) {
// AMD. Register as an anonymous module.
root = {}
define([], function () {
return root
})
} else if (typeof exports === "object") {
// CommonJS for node
root = exports
} else {
// Browser globals (root is window.railroad)
this.railroad = {}
root = this.railroad
}
var temp = [
Diagram,
ComplexDiagram,
Sequence,
Stack,
Choice,
Optional,
OneOrMore,
ZeroOrMore,
Terminal,
NonTerminal,
Comment,
Skip
]
/*
These are the names that the internal classes are exported as.
If you would like different names, adjust them here.
*/
;[
"Diagram",
"ComplexDiagram",
"Sequence",
"Stack",
"Choice",
"Optional",
"OneOrMore",
"ZeroOrMore",
"Terminal",
"NonTerminal",
"Comment",
"Skip"
].forEach(function (e, i) {
root[e] = temp[i]
})
}.call(this, {
VERTICAL_SEPARATION: 8,
ARC_RADIUS: 10,
DIAGRAM_CLASS: "railroad-diagram",
STROKE_ODD_PIXEL_LENGTH: true,
INTERNAL_ALIGNMENT: "center"
}))

17824
_node_modules/chevrotain/lib/chevrotain.js generated Normal file

File diff suppressed because it is too large Load Diff

2
_node_modules/chevrotain/lib/chevrotain.min.js generated vendored Normal file

File diff suppressed because one or more lines are too long

76
_node_modules/chevrotain/lib/src/api.js generated Normal file
View File

@@ -0,0 +1,76 @@
"use strict";
/* istanbul ignore file - tricky to import some things from this module during testing */
Object.defineProperty(exports, "__esModule", { value: true });
exports.Parser = exports.createSyntaxDiagramsCode = exports.clearCache = exports.generateCstDts = exports.GAstVisitor = exports.serializeProduction = exports.serializeGrammar = exports.Terminal = exports.Rule = exports.RepetitionWithSeparator = exports.RepetitionMandatoryWithSeparator = exports.RepetitionMandatory = exports.Repetition = exports.Option = exports.NonTerminal = exports.Alternative = exports.Alternation = exports.defaultLexerErrorProvider = exports.NoViableAltException = exports.NotAllInputParsedException = exports.MismatchedTokenException = exports.isRecognitionException = exports.EarlyExitException = exports.defaultParserErrorProvider = exports.LLkLookaheadStrategy = exports.getLookaheadPaths = exports.tokenName = exports.tokenMatcher = exports.tokenLabel = exports.EOF = exports.createTokenInstance = exports.createToken = exports.LexerDefinitionErrorType = exports.Lexer = exports.EMPTY_ALT = exports.ParserDefinitionErrorType = exports.EmbeddedActionsParser = exports.CstParser = exports.VERSION = void 0;
// semantic version
var version_1 = require("./version");
Object.defineProperty(exports, "VERSION", { enumerable: true, get: function () { return version_1.VERSION; } });
var parser_1 = require("./parse/parser/parser");
Object.defineProperty(exports, "CstParser", { enumerable: true, get: function () { return parser_1.CstParser; } });
Object.defineProperty(exports, "EmbeddedActionsParser", { enumerable: true, get: function () { return parser_1.EmbeddedActionsParser; } });
Object.defineProperty(exports, "ParserDefinitionErrorType", { enumerable: true, get: function () { return parser_1.ParserDefinitionErrorType; } });
Object.defineProperty(exports, "EMPTY_ALT", { enumerable: true, get: function () { return parser_1.EMPTY_ALT; } });
var lexer_public_1 = require("./scan/lexer_public");
Object.defineProperty(exports, "Lexer", { enumerable: true, get: function () { return lexer_public_1.Lexer; } });
Object.defineProperty(exports, "LexerDefinitionErrorType", { enumerable: true, get: function () { return lexer_public_1.LexerDefinitionErrorType; } });
// Tokens utilities
var tokens_public_1 = require("./scan/tokens_public");
Object.defineProperty(exports, "createToken", { enumerable: true, get: function () { return tokens_public_1.createToken; } });
Object.defineProperty(exports, "createTokenInstance", { enumerable: true, get: function () { return tokens_public_1.createTokenInstance; } });
Object.defineProperty(exports, "EOF", { enumerable: true, get: function () { return tokens_public_1.EOF; } });
Object.defineProperty(exports, "tokenLabel", { enumerable: true, get: function () { return tokens_public_1.tokenLabel; } });
Object.defineProperty(exports, "tokenMatcher", { enumerable: true, get: function () { return tokens_public_1.tokenMatcher; } });
Object.defineProperty(exports, "tokenName", { enumerable: true, get: function () { return tokens_public_1.tokenName; } });
// Lookahead
var lookahead_1 = require("./parse/grammar/lookahead");
Object.defineProperty(exports, "getLookaheadPaths", { enumerable: true, get: function () { return lookahead_1.getLookaheadPaths; } });
var llk_lookahead_1 = require("./parse/grammar/llk_lookahead");
Object.defineProperty(exports, "LLkLookaheadStrategy", { enumerable: true, get: function () { return llk_lookahead_1.LLkLookaheadStrategy; } });
// Other Utilities
var errors_public_1 = require("./parse/errors_public");
Object.defineProperty(exports, "defaultParserErrorProvider", { enumerable: true, get: function () { return errors_public_1.defaultParserErrorProvider; } });
var exceptions_public_1 = require("./parse/exceptions_public");
Object.defineProperty(exports, "EarlyExitException", { enumerable: true, get: function () { return exceptions_public_1.EarlyExitException; } });
Object.defineProperty(exports, "isRecognitionException", { enumerable: true, get: function () { return exceptions_public_1.isRecognitionException; } });
Object.defineProperty(exports, "MismatchedTokenException", { enumerable: true, get: function () { return exceptions_public_1.MismatchedTokenException; } });
Object.defineProperty(exports, "NotAllInputParsedException", { enumerable: true, get: function () { return exceptions_public_1.NotAllInputParsedException; } });
Object.defineProperty(exports, "NoViableAltException", { enumerable: true, get: function () { return exceptions_public_1.NoViableAltException; } });
var lexer_errors_public_1 = require("./scan/lexer_errors_public");
Object.defineProperty(exports, "defaultLexerErrorProvider", { enumerable: true, get: function () { return lexer_errors_public_1.defaultLexerErrorProvider; } });
// grammar reflection API
var gast_1 = require("@chevrotain/gast");
Object.defineProperty(exports, "Alternation", { enumerable: true, get: function () { return gast_1.Alternation; } });
Object.defineProperty(exports, "Alternative", { enumerable: true, get: function () { return gast_1.Alternative; } });
Object.defineProperty(exports, "NonTerminal", { enumerable: true, get: function () { return gast_1.NonTerminal; } });
Object.defineProperty(exports, "Option", { enumerable: true, get: function () { return gast_1.Option; } });
Object.defineProperty(exports, "Repetition", { enumerable: true, get: function () { return gast_1.Repetition; } });
Object.defineProperty(exports, "RepetitionMandatory", { enumerable: true, get: function () { return gast_1.RepetitionMandatory; } });
Object.defineProperty(exports, "RepetitionMandatoryWithSeparator", { enumerable: true, get: function () { return gast_1.RepetitionMandatoryWithSeparator; } });
Object.defineProperty(exports, "RepetitionWithSeparator", { enumerable: true, get: function () { return gast_1.RepetitionWithSeparator; } });
Object.defineProperty(exports, "Rule", { enumerable: true, get: function () { return gast_1.Rule; } });
Object.defineProperty(exports, "Terminal", { enumerable: true, get: function () { return gast_1.Terminal; } });
// GAST Utilities
var gast_2 = require("@chevrotain/gast");
Object.defineProperty(exports, "serializeGrammar", { enumerable: true, get: function () { return gast_2.serializeGrammar; } });
Object.defineProperty(exports, "serializeProduction", { enumerable: true, get: function () { return gast_2.serializeProduction; } });
Object.defineProperty(exports, "GAstVisitor", { enumerable: true, get: function () { return gast_2.GAstVisitor; } });
var cst_dts_gen_1 = require("@chevrotain/cst-dts-gen");
Object.defineProperty(exports, "generateCstDts", { enumerable: true, get: function () { return cst_dts_gen_1.generateCstDts; } });
/* istanbul ignore next */
function clearCache() {
console.warn("The clearCache function was 'soft' removed from the Chevrotain API." +
"\n\t It performs no action other than printing this message." +
"\n\t Please avoid using it as it will be completely removed in the future");
}
exports.clearCache = clearCache;
var render_public_1 = require("./diagrams/render_public");
Object.defineProperty(exports, "createSyntaxDiagramsCode", { enumerable: true, get: function () { return render_public_1.createSyntaxDiagramsCode; } });
var Parser = /** @class */ (function () {
function Parser() {
throw new Error("The Parser class has been deprecated, use CstParser or EmbeddedActionsParser instead.\t\n" +
"See: https://chevrotain.io/docs/changes/BREAKING_CHANGES.html#_7-0-0");
}
return Parser;
}());
exports.Parser = Parser;
//# sourceMappingURL=api.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"api.js","sourceRoot":"","sources":["../../src/api.ts"],"names":[],"mappings":";AAAA,yFAAyF;;;AAEzF,mBAAmB;AACnB,qCAAmC;AAA1B,kGAAA,OAAO,OAAA;AAEhB,gDAK8B;AAJ5B,mGAAA,SAAS,OAAA;AACT,+GAAA,qBAAqB,OAAA;AACrB,mHAAA,yBAAyB,OAAA;AACzB,mGAAA,SAAS,OAAA;AAGX,oDAAqE;AAA5D,qGAAA,KAAK,OAAA;AAAE,wHAAA,wBAAwB,OAAA;AAExC,mBAAmB;AACnB,sDAO6B;AAN3B,4GAAA,WAAW,OAAA;AACX,oHAAA,mBAAmB,OAAA;AACnB,oGAAA,GAAG,OAAA;AACH,2GAAA,UAAU,OAAA;AACV,6GAAA,YAAY,OAAA;AACZ,0GAAA,SAAS,OAAA;AAGX,YAAY;AAEZ,uDAA6D;AAApD,8GAAA,iBAAiB,OAAA;AAE1B,+DAAoE;AAA3D,qHAAA,oBAAoB,OAAA;AAE7B,kBAAkB;AAElB,uDAAkE;AAAzD,2HAAA,0BAA0B,OAAA;AAEnC,+DAMkC;AALhC,uHAAA,kBAAkB,OAAA;AAClB,2HAAA,sBAAsB,OAAA;AACtB,6HAAA,wBAAwB,OAAA;AACxB,+HAAA,0BAA0B,OAAA;AAC1B,yHAAA,oBAAoB,OAAA;AAGtB,kEAAsE;AAA7D,gIAAA,yBAAyB,OAAA;AAElC,yBAAyB;AACzB,yCAWyB;AAVvB,mGAAA,WAAW,OAAA;AACX,mGAAA,WAAW,OAAA;AACX,mGAAA,WAAW,OAAA;AACX,8FAAA,MAAM,OAAA;AACN,kGAAA,UAAU,OAAA;AACV,2GAAA,mBAAmB,OAAA;AACnB,wHAAA,gCAAgC,OAAA;AAChC,+GAAA,uBAAuB,OAAA;AACvB,4FAAA,IAAI,OAAA;AACJ,gGAAA,QAAQ,OAAA;AAGV,iBAAiB;AAEjB,yCAIyB;AAHvB,wGAAA,gBAAgB,OAAA;AAChB,2GAAA,mBAAmB,OAAA;AACnB,mGAAA,WAAW,OAAA;AAGb,uDAAwD;AAA/C,6GAAA,cAAc,OAAA;AAEvB,0BAA0B;AAC1B,SAAgB,UAAU;IACxB,OAAO,CAAC,IAAI,CACV,qEAAqE;QACnE,8DAA8D;QAC9D,2EAA2E,CAC9E,CAAA;AACH,CAAC;AAND,gCAMC;AAED,0DAAmE;AAA1D,yHAAA,wBAAwB,OAAA;AAEjC;IACE;QACE,MAAM,IAAI,KAAK,CACb,2FAA2F;YACzF,sEAAsE,CACzE,CAAA;IACH,CAAC;IACH,aAAC;AAAD,CAAC,AAPD,IAOC;AAPY,wBAAM"}

View File

@@ -0,0 +1,16 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.createSyntaxDiagramsCode = void 0;
var version_1 = require("../version");
function createSyntaxDiagramsCode(grammar, _a) {
var _b = _a === void 0 ? {} : _a, _c = _b.resourceBase, resourceBase = _c === void 0 ? "https://unpkg.com/chevrotain@".concat(version_1.VERSION, "/diagrams/") : _c, _d = _b.css, css = _d === void 0 ? "https://unpkg.com/chevrotain@".concat(version_1.VERSION, "/diagrams/diagrams.css") : _d;
var header = "\n<!-- This is a generated file -->\n<!DOCTYPE html>\n<meta charset=\"utf-8\">\n<style>\n body {\n background-color: hsl(30, 20%, 95%)\n }\n</style>\n\n";
var cssHtml = "\n<link rel='stylesheet' href='".concat(css, "'>\n");
var scripts = "\n<script src='".concat(resourceBase, "vendor/railroad-diagrams.js'></script>\n<script src='").concat(resourceBase, "src/diagrams_builder.js'></script>\n<script src='").concat(resourceBase, "src/diagrams_behavior.js'></script>\n<script src='").concat(resourceBase, "src/main.js'></script>\n");
var diagramsDiv = "\n<div id=\"diagrams\" align=\"center\"></div> \n";
var serializedGrammar = "\n<script>\n window.serializedGrammar = ".concat(JSON.stringify(grammar, null, " "), ";\n</script>\n");
var initLogic = "\n<script>\n var diagramsDiv = document.getElementById(\"diagrams\");\n main.drawDiagramsFromSerializedGrammar(serializedGrammar, diagramsDiv);\n</script>\n";
return (header + cssHtml + scripts + diagramsDiv + serializedGrammar + initLogic);
}
exports.createSyntaxDiagramsCode = createSyntaxDiagramsCode;
//# sourceMappingURL=render_public.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"render_public.js","sourceRoot":"","sources":["../../../src/diagrams/render_public.ts"],"names":[],"mappings":";;;AAAA,sCAAoC;AAGpC,SAAgB,wBAAwB,CACtC,OAA0B,EAC1B,EAMM;QANN,qBAMI,EAAE,KAAA,EALJ,oBAAkE,EAAlE,YAAY,mBAAG,uCAAgC,iBAAO,eAAY,KAAA,EAClE,WAAqE,EAArE,GAAG,mBAAG,uCAAgC,iBAAO,2BAAwB,KAAA;IAMvE,IAAM,MAAM,GAAG,+JAUhB,CAAA;IACC,IAAM,OAAO,GAAG,yCACa,GAAG,SACjC,CAAA;IAEC,IAAM,OAAO,GAAG,yBACH,YAAY,kEACZ,YAAY,8DACZ,YAAY,+DACZ,YAAY,6BAC1B,CAAA;IACC,IAAM,WAAW,GAAG,sDAErB,CAAA;IACC,IAAM,iBAAiB,GAAG,qDAEK,IAAI,CAAC,SAAS,CAAC,OAAO,EAAE,IAAI,EAAE,IAAI,CAAC,mBAEnE,CAAA;IAEC,IAAM,SAAS,GAAG,oKAKnB,CAAA;IACC,OAAO,CACL,MAAM,GAAG,OAAO,GAAG,OAAO,GAAG,WAAW,GAAG,iBAAiB,GAAG,SAAS,CACzE,CAAA;AACH,CAAC;AAjDD,4DAiDC"}

View File

@@ -0,0 +1,14 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.defineNameProp = void 0;
var NAME = "name";
function defineNameProp(obj, nameValue) {
Object.defineProperty(obj, NAME, {
enumerable: false,
configurable: true,
writable: false,
value: nameValue
});
}
exports.defineNameProp = defineNameProp;
//# sourceMappingURL=lang_extensions.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"lang_extensions.js","sourceRoot":"","sources":["../../../src/lang/lang_extensions.ts"],"names":[],"mappings":";;;AAAA,IAAM,IAAI,GAAG,MAAM,CAAA;AAEnB,SAAgB,cAAc,CAAC,GAAO,EAAE,SAAiB;IACvD,MAAM,CAAC,cAAc,CAAC,GAAG,EAAE,IAAI,EAAE;QAC/B,UAAU,EAAE,KAAK;QACjB,YAAY,EAAE,IAAI;QAClB,QAAQ,EAAE,KAAK;QACf,KAAK,EAAE,SAAS;KACjB,CAAC,CAAA;AACJ,CAAC;AAPD,wCAOC"}

View File

@@ -0,0 +1,6 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.IN = void 0;
// TODO: can this be removed? where is it used?
exports.IN = "_~IN~_";
//# sourceMappingURL=constants.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"constants.js","sourceRoot":"","sources":["../../../src/parse/constants.ts"],"names":[],"mappings":";;;AAAA,+CAA+C;AAClC,QAAA,EAAE,GAAG,QAAQ,CAAA"}

View File

@@ -0,0 +1,78 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.addNoneTerminalToCst = exports.addTerminalToCst = exports.setNodeLocationFull = exports.setNodeLocationOnlyOffset = void 0;
/**
* This nodeLocation tracking is not efficient and should only be used
* when error recovery is enabled or the Token Vector contains virtual Tokens
* (e.g, Python Indent/Outdent)
* As it executes the calculation for every single terminal/nonTerminal
* and does not rely on the fact the token vector is **sorted**
*/
function setNodeLocationOnlyOffset(currNodeLocation, newLocationInfo) {
// First (valid) update for this cst node
if (isNaN(currNodeLocation.startOffset) === true) {
// assumption1: Token location information is either NaN or a valid number
// assumption2: Token location information is fully valid if it exist
// (both start/end offsets exist and are numbers).
currNodeLocation.startOffset = newLocationInfo.startOffset;
currNodeLocation.endOffset = newLocationInfo.endOffset;
}
// Once the startOffset has been updated with a valid number it should never receive
// any farther updates as the Token vector is sorted.
// We still have to check this this condition for every new possible location info
// because with error recovery enabled we may encounter invalid tokens (NaN location props)
else if (currNodeLocation.endOffset < newLocationInfo.endOffset === true) {
currNodeLocation.endOffset = newLocationInfo.endOffset;
}
}
exports.setNodeLocationOnlyOffset = setNodeLocationOnlyOffset;
/**
* This nodeLocation tracking is not efficient and should only be used
* when error recovery is enabled or the Token Vector contains virtual Tokens
* (e.g, Python Indent/Outdent)
* As it executes the calculation for every single terminal/nonTerminal
* and does not rely on the fact the token vector is **sorted**
*/
function setNodeLocationFull(currNodeLocation, newLocationInfo) {
// First (valid) update for this cst node
if (isNaN(currNodeLocation.startOffset) === true) {
// assumption1: Token location information is either NaN or a valid number
// assumption2: Token location information is fully valid if it exist
// (all start/end props exist and are numbers).
currNodeLocation.startOffset = newLocationInfo.startOffset;
currNodeLocation.startColumn = newLocationInfo.startColumn;
currNodeLocation.startLine = newLocationInfo.startLine;
currNodeLocation.endOffset = newLocationInfo.endOffset;
currNodeLocation.endColumn = newLocationInfo.endColumn;
currNodeLocation.endLine = newLocationInfo.endLine;
}
// Once the start props has been updated with a valid number it should never receive
// any farther updates as the Token vector is sorted.
// We still have to check this this condition for every new possible location info
// because with error recovery enabled we may encounter invalid tokens (NaN location props)
else if (currNodeLocation.endOffset < newLocationInfo.endOffset === true) {
currNodeLocation.endOffset = newLocationInfo.endOffset;
currNodeLocation.endColumn = newLocationInfo.endColumn;
currNodeLocation.endLine = newLocationInfo.endLine;
}
}
exports.setNodeLocationFull = setNodeLocationFull;
function addTerminalToCst(node, token, tokenTypeName) {
if (node.children[tokenTypeName] === undefined) {
node.children[tokenTypeName] = [token];
}
else {
node.children[tokenTypeName].push(token);
}
}
exports.addTerminalToCst = addTerminalToCst;
function addNoneTerminalToCst(node, ruleName, ruleResult) {
if (node.children[ruleName] === undefined) {
node.children[ruleName] = [ruleResult];
}
else {
node.children[ruleName].push(ruleResult);
}
}
exports.addNoneTerminalToCst = addNoneTerminalToCst;
//# sourceMappingURL=cst.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"cst.js","sourceRoot":"","sources":["../../../../src/parse/cst/cst.ts"],"names":[],"mappings":";;;AAEA;;;;;;GAMG;AACH,SAAgB,yBAAyB,CACvC,gBAAiC,EACjC,eAAoE;IAEpE,yCAAyC;IACzC,IAAI,KAAK,CAAC,gBAAgB,CAAC,WAAW,CAAC,KAAK,IAAI,EAAE;QAChD,0EAA0E;QAC1E,qEAAqE;QACrE,kDAAkD;QAClD,gBAAgB,CAAC,WAAW,GAAG,eAAe,CAAC,WAAW,CAAA;QAC1D,gBAAgB,CAAC,SAAS,GAAG,eAAe,CAAC,SAAS,CAAA;KACvD;IACD,oFAAoF;IACpF,qDAAqD;IACrD,kFAAkF;IAClF,2FAA2F;SACtF,IAAI,gBAAgB,CAAC,SAAU,GAAG,eAAe,CAAC,SAAS,KAAK,IAAI,EAAE;QACzE,gBAAgB,CAAC,SAAS,GAAG,eAAe,CAAC,SAAS,CAAA;KACvD;AACH,CAAC;AAnBD,8DAmBC;AAED;;;;;;GAMG;AACH,SAAgB,mBAAmB,CACjC,gBAAiC,EACjC,eAAgC;IAEhC,yCAAyC;IACzC,IAAI,KAAK,CAAC,gBAAgB,CAAC,WAAW,CAAC,KAAK,IAAI,EAAE;QAChD,0EAA0E;QAC1E,qEAAqE;QACrE,+CAA+C;QAC/C,gBAAgB,CAAC,WAAW,GAAG,eAAe,CAAC,WAAW,CAAA;QAC1D,gBAAgB,CAAC,WAAW,GAAG,eAAe,CAAC,WAAW,CAAA;QAC1D,gBAAgB,CAAC,SAAS,GAAG,eAAe,CAAC,SAAS,CAAA;QACtD,gBAAgB,CAAC,SAAS,GAAG,eAAe,CAAC,SAAS,CAAA;QACtD,gBAAgB,CAAC,SAAS,GAAG,eAAe,CAAC,SAAS,CAAA;QACtD,gBAAgB,CAAC,OAAO,GAAG,eAAe,CAAC,OAAO,CAAA;KACnD;IACD,oFAAoF;IACpF,qDAAqD;IACrD,kFAAkF;IAClF,2FAA2F;SACtF,IAAI,gBAAgB,CAAC,SAAU,GAAG,eAAe,CAAC,SAAU,KAAK,IAAI,EAAE;QAC1E,gBAAgB,CAAC,SAAS,GAAG,eAAe,CAAC,SAAS,CAAA;QACtD,gBAAgB,CAAC,SAAS,GAAG,eAAe,CAAC,SAAS,CAAA;QACtD,gBAAgB,CAAC,OAAO,GAAG,eAAe,CAAC,OAAO,CAAA;KACnD;AACH,CAAC;AAzBD,kDAyBC;AAED,SAAgB,gBAAgB,CAC9B,IAAa,EACb,KAAa,EACb,aAAqB;IAErB,IAAI,IAAI,CAAC,QAAQ,CAAC,aAAa,CAAC,KAAK,SAAS,EAAE;QAC9C,IAAI,CAAC,QAAQ,CAAC,aAAa,CAAC,GAAG,CAAC,KAAK,CAAC,CAAA;KACvC;SAAM;QACL,IAAI,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;KACzC;AACH,CAAC;AAVD,4CAUC;AAED,SAAgB,oBAAoB,CAClC,IAAa,EACb,QAAgB,EAChB,UAAe;IAEf,IAAI,IAAI,CAAC,QAAQ,CAAC,QAAQ,CAAC,KAAK,SAAS,EAAE;QACzC,IAAI,CAAC,QAAQ,CAAC,QAAQ,CAAC,GAAG,CAAC,UAAU,CAAC,CAAA;KACvC;SAAM;QACL,IAAI,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC,CAAA;KACzC;AACH,CAAC;AAVD,oDAUC"}

View File

@@ -0,0 +1,109 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.validateMissingCstMethods = exports.validateVisitor = exports.CstVisitorDefinitionError = exports.createBaseVisitorConstructorWithDefaults = exports.createBaseSemanticVisitorConstructor = exports.defaultVisit = void 0;
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var compact_1 = __importDefault(require("lodash/compact"));
var isArray_1 = __importDefault(require("lodash/isArray"));
var map_1 = __importDefault(require("lodash/map"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var filter_1 = __importDefault(require("lodash/filter"));
var keys_1 = __importDefault(require("lodash/keys"));
var isFunction_1 = __importDefault(require("lodash/isFunction"));
var isUndefined_1 = __importDefault(require("lodash/isUndefined"));
var lang_extensions_1 = require("../../lang/lang_extensions");
function defaultVisit(ctx, param) {
var childrenNames = (0, keys_1.default)(ctx);
var childrenNamesLength = childrenNames.length;
for (var i = 0; i < childrenNamesLength; i++) {
var currChildName = childrenNames[i];
var currChildArray = ctx[currChildName];
var currChildArrayLength = currChildArray.length;
for (var j = 0; j < currChildArrayLength; j++) {
var currChild = currChildArray[j];
// distinction between Tokens Children and CstNode children
if (currChild.tokenTypeIdx === undefined) {
this[currChild.name](currChild.children, param);
}
}
}
// defaultVisit does not support generic out param
}
exports.defaultVisit = defaultVisit;
function createBaseSemanticVisitorConstructor(grammarName, ruleNames) {
var derivedConstructor = function () { };
// can be overwritten according to:
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function/
// name?redirectlocale=en-US&redirectslug=JavaScript%2FReference%2FGlobal_Objects%2FFunction%2Fname
(0, lang_extensions_1.defineNameProp)(derivedConstructor, grammarName + "BaseSemantics");
var semanticProto = {
visit: function (cstNode, param) {
// enables writing more concise visitor methods when CstNode has only a single child
if ((0, isArray_1.default)(cstNode)) {
// A CST Node's children dictionary can never have empty arrays as values
// If a key is defined there will be at least one element in the corresponding value array.
cstNode = cstNode[0];
}
// enables passing optional CstNodes concisely.
if ((0, isUndefined_1.default)(cstNode)) {
return undefined;
}
return this[cstNode.name](cstNode.children, param);
},
validateVisitor: function () {
var semanticDefinitionErrors = validateVisitor(this, ruleNames);
if (!(0, isEmpty_1.default)(semanticDefinitionErrors)) {
var errorMessages = (0, map_1.default)(semanticDefinitionErrors, function (currDefError) { return currDefError.msg; });
throw Error("Errors Detected in CST Visitor <".concat(this.constructor.name, ">:\n\t") +
"".concat(errorMessages.join("\n\n").replace(/\n/g, "\n\t")));
}
}
};
derivedConstructor.prototype = semanticProto;
derivedConstructor.prototype.constructor = derivedConstructor;
derivedConstructor._RULE_NAMES = ruleNames;
return derivedConstructor;
}
exports.createBaseSemanticVisitorConstructor = createBaseSemanticVisitorConstructor;
function createBaseVisitorConstructorWithDefaults(grammarName, ruleNames, baseConstructor) {
var derivedConstructor = function () { };
// can be overwritten according to:
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function/
// name?redirectlocale=en-US&redirectslug=JavaScript%2FReference%2FGlobal_Objects%2FFunction%2Fname
(0, lang_extensions_1.defineNameProp)(derivedConstructor, grammarName + "BaseSemanticsWithDefaults");
var withDefaultsProto = Object.create(baseConstructor.prototype);
(0, forEach_1.default)(ruleNames, function (ruleName) {
withDefaultsProto[ruleName] = defaultVisit;
});
derivedConstructor.prototype = withDefaultsProto;
derivedConstructor.prototype.constructor = derivedConstructor;
return derivedConstructor;
}
exports.createBaseVisitorConstructorWithDefaults = createBaseVisitorConstructorWithDefaults;
var CstVisitorDefinitionError;
(function (CstVisitorDefinitionError) {
CstVisitorDefinitionError[CstVisitorDefinitionError["REDUNDANT_METHOD"] = 0] = "REDUNDANT_METHOD";
CstVisitorDefinitionError[CstVisitorDefinitionError["MISSING_METHOD"] = 1] = "MISSING_METHOD";
})(CstVisitorDefinitionError = exports.CstVisitorDefinitionError || (exports.CstVisitorDefinitionError = {}));
function validateVisitor(visitorInstance, ruleNames) {
var missingErrors = validateMissingCstMethods(visitorInstance, ruleNames);
return missingErrors;
}
exports.validateVisitor = validateVisitor;
function validateMissingCstMethods(visitorInstance, ruleNames) {
var missingRuleNames = (0, filter_1.default)(ruleNames, function (currRuleName) {
return (0, isFunction_1.default)(visitorInstance[currRuleName]) === false;
});
var errors = (0, map_1.default)(missingRuleNames, function (currRuleName) {
return {
msg: "Missing visitor method: <".concat(currRuleName, "> on ").concat((visitorInstance.constructor.name), " CST Visitor."),
type: CstVisitorDefinitionError.MISSING_METHOD,
methodName: currRuleName
};
});
return (0, compact_1.default)(errors);
}
exports.validateMissingCstMethods = validateMissingCstMethods;
//# sourceMappingURL=cst_visitor.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"cst_visitor.js","sourceRoot":"","sources":["../../../../src/parse/cst/cst_visitor.ts"],"names":[],"mappings":";;;;;;AAAA,2DAAoC;AACpC,2DAAoC;AACpC,2DAAoC;AACpC,mDAA4B;AAC5B,2DAAoC;AACpC,yDAAkC;AAClC,qDAA8B;AAC9B,iEAA0C;AAC1C,mEAA4C;AAC5C,8DAA2D;AAG3D,SAAgB,YAAY,CAAK,GAAQ,EAAE,KAAS;IAClD,IAAM,aAAa,GAAG,IAAA,cAAI,EAAC,GAAG,CAAC,CAAA;IAC/B,IAAM,mBAAmB,GAAG,aAAa,CAAC,MAAM,CAAA;IAChD,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,mBAAmB,EAAE,CAAC,EAAE,EAAE;QAC5C,IAAM,aAAa,GAAG,aAAa,CAAC,CAAC,CAAC,CAAA;QACtC,IAAM,cAAc,GAAG,GAAG,CAAC,aAAa,CAAC,CAAA;QACzC,IAAM,oBAAoB,GAAG,cAAc,CAAC,MAAM,CAAA;QAClD,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,oBAAoB,EAAE,CAAC,EAAE,EAAE;YAC7C,IAAM,SAAS,GAAQ,cAAc,CAAC,CAAC,CAAC,CAAA;YACxC,2DAA2D;YAC3D,IAAI,SAAS,CAAC,YAAY,KAAK,SAAS,EAAE;gBACxC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,CAAC,SAAS,CAAC,QAAQ,EAAE,KAAK,CAAC,CAAA;aAChD;SACF;KACF;IACD,kDAAkD;AACpD,CAAC;AAhBD,oCAgBC;AAED,SAAgB,oCAAoC,CAClD,WAAmB,EACnB,SAAmB;IAInB,IAAM,kBAAkB,GAAQ,cAAa,CAAC,CAAA;IAE9C,mCAAmC;IACnC,6FAA6F;IAC7F,mGAAmG;IACnG,IAAA,gCAAc,EAAC,kBAAkB,EAAE,WAAW,GAAG,eAAe,CAAC,CAAA;IAEjE,IAAM,aAAa,GAAG;QACpB,KAAK,EAAE,UAAU,OAA4B,EAAE,KAAU;YACvD,oFAAoF;YACpF,IAAI,IAAA,iBAAO,EAAC,OAAO,CAAC,EAAE;gBACpB,yEAAyE;gBACzE,2FAA2F;gBAC3F,OAAO,GAAG,OAAO,CAAC,CAAC,CAAC,CAAA;aACrB;YAED,+CAA+C;YAC/C,IAAI,IAAA,qBAAW,EAAC,OAAO,CAAC,EAAE;gBACxB,OAAO,SAAS,CAAA;aACjB;YAED,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,OAAO,CAAC,QAAQ,EAAE,KAAK,CAAC,CAAA;QACpD,CAAC;QAED,eAAe,EAAE;YACf,IAAM,wBAAwB,GAAG,eAAe,CAAC,IAAI,EAAE,SAAS,CAAC,CAAA;YACjE,IAAI,CAAC,IAAA,iBAAO,EAAC,wBAAwB,CAAC,EAAE;gBACtC,IAAM,aAAa,GAAG,IAAA,aAAG,EACvB,wBAAwB,EACxB,UAAC,YAAY,IAAK,OAAA,YAAY,CAAC,GAAG,EAAhB,CAAgB,CACnC,CAAA;gBACD,MAAM,KAAK,CACT,0CAAmC,IAAI,CAAC,WAAW,CAAC,IAAI,WAAQ;oBAC9D,UAAG,aAAa,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,OAAO,CAAC,KAAK,EAAE,MAAM,CAAC,CAAE,CACzD,CAAA;aACF;QACH,CAAC;KACF,CAAA;IAED,kBAAkB,CAAC,SAAS,GAAG,aAAa,CAAA;IAC5C,kBAAkB,CAAC,SAAS,CAAC,WAAW,GAAG,kBAAkB,CAAA;IAE7D,kBAAkB,CAAC,WAAW,GAAG,SAAS,CAAA;IAE1C,OAAO,kBAAkB,CAAA;AAC3B,CAAC;AAnDD,oFAmDC;AAED,SAAgB,wCAAwC,CACtD,WAAmB,EACnB,SAAmB,EACnB,eAAyB;IAIzB,IAAM,kBAAkB,GAAQ,cAAa,CAAC,CAAA;IAE9C,mCAAmC;IACnC,6FAA6F;IAC7F,mGAAmG;IACnG,IAAA,gCAAc,EAAC,kBAAkB,EAAE,WAAW,GAAG,2BAA2B,CAAC,CAAA;IAE7E,IAAM,iBAAiB,GAAG,MAAM,CAAC,MAAM,CAAC,eAAe,CAAC,SAAS,CAAC,CAAA;IAClE,IAAA,iBAAO,EAAC,SAAS,EAAE,UAAC,QAAQ;QAC1B,iBAAiB,CAAC,QAAQ,CAAC,GAAG,YAAY,CAAA;IAC5C,CAAC,CAAC,CAAA;IAEF,kBAAkB,CAAC,SAAS,GAAG,iBAAiB,CAAA;IAChD,kBAAkB,CAAC,SAAS,CAAC,WAAW,GAAG,kBAAkB,CAAA;IAE7D,OAAO,kBAAkB,CAAA;AAC3B,CAAC;AAvBD,4FAuBC;AAED,IAAY,yBAGX;AAHD,WAAY,yBAAyB;IACnC,iGAAgB,CAAA;IAChB,6FAAc,CAAA;AAChB,CAAC,EAHW,yBAAyB,GAAzB,iCAAyB,KAAzB,iCAAyB,QAGpC;AAQD,SAAgB,eAAe,CAC7B,eAA8C,EAC9C,SAAmB;IAEnB,IAAM,aAAa,GAAG,yBAAyB,CAAC,eAAe,EAAE,SAAS,CAAC,CAAA;IAE3E,OAAO,aAAa,CAAA;AACtB,CAAC;AAPD,0CAOC;AAED,SAAgB,yBAAyB,CACvC,eAA8C,EAC9C,SAAmB;IAEnB,IAAM,gBAAgB,GAAG,IAAA,gBAAM,EAAC,SAAS,EAAE,UAAC,YAAY;QACtD,OAAO,IAAA,oBAAU,EAAE,eAAuB,CAAC,YAAY,CAAC,CAAC,KAAK,KAAK,CAAA;IACrE,CAAC,CAAC,CAAA;IAEF,IAAM,MAAM,GAA8B,IAAA,aAAG,EAC3C,gBAAgB,EAChB,UAAC,YAAY;QACX,OAAO;YACL,GAAG,EAAE,mCAA4B,YAAY,kBAAa,CACxD,eAAe,CAAC,WAAW,CAAC,IAAI,CACjC,kBAAe;YAChB,IAAI,EAAE,yBAAyB,CAAC,cAAc;YAC9C,UAAU,EAAE,YAAY;SACzB,CAAA;IACH,CAAC,CACF,CAAA;IAED,OAAO,IAAA,iBAAO,EAA0B,MAAM,CAAC,CAAA;AACjD,CAAC;AAtBD,8DAsBC"}

View File

@@ -0,0 +1,193 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.defaultGrammarValidatorErrorProvider = exports.defaultGrammarResolverErrorProvider = exports.defaultParserErrorProvider = void 0;
var tokens_public_1 = require("../scan/tokens_public");
var first_1 = __importDefault(require("lodash/first"));
var map_1 = __importDefault(require("lodash/map"));
var reduce_1 = __importDefault(require("lodash/reduce"));
var gast_1 = require("@chevrotain/gast");
var gast_2 = require("@chevrotain/gast");
exports.defaultParserErrorProvider = {
buildMismatchTokenMessage: function (_a) {
var expected = _a.expected, actual = _a.actual, previous = _a.previous, ruleName = _a.ruleName;
var hasLabel = (0, tokens_public_1.hasTokenLabel)(expected);
var expectedMsg = hasLabel
? "--> ".concat((0, tokens_public_1.tokenLabel)(expected), " <--")
: "token of type --> ".concat(expected.name, " <--");
var msg = "Expecting ".concat(expectedMsg, " but found --> '").concat(actual.image, "' <--");
return msg;
},
buildNotAllInputParsedMessage: function (_a) {
var firstRedundant = _a.firstRedundant, ruleName = _a.ruleName;
return "Redundant input, expecting EOF but found: " + firstRedundant.image;
},
buildNoViableAltMessage: function (_a) {
var expectedPathsPerAlt = _a.expectedPathsPerAlt, actual = _a.actual, previous = _a.previous, customUserDescription = _a.customUserDescription, ruleName = _a.ruleName;
var errPrefix = "Expecting: ";
// TODO: issue: No Viable Alternative Error may have incomplete details. #502
var actualText = (0, first_1.default)(actual).image;
var errSuffix = "\nbut found: '" + actualText + "'";
if (customUserDescription) {
return errPrefix + customUserDescription + errSuffix;
}
else {
var allLookAheadPaths = (0, reduce_1.default)(expectedPathsPerAlt, function (result, currAltPaths) { return result.concat(currAltPaths); }, []);
var nextValidTokenSequences = (0, map_1.default)(allLookAheadPaths, function (currPath) {
return "[".concat((0, map_1.default)(currPath, function (currTokenType) { return (0, tokens_public_1.tokenLabel)(currTokenType); }).join(", "), "]");
});
var nextValidSequenceItems = (0, map_1.default)(nextValidTokenSequences, function (itemMsg, idx) { return " ".concat(idx + 1, ". ").concat(itemMsg); });
var calculatedDescription = "one of these possible Token sequences:\n".concat(nextValidSequenceItems.join("\n"));
return errPrefix + calculatedDescription + errSuffix;
}
},
buildEarlyExitMessage: function (_a) {
var expectedIterationPaths = _a.expectedIterationPaths, actual = _a.actual, customUserDescription = _a.customUserDescription, ruleName = _a.ruleName;
var errPrefix = "Expecting: ";
// TODO: issue: No Viable Alternative Error may have incomplete details. #502
var actualText = (0, first_1.default)(actual).image;
var errSuffix = "\nbut found: '" + actualText + "'";
if (customUserDescription) {
return errPrefix + customUserDescription + errSuffix;
}
else {
var nextValidTokenSequences = (0, map_1.default)(expectedIterationPaths, function (currPath) {
return "[".concat((0, map_1.default)(currPath, function (currTokenType) { return (0, tokens_public_1.tokenLabel)(currTokenType); }).join(","), "]");
});
var calculatedDescription = "expecting at least one iteration which starts with one of these possible Token sequences::\n " +
"<".concat(nextValidTokenSequences.join(" ,"), ">");
return errPrefix + calculatedDescription + errSuffix;
}
}
};
Object.freeze(exports.defaultParserErrorProvider);
exports.defaultGrammarResolverErrorProvider = {
buildRuleNotFoundError: function (topLevelRule, undefinedRule) {
var msg = "Invalid grammar, reference to a rule which is not defined: ->" +
undefinedRule.nonTerminalName +
"<-\n" +
"inside top level rule: ->" +
topLevelRule.name +
"<-";
return msg;
}
};
exports.defaultGrammarValidatorErrorProvider = {
buildDuplicateFoundError: function (topLevelRule, duplicateProds) {
function getExtraProductionArgument(prod) {
if (prod instanceof gast_1.Terminal) {
return prod.terminalType.name;
}
else if (prod instanceof gast_1.NonTerminal) {
return prod.nonTerminalName;
}
else {
return "";
}
}
var topLevelName = topLevelRule.name;
var duplicateProd = (0, first_1.default)(duplicateProds);
var index = duplicateProd.idx;
var dslName = (0, gast_2.getProductionDslName)(duplicateProd);
var extraArgument = getExtraProductionArgument(duplicateProd);
var hasExplicitIndex = index > 0;
var msg = "->".concat(dslName).concat(hasExplicitIndex ? index : "", "<- ").concat(extraArgument ? "with argument: ->".concat(extraArgument, "<-") : "", "\n appears more than once (").concat(duplicateProds.length, " times) in the top level rule: ->").concat(topLevelName, "<-. \n For further details see: https://chevrotain.io/docs/FAQ.html#NUMERICAL_SUFFIXES \n ");
// white space trimming time! better to trim afterwards as it allows to use WELL formatted multi line template strings...
msg = msg.replace(/[ \t]+/g, " ");
msg = msg.replace(/\s\s+/g, "\n");
return msg;
},
buildNamespaceConflictError: function (rule) {
var errMsg = "Namespace conflict found in grammar.\n" +
"The grammar has both a Terminal(Token) and a Non-Terminal(Rule) named: <".concat(rule.name, ">.\n") +
"To resolve this make sure each Terminal and Non-Terminal names are unique\n" +
"This is easy to accomplish by using the convention that Terminal names start with an uppercase letter\n" +
"and Non-Terminal names start with a lower case letter.";
return errMsg;
},
buildAlternationPrefixAmbiguityError: function (options) {
var pathMsg = (0, map_1.default)(options.prefixPath, function (currTok) {
return (0, tokens_public_1.tokenLabel)(currTok);
}).join(", ");
var occurrence = options.alternation.idx === 0 ? "" : options.alternation.idx;
var errMsg = "Ambiguous alternatives: <".concat(options.ambiguityIndices.join(" ,"), "> due to common lookahead prefix\n") +
"in <OR".concat(occurrence, "> inside <").concat(options.topLevelRule.name, "> Rule,\n") +
"<".concat(pathMsg, "> may appears as a prefix path in all these alternatives.\n") +
"See: https://chevrotain.io/docs/guide/resolving_grammar_errors.html#COMMON_PREFIX\n" +
"For Further details.";
return errMsg;
},
buildAlternationAmbiguityError: function (options) {
var pathMsg = (0, map_1.default)(options.prefixPath, function (currtok) {
return (0, tokens_public_1.tokenLabel)(currtok);
}).join(", ");
var occurrence = options.alternation.idx === 0 ? "" : options.alternation.idx;
var currMessage = "Ambiguous Alternatives Detected: <".concat(options.ambiguityIndices.join(" ,"), "> in <OR").concat(occurrence, ">") +
" inside <".concat(options.topLevelRule.name, "> Rule,\n") +
"<".concat(pathMsg, "> may appears as a prefix path in all these alternatives.\n");
currMessage =
currMessage +
"See: https://chevrotain.io/docs/guide/resolving_grammar_errors.html#AMBIGUOUS_ALTERNATIVES\n" +
"For Further details.";
return currMessage;
},
buildEmptyRepetitionError: function (options) {
var dslName = (0, gast_2.getProductionDslName)(options.repetition);
if (options.repetition.idx !== 0) {
dslName += options.repetition.idx;
}
var errMsg = "The repetition <".concat(dslName, "> within Rule <").concat(options.topLevelRule.name, "> can never consume any tokens.\n") +
"This could lead to an infinite loop.";
return errMsg;
},
// TODO: remove - `errors_public` from nyc.config.js exclude
// once this method is fully removed from this file
buildTokenNameError: function (options) {
/* istanbul ignore next */
return "deprecated";
},
buildEmptyAlternationError: function (options) {
var errMsg = "Ambiguous empty alternative: <".concat(options.emptyChoiceIdx + 1, ">") +
" in <OR".concat(options.alternation.idx, "> inside <").concat(options.topLevelRule.name, "> Rule.\n") +
"Only the last alternative may be an empty alternative.";
return errMsg;
},
buildTooManyAlternativesError: function (options) {
var errMsg = "An Alternation cannot have more than 256 alternatives:\n" +
"<OR".concat(options.alternation.idx, "> inside <").concat(options.topLevelRule.name, "> Rule.\n has ").concat(options.alternation.definition.length + 1, " alternatives.");
return errMsg;
},
buildLeftRecursionError: function (options) {
var ruleName = options.topLevelRule.name;
var pathNames = (0, map_1.default)(options.leftRecursionPath, function (currRule) { return currRule.name; });
var leftRecursivePath = "".concat(ruleName, " --> ").concat(pathNames
.concat([ruleName])
.join(" --> "));
var errMsg = "Left Recursion found in grammar.\n" +
"rule: <".concat(ruleName, "> can be invoked from itself (directly or indirectly)\n") +
"without consuming any Tokens. The grammar path that causes this is: \n ".concat(leftRecursivePath, "\n") +
" To fix this refactor your grammar to remove the left recursion.\n" +
"see: https://en.wikipedia.org/wiki/LL_parser#Left_factoring.";
return errMsg;
},
// TODO: remove - `errors_public` from nyc.config.js exclude
// once this method is fully removed from this file
buildInvalidRuleNameError: function (options) {
/* istanbul ignore next */
return "deprecated";
},
buildDuplicateRuleNameError: function (options) {
var ruleName;
if (options.topLevelRule instanceof gast_1.Rule) {
ruleName = options.topLevelRule.name;
}
else {
ruleName = options.topLevelRule;
}
var errMsg = "Duplicate definition, rule: ->".concat(ruleName, "<- is already defined in the grammar: ->").concat(options.grammarName, "<-");
return errMsg;
}
};
//# sourceMappingURL=errors_public.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,100 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.EarlyExitException = exports.NotAllInputParsedException = exports.NoViableAltException = exports.MismatchedTokenException = exports.isRecognitionException = void 0;
var includes_1 = __importDefault(require("lodash/includes"));
var MISMATCHED_TOKEN_EXCEPTION = "MismatchedTokenException";
var NO_VIABLE_ALT_EXCEPTION = "NoViableAltException";
var EARLY_EXIT_EXCEPTION = "EarlyExitException";
var NOT_ALL_INPUT_PARSED_EXCEPTION = "NotAllInputParsedException";
var RECOGNITION_EXCEPTION_NAMES = [
MISMATCHED_TOKEN_EXCEPTION,
NO_VIABLE_ALT_EXCEPTION,
EARLY_EXIT_EXCEPTION,
NOT_ALL_INPUT_PARSED_EXCEPTION
];
Object.freeze(RECOGNITION_EXCEPTION_NAMES);
// hacks to bypass no support for custom Errors in javascript/typescript
function isRecognitionException(error) {
// can't do instanceof on hacked custom js exceptions
return (0, includes_1.default)(RECOGNITION_EXCEPTION_NAMES, error.name);
}
exports.isRecognitionException = isRecognitionException;
var RecognitionException = /** @class */ (function (_super) {
__extends(RecognitionException, _super);
function RecognitionException(message, token) {
var _newTarget = this.constructor;
var _this = _super.call(this, message) || this;
_this.token = token;
_this.resyncedTokens = [];
// fix prototype chain when typescript target is ES5
Object.setPrototypeOf(_this, _newTarget.prototype);
/* istanbul ignore next - V8 workaround to remove constructor from stacktrace when typescript target is ES5 */
if (Error.captureStackTrace) {
Error.captureStackTrace(_this, _this.constructor);
}
return _this;
}
return RecognitionException;
}(Error));
var MismatchedTokenException = /** @class */ (function (_super) {
__extends(MismatchedTokenException, _super);
function MismatchedTokenException(message, token, previousToken) {
var _this = _super.call(this, message, token) || this;
_this.previousToken = previousToken;
_this.name = MISMATCHED_TOKEN_EXCEPTION;
return _this;
}
return MismatchedTokenException;
}(RecognitionException));
exports.MismatchedTokenException = MismatchedTokenException;
var NoViableAltException = /** @class */ (function (_super) {
__extends(NoViableAltException, _super);
function NoViableAltException(message, token, previousToken) {
var _this = _super.call(this, message, token) || this;
_this.previousToken = previousToken;
_this.name = NO_VIABLE_ALT_EXCEPTION;
return _this;
}
return NoViableAltException;
}(RecognitionException));
exports.NoViableAltException = NoViableAltException;
var NotAllInputParsedException = /** @class */ (function (_super) {
__extends(NotAllInputParsedException, _super);
function NotAllInputParsedException(message, token) {
var _this = _super.call(this, message, token) || this;
_this.name = NOT_ALL_INPUT_PARSED_EXCEPTION;
return _this;
}
return NotAllInputParsedException;
}(RecognitionException));
exports.NotAllInputParsedException = NotAllInputParsedException;
var EarlyExitException = /** @class */ (function (_super) {
__extends(EarlyExitException, _super);
function EarlyExitException(message, token, previousToken) {
var _this = _super.call(this, message, token) || this;
_this.previousToken = previousToken;
_this.name = EARLY_EXIT_EXCEPTION;
return _this;
}
return EarlyExitException;
}(RecognitionException));
exports.EarlyExitException = EarlyExitException;
//# sourceMappingURL=exceptions_public.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"exceptions_public.js","sourceRoot":"","sources":["../../../src/parse/exceptions_public.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;AAAA,6DAAsC;AAOtC,IAAM,0BAA0B,GAAG,0BAA0B,CAAA;AAC7D,IAAM,uBAAuB,GAAG,sBAAsB,CAAA;AACtD,IAAM,oBAAoB,GAAG,oBAAoB,CAAA;AACjD,IAAM,8BAA8B,GAAG,4BAA4B,CAAA;AAEnE,IAAM,2BAA2B,GAAG;IAClC,0BAA0B;IAC1B,uBAAuB;IACvB,oBAAoB;IACpB,8BAA8B;CAC/B,CAAA;AAED,MAAM,CAAC,MAAM,CAAC,2BAA2B,CAAC,CAAA;AAE1C,wEAAwE;AACxE,SAAgB,sBAAsB,CAAC,KAAY;IACjD,qDAAqD;IACrD,OAAO,IAAA,kBAAQ,EAAC,2BAA2B,EAAE,KAAK,CAAC,IAAI,CAAC,CAAA;AAC1D,CAAC;AAHD,wDAGC;AAED;IACU,wCAAK;IAMb,8BAAsB,OAAe,EAAS,KAAa;;QAA3D,YACE,kBAAM,OAAO,CAAC,SASf;QAV6C,WAAK,GAAL,KAAK,CAAQ;QAF3D,oBAAc,GAAa,EAAE,CAAA;QAK3B,oDAAoD;QACpD,MAAM,CAAC,cAAc,CAAC,KAAI,EAAE,WAAW,SAAS,CAAC,CAAA;QAEjD,8GAA8G;QAC9G,IAAI,KAAK,CAAC,iBAAiB,EAAE;YAC3B,KAAK,CAAC,iBAAiB,CAAC,KAAI,EAAE,KAAI,CAAC,WAAW,CAAC,CAAA;SAChD;;IACH,CAAC;IACH,2BAAC;AAAD,CAAC,AAlBD,CACU,KAAK,GAiBd;AAED;IAA8C,4CAAoB;IAChE,kCAAY,OAAe,EAAE,KAAa,EAAS,aAAqB;QAAxE,YACE,kBAAM,OAAO,EAAE,KAAK,CAAC,SAEtB;QAHkD,mBAAa,GAAb,aAAa,CAAQ;QAEtE,KAAI,CAAC,IAAI,GAAG,0BAA0B,CAAA;;IACxC,CAAC;IACH,+BAAC;AAAD,CAAC,AALD,CAA8C,oBAAoB,GAKjE;AALY,4DAAwB;AAOrC;IAA0C,wCAAoB;IAC5D,8BAAY,OAAe,EAAE,KAAa,EAAS,aAAqB;QAAxE,YACE,kBAAM,OAAO,EAAE,KAAK,CAAC,SAEtB;QAHkD,mBAAa,GAAb,aAAa,CAAQ;QAEtE,KAAI,CAAC,IAAI,GAAG,uBAAuB,CAAA;;IACrC,CAAC;IACH,2BAAC;AAAD,CAAC,AALD,CAA0C,oBAAoB,GAK7D;AALY,oDAAoB;AAOjC;IAAgD,8CAAoB;IAClE,oCAAY,OAAe,EAAE,KAAa;QAA1C,YACE,kBAAM,OAAO,EAAE,KAAK,CAAC,SAEtB;QADC,KAAI,CAAC,IAAI,GAAG,8BAA8B,CAAA;;IAC5C,CAAC;IACH,iCAAC;AAAD,CAAC,AALD,CAAgD,oBAAoB,GAKnE;AALY,gEAA0B;AAOvC;IAAwC,sCAAoB;IAC1D,4BAAY,OAAe,EAAE,KAAa,EAAS,aAAqB;QAAxE,YACE,kBAAM,OAAO,EAAE,KAAK,CAAC,SAEtB;QAHkD,mBAAa,GAAb,aAAa,CAAQ;QAEtE,KAAI,CAAC,IAAI,GAAG,oBAAoB,CAAA;;IAClC,CAAC;IACH,yBAAC;AAAD,CAAC,AALD,CAAwC,oBAAoB,GAK3D;AALY,gDAAkB"}

View File

@@ -0,0 +1,516 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __assign = (this && this.__assign) || function () {
__assign = Object.assign || function(t) {
for (var s, i = 1, n = arguments.length; i < n; i++) {
s = arguments[i];
for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p))
t[p] = s[p];
}
return t;
};
return __assign.apply(this, arguments);
};
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.checkPrefixAlternativesAmbiguities = exports.validateSomeNonEmptyLookaheadPath = exports.validateTooManyAlts = exports.RepetitionCollector = exports.validateAmbiguousAlternationAlternatives = exports.validateEmptyOrAlternative = exports.getFirstNoneTerminal = exports.validateNoLeftRecursion = exports.validateRuleIsOverridden = exports.validateRuleDoesNotAlreadyExist = exports.OccurrenceValidationCollector = exports.identifyProductionForDuplicates = exports.validateGrammar = exports.validateLookahead = void 0;
var first_1 = __importDefault(require("lodash/first"));
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var drop_1 = __importDefault(require("lodash/drop"));
var flatten_1 = __importDefault(require("lodash/flatten"));
var filter_1 = __importDefault(require("lodash/filter"));
var reject_1 = __importDefault(require("lodash/reject"));
var difference_1 = __importDefault(require("lodash/difference"));
var map_1 = __importDefault(require("lodash/map"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var groupBy_1 = __importDefault(require("lodash/groupBy"));
var reduce_1 = __importDefault(require("lodash/reduce"));
var pickBy_1 = __importDefault(require("lodash/pickBy"));
var values_1 = __importDefault(require("lodash/values"));
var includes_1 = __importDefault(require("lodash/includes"));
var flatMap_1 = __importDefault(require("lodash/flatMap"));
var clone_1 = __importDefault(require("lodash/clone"));
var parser_1 = require("../parser/parser");
var gast_1 = require("@chevrotain/gast");
var lookahead_1 = require("./lookahead");
var interpreter_1 = require("./interpreter");
var gast_2 = require("@chevrotain/gast");
var gast_3 = require("@chevrotain/gast");
var dropRight_1 = __importDefault(require("lodash/dropRight"));
var compact_1 = __importDefault(require("lodash/compact"));
var tokens_1 = require("../../scan/tokens");
function validateLookahead(options) {
var lookaheadValidationErrorMessages = options.lookaheadStrategy.validate({
rules: options.rules,
tokenTypes: options.tokenTypes,
grammarName: options.grammarName
});
return (0, map_1.default)(lookaheadValidationErrorMessages, function (errorMessage) { return (__assign({ type: parser_1.ParserDefinitionErrorType.CUSTOM_LOOKAHEAD_VALIDATION }, errorMessage)); });
}
exports.validateLookahead = validateLookahead;
function validateGrammar(topLevels, tokenTypes, errMsgProvider, grammarName) {
var duplicateErrors = (0, flatMap_1.default)(topLevels, function (currTopLevel) { return validateDuplicateProductions(currTopLevel, errMsgProvider); });
var termsNamespaceConflictErrors = checkTerminalAndNoneTerminalsNameSpace(topLevels, tokenTypes, errMsgProvider);
var tooManyAltsErrors = (0, flatMap_1.default)(topLevels, function (curRule) {
return validateTooManyAlts(curRule, errMsgProvider);
});
var duplicateRulesError = (0, flatMap_1.default)(topLevels, function (curRule) {
return validateRuleDoesNotAlreadyExist(curRule, topLevels, grammarName, errMsgProvider);
});
return duplicateErrors.concat(termsNamespaceConflictErrors, tooManyAltsErrors, duplicateRulesError);
}
exports.validateGrammar = validateGrammar;
function validateDuplicateProductions(topLevelRule, errMsgProvider) {
var collectorVisitor = new OccurrenceValidationCollector();
topLevelRule.accept(collectorVisitor);
var allRuleProductions = collectorVisitor.allProductions;
var productionGroups = (0, groupBy_1.default)(allRuleProductions, identifyProductionForDuplicates);
var duplicates = (0, pickBy_1.default)(productionGroups, function (currGroup) {
return currGroup.length > 1;
});
var errors = (0, map_1.default)((0, values_1.default)(duplicates), function (currDuplicates) {
var firstProd = (0, first_1.default)(currDuplicates);
var msg = errMsgProvider.buildDuplicateFoundError(topLevelRule, currDuplicates);
var dslName = (0, gast_1.getProductionDslName)(firstProd);
var defError = {
message: msg,
type: parser_1.ParserDefinitionErrorType.DUPLICATE_PRODUCTIONS,
ruleName: topLevelRule.name,
dslName: dslName,
occurrence: firstProd.idx
};
var param = getExtraProductionArgument(firstProd);
if (param) {
defError.parameter = param;
}
return defError;
});
return errors;
}
function identifyProductionForDuplicates(prod) {
return "".concat((0, gast_1.getProductionDslName)(prod), "_#_").concat(prod.idx, "_#_").concat(getExtraProductionArgument(prod));
}
exports.identifyProductionForDuplicates = identifyProductionForDuplicates;
function getExtraProductionArgument(prod) {
if (prod instanceof gast_2.Terminal) {
return prod.terminalType.name;
}
else if (prod instanceof gast_2.NonTerminal) {
return prod.nonTerminalName;
}
else {
return "";
}
}
var OccurrenceValidationCollector = /** @class */ (function (_super) {
__extends(OccurrenceValidationCollector, _super);
function OccurrenceValidationCollector() {
var _this = _super !== null && _super.apply(this, arguments) || this;
_this.allProductions = [];
return _this;
}
OccurrenceValidationCollector.prototype.visitNonTerminal = function (subrule) {
this.allProductions.push(subrule);
};
OccurrenceValidationCollector.prototype.visitOption = function (option) {
this.allProductions.push(option);
};
OccurrenceValidationCollector.prototype.visitRepetitionWithSeparator = function (manySep) {
this.allProductions.push(manySep);
};
OccurrenceValidationCollector.prototype.visitRepetitionMandatory = function (atLeastOne) {
this.allProductions.push(atLeastOne);
};
OccurrenceValidationCollector.prototype.visitRepetitionMandatoryWithSeparator = function (atLeastOneSep) {
this.allProductions.push(atLeastOneSep);
};
OccurrenceValidationCollector.prototype.visitRepetition = function (many) {
this.allProductions.push(many);
};
OccurrenceValidationCollector.prototype.visitAlternation = function (or) {
this.allProductions.push(or);
};
OccurrenceValidationCollector.prototype.visitTerminal = function (terminal) {
this.allProductions.push(terminal);
};
return OccurrenceValidationCollector;
}(gast_3.GAstVisitor));
exports.OccurrenceValidationCollector = OccurrenceValidationCollector;
function validateRuleDoesNotAlreadyExist(rule, allRules, className, errMsgProvider) {
var errors = [];
var occurrences = (0, reduce_1.default)(allRules, function (result, curRule) {
if (curRule.name === rule.name) {
return result + 1;
}
return result;
}, 0);
if (occurrences > 1) {
var errMsg = errMsgProvider.buildDuplicateRuleNameError({
topLevelRule: rule,
grammarName: className
});
errors.push({
message: errMsg,
type: parser_1.ParserDefinitionErrorType.DUPLICATE_RULE_NAME,
ruleName: rule.name
});
}
return errors;
}
exports.validateRuleDoesNotAlreadyExist = validateRuleDoesNotAlreadyExist;
// TODO: is there anyway to get only the rule names of rules inherited from the super grammars?
// This is not part of the IGrammarErrorProvider because the validation cannot be performed on
// The grammar structure, only at runtime.
function validateRuleIsOverridden(ruleName, definedRulesNames, className) {
var errors = [];
var errMsg;
if (!(0, includes_1.default)(definedRulesNames, ruleName)) {
errMsg =
"Invalid rule override, rule: ->".concat(ruleName, "<- cannot be overridden in the grammar: ->").concat(className, "<-") +
"as it is not defined in any of the super grammars ";
errors.push({
message: errMsg,
type: parser_1.ParserDefinitionErrorType.INVALID_RULE_OVERRIDE,
ruleName: ruleName
});
}
return errors;
}
exports.validateRuleIsOverridden = validateRuleIsOverridden;
function validateNoLeftRecursion(topRule, currRule, errMsgProvider, path) {
if (path === void 0) { path = []; }
var errors = [];
var nextNonTerminals = getFirstNoneTerminal(currRule.definition);
if ((0, isEmpty_1.default)(nextNonTerminals)) {
return [];
}
else {
var ruleName = topRule.name;
var foundLeftRecursion = (0, includes_1.default)(nextNonTerminals, topRule);
if (foundLeftRecursion) {
errors.push({
message: errMsgProvider.buildLeftRecursionError({
topLevelRule: topRule,
leftRecursionPath: path
}),
type: parser_1.ParserDefinitionErrorType.LEFT_RECURSION,
ruleName: ruleName
});
}
// we are only looking for cyclic paths leading back to the specific topRule
// other cyclic paths are ignored, we still need this difference to avoid infinite loops...
var validNextSteps = (0, difference_1.default)(nextNonTerminals, path.concat([topRule]));
var errorsFromNextSteps = (0, flatMap_1.default)(validNextSteps, function (currRefRule) {
var newPath = (0, clone_1.default)(path);
newPath.push(currRefRule);
return validateNoLeftRecursion(topRule, currRefRule, errMsgProvider, newPath);
});
return errors.concat(errorsFromNextSteps);
}
}
exports.validateNoLeftRecursion = validateNoLeftRecursion;
function getFirstNoneTerminal(definition) {
var result = [];
if ((0, isEmpty_1.default)(definition)) {
return result;
}
var firstProd = (0, first_1.default)(definition);
/* istanbul ignore else */
if (firstProd instanceof gast_2.NonTerminal) {
result.push(firstProd.referencedRule);
}
else if (firstProd instanceof gast_2.Alternative ||
firstProd instanceof gast_2.Option ||
firstProd instanceof gast_2.RepetitionMandatory ||
firstProd instanceof gast_2.RepetitionMandatoryWithSeparator ||
firstProd instanceof gast_2.RepetitionWithSeparator ||
firstProd instanceof gast_2.Repetition) {
result = result.concat(getFirstNoneTerminal(firstProd.definition));
}
else if (firstProd instanceof gast_2.Alternation) {
// each sub definition in alternation is a FLAT
result = (0, flatten_1.default)((0, map_1.default)(firstProd.definition, function (currSubDef) {
return getFirstNoneTerminal(currSubDef.definition);
}));
}
else if (firstProd instanceof gast_2.Terminal) {
// nothing to see, move along
}
else {
throw Error("non exhaustive match");
}
var isFirstOptional = (0, gast_1.isOptionalProd)(firstProd);
var hasMore = definition.length > 1;
if (isFirstOptional && hasMore) {
var rest = (0, drop_1.default)(definition);
return result.concat(getFirstNoneTerminal(rest));
}
else {
return result;
}
}
exports.getFirstNoneTerminal = getFirstNoneTerminal;
var OrCollector = /** @class */ (function (_super) {
__extends(OrCollector, _super);
function OrCollector() {
var _this = _super !== null && _super.apply(this, arguments) || this;
_this.alternations = [];
return _this;
}
OrCollector.prototype.visitAlternation = function (node) {
this.alternations.push(node);
};
return OrCollector;
}(gast_3.GAstVisitor));
function validateEmptyOrAlternative(topLevelRule, errMsgProvider) {
var orCollector = new OrCollector();
topLevelRule.accept(orCollector);
var ors = orCollector.alternations;
var errors = (0, flatMap_1.default)(ors, function (currOr) {
var exceptLast = (0, dropRight_1.default)(currOr.definition);
return (0, flatMap_1.default)(exceptLast, function (currAlternative, currAltIdx) {
var possibleFirstInAlt = (0, interpreter_1.nextPossibleTokensAfter)([currAlternative], [], tokens_1.tokenStructuredMatcher, 1);
if ((0, isEmpty_1.default)(possibleFirstInAlt)) {
return [
{
message: errMsgProvider.buildEmptyAlternationError({
topLevelRule: topLevelRule,
alternation: currOr,
emptyChoiceIdx: currAltIdx
}),
type: parser_1.ParserDefinitionErrorType.NONE_LAST_EMPTY_ALT,
ruleName: topLevelRule.name,
occurrence: currOr.idx,
alternative: currAltIdx + 1
}
];
}
else {
return [];
}
});
});
return errors;
}
exports.validateEmptyOrAlternative = validateEmptyOrAlternative;
function validateAmbiguousAlternationAlternatives(topLevelRule, globalMaxLookahead, errMsgProvider) {
var orCollector = new OrCollector();
topLevelRule.accept(orCollector);
var ors = orCollector.alternations;
// New Handling of ignoring ambiguities
// - https://github.com/chevrotain/chevrotain/issues/869
ors = (0, reject_1.default)(ors, function (currOr) { return currOr.ignoreAmbiguities === true; });
var errors = (0, flatMap_1.default)(ors, function (currOr) {
var currOccurrence = currOr.idx;
var actualMaxLookahead = currOr.maxLookahead || globalMaxLookahead;
var alternatives = (0, lookahead_1.getLookaheadPathsForOr)(currOccurrence, topLevelRule, actualMaxLookahead, currOr);
var altsAmbiguityErrors = checkAlternativesAmbiguities(alternatives, currOr, topLevelRule, errMsgProvider);
var altsPrefixAmbiguityErrors = checkPrefixAlternativesAmbiguities(alternatives, currOr, topLevelRule, errMsgProvider);
return altsAmbiguityErrors.concat(altsPrefixAmbiguityErrors);
});
return errors;
}
exports.validateAmbiguousAlternationAlternatives = validateAmbiguousAlternationAlternatives;
var RepetitionCollector = /** @class */ (function (_super) {
__extends(RepetitionCollector, _super);
function RepetitionCollector() {
var _this = _super !== null && _super.apply(this, arguments) || this;
_this.allProductions = [];
return _this;
}
RepetitionCollector.prototype.visitRepetitionWithSeparator = function (manySep) {
this.allProductions.push(manySep);
};
RepetitionCollector.prototype.visitRepetitionMandatory = function (atLeastOne) {
this.allProductions.push(atLeastOne);
};
RepetitionCollector.prototype.visitRepetitionMandatoryWithSeparator = function (atLeastOneSep) {
this.allProductions.push(atLeastOneSep);
};
RepetitionCollector.prototype.visitRepetition = function (many) {
this.allProductions.push(many);
};
return RepetitionCollector;
}(gast_3.GAstVisitor));
exports.RepetitionCollector = RepetitionCollector;
function validateTooManyAlts(topLevelRule, errMsgProvider) {
var orCollector = new OrCollector();
topLevelRule.accept(orCollector);
var ors = orCollector.alternations;
var errors = (0, flatMap_1.default)(ors, function (currOr) {
if (currOr.definition.length > 255) {
return [
{
message: errMsgProvider.buildTooManyAlternativesError({
topLevelRule: topLevelRule,
alternation: currOr
}),
type: parser_1.ParserDefinitionErrorType.TOO_MANY_ALTS,
ruleName: topLevelRule.name,
occurrence: currOr.idx
}
];
}
else {
return [];
}
});
return errors;
}
exports.validateTooManyAlts = validateTooManyAlts;
function validateSomeNonEmptyLookaheadPath(topLevelRules, maxLookahead, errMsgProvider) {
var errors = [];
(0, forEach_1.default)(topLevelRules, function (currTopRule) {
var collectorVisitor = new RepetitionCollector();
currTopRule.accept(collectorVisitor);
var allRuleProductions = collectorVisitor.allProductions;
(0, forEach_1.default)(allRuleProductions, function (currProd) {
var prodType = (0, lookahead_1.getProdType)(currProd);
var actualMaxLookahead = currProd.maxLookahead || maxLookahead;
var currOccurrence = currProd.idx;
var paths = (0, lookahead_1.getLookaheadPathsForOptionalProd)(currOccurrence, currTopRule, prodType, actualMaxLookahead);
var pathsInsideProduction = paths[0];
if ((0, isEmpty_1.default)((0, flatten_1.default)(pathsInsideProduction))) {
var errMsg = errMsgProvider.buildEmptyRepetitionError({
topLevelRule: currTopRule,
repetition: currProd
});
errors.push({
message: errMsg,
type: parser_1.ParserDefinitionErrorType.NO_NON_EMPTY_LOOKAHEAD,
ruleName: currTopRule.name
});
}
});
});
return errors;
}
exports.validateSomeNonEmptyLookaheadPath = validateSomeNonEmptyLookaheadPath;
function checkAlternativesAmbiguities(alternatives, alternation, rule, errMsgProvider) {
var foundAmbiguousPaths = [];
var identicalAmbiguities = (0, reduce_1.default)(alternatives, function (result, currAlt, currAltIdx) {
// ignore (skip) ambiguities with this alternative
if (alternation.definition[currAltIdx].ignoreAmbiguities === true) {
return result;
}
(0, forEach_1.default)(currAlt, function (currPath) {
var altsCurrPathAppearsIn = [currAltIdx];
(0, forEach_1.default)(alternatives, function (currOtherAlt, currOtherAltIdx) {
if (currAltIdx !== currOtherAltIdx &&
(0, lookahead_1.containsPath)(currOtherAlt, currPath) &&
// ignore (skip) ambiguities with this "other" alternative
alternation.definition[currOtherAltIdx].ignoreAmbiguities !== true) {
altsCurrPathAppearsIn.push(currOtherAltIdx);
}
});
if (altsCurrPathAppearsIn.length > 1 &&
!(0, lookahead_1.containsPath)(foundAmbiguousPaths, currPath)) {
foundAmbiguousPaths.push(currPath);
result.push({
alts: altsCurrPathAppearsIn,
path: currPath
});
}
});
return result;
}, []);
var currErrors = (0, map_1.default)(identicalAmbiguities, function (currAmbDescriptor) {
var ambgIndices = (0, map_1.default)(currAmbDescriptor.alts, function (currAltIdx) { return currAltIdx + 1; });
var currMessage = errMsgProvider.buildAlternationAmbiguityError({
topLevelRule: rule,
alternation: alternation,
ambiguityIndices: ambgIndices,
prefixPath: currAmbDescriptor.path
});
return {
message: currMessage,
type: parser_1.ParserDefinitionErrorType.AMBIGUOUS_ALTS,
ruleName: rule.name,
occurrence: alternation.idx,
alternatives: currAmbDescriptor.alts
};
});
return currErrors;
}
function checkPrefixAlternativesAmbiguities(alternatives, alternation, rule, errMsgProvider) {
// flatten
var pathsAndIndices = (0, reduce_1.default)(alternatives, function (result, currAlt, idx) {
var currPathsAndIdx = (0, map_1.default)(currAlt, function (currPath) {
return { idx: idx, path: currPath };
});
return result.concat(currPathsAndIdx);
}, []);
var errors = (0, compact_1.default)((0, flatMap_1.default)(pathsAndIndices, function (currPathAndIdx) {
var alternativeGast = alternation.definition[currPathAndIdx.idx];
// ignore (skip) ambiguities with this alternative
if (alternativeGast.ignoreAmbiguities === true) {
return [];
}
var targetIdx = currPathAndIdx.idx;
var targetPath = currPathAndIdx.path;
var prefixAmbiguitiesPathsAndIndices = (0, filter_1.default)(pathsAndIndices, function (searchPathAndIdx) {
// prefix ambiguity can only be created from lower idx (higher priority) path
return (
// ignore (skip) ambiguities with this "other" alternative
alternation.definition[searchPathAndIdx.idx].ignoreAmbiguities !==
true &&
searchPathAndIdx.idx < targetIdx &&
// checking for strict prefix because identical lookaheads
// will be be detected using a different validation.
(0, lookahead_1.isStrictPrefixOfPath)(searchPathAndIdx.path, targetPath));
});
var currPathPrefixErrors = (0, map_1.default)(prefixAmbiguitiesPathsAndIndices, function (currAmbPathAndIdx) {
var ambgIndices = [currAmbPathAndIdx.idx + 1, targetIdx + 1];
var occurrence = alternation.idx === 0 ? "" : alternation.idx;
var message = errMsgProvider.buildAlternationPrefixAmbiguityError({
topLevelRule: rule,
alternation: alternation,
ambiguityIndices: ambgIndices,
prefixPath: currAmbPathAndIdx.path
});
return {
message: message,
type: parser_1.ParserDefinitionErrorType.AMBIGUOUS_PREFIX_ALTS,
ruleName: rule.name,
occurrence: occurrence,
alternatives: ambgIndices
};
});
return currPathPrefixErrors;
}));
return errors;
}
exports.checkPrefixAlternativesAmbiguities = checkPrefixAlternativesAmbiguities;
function checkTerminalAndNoneTerminalsNameSpace(topLevels, tokenTypes, errMsgProvider) {
var errors = [];
var tokenNames = (0, map_1.default)(tokenTypes, function (currToken) { return currToken.name; });
(0, forEach_1.default)(topLevels, function (currRule) {
var currRuleName = currRule.name;
if ((0, includes_1.default)(tokenNames, currRuleName)) {
var errMsg = errMsgProvider.buildNamespaceConflictError(currRule);
errors.push({
message: errMsg,
type: parser_1.ParserDefinitionErrorType.CONFLICT_TOKENS_RULES_NAMESPACE,
ruleName: currRuleName
});
}
});
return errors;
}
//# sourceMappingURL=checks.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,69 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.firstForTerminal = exports.firstForBranching = exports.firstForSequence = exports.first = void 0;
var flatten_1 = __importDefault(require("lodash/flatten"));
var uniq_1 = __importDefault(require("lodash/uniq"));
var map_1 = __importDefault(require("lodash/map"));
var gast_1 = require("@chevrotain/gast");
var gast_2 = require("@chevrotain/gast");
function first(prod) {
/* istanbul ignore else */
if (prod instanceof gast_1.NonTerminal) {
// this could in theory cause infinite loops if
// (1) prod A refs prod B.
// (2) prod B refs prod A
// (3) AB can match the empty set
// in other words a cycle where everything is optional so the first will keep
// looking ahead for the next optional part and will never exit
// currently there is no safeguard for this unique edge case because
// (1) not sure a grammar in which this can happen is useful for anything (productive)
return first(prod.referencedRule);
}
else if (prod instanceof gast_1.Terminal) {
return firstForTerminal(prod);
}
else if ((0, gast_2.isSequenceProd)(prod)) {
return firstForSequence(prod);
}
else if ((0, gast_2.isBranchingProd)(prod)) {
return firstForBranching(prod);
}
else {
throw Error("non exhaustive match");
}
}
exports.first = first;
function firstForSequence(prod) {
var firstSet = [];
var seq = prod.definition;
var nextSubProdIdx = 0;
var hasInnerProdsRemaining = seq.length > nextSubProdIdx;
var currSubProd;
// so we enter the loop at least once (if the definition is not empty
var isLastInnerProdOptional = true;
// scan a sequence until it's end or until we have found a NONE optional production in it
while (hasInnerProdsRemaining && isLastInnerProdOptional) {
currSubProd = seq[nextSubProdIdx];
isLastInnerProdOptional = (0, gast_2.isOptionalProd)(currSubProd);
firstSet = firstSet.concat(first(currSubProd));
nextSubProdIdx = nextSubProdIdx + 1;
hasInnerProdsRemaining = seq.length > nextSubProdIdx;
}
return (0, uniq_1.default)(firstSet);
}
exports.firstForSequence = firstForSequence;
function firstForBranching(prod) {
var allAlternativesFirsts = (0, map_1.default)(prod.definition, function (innerProd) {
return first(innerProd);
});
return (0, uniq_1.default)((0, flatten_1.default)(allAlternativesFirsts));
}
exports.firstForBranching = firstForBranching;
function firstForTerminal(terminal) {
return [terminal.terminalType];
}
exports.firstForTerminal = firstForTerminal;
//# sourceMappingURL=first.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"first.js","sourceRoot":"","sources":["../../../../src/parse/grammar/first.ts"],"names":[],"mappings":";;;;;;AAAA,2DAAoC;AACpC,qDAA8B;AAC9B,mDAA4B;AAC5B,yCAAwD;AACxD,yCAIyB;AAGzB,SAAgB,KAAK,CAAC,IAAiB;IACrC,0BAA0B;IAC1B,IAAI,IAAI,YAAY,kBAAW,EAAE;QAC/B,+CAA+C;QAC/C,0BAA0B;QAC1B,yBAAyB;QACzB,iCAAiC;QACjC,6EAA6E;QAC7E,+DAA+D;QAC/D,oEAAoE;QACpE,sFAAsF;QACtF,OAAO,KAAK,CAAe,IAAK,CAAC,cAAc,CAAC,CAAA;KACjD;SAAM,IAAI,IAAI,YAAY,eAAQ,EAAE;QACnC,OAAO,gBAAgB,CAAW,IAAI,CAAC,CAAA;KACxC;SAAM,IAAI,IAAA,qBAAc,EAAC,IAAI,CAAC,EAAE;QAC/B,OAAO,gBAAgB,CAAC,IAAI,CAAC,CAAA;KAC9B;SAAM,IAAI,IAAA,sBAAe,EAAC,IAAI,CAAC,EAAE;QAChC,OAAO,iBAAiB,CAAC,IAAI,CAAC,CAAA;KAC/B;SAAM;QACL,MAAM,KAAK,CAAC,sBAAsB,CAAC,CAAA;KACpC;AACH,CAAC;AArBD,sBAqBC;AAED,SAAgB,gBAAgB,CAAC,IAEhC;IACC,IAAI,QAAQ,GAAgB,EAAE,CAAA;IAC9B,IAAM,GAAG,GAAG,IAAI,CAAC,UAAU,CAAA;IAC3B,IAAI,cAAc,GAAG,CAAC,CAAA;IACtB,IAAI,sBAAsB,GAAG,GAAG,CAAC,MAAM,GAAG,cAAc,CAAA;IACxD,IAAI,WAAW,CAAA;IACf,qEAAqE;IACrE,IAAI,uBAAuB,GAAG,IAAI,CAAA;IAClC,yFAAyF;IACzF,OAAO,sBAAsB,IAAI,uBAAuB,EAAE;QACxD,WAAW,GAAG,GAAG,CAAC,cAAc,CAAC,CAAA;QACjC,uBAAuB,GAAG,IAAA,qBAAc,EAAC,WAAW,CAAC,CAAA;QACrD,QAAQ,GAAG,QAAQ,CAAC,MAAM,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC,CAAA;QAC9C,cAAc,GAAG,cAAc,GAAG,CAAC,CAAA;QACnC,sBAAsB,GAAG,GAAG,CAAC,MAAM,GAAG,cAAc,CAAA;KACrD;IAED,OAAO,IAAA,cAAI,EAAC,QAAQ,CAAC,CAAA;AACvB,CAAC;AApBD,4CAoBC;AAED,SAAgB,iBAAiB,CAAC,IAEjC;IACC,IAAM,qBAAqB,GAAkB,IAAA,aAAG,EAC9C,IAAI,CAAC,UAAU,EACf,UAAC,SAAS;QACR,OAAO,KAAK,CAAC,SAAS,CAAC,CAAA;IACzB,CAAC,CACF,CAAA;IACD,OAAO,IAAA,cAAI,EAAC,IAAA,iBAAO,EAAY,qBAAqB,CAAC,CAAC,CAAA;AACxD,CAAC;AAVD,8CAUC;AAED,SAAgB,gBAAgB,CAAC,QAAkB;IACjD,OAAO,CAAC,QAAQ,CAAC,YAAY,CAAC,CAAA;AAChC,CAAC;AAFD,4CAEC"}

View File

@@ -0,0 +1,74 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.buildInProdFollowPrefix = exports.buildBetweenProdsFollowPrefix = exports.computeAllProdsFollows = exports.ResyncFollowsWalker = void 0;
var rest_1 = require("./rest");
var first_1 = require("./first");
var forEach_1 = __importDefault(require("lodash/forEach"));
var assign_1 = __importDefault(require("lodash/assign"));
var constants_1 = require("../constants");
var gast_1 = require("@chevrotain/gast");
// This ResyncFollowsWalker computes all of the follows required for RESYNC
// (skipping reference production).
var ResyncFollowsWalker = /** @class */ (function (_super) {
__extends(ResyncFollowsWalker, _super);
function ResyncFollowsWalker(topProd) {
var _this = _super.call(this) || this;
_this.topProd = topProd;
_this.follows = {};
return _this;
}
ResyncFollowsWalker.prototype.startWalking = function () {
this.walk(this.topProd);
return this.follows;
};
ResyncFollowsWalker.prototype.walkTerminal = function (terminal, currRest, prevRest) {
// do nothing! just like in the public sector after 13:00
};
ResyncFollowsWalker.prototype.walkProdRef = function (refProd, currRest, prevRest) {
var followName = buildBetweenProdsFollowPrefix(refProd.referencedRule, refProd.idx) +
this.topProd.name;
var fullRest = currRest.concat(prevRest);
var restProd = new gast_1.Alternative({ definition: fullRest });
var t_in_topProd_follows = (0, first_1.first)(restProd);
this.follows[followName] = t_in_topProd_follows;
};
return ResyncFollowsWalker;
}(rest_1.RestWalker));
exports.ResyncFollowsWalker = ResyncFollowsWalker;
function computeAllProdsFollows(topProductions) {
var reSyncFollows = {};
(0, forEach_1.default)(topProductions, function (topProd) {
var currRefsFollow = new ResyncFollowsWalker(topProd).startWalking();
(0, assign_1.default)(reSyncFollows, currRefsFollow);
});
return reSyncFollows;
}
exports.computeAllProdsFollows = computeAllProdsFollows;
function buildBetweenProdsFollowPrefix(inner, occurenceInParent) {
return inner.name + occurenceInParent + constants_1.IN;
}
exports.buildBetweenProdsFollowPrefix = buildBetweenProdsFollowPrefix;
function buildInProdFollowPrefix(terminal) {
var terminalName = terminal.terminalType.name;
return terminalName + terminal.idx + constants_1.IN;
}
exports.buildInProdFollowPrefix = buildInProdFollowPrefix;
//# sourceMappingURL=follow.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"follow.js","sourceRoot":"","sources":["../../../../src/parse/grammar/follow.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;AAAA,+BAAmC;AACnC,iCAA+B;AAC/B,2DAAoC;AACpC,yDAAkC;AAClC,0CAAiC;AACjC,yCAA2E;AAG3E,2EAA2E;AAC3E,mCAAmC;AACnC;IAAyC,uCAAU;IAGjD,6BAAoB,OAAa;QAAjC,YACE,iBAAO,SACR;QAFmB,aAAO,GAAP,OAAO,CAAM;QAF1B,aAAO,GAAgC,EAAE,CAAA;;IAIhD,CAAC;IAED,0CAAY,GAAZ;QACE,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,CAAA;QACvB,OAAO,IAAI,CAAC,OAAO,CAAA;IACrB,CAAC;IAED,0CAAY,GAAZ,UACE,QAAkB,EAClB,QAAuB,EACvB,QAAuB;QAEvB,yDAAyD;IAC3D,CAAC;IAED,yCAAW,GAAX,UACE,OAAoB,EACpB,QAAuB,EACvB,QAAuB;QAEvB,IAAM,UAAU,GACd,6BAA6B,CAAC,OAAO,CAAC,cAAc,EAAE,OAAO,CAAC,GAAG,CAAC;YAClE,IAAI,CAAC,OAAO,CAAC,IAAI,CAAA;QACnB,IAAM,QAAQ,GAAkB,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAA;QACzD,IAAM,QAAQ,GAAG,IAAI,kBAAW,CAAC,EAAE,UAAU,EAAE,QAAQ,EAAE,CAAC,CAAA;QAC1D,IAAM,oBAAoB,GAAG,IAAA,aAAK,EAAC,QAAQ,CAAC,CAAA;QAC5C,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,GAAG,oBAAoB,CAAA;IACjD,CAAC;IACH,0BAAC;AAAD,CAAC,AAjCD,CAAyC,iBAAU,GAiClD;AAjCY,kDAAmB;AAmChC,SAAgB,sBAAsB,CACpC,cAAsB;IAEtB,IAAM,aAAa,GAAG,EAAE,CAAA;IAExB,IAAA,iBAAO,EAAC,cAAc,EAAE,UAAC,OAAO;QAC9B,IAAM,cAAc,GAAG,IAAI,mBAAmB,CAAC,OAAO,CAAC,CAAC,YAAY,EAAE,CAAA;QACtE,IAAA,gBAAM,EAAC,aAAa,EAAE,cAAc,CAAC,CAAA;IACvC,CAAC,CAAC,CAAA;IACF,OAAO,aAAa,CAAA;AACtB,CAAC;AAVD,wDAUC;AAED,SAAgB,6BAA6B,CAC3C,KAAW,EACX,iBAAyB;IAEzB,OAAO,KAAK,CAAC,IAAI,GAAG,iBAAiB,GAAG,cAAE,CAAA;AAC5C,CAAC;AALD,sEAKC;AAED,SAAgB,uBAAuB,CAAC,QAAkB;IACxD,IAAM,YAAY,GAAG,QAAQ,CAAC,YAAY,CAAC,IAAI,CAAA;IAC/C,OAAO,YAAY,GAAG,QAAQ,CAAC,GAAG,GAAG,cAAE,CAAA;AACzC,CAAC;AAHD,0DAGC"}

View File

@@ -0,0 +1,30 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.validateGrammar = exports.resolveGrammar = void 0;
var forEach_1 = __importDefault(require("lodash/forEach"));
var defaults_1 = __importDefault(require("lodash/defaults"));
var resolver_1 = require("../resolver");
var checks_1 = require("../checks");
var errors_public_1 = require("../../errors_public");
function resolveGrammar(options) {
var actualOptions = (0, defaults_1.default)(options, {
errMsgProvider: errors_public_1.defaultGrammarResolverErrorProvider
});
var topRulesTable = {};
(0, forEach_1.default)(options.rules, function (rule) {
topRulesTable[rule.name] = rule;
});
return (0, resolver_1.resolveGrammar)(topRulesTable, actualOptions.errMsgProvider);
}
exports.resolveGrammar = resolveGrammar;
function validateGrammar(options) {
options = (0, defaults_1.default)(options, {
errMsgProvider: errors_public_1.defaultGrammarValidatorErrorProvider
});
return (0, checks_1.validateGrammar)(options.rules, options.tokenTypes, options.errMsgProvider, options.grammarName);
}
exports.validateGrammar = validateGrammar;
//# sourceMappingURL=gast_resolver_public.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"gast_resolver_public.js","sourceRoot":"","sources":["../../../../../src/parse/grammar/gast/gast_resolver_public.ts"],"names":[],"mappings":";;;;;;AACA,2DAAoC;AACpC,6DAAsC;AACtC,wCAAiE;AACjE,oCAAiE;AACjE,qDAG4B;AAY5B,SAAgB,cAAc,CAC5B,OAA2B;IAE3B,IAAM,aAAa,GAAiC,IAAA,kBAAQ,EAAC,OAAO,EAAE;QACpE,cAAc,EAAE,mDAAmC;KACpD,CAAC,CAAA;IAEF,IAAM,aAAa,GAAiC,EAAE,CAAA;IACtD,IAAA,iBAAO,EAAC,OAAO,CAAC,KAAK,EAAE,UAAC,IAAI;QAC1B,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,GAAG,IAAI,CAAA;IACjC,CAAC,CAAC,CAAA;IACF,OAAO,IAAA,yBAAiB,EAAC,aAAa,EAAE,aAAa,CAAC,cAAc,CAAC,CAAA;AACvE,CAAC;AAZD,wCAYC;AAED,SAAgB,eAAe,CAAC,OAK/B;IACC,OAAO,GAAG,IAAA,kBAAQ,EAAC,OAAO,EAAE;QAC1B,cAAc,EAAE,oDAAoC;KACrD,CAAC,CAAA;IAEF,OAAO,IAAA,wBAAkB,EACvB,OAAO,CAAC,KAAK,EACb,OAAO,CAAC,UAAU,EAClB,OAAO,CAAC,cAAc,EACtB,OAAO,CAAC,WAAW,CACpB,CAAA;AACH,CAAC;AAhBD,0CAgBC"}

View File

@@ -0,0 +1,559 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.nextPossibleTokensAfter = exports.possiblePathsFrom = exports.NextTerminalAfterAtLeastOneSepWalker = exports.NextTerminalAfterAtLeastOneWalker = exports.NextTerminalAfterManySepWalker = exports.NextTerminalAfterManyWalker = exports.AbstractNextTerminalAfterProductionWalker = exports.NextAfterTokenWalker = exports.AbstractNextPossibleTokensWalker = void 0;
var rest_1 = require("./rest");
var first_1 = __importDefault(require("lodash/first"));
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var dropRight_1 = __importDefault(require("lodash/dropRight"));
var drop_1 = __importDefault(require("lodash/drop"));
var last_1 = __importDefault(require("lodash/last"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var clone_1 = __importDefault(require("lodash/clone"));
var first_2 = require("./first");
var gast_1 = require("@chevrotain/gast");
var AbstractNextPossibleTokensWalker = /** @class */ (function (_super) {
__extends(AbstractNextPossibleTokensWalker, _super);
function AbstractNextPossibleTokensWalker(topProd, path) {
var _this = _super.call(this) || this;
_this.topProd = topProd;
_this.path = path;
_this.possibleTokTypes = [];
_this.nextProductionName = "";
_this.nextProductionOccurrence = 0;
_this.found = false;
_this.isAtEndOfPath = false;
return _this;
}
AbstractNextPossibleTokensWalker.prototype.startWalking = function () {
this.found = false;
if (this.path.ruleStack[0] !== this.topProd.name) {
throw Error("The path does not start with the walker's top Rule!");
}
// immutable for the win
this.ruleStack = (0, clone_1.default)(this.path.ruleStack).reverse(); // intelij bug requires assertion
this.occurrenceStack = (0, clone_1.default)(this.path.occurrenceStack).reverse(); // intelij bug requires assertion
// already verified that the first production is valid, we now seek the 2nd production
this.ruleStack.pop();
this.occurrenceStack.pop();
this.updateExpectedNext();
this.walk(this.topProd);
return this.possibleTokTypes;
};
AbstractNextPossibleTokensWalker.prototype.walk = function (prod, prevRest) {
if (prevRest === void 0) { prevRest = []; }
// stop scanning once we found the path
if (!this.found) {
_super.prototype.walk.call(this, prod, prevRest);
}
};
AbstractNextPossibleTokensWalker.prototype.walkProdRef = function (refProd, currRest, prevRest) {
// found the next production, need to keep walking in it
if (refProd.referencedRule.name === this.nextProductionName &&
refProd.idx === this.nextProductionOccurrence) {
var fullRest = currRest.concat(prevRest);
this.updateExpectedNext();
this.walk(refProd.referencedRule, fullRest);
}
};
AbstractNextPossibleTokensWalker.prototype.updateExpectedNext = function () {
// need to consume the Terminal
if ((0, isEmpty_1.default)(this.ruleStack)) {
// must reset nextProductionXXX to avoid walking down another Top Level production while what we are
// really seeking is the last Terminal...
this.nextProductionName = "";
this.nextProductionOccurrence = 0;
this.isAtEndOfPath = true;
}
else {
this.nextProductionName = this.ruleStack.pop();
this.nextProductionOccurrence = this.occurrenceStack.pop();
}
};
return AbstractNextPossibleTokensWalker;
}(rest_1.RestWalker));
exports.AbstractNextPossibleTokensWalker = AbstractNextPossibleTokensWalker;
var NextAfterTokenWalker = /** @class */ (function (_super) {
__extends(NextAfterTokenWalker, _super);
function NextAfterTokenWalker(topProd, path) {
var _this = _super.call(this, topProd, path) || this;
_this.path = path;
_this.nextTerminalName = "";
_this.nextTerminalOccurrence = 0;
_this.nextTerminalName = _this.path.lastTok.name;
_this.nextTerminalOccurrence = _this.path.lastTokOccurrence;
return _this;
}
NextAfterTokenWalker.prototype.walkTerminal = function (terminal, currRest, prevRest) {
if (this.isAtEndOfPath &&
terminal.terminalType.name === this.nextTerminalName &&
terminal.idx === this.nextTerminalOccurrence &&
!this.found) {
var fullRest = currRest.concat(prevRest);
var restProd = new gast_1.Alternative({ definition: fullRest });
this.possibleTokTypes = (0, first_2.first)(restProd);
this.found = true;
}
};
return NextAfterTokenWalker;
}(AbstractNextPossibleTokensWalker));
exports.NextAfterTokenWalker = NextAfterTokenWalker;
/**
* This walker only "walks" a single "TOP" level in the Grammar Ast, this means
* it never "follows" production refs
*/
var AbstractNextTerminalAfterProductionWalker = /** @class */ (function (_super) {
__extends(AbstractNextTerminalAfterProductionWalker, _super);
function AbstractNextTerminalAfterProductionWalker(topRule, occurrence) {
var _this = _super.call(this) || this;
_this.topRule = topRule;
_this.occurrence = occurrence;
_this.result = {
token: undefined,
occurrence: undefined,
isEndOfRule: undefined
};
return _this;
}
AbstractNextTerminalAfterProductionWalker.prototype.startWalking = function () {
this.walk(this.topRule);
return this.result;
};
return AbstractNextTerminalAfterProductionWalker;
}(rest_1.RestWalker));
exports.AbstractNextTerminalAfterProductionWalker = AbstractNextTerminalAfterProductionWalker;
var NextTerminalAfterManyWalker = /** @class */ (function (_super) {
__extends(NextTerminalAfterManyWalker, _super);
function NextTerminalAfterManyWalker() {
return _super !== null && _super.apply(this, arguments) || this;
}
NextTerminalAfterManyWalker.prototype.walkMany = function (manyProd, currRest, prevRest) {
if (manyProd.idx === this.occurrence) {
var firstAfterMany = (0, first_1.default)(currRest.concat(prevRest));
this.result.isEndOfRule = firstAfterMany === undefined;
if (firstAfterMany instanceof gast_1.Terminal) {
this.result.token = firstAfterMany.terminalType;
this.result.occurrence = firstAfterMany.idx;
}
}
else {
_super.prototype.walkMany.call(this, manyProd, currRest, prevRest);
}
};
return NextTerminalAfterManyWalker;
}(AbstractNextTerminalAfterProductionWalker));
exports.NextTerminalAfterManyWalker = NextTerminalAfterManyWalker;
var NextTerminalAfterManySepWalker = /** @class */ (function (_super) {
__extends(NextTerminalAfterManySepWalker, _super);
function NextTerminalAfterManySepWalker() {
return _super !== null && _super.apply(this, arguments) || this;
}
NextTerminalAfterManySepWalker.prototype.walkManySep = function (manySepProd, currRest, prevRest) {
if (manySepProd.idx === this.occurrence) {
var firstAfterManySep = (0, first_1.default)(currRest.concat(prevRest));
this.result.isEndOfRule = firstAfterManySep === undefined;
if (firstAfterManySep instanceof gast_1.Terminal) {
this.result.token = firstAfterManySep.terminalType;
this.result.occurrence = firstAfterManySep.idx;
}
}
else {
_super.prototype.walkManySep.call(this, manySepProd, currRest, prevRest);
}
};
return NextTerminalAfterManySepWalker;
}(AbstractNextTerminalAfterProductionWalker));
exports.NextTerminalAfterManySepWalker = NextTerminalAfterManySepWalker;
var NextTerminalAfterAtLeastOneWalker = /** @class */ (function (_super) {
__extends(NextTerminalAfterAtLeastOneWalker, _super);
function NextTerminalAfterAtLeastOneWalker() {
return _super !== null && _super.apply(this, arguments) || this;
}
NextTerminalAfterAtLeastOneWalker.prototype.walkAtLeastOne = function (atLeastOneProd, currRest, prevRest) {
if (atLeastOneProd.idx === this.occurrence) {
var firstAfterAtLeastOne = (0, first_1.default)(currRest.concat(prevRest));
this.result.isEndOfRule = firstAfterAtLeastOne === undefined;
if (firstAfterAtLeastOne instanceof gast_1.Terminal) {
this.result.token = firstAfterAtLeastOne.terminalType;
this.result.occurrence = firstAfterAtLeastOne.idx;
}
}
else {
_super.prototype.walkAtLeastOne.call(this, atLeastOneProd, currRest, prevRest);
}
};
return NextTerminalAfterAtLeastOneWalker;
}(AbstractNextTerminalAfterProductionWalker));
exports.NextTerminalAfterAtLeastOneWalker = NextTerminalAfterAtLeastOneWalker;
// TODO: reduce code duplication in the AfterWalkers
var NextTerminalAfterAtLeastOneSepWalker = /** @class */ (function (_super) {
__extends(NextTerminalAfterAtLeastOneSepWalker, _super);
function NextTerminalAfterAtLeastOneSepWalker() {
return _super !== null && _super.apply(this, arguments) || this;
}
NextTerminalAfterAtLeastOneSepWalker.prototype.walkAtLeastOneSep = function (atleastOneSepProd, currRest, prevRest) {
if (atleastOneSepProd.idx === this.occurrence) {
var firstAfterfirstAfterAtLeastOneSep = (0, first_1.default)(currRest.concat(prevRest));
this.result.isEndOfRule = firstAfterfirstAfterAtLeastOneSep === undefined;
if (firstAfterfirstAfterAtLeastOneSep instanceof gast_1.Terminal) {
this.result.token = firstAfterfirstAfterAtLeastOneSep.terminalType;
this.result.occurrence = firstAfterfirstAfterAtLeastOneSep.idx;
}
}
else {
_super.prototype.walkAtLeastOneSep.call(this, atleastOneSepProd, currRest, prevRest);
}
};
return NextTerminalAfterAtLeastOneSepWalker;
}(AbstractNextTerminalAfterProductionWalker));
exports.NextTerminalAfterAtLeastOneSepWalker = NextTerminalAfterAtLeastOneSepWalker;
function possiblePathsFrom(targetDef, maxLength, currPath) {
if (currPath === void 0) { currPath = []; }
// avoid side effects
currPath = (0, clone_1.default)(currPath);
var result = [];
var i = 0;
// TODO: avoid inner funcs
function remainingPathWith(nextDef) {
return nextDef.concat((0, drop_1.default)(targetDef, i + 1));
}
// TODO: avoid inner funcs
function getAlternativesForProd(definition) {
var alternatives = possiblePathsFrom(remainingPathWith(definition), maxLength, currPath);
return result.concat(alternatives);
}
/**
* Mandatory productions will halt the loop as the paths computed from their recursive calls will already contain the
* following (rest) of the targetDef.
*
* For optional productions (Option/Repetition/...) the loop will continue to represent the paths that do not include the
* the optional production.
*/
while (currPath.length < maxLength && i < targetDef.length) {
var prod = targetDef[i];
/* istanbul ignore else */
if (prod instanceof gast_1.Alternative) {
return getAlternativesForProd(prod.definition);
}
else if (prod instanceof gast_1.NonTerminal) {
return getAlternativesForProd(prod.definition);
}
else if (prod instanceof gast_1.Option) {
result = getAlternativesForProd(prod.definition);
}
else if (prod instanceof gast_1.RepetitionMandatory) {
var newDef = prod.definition.concat([
new gast_1.Repetition({
definition: prod.definition
})
]);
return getAlternativesForProd(newDef);
}
else if (prod instanceof gast_1.RepetitionMandatoryWithSeparator) {
var newDef = [
new gast_1.Alternative({ definition: prod.definition }),
new gast_1.Repetition({
definition: [new gast_1.Terminal({ terminalType: prod.separator })].concat(prod.definition)
})
];
return getAlternativesForProd(newDef);
}
else if (prod instanceof gast_1.RepetitionWithSeparator) {
var newDef = prod.definition.concat([
new gast_1.Repetition({
definition: [new gast_1.Terminal({ terminalType: prod.separator })].concat(prod.definition)
})
]);
result = getAlternativesForProd(newDef);
}
else if (prod instanceof gast_1.Repetition) {
var newDef = prod.definition.concat([
new gast_1.Repetition({
definition: prod.definition
})
]);
result = getAlternativesForProd(newDef);
}
else if (prod instanceof gast_1.Alternation) {
(0, forEach_1.default)(prod.definition, function (currAlt) {
// TODO: this is a limited check for empty alternatives
// It would prevent a common case of infinite loops during parser initialization.
// However **in-directly** empty alternatives may still cause issues.
if ((0, isEmpty_1.default)(currAlt.definition) === false) {
result = getAlternativesForProd(currAlt.definition);
}
});
return result;
}
else if (prod instanceof gast_1.Terminal) {
currPath.push(prod.terminalType);
}
else {
throw Error("non exhaustive match");
}
i++;
}
result.push({
partialPath: currPath,
suffixDef: (0, drop_1.default)(targetDef, i)
});
return result;
}
exports.possiblePathsFrom = possiblePathsFrom;
function nextPossibleTokensAfter(initialDef, tokenVector, tokMatcher, maxLookAhead) {
var EXIT_NON_TERMINAL = "EXIT_NONE_TERMINAL";
// to avoid creating a new Array each time.
var EXIT_NON_TERMINAL_ARR = [EXIT_NON_TERMINAL];
var EXIT_ALTERNATIVE = "EXIT_ALTERNATIVE";
var foundCompletePath = false;
var tokenVectorLength = tokenVector.length;
var minimalAlternativesIndex = tokenVectorLength - maxLookAhead - 1;
var result = [];
var possiblePaths = [];
possiblePaths.push({
idx: -1,
def: initialDef,
ruleStack: [],
occurrenceStack: []
});
while (!(0, isEmpty_1.default)(possiblePaths)) {
var currPath = possiblePaths.pop();
// skip alternatives if no more results can be found (assuming deterministic grammar with fixed lookahead)
if (currPath === EXIT_ALTERNATIVE) {
if (foundCompletePath &&
(0, last_1.default)(possiblePaths).idx <= minimalAlternativesIndex) {
// remove irrelevant alternative
possiblePaths.pop();
}
continue;
}
var currDef = currPath.def;
var currIdx = currPath.idx;
var currRuleStack = currPath.ruleStack;
var currOccurrenceStack = currPath.occurrenceStack;
// For Example: an empty path could exist in a valid grammar in the case of an EMPTY_ALT
if ((0, isEmpty_1.default)(currDef)) {
continue;
}
var prod = currDef[0];
/* istanbul ignore else */
if (prod === EXIT_NON_TERMINAL) {
var nextPath = {
idx: currIdx,
def: (0, drop_1.default)(currDef),
ruleStack: (0, dropRight_1.default)(currRuleStack),
occurrenceStack: (0, dropRight_1.default)(currOccurrenceStack)
};
possiblePaths.push(nextPath);
}
else if (prod instanceof gast_1.Terminal) {
/* istanbul ignore else */
if (currIdx < tokenVectorLength - 1) {
var nextIdx = currIdx + 1;
var actualToken = tokenVector[nextIdx];
if (tokMatcher(actualToken, prod.terminalType)) {
var nextPath = {
idx: nextIdx,
def: (0, drop_1.default)(currDef),
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPath);
}
// end of the line
}
else if (currIdx === tokenVectorLength - 1) {
// IGNORE ABOVE ELSE
result.push({
nextTokenType: prod.terminalType,
nextTokenOccurrence: prod.idx,
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
});
foundCompletePath = true;
}
else {
throw Error("non exhaustive match");
}
}
else if (prod instanceof gast_1.NonTerminal) {
var newRuleStack = (0, clone_1.default)(currRuleStack);
newRuleStack.push(prod.nonTerminalName);
var newOccurrenceStack = (0, clone_1.default)(currOccurrenceStack);
newOccurrenceStack.push(prod.idx);
var nextPath = {
idx: currIdx,
def: prod.definition.concat(EXIT_NON_TERMINAL_ARR, (0, drop_1.default)(currDef)),
ruleStack: newRuleStack,
occurrenceStack: newOccurrenceStack
};
possiblePaths.push(nextPath);
}
else if (prod instanceof gast_1.Option) {
// the order of alternatives is meaningful, FILO (Last path will be traversed first).
var nextPathWithout = {
idx: currIdx,
def: (0, drop_1.default)(currDef),
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPathWithout);
// required marker to avoid backtracking paths whose higher priority alternatives already matched
possiblePaths.push(EXIT_ALTERNATIVE);
var nextPathWith = {
idx: currIdx,
def: prod.definition.concat((0, drop_1.default)(currDef)),
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPathWith);
}
else if (prod instanceof gast_1.RepetitionMandatory) {
// TODO:(THE NEW operators here take a while...) (convert once?)
var secondIteration = new gast_1.Repetition({
definition: prod.definition,
idx: prod.idx
});
var nextDef = prod.definition.concat([secondIteration], (0, drop_1.default)(currDef));
var nextPath = {
idx: currIdx,
def: nextDef,
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPath);
}
else if (prod instanceof gast_1.RepetitionMandatoryWithSeparator) {
// TODO:(THE NEW operators here take a while...) (convert once?)
var separatorGast = new gast_1.Terminal({
terminalType: prod.separator
});
var secondIteration = new gast_1.Repetition({
definition: [separatorGast].concat(prod.definition),
idx: prod.idx
});
var nextDef = prod.definition.concat([secondIteration], (0, drop_1.default)(currDef));
var nextPath = {
idx: currIdx,
def: nextDef,
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPath);
}
else if (prod instanceof gast_1.RepetitionWithSeparator) {
// the order of alternatives is meaningful, FILO (Last path will be traversed first).
var nextPathWithout = {
idx: currIdx,
def: (0, drop_1.default)(currDef),
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPathWithout);
// required marker to avoid backtracking paths whose higher priority alternatives already matched
possiblePaths.push(EXIT_ALTERNATIVE);
var separatorGast = new gast_1.Terminal({
terminalType: prod.separator
});
var nthRepetition = new gast_1.Repetition({
definition: [separatorGast].concat(prod.definition),
idx: prod.idx
});
var nextDef = prod.definition.concat([nthRepetition], (0, drop_1.default)(currDef));
var nextPathWith = {
idx: currIdx,
def: nextDef,
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPathWith);
}
else if (prod instanceof gast_1.Repetition) {
// the order of alternatives is meaningful, FILO (Last path will be traversed first).
var nextPathWithout = {
idx: currIdx,
def: (0, drop_1.default)(currDef),
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPathWithout);
// required marker to avoid backtracking paths whose higher priority alternatives already matched
possiblePaths.push(EXIT_ALTERNATIVE);
// TODO: an empty repetition will cause infinite loops here, will the parser detect this in selfAnalysis?
var nthRepetition = new gast_1.Repetition({
definition: prod.definition,
idx: prod.idx
});
var nextDef = prod.definition.concat([nthRepetition], (0, drop_1.default)(currDef));
var nextPathWith = {
idx: currIdx,
def: nextDef,
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(nextPathWith);
}
else if (prod instanceof gast_1.Alternation) {
// the order of alternatives is meaningful, FILO (Last path will be traversed first).
for (var i = prod.definition.length - 1; i >= 0; i--) {
var currAlt = prod.definition[i];
var currAltPath = {
idx: currIdx,
def: currAlt.definition.concat((0, drop_1.default)(currDef)),
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
};
possiblePaths.push(currAltPath);
possiblePaths.push(EXIT_ALTERNATIVE);
}
}
else if (prod instanceof gast_1.Alternative) {
possiblePaths.push({
idx: currIdx,
def: prod.definition.concat((0, drop_1.default)(currDef)),
ruleStack: currRuleStack,
occurrenceStack: currOccurrenceStack
});
}
else if (prod instanceof gast_1.Rule) {
// last because we should only encounter at most a single one of these per invocation.
possiblePaths.push(expandTopLevelRule(prod, currIdx, currRuleStack, currOccurrenceStack));
}
else {
throw Error("non exhaustive match");
}
}
return result;
}
exports.nextPossibleTokensAfter = nextPossibleTokensAfter;
function expandTopLevelRule(topRule, currIdx, currRuleStack, currOccurrenceStack) {
var newRuleStack = (0, clone_1.default)(currRuleStack);
newRuleStack.push(topRule.name);
var newCurrOccurrenceStack = (0, clone_1.default)(currOccurrenceStack);
// top rule is always assumed to have been called with occurrence index 1
newCurrOccurrenceStack.push(1);
return {
idx: currIdx,
def: topRule.definition,
ruleStack: newRuleStack,
occurrenceStack: newCurrOccurrenceStack
};
}
//# sourceMappingURL=interpreter.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,30 @@
"use strict";
// Lookahead keys are 32Bit integers in the form
// TTTTTTTT-ZZZZZZZZZZZZ-YYYY-XXXXXXXX
// XXXX -> Occurrence Index bitmap.
// YYYY -> DSL Method Type bitmap.
// ZZZZZZZZZZZZZZZ -> Rule short Index bitmap.
// TTTTTTTTT -> alternation alternative index bitmap
Object.defineProperty(exports, "__esModule", { value: true });
exports.getKeyForAutomaticLookahead = exports.AT_LEAST_ONE_SEP_IDX = exports.MANY_SEP_IDX = exports.AT_LEAST_ONE_IDX = exports.MANY_IDX = exports.OPTION_IDX = exports.OR_IDX = exports.BITS_FOR_ALT_IDX = exports.BITS_FOR_RULE_IDX = exports.BITS_FOR_OCCURRENCE_IDX = exports.BITS_FOR_METHOD_TYPE = void 0;
exports.BITS_FOR_METHOD_TYPE = 4;
exports.BITS_FOR_OCCURRENCE_IDX = 8;
exports.BITS_FOR_RULE_IDX = 12;
// TODO: validation, this means that there may at most 2^8 --> 256 alternatives for an alternation.
exports.BITS_FOR_ALT_IDX = 8;
// short string used as part of mapping keys.
// being short improves the performance when composing KEYS for maps out of these
// The 5 - 8 bits (16 possible values, are reserved for the DSL method indices)
exports.OR_IDX = 1 << exports.BITS_FOR_OCCURRENCE_IDX;
exports.OPTION_IDX = 2 << exports.BITS_FOR_OCCURRENCE_IDX;
exports.MANY_IDX = 3 << exports.BITS_FOR_OCCURRENCE_IDX;
exports.AT_LEAST_ONE_IDX = 4 << exports.BITS_FOR_OCCURRENCE_IDX;
exports.MANY_SEP_IDX = 5 << exports.BITS_FOR_OCCURRENCE_IDX;
exports.AT_LEAST_ONE_SEP_IDX = 6 << exports.BITS_FOR_OCCURRENCE_IDX;
// this actually returns a number, but it is always used as a string (object prop key)
function getKeyForAutomaticLookahead(ruleIdx, dslMethodIdx, occurrence) {
return occurrence | dslMethodIdx | ruleIdx;
}
exports.getKeyForAutomaticLookahead = getKeyForAutomaticLookahead;
var BITS_START_FOR_ALT_IDX = 32 - exports.BITS_FOR_ALT_IDX;
//# sourceMappingURL=keys.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"keys.js","sourceRoot":"","sources":["../../../../src/parse/grammar/keys.ts"],"names":[],"mappings":";AAAA,gDAAgD;AAChD,sCAAsC;AACtC,mCAAmC;AACnC,kCAAkC;AAClC,8CAA8C;AAC9C,oDAAoD;;;AAEvC,QAAA,oBAAoB,GAAG,CAAC,CAAA;AACxB,QAAA,uBAAuB,GAAG,CAAC,CAAA;AAC3B,QAAA,iBAAiB,GAAG,EAAE,CAAA;AACnC,mGAAmG;AACtF,QAAA,gBAAgB,GAAG,CAAC,CAAA;AAEjC,6CAA6C;AAC7C,iFAAiF;AACjF,+EAA+E;AAClE,QAAA,MAAM,GAAG,CAAC,IAAI,+BAAuB,CAAA;AACrC,QAAA,UAAU,GAAG,CAAC,IAAI,+BAAuB,CAAA;AACzC,QAAA,QAAQ,GAAG,CAAC,IAAI,+BAAuB,CAAA;AACvC,QAAA,gBAAgB,GAAG,CAAC,IAAI,+BAAuB,CAAA;AAC/C,QAAA,YAAY,GAAG,CAAC,IAAI,+BAAuB,CAAA;AAC3C,QAAA,oBAAoB,GAAG,CAAC,IAAI,+BAAuB,CAAA;AAEhE,sFAAsF;AACtF,SAAgB,2BAA2B,CACzC,OAAe,EACf,YAAoB,EACpB,UAAkB;IAElB,OAAO,UAAU,GAAG,YAAY,GAAG,OAAO,CAAA;AAC5C,CAAC;AAND,kEAMC;AAED,IAAM,sBAAsB,GAAG,EAAE,GAAG,wBAAgB,CAAA"}

View File

@@ -0,0 +1,66 @@
"use strict";
var __spreadArray = (this && this.__spreadArray) || function (to, from, pack) {
if (pack || arguments.length === 2) for (var i = 0, l = from.length, ar; i < l; i++) {
if (ar || !(i in from)) {
if (!ar) ar = Array.prototype.slice.call(from, 0, i);
ar[i] = from[i];
}
}
return to.concat(ar || Array.prototype.slice.call(from));
};
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.LLkLookaheadStrategy = void 0;
var flatMap_1 = __importDefault(require("lodash/flatMap"));
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var errors_public_1 = require("../errors_public");
var parser_1 = require("../parser/parser");
var checks_1 = require("./checks");
var lookahead_1 = require("./lookahead");
var LLkLookaheadStrategy = /** @class */ (function () {
function LLkLookaheadStrategy(options) {
var _a;
this.maxLookahead =
(_a = options === null || options === void 0 ? void 0 : options.maxLookahead) !== null && _a !== void 0 ? _a : parser_1.DEFAULT_PARSER_CONFIG.maxLookahead;
}
LLkLookaheadStrategy.prototype.validate = function (options) {
var leftRecursionErrors = this.validateNoLeftRecursion(options.rules);
if ((0, isEmpty_1.default)(leftRecursionErrors)) {
var emptyAltErrors = this.validateEmptyOrAlternatives(options.rules);
var ambiguousAltsErrors = this.validateAmbiguousAlternationAlternatives(options.rules, this.maxLookahead);
var emptyRepetitionErrors = this.validateSomeNonEmptyLookaheadPath(options.rules, this.maxLookahead);
var allErrors = __spreadArray(__spreadArray(__spreadArray(__spreadArray([], leftRecursionErrors, true), emptyAltErrors, true), ambiguousAltsErrors, true), emptyRepetitionErrors, true);
return allErrors;
}
return leftRecursionErrors;
};
LLkLookaheadStrategy.prototype.validateNoLeftRecursion = function (rules) {
return (0, flatMap_1.default)(rules, function (currTopRule) {
return (0, checks_1.validateNoLeftRecursion)(currTopRule, currTopRule, errors_public_1.defaultGrammarValidatorErrorProvider);
});
};
LLkLookaheadStrategy.prototype.validateEmptyOrAlternatives = function (rules) {
return (0, flatMap_1.default)(rules, function (currTopRule) {
return (0, checks_1.validateEmptyOrAlternative)(currTopRule, errors_public_1.defaultGrammarValidatorErrorProvider);
});
};
LLkLookaheadStrategy.prototype.validateAmbiguousAlternationAlternatives = function (rules, maxLookahead) {
return (0, flatMap_1.default)(rules, function (currTopRule) {
return (0, checks_1.validateAmbiguousAlternationAlternatives)(currTopRule, maxLookahead, errors_public_1.defaultGrammarValidatorErrorProvider);
});
};
LLkLookaheadStrategy.prototype.validateSomeNonEmptyLookaheadPath = function (rules, maxLookahead) {
return (0, checks_1.validateSomeNonEmptyLookaheadPath)(rules, maxLookahead, errors_public_1.defaultGrammarValidatorErrorProvider);
};
LLkLookaheadStrategy.prototype.buildLookaheadForAlternation = function (options) {
return (0, lookahead_1.buildLookaheadFuncForOr)(options.prodOccurrence, options.rule, options.maxLookahead, options.hasPredicates, options.dynamicTokensEnabled, lookahead_1.buildAlternativesLookAheadFunc);
};
LLkLookaheadStrategy.prototype.buildLookaheadForOptional = function (options) {
return (0, lookahead_1.buildLookaheadFuncForOptionalProd)(options.prodOccurrence, options.rule, options.maxLookahead, options.dynamicTokensEnabled, (0, lookahead_1.getProdType)(options.prodType), lookahead_1.buildSingleAlternativeLookaheadFunction);
};
return LLkLookaheadStrategy;
}());
exports.LLkLookaheadStrategy = LLkLookaheadStrategy;
//# sourceMappingURL=llk_lookahead.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"llk_lookahead.js","sourceRoot":"","sources":["../../../../src/parse/grammar/llk_lookahead.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;AAQA,2DAAoC;AACpC,2DAAoC;AACpC,kDAAuE;AACvE,2CAAwD;AACxD,mCAKiB;AACjB,yCAMoB;AAGpB;IAGE,8BAAY,OAAmC;;QAC7C,IAAI,CAAC,YAAY;YACf,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,YAAY,mCAAI,8BAAqB,CAAC,YAAY,CAAA;IAC/D,CAAC;IAED,uCAAQ,GAAR,UAAS,OAIR;QACC,IAAM,mBAAmB,GAAG,IAAI,CAAC,uBAAuB,CAAC,OAAO,CAAC,KAAK,CAAC,CAAA;QAEvE,IAAI,IAAA,iBAAO,EAAC,mBAAmB,CAAC,EAAE;YAChC,IAAM,cAAc,GAAG,IAAI,CAAC,2BAA2B,CAAC,OAAO,CAAC,KAAK,CAAC,CAAA;YACtE,IAAM,mBAAmB,GAAG,IAAI,CAAC,wCAAwC,CACvE,OAAO,CAAC,KAAK,EACb,IAAI,CAAC,YAAY,CAClB,CAAA;YACD,IAAM,qBAAqB,GAAG,IAAI,CAAC,iCAAiC,CAClE,OAAO,CAAC,KAAK,EACb,IAAI,CAAC,YAAY,CAClB,CAAA;YACD,IAAM,SAAS,+DACV,mBAAmB,SACnB,cAAc,SACd,mBAAmB,SACnB,qBAAqB,OACzB,CAAA;YACD,OAAO,SAAS,CAAA;SACjB;QACD,OAAO,mBAAmB,CAAA;IAC5B,CAAC;IAED,sDAAuB,GAAvB,UAAwB,KAAa;QACnC,OAAO,IAAA,iBAAO,EAAC,KAAK,EAAE,UAAC,WAAW;YAChC,OAAA,IAAA,gCAAuB,EACrB,WAAW,EACX,WAAW,EACX,oDAAoC,CACrC;QAJD,CAIC,CACF,CAAA;IACH,CAAC;IAED,0DAA2B,GAA3B,UAA4B,KAAa;QACvC,OAAO,IAAA,iBAAO,EAAC,KAAK,EAAE,UAAC,WAAW;YAChC,OAAA,IAAA,mCAA0B,EACxB,WAAW,EACX,oDAAoC,CACrC;QAHD,CAGC,CACF,CAAA;IACH,CAAC;IAED,uEAAwC,GAAxC,UACE,KAAa,EACb,YAAoB;QAEpB,OAAO,IAAA,iBAAO,EAAC,KAAK,EAAE,UAAC,WAAW;YAChC,OAAA,IAAA,iDAAwC,EACtC,WAAW,EACX,YAAY,EACZ,oDAAoC,CACrC;QAJD,CAIC,CACF,CAAA;IACH,CAAC;IAED,gEAAiC,GAAjC,UACE,KAAa,EACb,YAAoB;QAEpB,OAAO,IAAA,0CAAiC,EACtC,KAAK,EACL,YAAY,EACZ,oDAAoC,CACrC,CAAA;IACH,CAAC;IAED,2DAA4B,GAA5B,UAA6B,OAM5B;QACC,OAAO,IAAA,mCAAuB,EAC5B,OAAO,CAAC,cAAc,EACtB,OAAO,CAAC,IAAI,EACZ,OAAO,CAAC,YAAY,EACpB,OAAO,CAAC,aAAa,EACrB,OAAO,CAAC,oBAAoB,EAC5B,0CAA8B,CAC/B,CAAA;IACH,CAAC;IAED,wDAAyB,GAAzB,UAA0B,OAMzB;QACC,OAAO,IAAA,6CAAiC,EACtC,OAAO,CAAC,cAAc,EACtB,OAAO,CAAC,IAAI,EACZ,OAAO,CAAC,YAAY,EACpB,OAAO,CAAC,oBAAoB,EAC5B,IAAA,uBAAW,EAAC,OAAO,CAAC,QAAQ,CAAC,EAC7B,mDAAuC,CACxC,CAAA;IACH,CAAC;IACH,2BAAC;AAAD,CAAC,AAhHD,IAgHC;AAhHY,oDAAoB"}

View File

@@ -0,0 +1,526 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.areTokenCategoriesNotUsed = exports.isStrictPrefixOfPath = exports.containsPath = exports.getLookaheadPathsForOptionalProd = exports.getLookaheadPathsForOr = exports.lookAheadSequenceFromAlternatives = exports.buildSingleAlternativeLookaheadFunction = exports.buildAlternativesLookAheadFunc = exports.buildLookaheadFuncForOptionalProd = exports.buildLookaheadFuncForOr = exports.getLookaheadPaths = exports.getProdType = exports.PROD_TYPE = void 0;
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var flatten_1 = __importDefault(require("lodash/flatten"));
var every_1 = __importDefault(require("lodash/every"));
var map_1 = __importDefault(require("lodash/map"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var has_1 = __importDefault(require("lodash/has"));
var reduce_1 = __importDefault(require("lodash/reduce"));
var interpreter_1 = require("./interpreter");
var rest_1 = require("./rest");
var tokens_1 = require("../../scan/tokens");
var gast_1 = require("@chevrotain/gast");
var gast_2 = require("@chevrotain/gast");
var PROD_TYPE;
(function (PROD_TYPE) {
PROD_TYPE[PROD_TYPE["OPTION"] = 0] = "OPTION";
PROD_TYPE[PROD_TYPE["REPETITION"] = 1] = "REPETITION";
PROD_TYPE[PROD_TYPE["REPETITION_MANDATORY"] = 2] = "REPETITION_MANDATORY";
PROD_TYPE[PROD_TYPE["REPETITION_MANDATORY_WITH_SEPARATOR"] = 3] = "REPETITION_MANDATORY_WITH_SEPARATOR";
PROD_TYPE[PROD_TYPE["REPETITION_WITH_SEPARATOR"] = 4] = "REPETITION_WITH_SEPARATOR";
PROD_TYPE[PROD_TYPE["ALTERNATION"] = 5] = "ALTERNATION";
})(PROD_TYPE = exports.PROD_TYPE || (exports.PROD_TYPE = {}));
function getProdType(prod) {
/* istanbul ignore else */
if (prod instanceof gast_1.Option || prod === "Option") {
return PROD_TYPE.OPTION;
}
else if (prod instanceof gast_1.Repetition || prod === "Repetition") {
return PROD_TYPE.REPETITION;
}
else if (prod instanceof gast_1.RepetitionMandatory ||
prod === "RepetitionMandatory") {
return PROD_TYPE.REPETITION_MANDATORY;
}
else if (prod instanceof gast_1.RepetitionMandatoryWithSeparator ||
prod === "RepetitionMandatoryWithSeparator") {
return PROD_TYPE.REPETITION_MANDATORY_WITH_SEPARATOR;
}
else if (prod instanceof gast_1.RepetitionWithSeparator ||
prod === "RepetitionWithSeparator") {
return PROD_TYPE.REPETITION_WITH_SEPARATOR;
}
else if (prod instanceof gast_1.Alternation || prod === "Alternation") {
return PROD_TYPE.ALTERNATION;
}
else {
throw Error("non exhaustive match");
}
}
exports.getProdType = getProdType;
function getLookaheadPaths(options) {
var occurrence = options.occurrence, rule = options.rule, prodType = options.prodType, maxLookahead = options.maxLookahead;
var type = getProdType(prodType);
if (type === PROD_TYPE.ALTERNATION) {
return getLookaheadPathsForOr(occurrence, rule, maxLookahead);
}
else {
return getLookaheadPathsForOptionalProd(occurrence, rule, type, maxLookahead);
}
}
exports.getLookaheadPaths = getLookaheadPaths;
function buildLookaheadFuncForOr(occurrence, ruleGrammar, maxLookahead, hasPredicates, dynamicTokensEnabled, laFuncBuilder) {
var lookAheadPaths = getLookaheadPathsForOr(occurrence, ruleGrammar, maxLookahead);
var tokenMatcher = areTokenCategoriesNotUsed(lookAheadPaths)
? tokens_1.tokenStructuredMatcherNoCategories
: tokens_1.tokenStructuredMatcher;
return laFuncBuilder(lookAheadPaths, hasPredicates, tokenMatcher, dynamicTokensEnabled);
}
exports.buildLookaheadFuncForOr = buildLookaheadFuncForOr;
/**
* When dealing with an Optional production (OPTION/MANY/2nd iteration of AT_LEAST_ONE/...) we need to compare
* the lookahead "inside" the production and the lookahead immediately "after" it in the same top level rule (context free).
*
* Example: given a production:
* ABC(DE)?DF
*
* The optional '(DE)?' should only be entered if we see 'DE'. a single Token 'D' is not sufficient to distinguish between the two
* alternatives.
*
* @returns A Lookahead function which will return true IFF the parser should parse the Optional production.
*/
function buildLookaheadFuncForOptionalProd(occurrence, ruleGrammar, k, dynamicTokensEnabled, prodType, lookaheadBuilder) {
var lookAheadPaths = getLookaheadPathsForOptionalProd(occurrence, ruleGrammar, prodType, k);
var tokenMatcher = areTokenCategoriesNotUsed(lookAheadPaths)
? tokens_1.tokenStructuredMatcherNoCategories
: tokens_1.tokenStructuredMatcher;
return lookaheadBuilder(lookAheadPaths[0], tokenMatcher, dynamicTokensEnabled);
}
exports.buildLookaheadFuncForOptionalProd = buildLookaheadFuncForOptionalProd;
function buildAlternativesLookAheadFunc(alts, hasPredicates, tokenMatcher, dynamicTokensEnabled) {
var numOfAlts = alts.length;
var areAllOneTokenLookahead = (0, every_1.default)(alts, function (currAlt) {
return (0, every_1.default)(currAlt, function (currPath) {
return currPath.length === 1;
});
});
// This version takes into account the predicates as well.
if (hasPredicates) {
/**
* @returns {number} - The chosen alternative index
*/
return function (orAlts) {
// unfortunately the predicates must be extracted every single time
// as they cannot be cached due to references to parameters(vars) which are no longer valid.
// note that in the common case of no predicates, no cpu time will be wasted on this (see else block)
var predicates = (0, map_1.default)(orAlts, function (currAlt) { return currAlt.GATE; });
for (var t = 0; t < numOfAlts; t++) {
var currAlt = alts[t];
var currNumOfPaths = currAlt.length;
var currPredicate = predicates[t];
if (currPredicate !== undefined && currPredicate.call(this) === false) {
// if the predicate does not match there is no point in checking the paths
continue;
}
nextPath: for (var j = 0; j < currNumOfPaths; j++) {
var currPath = currAlt[j];
var currPathLength = currPath.length;
for (var i = 0; i < currPathLength; i++) {
var nextToken = this.LA(i + 1);
if (tokenMatcher(nextToken, currPath[i]) === false) {
// mismatch in current path
// try the next pth
continue nextPath;
}
}
// found a full path that matches.
// this will also work for an empty ALT as the loop will be skipped
return t;
}
// none of the paths for the current alternative matched
// try the next alternative
}
// none of the alternatives could be matched
return undefined;
};
}
else if (areAllOneTokenLookahead && !dynamicTokensEnabled) {
// optimized (common) case of all the lookaheads paths requiring only
// a single token lookahead. These Optimizations cannot work if dynamically defined Tokens are used.
var singleTokenAlts = (0, map_1.default)(alts, function (currAlt) {
return (0, flatten_1.default)(currAlt);
});
var choiceToAlt_1 = (0, reduce_1.default)(singleTokenAlts, function (result, currAlt, idx) {
(0, forEach_1.default)(currAlt, function (currTokType) {
if (!(0, has_1.default)(result, currTokType.tokenTypeIdx)) {
result[currTokType.tokenTypeIdx] = idx;
}
(0, forEach_1.default)(currTokType.categoryMatches, function (currExtendingType) {
if (!(0, has_1.default)(result, currExtendingType)) {
result[currExtendingType] = idx;
}
});
});
return result;
}, {});
/**
* @returns {number} - The chosen alternative index
*/
return function () {
var nextToken = this.LA(1);
return choiceToAlt_1[nextToken.tokenTypeIdx];
};
}
else {
// optimized lookahead without needing to check the predicates at all.
// this causes code duplication which is intentional to improve performance.
/**
* @returns {number} - The chosen alternative index
*/
return function () {
for (var t = 0; t < numOfAlts; t++) {
var currAlt = alts[t];
var currNumOfPaths = currAlt.length;
nextPath: for (var j = 0; j < currNumOfPaths; j++) {
var currPath = currAlt[j];
var currPathLength = currPath.length;
for (var i = 0; i < currPathLength; i++) {
var nextToken = this.LA(i + 1);
if (tokenMatcher(nextToken, currPath[i]) === false) {
// mismatch in current path
// try the next pth
continue nextPath;
}
}
// found a full path that matches.
// this will also work for an empty ALT as the loop will be skipped
return t;
}
// none of the paths for the current alternative matched
// try the next alternative
}
// none of the alternatives could be matched
return undefined;
};
}
}
exports.buildAlternativesLookAheadFunc = buildAlternativesLookAheadFunc;
function buildSingleAlternativeLookaheadFunction(alt, tokenMatcher, dynamicTokensEnabled) {
var areAllOneTokenLookahead = (0, every_1.default)(alt, function (currPath) {
return currPath.length === 1;
});
var numOfPaths = alt.length;
// optimized (common) case of all the lookaheads paths requiring only
// a single token lookahead.
if (areAllOneTokenLookahead && !dynamicTokensEnabled) {
var singleTokensTypes = (0, flatten_1.default)(alt);
if (singleTokensTypes.length === 1 &&
(0, isEmpty_1.default)(singleTokensTypes[0].categoryMatches)) {
var expectedTokenType = singleTokensTypes[0];
var expectedTokenUniqueKey_1 = expectedTokenType.tokenTypeIdx;
return function () {
return this.LA(1).tokenTypeIdx === expectedTokenUniqueKey_1;
};
}
else {
var choiceToAlt_2 = (0, reduce_1.default)(singleTokensTypes, function (result, currTokType, idx) {
result[currTokType.tokenTypeIdx] = true;
(0, forEach_1.default)(currTokType.categoryMatches, function (currExtendingType) {
result[currExtendingType] = true;
});
return result;
}, []);
return function () {
var nextToken = this.LA(1);
return choiceToAlt_2[nextToken.tokenTypeIdx] === true;
};
}
}
else {
return function () {
nextPath: for (var j = 0; j < numOfPaths; j++) {
var currPath = alt[j];
var currPathLength = currPath.length;
for (var i = 0; i < currPathLength; i++) {
var nextToken = this.LA(i + 1);
if (tokenMatcher(nextToken, currPath[i]) === false) {
// mismatch in current path
// try the next pth
continue nextPath;
}
}
// found a full path that matches.
return true;
}
// none of the paths matched
return false;
};
}
}
exports.buildSingleAlternativeLookaheadFunction = buildSingleAlternativeLookaheadFunction;
var RestDefinitionFinderWalker = /** @class */ (function (_super) {
__extends(RestDefinitionFinderWalker, _super);
function RestDefinitionFinderWalker(topProd, targetOccurrence, targetProdType) {
var _this = _super.call(this) || this;
_this.topProd = topProd;
_this.targetOccurrence = targetOccurrence;
_this.targetProdType = targetProdType;
return _this;
}
RestDefinitionFinderWalker.prototype.startWalking = function () {
this.walk(this.topProd);
return this.restDef;
};
RestDefinitionFinderWalker.prototype.checkIsTarget = function (node, expectedProdType, currRest, prevRest) {
if (node.idx === this.targetOccurrence &&
this.targetProdType === expectedProdType) {
this.restDef = currRest.concat(prevRest);
return true;
}
// performance optimization, do not iterate over the entire Grammar ast after we have found the target
return false;
};
RestDefinitionFinderWalker.prototype.walkOption = function (optionProd, currRest, prevRest) {
if (!this.checkIsTarget(optionProd, PROD_TYPE.OPTION, currRest, prevRest)) {
_super.prototype.walkOption.call(this, optionProd, currRest, prevRest);
}
};
RestDefinitionFinderWalker.prototype.walkAtLeastOne = function (atLeastOneProd, currRest, prevRest) {
if (!this.checkIsTarget(atLeastOneProd, PROD_TYPE.REPETITION_MANDATORY, currRest, prevRest)) {
_super.prototype.walkOption.call(this, atLeastOneProd, currRest, prevRest);
}
};
RestDefinitionFinderWalker.prototype.walkAtLeastOneSep = function (atLeastOneSepProd, currRest, prevRest) {
if (!this.checkIsTarget(atLeastOneSepProd, PROD_TYPE.REPETITION_MANDATORY_WITH_SEPARATOR, currRest, prevRest)) {
_super.prototype.walkOption.call(this, atLeastOneSepProd, currRest, prevRest);
}
};
RestDefinitionFinderWalker.prototype.walkMany = function (manyProd, currRest, prevRest) {
if (!this.checkIsTarget(manyProd, PROD_TYPE.REPETITION, currRest, prevRest)) {
_super.prototype.walkOption.call(this, manyProd, currRest, prevRest);
}
};
RestDefinitionFinderWalker.prototype.walkManySep = function (manySepProd, currRest, prevRest) {
if (!this.checkIsTarget(manySepProd, PROD_TYPE.REPETITION_WITH_SEPARATOR, currRest, prevRest)) {
_super.prototype.walkOption.call(this, manySepProd, currRest, prevRest);
}
};
return RestDefinitionFinderWalker;
}(rest_1.RestWalker));
/**
* Returns the definition of a target production in a top level level rule.
*/
var InsideDefinitionFinderVisitor = /** @class */ (function (_super) {
__extends(InsideDefinitionFinderVisitor, _super);
function InsideDefinitionFinderVisitor(targetOccurrence, targetProdType, targetRef) {
var _this = _super.call(this) || this;
_this.targetOccurrence = targetOccurrence;
_this.targetProdType = targetProdType;
_this.targetRef = targetRef;
_this.result = [];
return _this;
}
InsideDefinitionFinderVisitor.prototype.checkIsTarget = function (node, expectedProdName) {
if (node.idx === this.targetOccurrence &&
this.targetProdType === expectedProdName &&
(this.targetRef === undefined || node === this.targetRef)) {
this.result = node.definition;
}
};
InsideDefinitionFinderVisitor.prototype.visitOption = function (node) {
this.checkIsTarget(node, PROD_TYPE.OPTION);
};
InsideDefinitionFinderVisitor.prototype.visitRepetition = function (node) {
this.checkIsTarget(node, PROD_TYPE.REPETITION);
};
InsideDefinitionFinderVisitor.prototype.visitRepetitionMandatory = function (node) {
this.checkIsTarget(node, PROD_TYPE.REPETITION_MANDATORY);
};
InsideDefinitionFinderVisitor.prototype.visitRepetitionMandatoryWithSeparator = function (node) {
this.checkIsTarget(node, PROD_TYPE.REPETITION_MANDATORY_WITH_SEPARATOR);
};
InsideDefinitionFinderVisitor.prototype.visitRepetitionWithSeparator = function (node) {
this.checkIsTarget(node, PROD_TYPE.REPETITION_WITH_SEPARATOR);
};
InsideDefinitionFinderVisitor.prototype.visitAlternation = function (node) {
this.checkIsTarget(node, PROD_TYPE.ALTERNATION);
};
return InsideDefinitionFinderVisitor;
}(gast_2.GAstVisitor));
function initializeArrayOfArrays(size) {
var result = new Array(size);
for (var i = 0; i < size; i++) {
result[i] = [];
}
return result;
}
/**
* A sort of hash function between a Path in the grammar and a string.
* Note that this returns multiple "hashes" to support the scenario of token categories.
* - A single path with categories may match multiple **actual** paths.
*/
function pathToHashKeys(path) {
var keys = [""];
for (var i = 0; i < path.length; i++) {
var tokType = path[i];
var longerKeys = [];
for (var j = 0; j < keys.length; j++) {
var currShorterKey = keys[j];
longerKeys.push(currShorterKey + "_" + tokType.tokenTypeIdx);
for (var t = 0; t < tokType.categoryMatches.length; t++) {
var categoriesKeySuffix = "_" + tokType.categoryMatches[t];
longerKeys.push(currShorterKey + categoriesKeySuffix);
}
}
keys = longerKeys;
}
return keys;
}
/**
* Imperative style due to being called from a hot spot
*/
function isUniquePrefixHash(altKnownPathsKeys, searchPathKeys, idx) {
for (var currAltIdx = 0; currAltIdx < altKnownPathsKeys.length; currAltIdx++) {
// We only want to test vs the other alternatives
if (currAltIdx === idx) {
continue;
}
var otherAltKnownPathsKeys = altKnownPathsKeys[currAltIdx];
for (var searchIdx = 0; searchIdx < searchPathKeys.length; searchIdx++) {
var searchKey = searchPathKeys[searchIdx];
if (otherAltKnownPathsKeys[searchKey] === true) {
return false;
}
}
}
// None of the SearchPathKeys were found in any of the other alternatives
return true;
}
function lookAheadSequenceFromAlternatives(altsDefs, k) {
var partialAlts = (0, map_1.default)(altsDefs, function (currAlt) {
return (0, interpreter_1.possiblePathsFrom)([currAlt], 1);
});
var finalResult = initializeArrayOfArrays(partialAlts.length);
var altsHashes = (0, map_1.default)(partialAlts, function (currAltPaths) {
var dict = {};
(0, forEach_1.default)(currAltPaths, function (item) {
var keys = pathToHashKeys(item.partialPath);
(0, forEach_1.default)(keys, function (currKey) {
dict[currKey] = true;
});
});
return dict;
});
var newData = partialAlts;
// maxLookahead loop
for (var pathLength = 1; pathLength <= k; pathLength++) {
var currDataset = newData;
newData = initializeArrayOfArrays(currDataset.length);
var _loop_1 = function (altIdx) {
var currAltPathsAndSuffixes = currDataset[altIdx];
// paths in current alternative loop
for (var currPathIdx = 0; currPathIdx < currAltPathsAndSuffixes.length; currPathIdx++) {
var currPathPrefix = currAltPathsAndSuffixes[currPathIdx].partialPath;
var suffixDef = currAltPathsAndSuffixes[currPathIdx].suffixDef;
var prefixKeys = pathToHashKeys(currPathPrefix);
var isUnique = isUniquePrefixHash(altsHashes, prefixKeys, altIdx);
// End of the line for this path.
if (isUnique || (0, isEmpty_1.default)(suffixDef) || currPathPrefix.length === k) {
var currAltResult = finalResult[altIdx];
// TODO: Can we implement a containsPath using Maps/Dictionaries?
if (containsPath(currAltResult, currPathPrefix) === false) {
currAltResult.push(currPathPrefix);
// Update all new keys for the current path.
for (var j = 0; j < prefixKeys.length; j++) {
var currKey = prefixKeys[j];
altsHashes[altIdx][currKey] = true;
}
}
}
// Expand longer paths
else {
var newPartialPathsAndSuffixes = (0, interpreter_1.possiblePathsFrom)(suffixDef, pathLength + 1, currPathPrefix);
newData[altIdx] = newData[altIdx].concat(newPartialPathsAndSuffixes);
// Update keys for new known paths
(0, forEach_1.default)(newPartialPathsAndSuffixes, function (item) {
var prefixKeys = pathToHashKeys(item.partialPath);
(0, forEach_1.default)(prefixKeys, function (key) {
altsHashes[altIdx][key] = true;
});
});
}
}
};
// alternatives loop
for (var altIdx = 0; altIdx < currDataset.length; altIdx++) {
_loop_1(altIdx);
}
}
return finalResult;
}
exports.lookAheadSequenceFromAlternatives = lookAheadSequenceFromAlternatives;
function getLookaheadPathsForOr(occurrence, ruleGrammar, k, orProd) {
var visitor = new InsideDefinitionFinderVisitor(occurrence, PROD_TYPE.ALTERNATION, orProd);
ruleGrammar.accept(visitor);
return lookAheadSequenceFromAlternatives(visitor.result, k);
}
exports.getLookaheadPathsForOr = getLookaheadPathsForOr;
function getLookaheadPathsForOptionalProd(occurrence, ruleGrammar, prodType, k) {
var insideDefVisitor = new InsideDefinitionFinderVisitor(occurrence, prodType);
ruleGrammar.accept(insideDefVisitor);
var insideDef = insideDefVisitor.result;
var afterDefWalker = new RestDefinitionFinderWalker(ruleGrammar, occurrence, prodType);
var afterDef = afterDefWalker.startWalking();
var insideFlat = new gast_1.Alternative({ definition: insideDef });
var afterFlat = new gast_1.Alternative({ definition: afterDef });
return lookAheadSequenceFromAlternatives([insideFlat, afterFlat], k);
}
exports.getLookaheadPathsForOptionalProd = getLookaheadPathsForOptionalProd;
function containsPath(alternative, searchPath) {
compareOtherPath: for (var i = 0; i < alternative.length; i++) {
var otherPath = alternative[i];
if (otherPath.length !== searchPath.length) {
continue;
}
for (var j = 0; j < otherPath.length; j++) {
var searchTok = searchPath[j];
var otherTok = otherPath[j];
var matchingTokens = searchTok === otherTok ||
otherTok.categoryMatchesMap[searchTok.tokenTypeIdx] !== undefined;
if (matchingTokens === false) {
continue compareOtherPath;
}
}
return true;
}
return false;
}
exports.containsPath = containsPath;
function isStrictPrefixOfPath(prefix, other) {
return (prefix.length < other.length &&
(0, every_1.default)(prefix, function (tokType, idx) {
var otherTokType = other[idx];
return (tokType === otherTokType ||
otherTokType.categoryMatchesMap[tokType.tokenTypeIdx]);
}));
}
exports.isStrictPrefixOfPath = isStrictPrefixOfPath;
function areTokenCategoriesNotUsed(lookAheadPaths) {
return (0, every_1.default)(lookAheadPaths, function (singleAltPaths) {
return (0, every_1.default)(singleAltPaths, function (singlePath) {
return (0, every_1.default)(singlePath, function (token) { return (0, isEmpty_1.default)(token.categoryMatches); });
});
});
}
exports.areTokenCategoriesNotUsed = areTokenCategoriesNotUsed;
//# sourceMappingURL=lookahead.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,66 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.GastRefResolverVisitor = exports.resolveGrammar = void 0;
var parser_1 = require("../parser/parser");
var forEach_1 = __importDefault(require("lodash/forEach"));
var values_1 = __importDefault(require("lodash/values"));
var gast_1 = require("@chevrotain/gast");
function resolveGrammar(topLevels, errMsgProvider) {
var refResolver = new GastRefResolverVisitor(topLevels, errMsgProvider);
refResolver.resolveRefs();
return refResolver.errors;
}
exports.resolveGrammar = resolveGrammar;
var GastRefResolverVisitor = /** @class */ (function (_super) {
__extends(GastRefResolverVisitor, _super);
function GastRefResolverVisitor(nameToTopRule, errMsgProvider) {
var _this = _super.call(this) || this;
_this.nameToTopRule = nameToTopRule;
_this.errMsgProvider = errMsgProvider;
_this.errors = [];
return _this;
}
GastRefResolverVisitor.prototype.resolveRefs = function () {
var _this = this;
(0, forEach_1.default)((0, values_1.default)(this.nameToTopRule), function (prod) {
_this.currTopLevel = prod;
prod.accept(_this);
});
};
GastRefResolverVisitor.prototype.visitNonTerminal = function (node) {
var ref = this.nameToTopRule[node.nonTerminalName];
if (!ref) {
var msg = this.errMsgProvider.buildRuleNotFoundError(this.currTopLevel, node);
this.errors.push({
message: msg,
type: parser_1.ParserDefinitionErrorType.UNRESOLVED_SUBRULE_REF,
ruleName: this.currTopLevel.name,
unresolvedRefName: node.nonTerminalName
});
}
else {
node.referencedRule = ref;
}
};
return GastRefResolverVisitor;
}(gast_1.GAstVisitor));
exports.GastRefResolverVisitor = GastRefResolverVisitor;
//# sourceMappingURL=resolver.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"resolver.js","sourceRoot":"","sources":["../../../../src/parse/grammar/resolver.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;AAAA,2CAGyB;AACzB,2DAAoC;AACpC,yDAAkC;AAElC,yCAA8C;AAM9C,SAAgB,cAAc,CAC5B,SAA+B,EAC/B,cAAoD;IAEpD,IAAM,WAAW,GAAG,IAAI,sBAAsB,CAAC,SAAS,EAAE,cAAc,CAAC,CAAA;IACzE,WAAW,CAAC,WAAW,EAAE,CAAA;IACzB,OAAO,WAAW,CAAC,MAAM,CAAA;AAC3B,CAAC;AAPD,wCAOC;AAED;IAA4C,0CAAW;IAIrD,gCACU,aAAmC,EACnC,cAAoD;QAF9D,YAIE,iBAAO,SACR;QAJS,mBAAa,GAAb,aAAa,CAAsB;QACnC,oBAAc,GAAd,cAAc,CAAsC;QALvD,YAAM,GAA0C,EAAE,CAAA;;IAQzD,CAAC;IAEM,4CAAW,GAAlB;QAAA,iBAKC;QAJC,IAAA,iBAAO,EAAC,IAAA,gBAAM,EAAC,IAAI,CAAC,aAAa,CAAC,EAAE,UAAC,IAAI;YACvC,KAAI,CAAC,YAAY,GAAG,IAAI,CAAA;YACxB,IAAI,CAAC,MAAM,CAAC,KAAI,CAAC,CAAA;QACnB,CAAC,CAAC,CAAA;IACJ,CAAC;IAEM,iDAAgB,GAAvB,UAAwB,IAAiB;QACvC,IAAM,GAAG,GAAG,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,eAAe,CAAC,CAAA;QAEpD,IAAI,CAAC,GAAG,EAAE;YACR,IAAM,GAAG,GAAG,IAAI,CAAC,cAAc,CAAC,sBAAsB,CACpD,IAAI,CAAC,YAAY,EACjB,IAAI,CACL,CAAA;YACD,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC;gBACf,OAAO,EAAE,GAAG;gBACZ,IAAI,EAAE,kCAAyB,CAAC,sBAAsB;gBACtD,QAAQ,EAAE,IAAI,CAAC,YAAY,CAAC,IAAI;gBAChC,iBAAiB,EAAE,IAAI,CAAC,eAAe;aACxC,CAAC,CAAA;SACH;aAAM;YACL,IAAI,CAAC,cAAc,GAAG,GAAG,CAAA;SAC1B;IACH,CAAC;IACH,6BAAC;AAAD,CAAC,AApCD,CAA4C,kBAAW,GAoCtD;AApCY,wDAAsB"}

View File

@@ -0,0 +1,117 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.RestWalker = void 0;
var drop_1 = __importDefault(require("lodash/drop"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var gast_1 = require("@chevrotain/gast");
/**
* A Grammar Walker that computes the "remaining" grammar "after" a productions in the grammar.
*/
var RestWalker = /** @class */ (function () {
function RestWalker() {
}
RestWalker.prototype.walk = function (prod, prevRest) {
var _this = this;
if (prevRest === void 0) { prevRest = []; }
(0, forEach_1.default)(prod.definition, function (subProd, index) {
var currRest = (0, drop_1.default)(prod.definition, index + 1);
/* istanbul ignore else */
if (subProd instanceof gast_1.NonTerminal) {
_this.walkProdRef(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.Terminal) {
_this.walkTerminal(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.Alternative) {
_this.walkFlat(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.Option) {
_this.walkOption(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.RepetitionMandatory) {
_this.walkAtLeastOne(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.RepetitionMandatoryWithSeparator) {
_this.walkAtLeastOneSep(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.RepetitionWithSeparator) {
_this.walkManySep(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.Repetition) {
_this.walkMany(subProd, currRest, prevRest);
}
else if (subProd instanceof gast_1.Alternation) {
_this.walkOr(subProd, currRest, prevRest);
}
else {
throw Error("non exhaustive match");
}
});
};
RestWalker.prototype.walkTerminal = function (terminal, currRest, prevRest) { };
RestWalker.prototype.walkProdRef = function (refProd, currRest, prevRest) { };
RestWalker.prototype.walkFlat = function (flatProd, currRest, prevRest) {
// ABCDEF => after the D the rest is EF
var fullOrRest = currRest.concat(prevRest);
this.walk(flatProd, fullOrRest);
};
RestWalker.prototype.walkOption = function (optionProd, currRest, prevRest) {
// ABC(DE)?F => after the (DE)? the rest is F
var fullOrRest = currRest.concat(prevRest);
this.walk(optionProd, fullOrRest);
};
RestWalker.prototype.walkAtLeastOne = function (atLeastOneProd, currRest, prevRest) {
// ABC(DE)+F => after the (DE)+ the rest is (DE)?F
var fullAtLeastOneRest = [
new gast_1.Option({ definition: atLeastOneProd.definition })
].concat(currRest, prevRest);
this.walk(atLeastOneProd, fullAtLeastOneRest);
};
RestWalker.prototype.walkAtLeastOneSep = function (atLeastOneSepProd, currRest, prevRest) {
// ABC DE(,DE)* F => after the (,DE)+ the rest is (,DE)?F
var fullAtLeastOneSepRest = restForRepetitionWithSeparator(atLeastOneSepProd, currRest, prevRest);
this.walk(atLeastOneSepProd, fullAtLeastOneSepRest);
};
RestWalker.prototype.walkMany = function (manyProd, currRest, prevRest) {
// ABC(DE)*F => after the (DE)* the rest is (DE)?F
var fullManyRest = [
new gast_1.Option({ definition: manyProd.definition })
].concat(currRest, prevRest);
this.walk(manyProd, fullManyRest);
};
RestWalker.prototype.walkManySep = function (manySepProd, currRest, prevRest) {
// ABC (DE(,DE)*)? F => after the (,DE)* the rest is (,DE)?F
var fullManySepRest = restForRepetitionWithSeparator(manySepProd, currRest, prevRest);
this.walk(manySepProd, fullManySepRest);
};
RestWalker.prototype.walkOr = function (orProd, currRest, prevRest) {
var _this = this;
// ABC(D|E|F)G => when finding the (D|E|F) the rest is G
var fullOrRest = currRest.concat(prevRest);
// walk all different alternatives
(0, forEach_1.default)(orProd.definition, function (alt) {
// wrapping each alternative in a single definition wrapper
// to avoid errors in computing the rest of that alternative in the invocation to computeInProdFollows
// (otherwise for OR([alt1,alt2]) alt2 will be considered in 'rest' of alt1
var prodWrapper = new gast_1.Alternative({ definition: [alt] });
_this.walk(prodWrapper, fullOrRest);
});
};
return RestWalker;
}());
exports.RestWalker = RestWalker;
function restForRepetitionWithSeparator(repSepProd, currRest, prevRest) {
var repSepRest = [
new gast_1.Option({
definition: [
new gast_1.Terminal({ terminalType: repSepProd.separator })
].concat(repSepProd.definition)
})
];
var fullRepSepRest = repSepRest.concat(currRest, prevRest);
return fullRepSepRest;
}
//# sourceMappingURL=rest.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"rest.js","sourceRoot":"","sources":["../../../../src/parse/grammar/rest.ts"],"names":[],"mappings":";;;;;;AAAA,qDAA8B;AAC9B,2DAAoC;AACpC,yCAUyB;AAGzB;;GAEG;AACH;IAAA;IAiIA,CAAC;IAhIC,yBAAI,GAAJ,UAAK,IAAmC,EAAE,QAAoB;QAA9D,iBA0BC;QA1ByC,yBAAA,EAAA,aAAoB;QAC5D,IAAA,iBAAO,EAAC,IAAI,CAAC,UAAU,EAAE,UAAC,OAAoB,EAAE,KAAK;YACnD,IAAM,QAAQ,GAAG,IAAA,cAAI,EAAC,IAAI,CAAC,UAAU,EAAE,KAAK,GAAG,CAAC,CAAC,CAAA;YACjD,0BAA0B;YAC1B,IAAI,OAAO,YAAY,kBAAW,EAAE;gBAClC,KAAI,CAAC,WAAW,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aAC9C;iBAAM,IAAI,OAAO,YAAY,eAAQ,EAAE;gBACtC,KAAI,CAAC,YAAY,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aAC/C;iBAAM,IAAI,OAAO,YAAY,kBAAW,EAAE;gBACzC,KAAI,CAAC,QAAQ,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aAC3C;iBAAM,IAAI,OAAO,YAAY,aAAM,EAAE;gBACpC,KAAI,CAAC,UAAU,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aAC7C;iBAAM,IAAI,OAAO,YAAY,0BAAmB,EAAE;gBACjD,KAAI,CAAC,cAAc,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aACjD;iBAAM,IAAI,OAAO,YAAY,uCAAgC,EAAE;gBAC9D,KAAI,CAAC,iBAAiB,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aACpD;iBAAM,IAAI,OAAO,YAAY,8BAAuB,EAAE;gBACrD,KAAI,CAAC,WAAW,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aAC9C;iBAAM,IAAI,OAAO,YAAY,iBAAU,EAAE;gBACxC,KAAI,CAAC,QAAQ,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aAC3C;iBAAM,IAAI,OAAO,YAAY,kBAAW,EAAE;gBACzC,KAAI,CAAC,MAAM,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAA;aACzC;iBAAM;gBACL,MAAM,KAAK,CAAC,sBAAsB,CAAC,CAAA;aACpC;QACH,CAAC,CAAC,CAAA;IACJ,CAAC;IAED,iCAAY,GAAZ,UACE,QAAkB,EAClB,QAAuB,EACvB,QAAuB,IAChB,CAAC;IAEV,gCAAW,GAAX,UACE,OAAoB,EACpB,QAAuB,EACvB,QAAuB,IAChB,CAAC;IAEV,6BAAQ,GAAR,UACE,QAAqB,EACrB,QAAuB,EACvB,QAAuB;QAEvB,uCAAuC;QACvC,IAAM,UAAU,GAAG,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAA;QAC5C,IAAI,CAAC,IAAI,CAAC,QAAQ,EAAO,UAAU,CAAC,CAAA;IACtC,CAAC;IAED,+BAAU,GAAV,UACE,UAAkB,EAClB,QAAuB,EACvB,QAAuB;QAEvB,6CAA6C;QAC7C,IAAM,UAAU,GAAG,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAA;QAC5C,IAAI,CAAC,IAAI,CAAC,UAAU,EAAO,UAAU,CAAC,CAAA;IACxC,CAAC;IAED,mCAAc,GAAd,UACE,cAAmC,EACnC,QAAuB,EACvB,QAAuB;QAEvB,kDAAkD;QAClD,IAAM,kBAAkB,GAAkB;YACxC,IAAI,aAAM,CAAC,EAAE,UAAU,EAAE,cAAc,CAAC,UAAU,EAAE,CAAC;SACtD,CAAC,MAAM,CAAM,QAAQ,EAAO,QAAQ,CAAC,CAAA;QACtC,IAAI,CAAC,IAAI,CAAC,cAAc,EAAE,kBAAkB,CAAC,CAAA;IAC/C,CAAC;IAED,sCAAiB,GAAjB,UACE,iBAAmD,EACnD,QAAuB,EACvB,QAAuB;QAEvB,yDAAyD;QACzD,IAAM,qBAAqB,GAAG,8BAA8B,CAC1D,iBAAiB,EACjB,QAAQ,EACR,QAAQ,CACT,CAAA;QACD,IAAI,CAAC,IAAI,CAAC,iBAAiB,EAAE,qBAAqB,CAAC,CAAA;IACrD,CAAC;IAED,6BAAQ,GAAR,UACE,QAAoB,EACpB,QAAuB,EACvB,QAAuB;QAEvB,kDAAkD;QAClD,IAAM,YAAY,GAAkB;YAClC,IAAI,aAAM,CAAC,EAAE,UAAU,EAAE,QAAQ,CAAC,UAAU,EAAE,CAAC;SAChD,CAAC,MAAM,CAAM,QAAQ,EAAO,QAAQ,CAAC,CAAA;QACtC,IAAI,CAAC,IAAI,CAAC,QAAQ,EAAE,YAAY,CAAC,CAAA;IACnC,CAAC;IAED,gCAAW,GAAX,UACE,WAAoC,EACpC,QAAuB,EACvB,QAAuB;QAEvB,4DAA4D;QAC5D,IAAM,eAAe,GAAG,8BAA8B,CACpD,WAAW,EACX,QAAQ,EACR,QAAQ,CACT,CAAA;QACD,IAAI,CAAC,IAAI,CAAC,WAAW,EAAE,eAAe,CAAC,CAAA;IACzC,CAAC;IAED,2BAAM,GAAN,UACE,MAAmB,EACnB,QAAuB,EACvB,QAAuB;QAHzB,iBAeC;QAVC,wDAAwD;QACxD,IAAM,UAAU,GAAG,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAA;QAC5C,kCAAkC;QAClC,IAAA,iBAAO,EAAC,MAAM,CAAC,UAAU,EAAE,UAAC,GAAG;YAC7B,2DAA2D;YAC3D,sGAAsG;YACtG,2EAA2E;YAC3E,IAAM,WAAW,GAAG,IAAI,kBAAW,CAAC,EAAE,UAAU,EAAE,CAAC,GAAG,CAAC,EAAE,CAAC,CAAA;YAC1D,KAAI,CAAC,IAAI,CAAC,WAAW,EAAO,UAAU,CAAC,CAAA;QACzC,CAAC,CAAC,CAAA;IACJ,CAAC;IACH,iBAAC;AAAD,CAAC,AAjID,IAiIC;AAjIqB,gCAAU;AAmIhC,SAAS,8BAA8B,CACrC,UAAmC,EACnC,QAAuB,EACvB,QAAuB;IAEvB,IAAM,UAAU,GAAG;QACjB,IAAI,aAAM,CAAC;YACT,UAAU,EAAE;gBACV,IAAI,eAAQ,CAAC,EAAE,YAAY,EAAE,UAAU,CAAC,SAAS,EAAE,CAAgB;aACpE,CAAC,MAAM,CAAC,UAAU,CAAC,UAAU,CAAC;SAChC,CAAgB;KAClB,CAAA;IACD,IAAM,cAAc,GAAkB,UAAU,CAAC,MAAM,CAAC,QAAQ,EAAE,QAAQ,CAAC,CAAA;IAC3E,OAAO,cAAc,CAAA;AACvB,CAAC"}

View File

@@ -0,0 +1,3 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
//# sourceMappingURL=types.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"types.js","sourceRoot":"","sources":["../../../../src/parse/grammar/types.ts"],"names":[],"mappings":""}

View File

@@ -0,0 +1,240 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.EmbeddedActionsParser = exports.CstParser = exports.Parser = exports.EMPTY_ALT = exports.ParserDefinitionErrorType = exports.DEFAULT_RULE_CONFIG = exports.DEFAULT_PARSER_CONFIG = exports.END_OF_FILE = void 0;
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var map_1 = __importDefault(require("lodash/map"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var values_1 = __importDefault(require("lodash/values"));
var has_1 = __importDefault(require("lodash/has"));
var clone_1 = __importDefault(require("lodash/clone"));
var utils_1 = require("@chevrotain/utils");
var follow_1 = require("../grammar/follow");
var tokens_public_1 = require("../../scan/tokens_public");
var errors_public_1 = require("../errors_public");
var gast_resolver_public_1 = require("../grammar/gast/gast_resolver_public");
var recoverable_1 = require("./traits/recoverable");
var looksahead_1 = require("./traits/looksahead");
var tree_builder_1 = require("./traits/tree_builder");
var lexer_adapter_1 = require("./traits/lexer_adapter");
var recognizer_api_1 = require("./traits/recognizer_api");
var recognizer_engine_1 = require("./traits/recognizer_engine");
var error_handler_1 = require("./traits/error_handler");
var context_assist_1 = require("./traits/context_assist");
var gast_recorder_1 = require("./traits/gast_recorder");
var perf_tracer_1 = require("./traits/perf_tracer");
var apply_mixins_1 = require("./utils/apply_mixins");
var checks_1 = require("../grammar/checks");
exports.END_OF_FILE = (0, tokens_public_1.createTokenInstance)(tokens_public_1.EOF, "", NaN, NaN, NaN, NaN, NaN, NaN);
Object.freeze(exports.END_OF_FILE);
exports.DEFAULT_PARSER_CONFIG = Object.freeze({
recoveryEnabled: false,
maxLookahead: 3,
dynamicTokensEnabled: false,
outputCst: true,
errorMessageProvider: errors_public_1.defaultParserErrorProvider,
nodeLocationTracking: "none",
traceInitPerf: false,
skipValidations: false
});
exports.DEFAULT_RULE_CONFIG = Object.freeze({
recoveryValueFunc: function () { return undefined; },
resyncEnabled: true
});
var ParserDefinitionErrorType;
(function (ParserDefinitionErrorType) {
ParserDefinitionErrorType[ParserDefinitionErrorType["INVALID_RULE_NAME"] = 0] = "INVALID_RULE_NAME";
ParserDefinitionErrorType[ParserDefinitionErrorType["DUPLICATE_RULE_NAME"] = 1] = "DUPLICATE_RULE_NAME";
ParserDefinitionErrorType[ParserDefinitionErrorType["INVALID_RULE_OVERRIDE"] = 2] = "INVALID_RULE_OVERRIDE";
ParserDefinitionErrorType[ParserDefinitionErrorType["DUPLICATE_PRODUCTIONS"] = 3] = "DUPLICATE_PRODUCTIONS";
ParserDefinitionErrorType[ParserDefinitionErrorType["UNRESOLVED_SUBRULE_REF"] = 4] = "UNRESOLVED_SUBRULE_REF";
ParserDefinitionErrorType[ParserDefinitionErrorType["LEFT_RECURSION"] = 5] = "LEFT_RECURSION";
ParserDefinitionErrorType[ParserDefinitionErrorType["NONE_LAST_EMPTY_ALT"] = 6] = "NONE_LAST_EMPTY_ALT";
ParserDefinitionErrorType[ParserDefinitionErrorType["AMBIGUOUS_ALTS"] = 7] = "AMBIGUOUS_ALTS";
ParserDefinitionErrorType[ParserDefinitionErrorType["CONFLICT_TOKENS_RULES_NAMESPACE"] = 8] = "CONFLICT_TOKENS_RULES_NAMESPACE";
ParserDefinitionErrorType[ParserDefinitionErrorType["INVALID_TOKEN_NAME"] = 9] = "INVALID_TOKEN_NAME";
ParserDefinitionErrorType[ParserDefinitionErrorType["NO_NON_EMPTY_LOOKAHEAD"] = 10] = "NO_NON_EMPTY_LOOKAHEAD";
ParserDefinitionErrorType[ParserDefinitionErrorType["AMBIGUOUS_PREFIX_ALTS"] = 11] = "AMBIGUOUS_PREFIX_ALTS";
ParserDefinitionErrorType[ParserDefinitionErrorType["TOO_MANY_ALTS"] = 12] = "TOO_MANY_ALTS";
ParserDefinitionErrorType[ParserDefinitionErrorType["CUSTOM_LOOKAHEAD_VALIDATION"] = 13] = "CUSTOM_LOOKAHEAD_VALIDATION";
})(ParserDefinitionErrorType = exports.ParserDefinitionErrorType || (exports.ParserDefinitionErrorType = {}));
function EMPTY_ALT(value) {
if (value === void 0) { value = undefined; }
return function () {
return value;
};
}
exports.EMPTY_ALT = EMPTY_ALT;
var Parser = /** @class */ (function () {
function Parser(tokenVocabulary, config) {
this.definitionErrors = [];
this.selfAnalysisDone = false;
var that = this;
that.initErrorHandler(config);
that.initLexerAdapter();
that.initLooksAhead(config);
that.initRecognizerEngine(tokenVocabulary, config);
that.initRecoverable(config);
that.initTreeBuilder(config);
that.initContentAssist();
that.initGastRecorder(config);
that.initPerformanceTracer(config);
if ((0, has_1.default)(config, "ignoredIssues")) {
throw new Error("The <ignoredIssues> IParserConfig property has been deprecated.\n\t" +
"Please use the <IGNORE_AMBIGUITIES> flag on the relevant DSL method instead.\n\t" +
"See: https://chevrotain.io/docs/guide/resolving_grammar_errors.html#IGNORING_AMBIGUITIES\n\t" +
"For further details.");
}
this.skipValidations = (0, has_1.default)(config, "skipValidations")
? config.skipValidations // casting assumes the end user passing the correct type
: exports.DEFAULT_PARSER_CONFIG.skipValidations;
}
/**
* @deprecated use the **instance** method with the same name instead
*/
Parser.performSelfAnalysis = function (parserInstance) {
throw Error("The **static** `performSelfAnalysis` method has been deprecated." +
"\t\nUse the **instance** method with the same name instead.");
};
Parser.prototype.performSelfAnalysis = function () {
var _this = this;
this.TRACE_INIT("performSelfAnalysis", function () {
var defErrorsMsgs;
_this.selfAnalysisDone = true;
var className = _this.className;
_this.TRACE_INIT("toFastProps", function () {
// Without this voodoo magic the parser would be x3-x4 slower
// It seems it is better to invoke `toFastProperties` **before**
// Any manipulations of the `this` object done during the recording phase.
(0, utils_1.toFastProperties)(_this);
});
_this.TRACE_INIT("Grammar Recording", function () {
try {
_this.enableRecording();
// Building the GAST
(0, forEach_1.default)(_this.definedRulesNames, function (currRuleName) {
var wrappedRule = _this[currRuleName];
var originalGrammarAction = wrappedRule["originalGrammarAction"];
var recordedRuleGast;
_this.TRACE_INIT("".concat(currRuleName, " Rule"), function () {
recordedRuleGast = _this.topLevelRuleRecord(currRuleName, originalGrammarAction);
});
_this.gastProductionsCache[currRuleName] = recordedRuleGast;
});
}
finally {
_this.disableRecording();
}
});
var resolverErrors = [];
_this.TRACE_INIT("Grammar Resolving", function () {
resolverErrors = (0, gast_resolver_public_1.resolveGrammar)({
rules: (0, values_1.default)(_this.gastProductionsCache)
});
_this.definitionErrors = _this.definitionErrors.concat(resolverErrors);
});
_this.TRACE_INIT("Grammar Validations", function () {
// only perform additional grammar validations IFF no resolving errors have occurred.
// as unresolved grammar may lead to unhandled runtime exceptions in the follow up validations.
if ((0, isEmpty_1.default)(resolverErrors) && _this.skipValidations === false) {
var validationErrors = (0, gast_resolver_public_1.validateGrammar)({
rules: (0, values_1.default)(_this.gastProductionsCache),
tokenTypes: (0, values_1.default)(_this.tokensMap),
errMsgProvider: errors_public_1.defaultGrammarValidatorErrorProvider,
grammarName: className
});
var lookaheadValidationErrors = (0, checks_1.validateLookahead)({
lookaheadStrategy: _this.lookaheadStrategy,
rules: (0, values_1.default)(_this.gastProductionsCache),
tokenTypes: (0, values_1.default)(_this.tokensMap),
grammarName: className
});
_this.definitionErrors = _this.definitionErrors.concat(validationErrors, lookaheadValidationErrors);
}
});
// this analysis may fail if the grammar is not perfectly valid
if ((0, isEmpty_1.default)(_this.definitionErrors)) {
// The results of these computations are not needed unless error recovery is enabled.
if (_this.recoveryEnabled) {
_this.TRACE_INIT("computeAllProdsFollows", function () {
var allFollows = (0, follow_1.computeAllProdsFollows)((0, values_1.default)(_this.gastProductionsCache));
_this.resyncFollows = allFollows;
});
}
_this.TRACE_INIT("ComputeLookaheadFunctions", function () {
var _a, _b;
(_b = (_a = _this.lookaheadStrategy).initialize) === null || _b === void 0 ? void 0 : _b.call(_a, {
rules: (0, values_1.default)(_this.gastProductionsCache)
});
_this.preComputeLookaheadFunctions((0, values_1.default)(_this.gastProductionsCache));
});
}
if (!Parser.DEFER_DEFINITION_ERRORS_HANDLING &&
!(0, isEmpty_1.default)(_this.definitionErrors)) {
defErrorsMsgs = (0, map_1.default)(_this.definitionErrors, function (defError) { return defError.message; });
throw new Error("Parser Definition Errors detected:\n ".concat(defErrorsMsgs.join("\n-------------------------------\n")));
}
});
};
// Set this flag to true if you don't want the Parser to throw error when problems in it's definition are detected.
// (normally during the parser's constructor).
// This is a design time flag, it will not affect the runtime error handling of the parser, just design time errors,
// for example: duplicate rule names, referencing an unresolved subrule, ect...
// This flag should not be enabled during normal usage, it is used in special situations, for example when
// needing to display the parser definition errors in some GUI(online playground).
Parser.DEFER_DEFINITION_ERRORS_HANDLING = false;
return Parser;
}());
exports.Parser = Parser;
(0, apply_mixins_1.applyMixins)(Parser, [
recoverable_1.Recoverable,
looksahead_1.LooksAhead,
tree_builder_1.TreeBuilder,
lexer_adapter_1.LexerAdapter,
recognizer_engine_1.RecognizerEngine,
recognizer_api_1.RecognizerApi,
error_handler_1.ErrorHandler,
context_assist_1.ContentAssist,
gast_recorder_1.GastRecorder,
perf_tracer_1.PerformanceTracer
]);
var CstParser = /** @class */ (function (_super) {
__extends(CstParser, _super);
function CstParser(tokenVocabulary, config) {
if (config === void 0) { config = exports.DEFAULT_PARSER_CONFIG; }
var configClone = (0, clone_1.default)(config);
configClone.outputCst = true;
return _super.call(this, tokenVocabulary, configClone) || this;
}
return CstParser;
}(Parser));
exports.CstParser = CstParser;
var EmbeddedActionsParser = /** @class */ (function (_super) {
__extends(EmbeddedActionsParser, _super);
function EmbeddedActionsParser(tokenVocabulary, config) {
if (config === void 0) { config = exports.DEFAULT_PARSER_CONFIG; }
var configClone = (0, clone_1.default)(config);
configClone.outputCst = false;
return _super.call(this, tokenVocabulary, configClone) || this;
}
return EmbeddedActionsParser;
}(Parser));
exports.EmbeddedActionsParser = EmbeddedActionsParser;
//# sourceMappingURL=parser.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,33 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.ContentAssist = void 0;
var interpreter_1 = require("../../grammar/interpreter");
var first_1 = __importDefault(require("lodash/first"));
var isUndefined_1 = __importDefault(require("lodash/isUndefined"));
var ContentAssist = /** @class */ (function () {
function ContentAssist() {
}
ContentAssist.prototype.initContentAssist = function () { };
ContentAssist.prototype.computeContentAssist = function (startRuleName, precedingInput) {
var startRuleGast = this.gastProductionsCache[startRuleName];
if ((0, isUndefined_1.default)(startRuleGast)) {
throw Error("Rule ->".concat(startRuleName, "<- does not exist in this grammar."));
}
return (0, interpreter_1.nextPossibleTokensAfter)([startRuleGast], precedingInput, this.tokenMatcher, this.maxLookahead);
};
// TODO: should this be a member method or a utility? it does not have any state or usage of 'this'...
// TODO: should this be more explicitly part of the public API?
ContentAssist.prototype.getNextPossibleTokenTypes = function (grammarPath) {
var topRuleName = (0, first_1.default)(grammarPath.ruleStack);
var gastProductions = this.getGAstProductions();
var topProduction = gastProductions[topRuleName];
var nextPossibleTokenTypes = new interpreter_1.NextAfterTokenWalker(topProduction, grammarPath).startWalking();
return nextPossibleTokenTypes;
};
return ContentAssist;
}());
exports.ContentAssist = ContentAssist;
//# sourceMappingURL=context_assist.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"context_assist.js","sourceRoot":"","sources":["../../../../../src/parse/parser/traits/context_assist.ts"],"names":[],"mappings":";;;;;;AAMA,yDAGkC;AAClC,uDAAgC;AAChC,mEAA4C;AAG5C;IAAA;IAqCA,CAAC;IApCC,yCAAiB,GAAjB,cAAqB,CAAC;IAEf,4CAAoB,GAA3B,UAEE,aAAqB,EACrB,cAAwB;QAExB,IAAM,aAAa,GAAG,IAAI,CAAC,oBAAoB,CAAC,aAAa,CAAC,CAAA;QAE9D,IAAI,IAAA,qBAAW,EAAC,aAAa,CAAC,EAAE;YAC9B,MAAM,KAAK,CAAC,iBAAU,aAAa,uCAAoC,CAAC,CAAA;SACzE;QAED,OAAO,IAAA,qCAAuB,EAC5B,CAAC,aAAa,CAAC,EACf,cAAc,EACd,IAAI,CAAC,YAAY,EACjB,IAAI,CAAC,YAAY,CAClB,CAAA;IACH,CAAC;IAED,sGAAsG;IACtG,+DAA+D;IACxD,iDAAyB,GAAhC,UAEE,WAA8B;QAE9B,IAAM,WAAW,GAAG,IAAA,eAAK,EAAC,WAAW,CAAC,SAAS,CAAE,CAAA;QACjD,IAAM,eAAe,GAAG,IAAI,CAAC,kBAAkB,EAAE,CAAA;QACjD,IAAM,aAAa,GAAG,eAAe,CAAC,WAAW,CAAC,CAAA;QAClD,IAAM,sBAAsB,GAAG,IAAI,kCAAoB,CACrD,aAAa,EACb,WAAW,CACZ,CAAC,YAAY,EAAE,CAAA;QAChB,OAAO,sBAAsB,CAAA;IAC/B,CAAC;IACH,oBAAC;AAAD,CAAC,AArCD,IAqCC;AArCY,sCAAa"}

View File

@@ -0,0 +1,89 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.ErrorHandler = void 0;
var exceptions_public_1 = require("../../exceptions_public");
var has_1 = __importDefault(require("lodash/has"));
var clone_1 = __importDefault(require("lodash/clone"));
var lookahead_1 = require("../../grammar/lookahead");
var parser_1 = require("../parser");
/**
* Trait responsible for runtime parsing errors.
*/
var ErrorHandler = /** @class */ (function () {
function ErrorHandler() {
}
ErrorHandler.prototype.initErrorHandler = function (config) {
this._errors = [];
this.errorMessageProvider = (0, has_1.default)(config, "errorMessageProvider")
? config.errorMessageProvider // assumes end user provides the correct config value/type
: parser_1.DEFAULT_PARSER_CONFIG.errorMessageProvider;
};
ErrorHandler.prototype.SAVE_ERROR = function (error) {
if ((0, exceptions_public_1.isRecognitionException)(error)) {
error.context = {
ruleStack: this.getHumanReadableRuleStack(),
ruleOccurrenceStack: (0, clone_1.default)(this.RULE_OCCURRENCE_STACK)
};
this._errors.push(error);
return error;
}
else {
throw Error("Trying to save an Error which is not a RecognitionException");
}
};
Object.defineProperty(ErrorHandler.prototype, "errors", {
get: function () {
return (0, clone_1.default)(this._errors);
},
set: function (newErrors) {
this._errors = newErrors;
},
enumerable: false,
configurable: true
});
// TODO: consider caching the error message computed information
ErrorHandler.prototype.raiseEarlyExitException = function (occurrence, prodType, userDefinedErrMsg) {
var ruleName = this.getCurrRuleFullName();
var ruleGrammar = this.getGAstProductions()[ruleName];
var lookAheadPathsPerAlternative = (0, lookahead_1.getLookaheadPathsForOptionalProd)(occurrence, ruleGrammar, prodType, this.maxLookahead);
var insideProdPaths = lookAheadPathsPerAlternative[0];
var actualTokens = [];
for (var i = 1; i <= this.maxLookahead; i++) {
actualTokens.push(this.LA(i));
}
var msg = this.errorMessageProvider.buildEarlyExitMessage({
expectedIterationPaths: insideProdPaths,
actual: actualTokens,
previous: this.LA(0),
customUserDescription: userDefinedErrMsg,
ruleName: ruleName
});
throw this.SAVE_ERROR(new exceptions_public_1.EarlyExitException(msg, this.LA(1), this.LA(0)));
};
// TODO: consider caching the error message computed information
ErrorHandler.prototype.raiseNoAltException = function (occurrence, errMsgTypes) {
var ruleName = this.getCurrRuleFullName();
var ruleGrammar = this.getGAstProductions()[ruleName];
// TODO: getLookaheadPathsForOr can be slow for large enough maxLookahead and certain grammars, consider caching ?
var lookAheadPathsPerAlternative = (0, lookahead_1.getLookaheadPathsForOr)(occurrence, ruleGrammar, this.maxLookahead);
var actualTokens = [];
for (var i = 1; i <= this.maxLookahead; i++) {
actualTokens.push(this.LA(i));
}
var previousToken = this.LA(0);
var errMsg = this.errorMessageProvider.buildNoViableAltMessage({
expectedPathsPerAlt: lookAheadPathsPerAlternative,
actual: actualTokens,
previous: previousToken,
customUserDescription: errMsgTypes,
ruleName: this.getCurrRuleFullName()
});
throw this.SAVE_ERROR(new exceptions_public_1.NoViableAltException(errMsg, this.LA(1), previousToken));
};
return ErrorHandler;
}());
exports.ErrorHandler = ErrorHandler;
//# sourceMappingURL=error_handler.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"error_handler.js","sourceRoot":"","sources":["../../../../../src/parse/parser/traits/error_handler.ts"],"names":[],"mappings":";;;;;;AAKA,6DAIgC;AAChC,mDAA4B;AAC5B,uDAAgC;AAChC,qDAIgC;AAEhC,oCAAiD;AAEjD;;GAEG;AACH;IAAA;IAmGA,CAAC;IA/FC,uCAAgB,GAAhB,UAAiB,MAAqB;QACpC,IAAI,CAAC,OAAO,GAAG,EAAE,CAAA;QACjB,IAAI,CAAC,oBAAoB,GAAG,IAAA,aAAG,EAAC,MAAM,EAAE,sBAAsB,CAAC;YAC7D,CAAC,CAAE,MAAM,CAAC,oBAAoD,CAAC,0DAA0D;YACzH,CAAC,CAAC,8BAAqB,CAAC,oBAAoB,CAAA;IAChD,CAAC;IAED,iCAAU,GAAV,UAEE,KAA4B;QAE5B,IAAI,IAAA,0CAAsB,EAAC,KAAK,CAAC,EAAE;YACjC,KAAK,CAAC,OAAO,GAAG;gBACd,SAAS,EAAE,IAAI,CAAC,yBAAyB,EAAE;gBAC3C,mBAAmB,EAAE,IAAA,eAAK,EAAC,IAAI,CAAC,qBAAqB,CAAC;aACvD,CAAA;YACD,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA;YACxB,OAAO,KAAK,CAAA;SACb;aAAM;YACL,MAAM,KAAK,CAAC,6DAA6D,CAAC,CAAA;SAC3E;IACH,CAAC;IAED,sBAAI,gCAAM;aAAV;YACE,OAAO,IAAA,eAAK,EAAC,IAAI,CAAC,OAAO,CAAC,CAAA;QAC5B,CAAC;aAED,UAAW,SAAkC;YAC3C,IAAI,CAAC,OAAO,GAAG,SAAS,CAAA;QAC1B,CAAC;;;OAJA;IAMD,gEAAgE;IAChE,8CAAuB,GAAvB,UAEE,UAAkB,EAClB,QAAmB,EACnB,iBAAqC;QAErC,IAAM,QAAQ,GAAG,IAAI,CAAC,mBAAmB,EAAE,CAAA;QAC3C,IAAM,WAAW,GAAG,IAAI,CAAC,kBAAkB,EAAE,CAAC,QAAQ,CAAC,CAAA;QACvD,IAAM,4BAA4B,GAAG,IAAA,4CAAgC,EACnE,UAAU,EACV,WAAW,EACX,QAAQ,EACR,IAAI,CAAC,YAAY,CAClB,CAAA;QACD,IAAM,eAAe,GAAG,4BAA4B,CAAC,CAAC,CAAC,CAAA;QACvD,IAAM,YAAY,GAAG,EAAE,CAAA;QACvB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,IAAI,IAAI,CAAC,YAAY,EAAE,CAAC,EAAE,EAAE;YAC3C,YAAY,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAA;SAC9B;QACD,IAAM,GAAG,GAAG,IAAI,CAAC,oBAAoB,CAAC,qBAAqB,CAAC;YAC1D,sBAAsB,EAAE,eAAe;YACvC,MAAM,EAAE,YAAY;YACpB,QAAQ,EAAE,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC;YACpB,qBAAqB,EAAE,iBAAiB;YACxC,QAAQ,EAAE,QAAQ;SACnB,CAAC,CAAA;QAEF,MAAM,IAAI,CAAC,UAAU,CAAC,IAAI,sCAAkB,CAAC,GAAG,EAAE,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC,CAAA;IAC5E,CAAC;IAED,gEAAgE;IAChE,0CAAmB,GAAnB,UAEE,UAAkB,EAClB,WAA+B;QAE/B,IAAM,QAAQ,GAAG,IAAI,CAAC,mBAAmB,EAAE,CAAA;QAC3C,IAAM,WAAW,GAAG,IAAI,CAAC,kBAAkB,EAAE,CAAC,QAAQ,CAAC,CAAA;QACvD,kHAAkH;QAClH,IAAM,4BAA4B,GAAG,IAAA,kCAAsB,EACzD,UAAU,EACV,WAAW,EACX,IAAI,CAAC,YAAY,CAClB,CAAA;QAED,IAAM,YAAY,GAAG,EAAE,CAAA;QACvB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,IAAI,IAAI,CAAC,YAAY,EAAE,CAAC,EAAE,EAAE;YAC3C,YAAY,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAA;SAC9B;QACD,IAAM,aAAa,GAAG,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,CAAA;QAEhC,IAAM,MAAM,GAAG,IAAI,CAAC,oBAAoB,CAAC,uBAAuB,CAAC;YAC/D,mBAAmB,EAAE,4BAA4B;YACjD,MAAM,EAAE,YAAY;YACpB,QAAQ,EAAE,aAAa;YACvB,qBAAqB,EAAE,WAAW;YAClC,QAAQ,EAAE,IAAI,CAAC,mBAAmB,EAAE;SACrC,CAAC,CAAA;QAEF,MAAM,IAAI,CAAC,UAAU,CACnB,IAAI,wCAAoB,CAAC,MAAM,EAAE,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,aAAa,CAAC,CAC5D,CAAA;IACH,CAAC;IACH,mBAAC;AAAD,CAAC,AAnGD,IAmGC;AAnGY,oCAAY"}

View File

@@ -0,0 +1,315 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.GastRecorder = void 0;
var last_1 = __importDefault(require("lodash/last"));
var isArray_1 = __importDefault(require("lodash/isArray"));
var some_1 = __importDefault(require("lodash/some"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var isFunction_1 = __importDefault(require("lodash/isFunction"));
var has_1 = __importDefault(require("lodash/has"));
var gast_1 = require("@chevrotain/gast");
var lexer_public_1 = require("../../../scan/lexer_public");
var tokens_1 = require("../../../scan/tokens");
var tokens_public_1 = require("../../../scan/tokens_public");
var parser_1 = require("../parser");
var keys_1 = require("../../grammar/keys");
var RECORDING_NULL_OBJECT = {
description: "This Object indicates the Parser is during Recording Phase"
};
Object.freeze(RECORDING_NULL_OBJECT);
var HANDLE_SEPARATOR = true;
var MAX_METHOD_IDX = Math.pow(2, keys_1.BITS_FOR_OCCURRENCE_IDX) - 1;
var RFT = (0, tokens_public_1.createToken)({ name: "RECORDING_PHASE_TOKEN", pattern: lexer_public_1.Lexer.NA });
(0, tokens_1.augmentTokenTypes)([RFT]);
var RECORDING_PHASE_TOKEN = (0, tokens_public_1.createTokenInstance)(RFT, "This IToken indicates the Parser is in Recording Phase\n\t" +
"" +
"See: https://chevrotain.io/docs/guide/internals.html#grammar-recording for details",
// Using "-1" instead of NaN (as in EOF) because an actual number is less likely to
// cause errors if the output of LA or CONSUME would be (incorrectly) used during the recording phase.
-1, -1, -1, -1, -1, -1);
Object.freeze(RECORDING_PHASE_TOKEN);
var RECORDING_PHASE_CSTNODE = {
name: "This CSTNode indicates the Parser is in Recording Phase\n\t" +
"See: https://chevrotain.io/docs/guide/internals.html#grammar-recording for details",
children: {}
};
/**
* This trait handles the creation of the GAST structure for Chevrotain Grammars
*/
var GastRecorder = /** @class */ (function () {
function GastRecorder() {
}
GastRecorder.prototype.initGastRecorder = function (config) {
this.recordingProdStack = [];
this.RECORDING_PHASE = false;
};
GastRecorder.prototype.enableRecording = function () {
var _this = this;
this.RECORDING_PHASE = true;
this.TRACE_INIT("Enable Recording", function () {
var _loop_1 = function (i) {
var idx = i > 0 ? i : "";
_this["CONSUME".concat(idx)] = function (arg1, arg2) {
return this.consumeInternalRecord(arg1, i, arg2);
};
_this["SUBRULE".concat(idx)] = function (arg1, arg2) {
return this.subruleInternalRecord(arg1, i, arg2);
};
_this["OPTION".concat(idx)] = function (arg1) {
return this.optionInternalRecord(arg1, i);
};
_this["OR".concat(idx)] = function (arg1) {
return this.orInternalRecord(arg1, i);
};
_this["MANY".concat(idx)] = function (arg1) {
this.manyInternalRecord(i, arg1);
};
_this["MANY_SEP".concat(idx)] = function (arg1) {
this.manySepFirstInternalRecord(i, arg1);
};
_this["AT_LEAST_ONE".concat(idx)] = function (arg1) {
this.atLeastOneInternalRecord(i, arg1);
};
_this["AT_LEAST_ONE_SEP".concat(idx)] = function (arg1) {
this.atLeastOneSepFirstInternalRecord(i, arg1);
};
};
/**
* Warning Dark Voodoo Magic upcoming!
* We are "replacing" the public parsing DSL methods API
* With **new** alternative implementations on the Parser **instance**
*
* So far this is the only way I've found to avoid performance regressions during parsing time.
* - Approx 30% performance regression was measured on Chrome 75 Canary when attempting to replace the "internal"
* implementations directly instead.
*/
for (var i = 0; i < 10; i++) {
_loop_1(i);
}
// DSL methods with the idx(suffix) as an argument
_this["consume"] = function (idx, arg1, arg2) {
return this.consumeInternalRecord(arg1, idx, arg2);
};
_this["subrule"] = function (idx, arg1, arg2) {
return this.subruleInternalRecord(arg1, idx, arg2);
};
_this["option"] = function (idx, arg1) {
return this.optionInternalRecord(arg1, idx);
};
_this["or"] = function (idx, arg1) {
return this.orInternalRecord(arg1, idx);
};
_this["many"] = function (idx, arg1) {
this.manyInternalRecord(idx, arg1);
};
_this["atLeastOne"] = function (idx, arg1) {
this.atLeastOneInternalRecord(idx, arg1);
};
_this.ACTION = _this.ACTION_RECORD;
_this.BACKTRACK = _this.BACKTRACK_RECORD;
_this.LA = _this.LA_RECORD;
});
};
GastRecorder.prototype.disableRecording = function () {
var _this = this;
this.RECORDING_PHASE = false;
// By deleting these **instance** properties, any future invocation
// will be deferred to the original methods on the **prototype** object
// This seems to get rid of any incorrect optimizations that V8 may
// do during the recording phase.
this.TRACE_INIT("Deleting Recording methods", function () {
var that = _this;
for (var i = 0; i < 10; i++) {
var idx = i > 0 ? i : "";
delete that["CONSUME".concat(idx)];
delete that["SUBRULE".concat(idx)];
delete that["OPTION".concat(idx)];
delete that["OR".concat(idx)];
delete that["MANY".concat(idx)];
delete that["MANY_SEP".concat(idx)];
delete that["AT_LEAST_ONE".concat(idx)];
delete that["AT_LEAST_ONE_SEP".concat(idx)];
}
delete that["consume"];
delete that["subrule"];
delete that["option"];
delete that["or"];
delete that["many"];
delete that["atLeastOne"];
delete that.ACTION;
delete that.BACKTRACK;
delete that.LA;
});
};
// Parser methods are called inside an ACTION?
// Maybe try/catch/finally on ACTIONS while disabling the recorders state changes?
// @ts-expect-error -- noop place holder
GastRecorder.prototype.ACTION_RECORD = function (impl) {
// NO-OP during recording
};
// Executing backtracking logic will break our recording logic assumptions
GastRecorder.prototype.BACKTRACK_RECORD = function (grammarRule, args) {
return function () { return true; };
};
// LA is part of the official API and may be used for custom lookahead logic
// by end users who may forget to wrap it in ACTION or inside a GATE
GastRecorder.prototype.LA_RECORD = function (howMuch) {
// We cannot use the RECORD_PHASE_TOKEN here because someone may depend
// On LA return EOF at the end of the input so an infinite loop may occur.
return parser_1.END_OF_FILE;
};
GastRecorder.prototype.topLevelRuleRecord = function (name, def) {
try {
var newTopLevelRule = new gast_1.Rule({ definition: [], name: name });
newTopLevelRule.name = name;
this.recordingProdStack.push(newTopLevelRule);
def.call(this);
this.recordingProdStack.pop();
return newTopLevelRule;
}
catch (originalError) {
if (originalError.KNOWN_RECORDER_ERROR !== true) {
try {
originalError.message =
originalError.message +
'\n\t This error was thrown during the "grammar recording phase" For more info see:\n\t' +
"https://chevrotain.io/docs/guide/internals.html#grammar-recording";
}
catch (mutabilityError) {
// We may not be able to modify the original error object
throw originalError;
}
}
throw originalError;
}
};
// Implementation of parsing DSL
GastRecorder.prototype.optionInternalRecord = function (actionORMethodDef, occurrence) {
return recordProd.call(this, gast_1.Option, actionORMethodDef, occurrence);
};
GastRecorder.prototype.atLeastOneInternalRecord = function (occurrence, actionORMethodDef) {
recordProd.call(this, gast_1.RepetitionMandatory, actionORMethodDef, occurrence);
};
GastRecorder.prototype.atLeastOneSepFirstInternalRecord = function (occurrence, options) {
recordProd.call(this, gast_1.RepetitionMandatoryWithSeparator, options, occurrence, HANDLE_SEPARATOR);
};
GastRecorder.prototype.manyInternalRecord = function (occurrence, actionORMethodDef) {
recordProd.call(this, gast_1.Repetition, actionORMethodDef, occurrence);
};
GastRecorder.prototype.manySepFirstInternalRecord = function (occurrence, options) {
recordProd.call(this, gast_1.RepetitionWithSeparator, options, occurrence, HANDLE_SEPARATOR);
};
GastRecorder.prototype.orInternalRecord = function (altsOrOpts, occurrence) {
return recordOrProd.call(this, altsOrOpts, occurrence);
};
GastRecorder.prototype.subruleInternalRecord = function (ruleToCall, occurrence, options) {
assertMethodIdxIsValid(occurrence);
if (!ruleToCall || (0, has_1.default)(ruleToCall, "ruleName") === false) {
var error = new Error("<SUBRULE".concat(getIdxSuffix(occurrence), "> argument is invalid") +
" expecting a Parser method reference but got: <".concat(JSON.stringify(ruleToCall), ">") +
"\n inside top level rule: <".concat(this.recordingProdStack[0].name, ">"));
error.KNOWN_RECORDER_ERROR = true;
throw error;
}
var prevProd = (0, last_1.default)(this.recordingProdStack);
var ruleName = ruleToCall.ruleName;
var newNoneTerminal = new gast_1.NonTerminal({
idx: occurrence,
nonTerminalName: ruleName,
label: options === null || options === void 0 ? void 0 : options.LABEL,
// The resolving of the `referencedRule` property will be done once all the Rule's GASTs have been created
referencedRule: undefined
});
prevProd.definition.push(newNoneTerminal);
return this.outputCst ? RECORDING_PHASE_CSTNODE : RECORDING_NULL_OBJECT;
};
GastRecorder.prototype.consumeInternalRecord = function (tokType, occurrence, options) {
assertMethodIdxIsValid(occurrence);
if (!(0, tokens_1.hasShortKeyProperty)(tokType)) {
var error = new Error("<CONSUME".concat(getIdxSuffix(occurrence), "> argument is invalid") +
" expecting a TokenType reference but got: <".concat(JSON.stringify(tokType), ">") +
"\n inside top level rule: <".concat(this.recordingProdStack[0].name, ">"));
error.KNOWN_RECORDER_ERROR = true;
throw error;
}
var prevProd = (0, last_1.default)(this.recordingProdStack);
var newNoneTerminal = new gast_1.Terminal({
idx: occurrence,
terminalType: tokType,
label: options === null || options === void 0 ? void 0 : options.LABEL
});
prevProd.definition.push(newNoneTerminal);
return RECORDING_PHASE_TOKEN;
};
return GastRecorder;
}());
exports.GastRecorder = GastRecorder;
function recordProd(prodConstructor, mainProdArg, occurrence, handleSep) {
if (handleSep === void 0) { handleSep = false; }
assertMethodIdxIsValid(occurrence);
var prevProd = (0, last_1.default)(this.recordingProdStack);
var grammarAction = (0, isFunction_1.default)(mainProdArg) ? mainProdArg : mainProdArg.DEF;
var newProd = new prodConstructor({ definition: [], idx: occurrence });
if (handleSep) {
newProd.separator = mainProdArg.SEP;
}
if ((0, has_1.default)(mainProdArg, "MAX_LOOKAHEAD")) {
newProd.maxLookahead = mainProdArg.MAX_LOOKAHEAD;
}
this.recordingProdStack.push(newProd);
grammarAction.call(this);
prevProd.definition.push(newProd);
this.recordingProdStack.pop();
return RECORDING_NULL_OBJECT;
}
function recordOrProd(mainProdArg, occurrence) {
var _this = this;
assertMethodIdxIsValid(occurrence);
var prevProd = (0, last_1.default)(this.recordingProdStack);
// Only an array of alternatives
var hasOptions = (0, isArray_1.default)(mainProdArg) === false;
var alts = hasOptions === false ? mainProdArg : mainProdArg.DEF;
var newOrProd = new gast_1.Alternation({
definition: [],
idx: occurrence,
ignoreAmbiguities: hasOptions && mainProdArg.IGNORE_AMBIGUITIES === true
});
if ((0, has_1.default)(mainProdArg, "MAX_LOOKAHEAD")) {
newOrProd.maxLookahead = mainProdArg.MAX_LOOKAHEAD;
}
var hasPredicates = (0, some_1.default)(alts, function (currAlt) { return (0, isFunction_1.default)(currAlt.GATE); });
newOrProd.hasPredicates = hasPredicates;
prevProd.definition.push(newOrProd);
(0, forEach_1.default)(alts, function (currAlt) {
var currAltFlat = new gast_1.Alternative({ definition: [] });
newOrProd.definition.push(currAltFlat);
if ((0, has_1.default)(currAlt, "IGNORE_AMBIGUITIES")) {
currAltFlat.ignoreAmbiguities = currAlt.IGNORE_AMBIGUITIES; // assumes end user provides the correct config value/type
}
// **implicit** ignoreAmbiguities due to usage of gate
else if ((0, has_1.default)(currAlt, "GATE")) {
currAltFlat.ignoreAmbiguities = true;
}
_this.recordingProdStack.push(currAltFlat);
currAlt.ALT.call(_this);
_this.recordingProdStack.pop();
});
return RECORDING_NULL_OBJECT;
}
function getIdxSuffix(idx) {
return idx === 0 ? "" : "".concat(idx);
}
function assertMethodIdxIsValid(idx) {
if (idx < 0 || idx > MAX_METHOD_IDX) {
var error = new Error(
// The stack trace will contain all the needed details
"Invalid DSL Method idx value: <".concat(idx, ">\n\t") +
"Idx value must be a none negative value smaller than ".concat(MAX_METHOD_IDX + 1));
error.KNOWN_RECORDER_ERROR = true;
throw error;
}
}
//# sourceMappingURL=gast_recorder.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,81 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.LexerAdapter = void 0;
var parser_1 = require("../parser");
/**
* Trait responsible abstracting over the interaction with Lexer output (Token vector).
*
* This could be generalized to support other kinds of lexers, e.g.
* - Just in Time Lexing / Lexer-Less parsing.
* - Streaming Lexer.
*/
var LexerAdapter = /** @class */ (function () {
function LexerAdapter() {
}
LexerAdapter.prototype.initLexerAdapter = function () {
this.tokVector = [];
this.tokVectorLength = 0;
this.currIdx = -1;
};
Object.defineProperty(LexerAdapter.prototype, "input", {
get: function () {
return this.tokVector;
},
set: function (newInput) {
// @ts-ignore - `this parameter` not supported in setters/getters
// - https://www.typescriptlang.org/docs/handbook/functions.html#this-parameters
if (this.selfAnalysisDone !== true) {
throw Error("Missing <performSelfAnalysis> invocation at the end of the Parser's constructor.");
}
// @ts-ignore - `this parameter` not supported in setters/getters
// - https://www.typescriptlang.org/docs/handbook/functions.html#this-parameters
this.reset();
this.tokVector = newInput;
this.tokVectorLength = newInput.length;
},
enumerable: false,
configurable: true
});
// skips a token and returns the next token
LexerAdapter.prototype.SKIP_TOKEN = function () {
if (this.currIdx <= this.tokVector.length - 2) {
this.consumeToken();
return this.LA(1);
}
else {
return parser_1.END_OF_FILE;
}
};
// Lexer (accessing Token vector) related methods which can be overridden to implement lazy lexers
// or lexers dependent on parser context.
LexerAdapter.prototype.LA = function (howMuch) {
var soughtIdx = this.currIdx + howMuch;
if (soughtIdx < 0 || this.tokVectorLength <= soughtIdx) {
return parser_1.END_OF_FILE;
}
else {
return this.tokVector[soughtIdx];
}
};
LexerAdapter.prototype.consumeToken = function () {
this.currIdx++;
};
LexerAdapter.prototype.exportLexerState = function () {
return this.currIdx;
};
LexerAdapter.prototype.importLexerState = function (newState) {
this.currIdx = newState;
};
LexerAdapter.prototype.resetLexerState = function () {
this.currIdx = -1;
};
LexerAdapter.prototype.moveToTerminatedState = function () {
this.currIdx = this.tokVector.length - 1;
};
LexerAdapter.prototype.getLexerPosition = function () {
return this.exportLexerState();
};
return LexerAdapter;
}());
exports.LexerAdapter = LexerAdapter;
//# sourceMappingURL=lexer_adapter.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"lexer_adapter.js","sourceRoot":"","sources":["../../../../../src/parse/parser/traits/lexer_adapter.ts"],"names":[],"mappings":";;;AAAA,oCAAuC;AAIvC;;;;;;GAMG;AACH;IAAA;IA0EA,CAAC;IArEC,uCAAgB,GAAhB;QACE,IAAI,CAAC,SAAS,GAAG,EAAE,CAAA;QACnB,IAAI,CAAC,eAAe,GAAG,CAAC,CAAA;QACxB,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC,CAAA;IACnB,CAAC;IAED,sBAAI,+BAAK;aAeT;YACE,OAAO,IAAI,CAAC,SAAS,CAAA;QACvB,CAAC;aAjBD,UAAU,QAAkB;YAC1B,iEAAiE;YACjE,kFAAkF;YAClF,IAAI,IAAI,CAAC,gBAAgB,KAAK,IAAI,EAAE;gBAClC,MAAM,KAAK,CACT,kFAAkF,CACnF,CAAA;aACF;YACD,iEAAiE;YACjE,kFAAkF;YAClF,IAAI,CAAC,KAAK,EAAE,CAAA;YACZ,IAAI,CAAC,SAAS,GAAG,QAAQ,CAAA;YACzB,IAAI,CAAC,eAAe,GAAG,QAAQ,CAAC,MAAM,CAAA;QACxC,CAAC;;;OAAA;IAMD,2CAA2C;IAC3C,iCAAU,GAAV;QACE,IAAI,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC7C,IAAI,CAAC,YAAY,EAAE,CAAA;YACnB,OAAO,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,CAAA;SAClB;aAAM;YACL,OAAO,oBAAW,CAAA;SACnB;IACH,CAAC;IAED,kGAAkG;IAClG,yCAAyC;IACzC,yBAAE,GAAF,UAAwB,OAAe;QACrC,IAAM,SAAS,GAAG,IAAI,CAAC,OAAO,GAAG,OAAO,CAAA;QACxC,IAAI,SAAS,GAAG,CAAC,IAAI,IAAI,CAAC,eAAe,IAAI,SAAS,EAAE;YACtD,OAAO,oBAAW,CAAA;SACnB;aAAM;YACL,OAAO,IAAI,CAAC,SAAS,CAAC,SAAS,CAAC,CAAA;SACjC;IACH,CAAC;IAED,mCAAY,GAAZ;QACE,IAAI,CAAC,OAAO,EAAE,CAAA;IAChB,CAAC;IAED,uCAAgB,GAAhB;QACE,OAAO,IAAI,CAAC,OAAO,CAAA;IACrB,CAAC;IAED,uCAAgB,GAAhB,UAAsC,QAAgB;QACpD,IAAI,CAAC,OAAO,GAAG,QAAQ,CAAA;IACzB,CAAC;IAED,sCAAe,GAAf;QACE,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC,CAAA;IACnB,CAAC;IAED,4CAAqB,GAArB;QACE,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,CAAA;IAC1C,CAAC;IAED,uCAAgB,GAAhB;QACE,OAAO,IAAI,CAAC,gBAAgB,EAAE,CAAA;IAChC,CAAC;IACH,mBAAC;AAAD,CAAC,AA1ED,IA0EC;AA1EY,oCAAY"}

View File

@@ -0,0 +1,167 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.collectMethods = exports.LooksAhead = void 0;
var forEach_1 = __importDefault(require("lodash/forEach"));
var has_1 = __importDefault(require("lodash/has"));
var parser_1 = require("../parser");
var keys_1 = require("../../grammar/keys");
var gast_1 = require("@chevrotain/gast");
var gast_2 = require("@chevrotain/gast");
var llk_lookahead_1 = require("../../grammar/llk_lookahead");
/**
* Trait responsible for the lookahead related utilities and optimizations.
*/
var LooksAhead = /** @class */ (function () {
function LooksAhead() {
}
LooksAhead.prototype.initLooksAhead = function (config) {
this.dynamicTokensEnabled = (0, has_1.default)(config, "dynamicTokensEnabled")
? config.dynamicTokensEnabled // assumes end user provides the correct config value/type
: parser_1.DEFAULT_PARSER_CONFIG.dynamicTokensEnabled;
this.maxLookahead = (0, has_1.default)(config, "maxLookahead")
? config.maxLookahead // assumes end user provides the correct config value/type
: parser_1.DEFAULT_PARSER_CONFIG.maxLookahead;
this.lookaheadStrategy = (0, has_1.default)(config, "lookaheadStrategy")
? config.lookaheadStrategy // assumes end user provides the correct config value/type
: new llk_lookahead_1.LLkLookaheadStrategy({ maxLookahead: this.maxLookahead });
this.lookAheadFuncsCache = new Map();
};
LooksAhead.prototype.preComputeLookaheadFunctions = function (rules) {
var _this = this;
(0, forEach_1.default)(rules, function (currRule) {
_this.TRACE_INIT("".concat(currRule.name, " Rule Lookahead"), function () {
var _a = collectMethods(currRule), alternation = _a.alternation, repetition = _a.repetition, option = _a.option, repetitionMandatory = _a.repetitionMandatory, repetitionMandatoryWithSeparator = _a.repetitionMandatoryWithSeparator, repetitionWithSeparator = _a.repetitionWithSeparator;
(0, forEach_1.default)(alternation, function (currProd) {
var prodIdx = currProd.idx === 0 ? "" : currProd.idx;
_this.TRACE_INIT("".concat((0, gast_2.getProductionDslName)(currProd)).concat(prodIdx), function () {
var laFunc = _this.lookaheadStrategy.buildLookaheadForAlternation({
prodOccurrence: currProd.idx,
rule: currRule,
maxLookahead: currProd.maxLookahead || _this.maxLookahead,
hasPredicates: currProd.hasPredicates,
dynamicTokensEnabled: _this.dynamicTokensEnabled
});
var key = (0, keys_1.getKeyForAutomaticLookahead)(_this.fullRuleNameToShort[currRule.name], keys_1.OR_IDX, currProd.idx);
_this.setLaFuncCache(key, laFunc);
});
});
(0, forEach_1.default)(repetition, function (currProd) {
_this.computeLookaheadFunc(currRule, currProd.idx, keys_1.MANY_IDX, "Repetition", currProd.maxLookahead, (0, gast_2.getProductionDslName)(currProd));
});
(0, forEach_1.default)(option, function (currProd) {
_this.computeLookaheadFunc(currRule, currProd.idx, keys_1.OPTION_IDX, "Option", currProd.maxLookahead, (0, gast_2.getProductionDslName)(currProd));
});
(0, forEach_1.default)(repetitionMandatory, function (currProd) {
_this.computeLookaheadFunc(currRule, currProd.idx, keys_1.AT_LEAST_ONE_IDX, "RepetitionMandatory", currProd.maxLookahead, (0, gast_2.getProductionDslName)(currProd));
});
(0, forEach_1.default)(repetitionMandatoryWithSeparator, function (currProd) {
_this.computeLookaheadFunc(currRule, currProd.idx, keys_1.AT_LEAST_ONE_SEP_IDX, "RepetitionMandatoryWithSeparator", currProd.maxLookahead, (0, gast_2.getProductionDslName)(currProd));
});
(0, forEach_1.default)(repetitionWithSeparator, function (currProd) {
_this.computeLookaheadFunc(currRule, currProd.idx, keys_1.MANY_SEP_IDX, "RepetitionWithSeparator", currProd.maxLookahead, (0, gast_2.getProductionDslName)(currProd));
});
});
});
};
LooksAhead.prototype.computeLookaheadFunc = function (rule, prodOccurrence, prodKey, prodType, prodMaxLookahead, dslMethodName) {
var _this = this;
this.TRACE_INIT("".concat(dslMethodName).concat(prodOccurrence === 0 ? "" : prodOccurrence), function () {
var laFunc = _this.lookaheadStrategy.buildLookaheadForOptional({
prodOccurrence: prodOccurrence,
rule: rule,
maxLookahead: prodMaxLookahead || _this.maxLookahead,
dynamicTokensEnabled: _this.dynamicTokensEnabled,
prodType: prodType
});
var key = (0, keys_1.getKeyForAutomaticLookahead)(_this.fullRuleNameToShort[rule.name], prodKey, prodOccurrence);
_this.setLaFuncCache(key, laFunc);
});
};
// this actually returns a number, but it is always used as a string (object prop key)
LooksAhead.prototype.getKeyForAutomaticLookahead = function (dslMethodIdx, occurrence) {
var currRuleShortName = this.getLastExplicitRuleShortName();
return (0, keys_1.getKeyForAutomaticLookahead)(currRuleShortName, dslMethodIdx, occurrence);
};
LooksAhead.prototype.getLaFuncFromCache = function (key) {
return this.lookAheadFuncsCache.get(key);
};
/* istanbul ignore next */
LooksAhead.prototype.setLaFuncCache = function (key, value) {
this.lookAheadFuncsCache.set(key, value);
};
return LooksAhead;
}());
exports.LooksAhead = LooksAhead;
var DslMethodsCollectorVisitor = /** @class */ (function (_super) {
__extends(DslMethodsCollectorVisitor, _super);
function DslMethodsCollectorVisitor() {
var _this = _super !== null && _super.apply(this, arguments) || this;
_this.dslMethods = {
option: [],
alternation: [],
repetition: [],
repetitionWithSeparator: [],
repetitionMandatory: [],
repetitionMandatoryWithSeparator: []
};
return _this;
}
DslMethodsCollectorVisitor.prototype.reset = function () {
this.dslMethods = {
option: [],
alternation: [],
repetition: [],
repetitionWithSeparator: [],
repetitionMandatory: [],
repetitionMandatoryWithSeparator: []
};
};
DslMethodsCollectorVisitor.prototype.visitOption = function (option) {
this.dslMethods.option.push(option);
};
DslMethodsCollectorVisitor.prototype.visitRepetitionWithSeparator = function (manySep) {
this.dslMethods.repetitionWithSeparator.push(manySep);
};
DslMethodsCollectorVisitor.prototype.visitRepetitionMandatory = function (atLeastOne) {
this.dslMethods.repetitionMandatory.push(atLeastOne);
};
DslMethodsCollectorVisitor.prototype.visitRepetitionMandatoryWithSeparator = function (atLeastOneSep) {
this.dslMethods.repetitionMandatoryWithSeparator.push(atLeastOneSep);
};
DslMethodsCollectorVisitor.prototype.visitRepetition = function (many) {
this.dslMethods.repetition.push(many);
};
DslMethodsCollectorVisitor.prototype.visitAlternation = function (or) {
this.dslMethods.alternation.push(or);
};
return DslMethodsCollectorVisitor;
}(gast_1.GAstVisitor));
var collectorVisitor = new DslMethodsCollectorVisitor();
function collectMethods(rule) {
collectorVisitor.reset();
rule.accept(collectorVisitor);
var dslMethods = collectorVisitor.dslMethods;
// avoid uncleaned references
collectorVisitor.reset();
return dslMethods;
}
exports.collectMethods = collectMethods;
//# sourceMappingURL=looksahead.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,7 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.EmbeddedActionsParser = exports.CstParser = void 0;
var parser_1 = require("../parser");
exports.CstParser = (parser_1.CstParser);
exports.EmbeddedActionsParser = parser_1.EmbeddedActionsParser;
//# sourceMappingURL=parser_traits.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"parser_traits.js","sourceRoot":"","sources":["../../../../../src/parse/parser/traits/parser_traits.ts"],"names":[],"mappings":";;;AAOA,oCAIkB;AAiCL,QAAA,SAAS,GAAqC,CACzD,kBAAyB,CAC1B,CAAA;AASY,QAAA,qBAAqB,GAEjC,8BAAoC,CAAA"}

View File

@@ -0,0 +1,58 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.PerformanceTracer = void 0;
var has_1 = __importDefault(require("lodash/has"));
var utils_1 = require("@chevrotain/utils");
var parser_1 = require("../parser");
/**
* Trait responsible for runtime parsing errors.
*/
var PerformanceTracer = /** @class */ (function () {
function PerformanceTracer() {
}
PerformanceTracer.prototype.initPerformanceTracer = function (config) {
if ((0, has_1.default)(config, "traceInitPerf")) {
var userTraceInitPerf = config.traceInitPerf;
var traceIsNumber = typeof userTraceInitPerf === "number";
this.traceInitMaxIdent = traceIsNumber
? userTraceInitPerf
: Infinity;
this.traceInitPerf = traceIsNumber
? userTraceInitPerf > 0
: userTraceInitPerf; // assumes end user provides the correct config value/type
}
else {
this.traceInitMaxIdent = 0;
this.traceInitPerf = parser_1.DEFAULT_PARSER_CONFIG.traceInitPerf;
}
this.traceInitIndent = -1;
};
PerformanceTracer.prototype.TRACE_INIT = function (phaseDesc, phaseImpl) {
// No need to optimize this using NOOP pattern because
// It is not called in a hot spot...
if (this.traceInitPerf === true) {
this.traceInitIndent++;
var indent = new Array(this.traceInitIndent + 1).join("\t");
if (this.traceInitIndent < this.traceInitMaxIdent) {
console.log("".concat(indent, "--> <").concat(phaseDesc, ">"));
}
var _a = (0, utils_1.timer)(phaseImpl), time = _a.time, value = _a.value;
/* istanbul ignore next - Difficult to reproduce specific performance behavior (>10ms) in tests */
var traceMethod = time > 10 ? console.warn : console.log;
if (this.traceInitIndent < this.traceInitMaxIdent) {
traceMethod("".concat(indent, "<-- <").concat(phaseDesc, "> time: ").concat(time, "ms"));
}
this.traceInitIndent--;
return value;
}
else {
return phaseImpl();
}
};
return PerformanceTracer;
}());
exports.PerformanceTracer = PerformanceTracer;
//# sourceMappingURL=perf_tracer.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"perf_tracer.js","sourceRoot":"","sources":["../../../../../src/parse/parser/traits/perf_tracer.ts"],"names":[],"mappings":";;;;;;AACA,mDAA4B;AAC5B,2CAAyC;AAEzC,oCAAiD;AAEjD;;GAEG;AACH;IAAA;IA4CA,CAAC;IAvCC,iDAAqB,GAArB,UAAsB,MAAqB;QACzC,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,eAAe,CAAC,EAAE;YAChC,IAAM,iBAAiB,GAAG,MAAM,CAAC,aAAa,CAAA;YAC9C,IAAM,aAAa,GAAG,OAAO,iBAAiB,KAAK,QAAQ,CAAA;YAC3D,IAAI,CAAC,iBAAiB,GAAG,aAAa;gBACpC,CAAC,CAAS,iBAAiB;gBAC3B,CAAC,CAAC,QAAQ,CAAA;YACZ,IAAI,CAAC,aAAa,GAAG,aAAa;gBAChC,CAAC,CAAC,iBAAiB,GAAG,CAAC;gBACvB,CAAC,CAAE,iBAA6B,CAAA,CAAC,0DAA0D;SAC9F;aAAM;YACL,IAAI,CAAC,iBAAiB,GAAG,CAAC,CAAA;YAC1B,IAAI,CAAC,aAAa,GAAG,8BAAqB,CAAC,aAAa,CAAA;SACzD;QAED,IAAI,CAAC,eAAe,GAAG,CAAC,CAAC,CAAA;IAC3B,CAAC;IAED,sCAAU,GAAV,UAAmC,SAAiB,EAAE,SAAkB;QACtE,sDAAsD;QACtD,oCAAoC;QACpC,IAAI,IAAI,CAAC,aAAa,KAAK,IAAI,EAAE;YAC/B,IAAI,CAAC,eAAe,EAAE,CAAA;YACtB,IAAM,MAAM,GAAG,IAAI,KAAK,CAAC,IAAI,CAAC,eAAe,GAAG,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,CAAA;YAC7D,IAAI,IAAI,CAAC,eAAe,GAAG,IAAI,CAAC,iBAAiB,EAAE;gBACjD,OAAO,CAAC,GAAG,CAAC,UAAG,MAAM,kBAAQ,SAAS,MAAG,CAAC,CAAA;aAC3C;YACK,IAAA,KAAkB,IAAA,aAAK,EAAC,SAAS,CAAC,EAAhC,IAAI,UAAA,EAAE,KAAK,WAAqB,CAAA;YACxC,kGAAkG;YAClG,IAAM,WAAW,GAAG,IAAI,GAAG,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAA;YAC1D,IAAI,IAAI,CAAC,eAAe,GAAG,IAAI,CAAC,iBAAiB,EAAE;gBACjD,WAAW,CAAC,UAAG,MAAM,kBAAQ,SAAS,qBAAW,IAAI,OAAI,CAAC,CAAA;aAC3D;YACD,IAAI,CAAC,eAAe,EAAE,CAAA;YACtB,OAAO,KAAK,CAAA;SACb;aAAM;YACL,OAAO,SAAS,EAAE,CAAA;SACnB;IACH,CAAC;IACH,wBAAC;AAAD,CAAC,AA5CD,IA4CC;AA5CY,8CAAiB"}

View File

@@ -0,0 +1,347 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.RecognizerApi = void 0;
var values_1 = __importDefault(require("lodash/values"));
var includes_1 = __importDefault(require("lodash/includes"));
var exceptions_public_1 = require("../../exceptions_public");
var parser_1 = require("../parser");
var errors_public_1 = require("../../errors_public");
var checks_1 = require("../../grammar/checks");
var gast_1 = require("@chevrotain/gast");
/**
* This trait is responsible for implementing the public API
* for defining Chevrotain parsers, i.e:
* - CONSUME
* - RULE
* - OPTION
* - ...
*/
var RecognizerApi = /** @class */ (function () {
function RecognizerApi() {
}
RecognizerApi.prototype.ACTION = function (impl) {
return impl.call(this);
};
RecognizerApi.prototype.consume = function (idx, tokType, options) {
return this.consumeInternal(tokType, idx, options);
};
RecognizerApi.prototype.subrule = function (idx, ruleToCall, options) {
return this.subruleInternal(ruleToCall, idx, options);
};
RecognizerApi.prototype.option = function (idx, actionORMethodDef) {
return this.optionInternal(actionORMethodDef, idx);
};
RecognizerApi.prototype.or = function (idx, altsOrOpts) {
return this.orInternal(altsOrOpts, idx);
};
RecognizerApi.prototype.many = function (idx, actionORMethodDef) {
return this.manyInternal(idx, actionORMethodDef);
};
RecognizerApi.prototype.atLeastOne = function (idx, actionORMethodDef) {
return this.atLeastOneInternal(idx, actionORMethodDef);
};
RecognizerApi.prototype.CONSUME = function (tokType, options) {
return this.consumeInternal(tokType, 0, options);
};
RecognizerApi.prototype.CONSUME1 = function (tokType, options) {
return this.consumeInternal(tokType, 1, options);
};
RecognizerApi.prototype.CONSUME2 = function (tokType, options) {
return this.consumeInternal(tokType, 2, options);
};
RecognizerApi.prototype.CONSUME3 = function (tokType, options) {
return this.consumeInternal(tokType, 3, options);
};
RecognizerApi.prototype.CONSUME4 = function (tokType, options) {
return this.consumeInternal(tokType, 4, options);
};
RecognizerApi.prototype.CONSUME5 = function (tokType, options) {
return this.consumeInternal(tokType, 5, options);
};
RecognizerApi.prototype.CONSUME6 = function (tokType, options) {
return this.consumeInternal(tokType, 6, options);
};
RecognizerApi.prototype.CONSUME7 = function (tokType, options) {
return this.consumeInternal(tokType, 7, options);
};
RecognizerApi.prototype.CONSUME8 = function (tokType, options) {
return this.consumeInternal(tokType, 8, options);
};
RecognizerApi.prototype.CONSUME9 = function (tokType, options) {
return this.consumeInternal(tokType, 9, options);
};
RecognizerApi.prototype.SUBRULE = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 0, options);
};
RecognizerApi.prototype.SUBRULE1 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 1, options);
};
RecognizerApi.prototype.SUBRULE2 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 2, options);
};
RecognizerApi.prototype.SUBRULE3 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 3, options);
};
RecognizerApi.prototype.SUBRULE4 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 4, options);
};
RecognizerApi.prototype.SUBRULE5 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 5, options);
};
RecognizerApi.prototype.SUBRULE6 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 6, options);
};
RecognizerApi.prototype.SUBRULE7 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 7, options);
};
RecognizerApi.prototype.SUBRULE8 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 8, options);
};
RecognizerApi.prototype.SUBRULE9 = function (ruleToCall, options) {
return this.subruleInternal(ruleToCall, 9, options);
};
RecognizerApi.prototype.OPTION = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 0);
};
RecognizerApi.prototype.OPTION1 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 1);
};
RecognizerApi.prototype.OPTION2 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 2);
};
RecognizerApi.prototype.OPTION3 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 3);
};
RecognizerApi.prototype.OPTION4 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 4);
};
RecognizerApi.prototype.OPTION5 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 5);
};
RecognizerApi.prototype.OPTION6 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 6);
};
RecognizerApi.prototype.OPTION7 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 7);
};
RecognizerApi.prototype.OPTION8 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 8);
};
RecognizerApi.prototype.OPTION9 = function (actionORMethodDef) {
return this.optionInternal(actionORMethodDef, 9);
};
RecognizerApi.prototype.OR = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 0);
};
RecognizerApi.prototype.OR1 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 1);
};
RecognizerApi.prototype.OR2 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 2);
};
RecognizerApi.prototype.OR3 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 3);
};
RecognizerApi.prototype.OR4 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 4);
};
RecognizerApi.prototype.OR5 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 5);
};
RecognizerApi.prototype.OR6 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 6);
};
RecognizerApi.prototype.OR7 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 7);
};
RecognizerApi.prototype.OR8 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 8);
};
RecognizerApi.prototype.OR9 = function (altsOrOpts) {
return this.orInternal(altsOrOpts, 9);
};
RecognizerApi.prototype.MANY = function (actionORMethodDef) {
this.manyInternal(0, actionORMethodDef);
};
RecognizerApi.prototype.MANY1 = function (actionORMethodDef) {
this.manyInternal(1, actionORMethodDef);
};
RecognizerApi.prototype.MANY2 = function (actionORMethodDef) {
this.manyInternal(2, actionORMethodDef);
};
RecognizerApi.prototype.MANY3 = function (actionORMethodDef) {
this.manyInternal(3, actionORMethodDef);
};
RecognizerApi.prototype.MANY4 = function (actionORMethodDef) {
this.manyInternal(4, actionORMethodDef);
};
RecognizerApi.prototype.MANY5 = function (actionORMethodDef) {
this.manyInternal(5, actionORMethodDef);
};
RecognizerApi.prototype.MANY6 = function (actionORMethodDef) {
this.manyInternal(6, actionORMethodDef);
};
RecognizerApi.prototype.MANY7 = function (actionORMethodDef) {
this.manyInternal(7, actionORMethodDef);
};
RecognizerApi.prototype.MANY8 = function (actionORMethodDef) {
this.manyInternal(8, actionORMethodDef);
};
RecognizerApi.prototype.MANY9 = function (actionORMethodDef) {
this.manyInternal(9, actionORMethodDef);
};
RecognizerApi.prototype.MANY_SEP = function (options) {
this.manySepFirstInternal(0, options);
};
RecognizerApi.prototype.MANY_SEP1 = function (options) {
this.manySepFirstInternal(1, options);
};
RecognizerApi.prototype.MANY_SEP2 = function (options) {
this.manySepFirstInternal(2, options);
};
RecognizerApi.prototype.MANY_SEP3 = function (options) {
this.manySepFirstInternal(3, options);
};
RecognizerApi.prototype.MANY_SEP4 = function (options) {
this.manySepFirstInternal(4, options);
};
RecognizerApi.prototype.MANY_SEP5 = function (options) {
this.manySepFirstInternal(5, options);
};
RecognizerApi.prototype.MANY_SEP6 = function (options) {
this.manySepFirstInternal(6, options);
};
RecognizerApi.prototype.MANY_SEP7 = function (options) {
this.manySepFirstInternal(7, options);
};
RecognizerApi.prototype.MANY_SEP8 = function (options) {
this.manySepFirstInternal(8, options);
};
RecognizerApi.prototype.MANY_SEP9 = function (options) {
this.manySepFirstInternal(9, options);
};
RecognizerApi.prototype.AT_LEAST_ONE = function (actionORMethodDef) {
this.atLeastOneInternal(0, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE1 = function (actionORMethodDef) {
return this.atLeastOneInternal(1, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE2 = function (actionORMethodDef) {
this.atLeastOneInternal(2, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE3 = function (actionORMethodDef) {
this.atLeastOneInternal(3, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE4 = function (actionORMethodDef) {
this.atLeastOneInternal(4, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE5 = function (actionORMethodDef) {
this.atLeastOneInternal(5, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE6 = function (actionORMethodDef) {
this.atLeastOneInternal(6, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE7 = function (actionORMethodDef) {
this.atLeastOneInternal(7, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE8 = function (actionORMethodDef) {
this.atLeastOneInternal(8, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE9 = function (actionORMethodDef) {
this.atLeastOneInternal(9, actionORMethodDef);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP = function (options) {
this.atLeastOneSepFirstInternal(0, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP1 = function (options) {
this.atLeastOneSepFirstInternal(1, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP2 = function (options) {
this.atLeastOneSepFirstInternal(2, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP3 = function (options) {
this.atLeastOneSepFirstInternal(3, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP4 = function (options) {
this.atLeastOneSepFirstInternal(4, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP5 = function (options) {
this.atLeastOneSepFirstInternal(5, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP6 = function (options) {
this.atLeastOneSepFirstInternal(6, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP7 = function (options) {
this.atLeastOneSepFirstInternal(7, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP8 = function (options) {
this.atLeastOneSepFirstInternal(8, options);
};
RecognizerApi.prototype.AT_LEAST_ONE_SEP9 = function (options) {
this.atLeastOneSepFirstInternal(9, options);
};
RecognizerApi.prototype.RULE = function (name, implementation, config) {
if (config === void 0) { config = parser_1.DEFAULT_RULE_CONFIG; }
if ((0, includes_1.default)(this.definedRulesNames, name)) {
var errMsg = errors_public_1.defaultGrammarValidatorErrorProvider.buildDuplicateRuleNameError({
topLevelRule: name,
grammarName: this.className
});
var error = {
message: errMsg,
type: parser_1.ParserDefinitionErrorType.DUPLICATE_RULE_NAME,
ruleName: name
};
this.definitionErrors.push(error);
}
this.definedRulesNames.push(name);
var ruleImplementation = this.defineRule(name, implementation, config);
this[name] = ruleImplementation;
return ruleImplementation;
};
RecognizerApi.prototype.OVERRIDE_RULE = function (name, impl, config) {
if (config === void 0) { config = parser_1.DEFAULT_RULE_CONFIG; }
var ruleErrors = (0, checks_1.validateRuleIsOverridden)(name, this.definedRulesNames, this.className);
this.definitionErrors = this.definitionErrors.concat(ruleErrors);
var ruleImplementation = this.defineRule(name, impl, config);
this[name] = ruleImplementation;
return ruleImplementation;
};
RecognizerApi.prototype.BACKTRACK = function (grammarRule, args) {
return function () {
// save org state
this.isBackTrackingStack.push(1);
var orgState = this.saveRecogState();
try {
grammarRule.apply(this, args);
// if no exception was thrown we have succeed parsing the rule.
return true;
}
catch (e) {
if ((0, exceptions_public_1.isRecognitionException)(e)) {
return false;
}
else {
throw e;
}
}
finally {
this.reloadRecogState(orgState);
this.isBackTrackingStack.pop();
}
};
};
// GAST export APIs
RecognizerApi.prototype.getGAstProductions = function () {
return this.gastProductionsCache;
};
RecognizerApi.prototype.getSerializedGastProductions = function () {
return (0, gast_1.serializeGrammar)((0, values_1.default)(this.gastProductionsCache));
};
return RecognizerApi;
}());
exports.RecognizerApi = RecognizerApi;
//# sourceMappingURL=recognizer_api.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,576 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.RecognizerEngine = void 0;
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var isArray_1 = __importDefault(require("lodash/isArray"));
var flatten_1 = __importDefault(require("lodash/flatten"));
var every_1 = __importDefault(require("lodash/every"));
var uniq_1 = __importDefault(require("lodash/uniq"));
var isObject_1 = __importDefault(require("lodash/isObject"));
var has_1 = __importDefault(require("lodash/has"));
var values_1 = __importDefault(require("lodash/values"));
var reduce_1 = __importDefault(require("lodash/reduce"));
var clone_1 = __importDefault(require("lodash/clone"));
var keys_1 = require("../../grammar/keys");
var exceptions_public_1 = require("../../exceptions_public");
var lookahead_1 = require("../../grammar/lookahead");
var interpreter_1 = require("../../grammar/interpreter");
var parser_1 = require("../parser");
var recoverable_1 = require("./recoverable");
var tokens_public_1 = require("../../../scan/tokens_public");
var tokens_1 = require("../../../scan/tokens");
/**
* This trait is responsible for the runtime parsing engine
* Used by the official API (recognizer_api.ts)
*/
var RecognizerEngine = /** @class */ (function () {
function RecognizerEngine() {
}
RecognizerEngine.prototype.initRecognizerEngine = function (tokenVocabulary, config) {
this.className = this.constructor.name;
// TODO: would using an ES6 Map or plain object be faster (CST building scenario)
this.shortRuleNameToFull = {};
this.fullRuleNameToShort = {};
this.ruleShortNameIdx = 256;
this.tokenMatcher = tokens_1.tokenStructuredMatcherNoCategories;
this.subruleIdx = 0;
this.definedRulesNames = [];
this.tokensMap = {};
this.isBackTrackingStack = [];
this.RULE_STACK = [];
this.RULE_OCCURRENCE_STACK = [];
this.gastProductionsCache = {};
if ((0, has_1.default)(config, "serializedGrammar")) {
throw Error("The Parser's configuration can no longer contain a <serializedGrammar> property.\n" +
"\tSee: https://chevrotain.io/docs/changes/BREAKING_CHANGES.html#_6-0-0\n" +
"\tFor Further details.");
}
if ((0, isArray_1.default)(tokenVocabulary)) {
// This only checks for Token vocabularies provided as arrays.
// That is good enough because the main objective is to detect users of pre-V4.0 APIs
// rather than all edge cases of empty Token vocabularies.
if ((0, isEmpty_1.default)(tokenVocabulary)) {
throw Error("A Token Vocabulary cannot be empty.\n" +
"\tNote that the first argument for the parser constructor\n" +
"\tis no longer a Token vector (since v4.0).");
}
if (typeof tokenVocabulary[0].startOffset === "number") {
throw Error("The Parser constructor no longer accepts a token vector as the first argument.\n" +
"\tSee: https://chevrotain.io/docs/changes/BREAKING_CHANGES.html#_4-0-0\n" +
"\tFor Further details.");
}
}
if ((0, isArray_1.default)(tokenVocabulary)) {
this.tokensMap = (0, reduce_1.default)(tokenVocabulary, function (acc, tokType) {
acc[tokType.name] = tokType;
return acc;
}, {});
}
else if ((0, has_1.default)(tokenVocabulary, "modes") &&
(0, every_1.default)((0, flatten_1.default)((0, values_1.default)(tokenVocabulary.modes)), tokens_1.isTokenType)) {
var allTokenTypes_1 = (0, flatten_1.default)((0, values_1.default)(tokenVocabulary.modes));
var uniqueTokens = (0, uniq_1.default)(allTokenTypes_1);
this.tokensMap = (0, reduce_1.default)(uniqueTokens, function (acc, tokType) {
acc[tokType.name] = tokType;
return acc;
}, {});
}
else if ((0, isObject_1.default)(tokenVocabulary)) {
this.tokensMap = (0, clone_1.default)(tokenVocabulary);
}
else {
throw new Error("<tokensDictionary> argument must be An Array of Token constructors," +
" A dictionary of Token constructors or an IMultiModeLexerDefinition");
}
// always add EOF to the tokenNames -> constructors map. it is useful to assure all the input has been
// parsed with a clear error message ("expecting EOF but found ...")
this.tokensMap["EOF"] = tokens_public_1.EOF;
var allTokenTypes = (0, has_1.default)(tokenVocabulary, "modes")
? (0, flatten_1.default)((0, values_1.default)(tokenVocabulary.modes))
: (0, values_1.default)(tokenVocabulary);
var noTokenCategoriesUsed = (0, every_1.default)(allTokenTypes, function (tokenConstructor) {
return (0, isEmpty_1.default)(tokenConstructor.categoryMatches);
});
this.tokenMatcher = noTokenCategoriesUsed
? tokens_1.tokenStructuredMatcherNoCategories
: tokens_1.tokenStructuredMatcher;
// Because ES2015+ syntax should be supported for creating Token classes
// We cannot assume that the Token classes were created using the "extendToken" utilities
// Therefore we must augment the Token classes both on Lexer initialization and on Parser initialization
(0, tokens_1.augmentTokenTypes)((0, values_1.default)(this.tokensMap));
};
RecognizerEngine.prototype.defineRule = function (ruleName, impl, config) {
if (this.selfAnalysisDone) {
throw Error("Grammar rule <".concat(ruleName, "> may not be defined after the 'performSelfAnalysis' method has been called'\n") +
"Make sure that all grammar rule definitions are done before 'performSelfAnalysis' is called.");
}
var resyncEnabled = (0, has_1.default)(config, "resyncEnabled")
? config.resyncEnabled // assumes end user provides the correct config value/type
: parser_1.DEFAULT_RULE_CONFIG.resyncEnabled;
var recoveryValueFunc = (0, has_1.default)(config, "recoveryValueFunc")
? config.recoveryValueFunc // assumes end user provides the correct config value/type
: parser_1.DEFAULT_RULE_CONFIG.recoveryValueFunc;
// performance optimization: Use small integers as keys for the longer human readable "full" rule names.
// this greatly improves Map access time (as much as 8% for some performance benchmarks).
var shortName = this.ruleShortNameIdx << (keys_1.BITS_FOR_METHOD_TYPE + keys_1.BITS_FOR_OCCURRENCE_IDX);
this.ruleShortNameIdx++;
this.shortRuleNameToFull[shortName] = ruleName;
this.fullRuleNameToShort[ruleName] = shortName;
var invokeRuleWithTry;
// Micro optimization, only check the condition **once** on rule definition
// instead of **every single** rule invocation.
if (this.outputCst === true) {
invokeRuleWithTry = function invokeRuleWithTry() {
var args = [];
for (var _i = 0; _i < arguments.length; _i++) {
args[_i] = arguments[_i];
}
try {
this.ruleInvocationStateUpdate(shortName, ruleName, this.subruleIdx);
impl.apply(this, args);
var cst = this.CST_STACK[this.CST_STACK.length - 1];
this.cstPostRule(cst);
return cst;
}
catch (e) {
return this.invokeRuleCatch(e, resyncEnabled, recoveryValueFunc);
}
finally {
this.ruleFinallyStateUpdate();
}
};
}
else {
invokeRuleWithTry = function invokeRuleWithTryCst() {
var args = [];
for (var _i = 0; _i < arguments.length; _i++) {
args[_i] = arguments[_i];
}
try {
this.ruleInvocationStateUpdate(shortName, ruleName, this.subruleIdx);
return impl.apply(this, args);
}
catch (e) {
return this.invokeRuleCatch(e, resyncEnabled, recoveryValueFunc);
}
finally {
this.ruleFinallyStateUpdate();
}
};
}
var wrappedGrammarRule = Object.assign(invokeRuleWithTry, { ruleName: ruleName, originalGrammarAction: impl });
return wrappedGrammarRule;
};
RecognizerEngine.prototype.invokeRuleCatch = function (e, resyncEnabledConfig, recoveryValueFunc) {
var isFirstInvokedRule = this.RULE_STACK.length === 1;
// note the reSync is always enabled for the first rule invocation, because we must always be able to
// reSync with EOF and just output some INVALID ParseTree
// during backtracking reSync recovery is disabled, otherwise we can't be certain the backtracking
// path is really the most valid one
var reSyncEnabled = resyncEnabledConfig && !this.isBackTracking() && this.recoveryEnabled;
if ((0, exceptions_public_1.isRecognitionException)(e)) {
var recogError = e;
if (reSyncEnabled) {
var reSyncTokType = this.findReSyncTokenType();
if (this.isInCurrentRuleReSyncSet(reSyncTokType)) {
recogError.resyncedTokens = this.reSyncTo(reSyncTokType);
if (this.outputCst) {
var partialCstResult = this.CST_STACK[this.CST_STACK.length - 1];
partialCstResult.recoveredNode = true;
return partialCstResult;
}
else {
return recoveryValueFunc(e);
}
}
else {
if (this.outputCst) {
var partialCstResult = this.CST_STACK[this.CST_STACK.length - 1];
partialCstResult.recoveredNode = true;
recogError.partialCstResult = partialCstResult;
}
// to be handled Further up the call stack
throw recogError;
}
}
else if (isFirstInvokedRule) {
// otherwise a Redundant input error will be created as well and we cannot guarantee that this is indeed the case
this.moveToTerminatedState();
// the parser should never throw one of its own errors outside its flow.
// even if error recovery is disabled
return recoveryValueFunc(e);
}
else {
// to be recovered Further up the call stack
throw recogError;
}
}
else {
// some other Error type which we don't know how to handle (for example a built in JavaScript Error)
throw e;
}
};
// Implementation of parsing DSL
RecognizerEngine.prototype.optionInternal = function (actionORMethodDef, occurrence) {
var key = this.getKeyForAutomaticLookahead(keys_1.OPTION_IDX, occurrence);
return this.optionInternalLogic(actionORMethodDef, occurrence, key);
};
RecognizerEngine.prototype.optionInternalLogic = function (actionORMethodDef, occurrence, key) {
var _this = this;
var lookAheadFunc = this.getLaFuncFromCache(key);
var action;
if (typeof actionORMethodDef !== "function") {
action = actionORMethodDef.DEF;
var predicate_1 = actionORMethodDef.GATE;
// predicate present
if (predicate_1 !== undefined) {
var orgLookaheadFunction_1 = lookAheadFunc;
lookAheadFunc = function () {
return predicate_1.call(_this) && orgLookaheadFunction_1.call(_this);
};
}
}
else {
action = actionORMethodDef;
}
if (lookAheadFunc.call(this) === true) {
return action.call(this);
}
return undefined;
};
RecognizerEngine.prototype.atLeastOneInternal = function (prodOccurrence, actionORMethodDef) {
var laKey = this.getKeyForAutomaticLookahead(keys_1.AT_LEAST_ONE_IDX, prodOccurrence);
return this.atLeastOneInternalLogic(prodOccurrence, actionORMethodDef, laKey);
};
RecognizerEngine.prototype.atLeastOneInternalLogic = function (prodOccurrence, actionORMethodDef, key) {
var _this = this;
var lookAheadFunc = this.getLaFuncFromCache(key);
var action;
if (typeof actionORMethodDef !== "function") {
action = actionORMethodDef.DEF;
var predicate_2 = actionORMethodDef.GATE;
// predicate present
if (predicate_2 !== undefined) {
var orgLookaheadFunction_2 = lookAheadFunc;
lookAheadFunc = function () {
return predicate_2.call(_this) && orgLookaheadFunction_2.call(_this);
};
}
}
else {
action = actionORMethodDef;
}
if (lookAheadFunc.call(this) === true) {
var notStuck = this.doSingleRepetition(action);
while (lookAheadFunc.call(this) === true &&
notStuck === true) {
notStuck = this.doSingleRepetition(action);
}
}
else {
throw this.raiseEarlyExitException(prodOccurrence, lookahead_1.PROD_TYPE.REPETITION_MANDATORY, actionORMethodDef.ERR_MSG);
}
// note that while it may seem that this can cause an error because by using a recursive call to
// AT_LEAST_ONE we change the grammar to AT_LEAST_TWO, AT_LEAST_THREE ... , the possible recursive call
// from the tryInRepetitionRecovery(...) will only happen IFF there really are TWO/THREE/.... items.
// Performance optimization: "attemptInRepetitionRecovery" will be defined as NOOP unless recovery is enabled
this.attemptInRepetitionRecovery(this.atLeastOneInternal, [prodOccurrence, actionORMethodDef], lookAheadFunc, keys_1.AT_LEAST_ONE_IDX, prodOccurrence, interpreter_1.NextTerminalAfterAtLeastOneWalker);
};
RecognizerEngine.prototype.atLeastOneSepFirstInternal = function (prodOccurrence, options) {
var laKey = this.getKeyForAutomaticLookahead(keys_1.AT_LEAST_ONE_SEP_IDX, prodOccurrence);
this.atLeastOneSepFirstInternalLogic(prodOccurrence, options, laKey);
};
RecognizerEngine.prototype.atLeastOneSepFirstInternalLogic = function (prodOccurrence, options, key) {
var _this = this;
var action = options.DEF;
var separator = options.SEP;
var firstIterationLookaheadFunc = this.getLaFuncFromCache(key);
// 1st iteration
if (firstIterationLookaheadFunc.call(this) === true) {
;
action.call(this);
// TODO: Optimization can move this function construction into "attemptInRepetitionRecovery"
// because it is only needed in error recovery scenarios.
var separatorLookAheadFunc = function () {
return _this.tokenMatcher(_this.LA(1), separator);
};
// 2nd..nth iterations
while (this.tokenMatcher(this.LA(1), separator) === true) {
// note that this CONSUME will never enter recovery because
// the separatorLookAheadFunc checks that the separator really does exist.
this.CONSUME(separator);
action.call(this);
}
// Performance optimization: "attemptInRepetitionRecovery" will be defined as NOOP unless recovery is enabled
this.attemptInRepetitionRecovery(this.repetitionSepSecondInternal, [
prodOccurrence,
separator,
separatorLookAheadFunc,
action,
interpreter_1.NextTerminalAfterAtLeastOneSepWalker
], separatorLookAheadFunc, keys_1.AT_LEAST_ONE_SEP_IDX, prodOccurrence, interpreter_1.NextTerminalAfterAtLeastOneSepWalker);
}
else {
throw this.raiseEarlyExitException(prodOccurrence, lookahead_1.PROD_TYPE.REPETITION_MANDATORY_WITH_SEPARATOR, options.ERR_MSG);
}
};
RecognizerEngine.prototype.manyInternal = function (prodOccurrence, actionORMethodDef) {
var laKey = this.getKeyForAutomaticLookahead(keys_1.MANY_IDX, prodOccurrence);
return this.manyInternalLogic(prodOccurrence, actionORMethodDef, laKey);
};
RecognizerEngine.prototype.manyInternalLogic = function (prodOccurrence, actionORMethodDef, key) {
var _this = this;
var lookaheadFunction = this.getLaFuncFromCache(key);
var action;
if (typeof actionORMethodDef !== "function") {
action = actionORMethodDef.DEF;
var predicate_3 = actionORMethodDef.GATE;
// predicate present
if (predicate_3 !== undefined) {
var orgLookaheadFunction_3 = lookaheadFunction;
lookaheadFunction = function () {
return predicate_3.call(_this) && orgLookaheadFunction_3.call(_this);
};
}
}
else {
action = actionORMethodDef;
}
var notStuck = true;
while (lookaheadFunction.call(this) === true && notStuck === true) {
notStuck = this.doSingleRepetition(action);
}
// Performance optimization: "attemptInRepetitionRecovery" will be defined as NOOP unless recovery is enabled
this.attemptInRepetitionRecovery(this.manyInternal, [prodOccurrence, actionORMethodDef], lookaheadFunction, keys_1.MANY_IDX, prodOccurrence, interpreter_1.NextTerminalAfterManyWalker,
// The notStuck parameter is only relevant when "attemptInRepetitionRecovery"
// is invoked from manyInternal, in the MANY_SEP case and AT_LEAST_ONE[_SEP]
// An infinite loop cannot occur as:
// - Either the lookahead is guaranteed to consume something (Single Token Separator)
// - AT_LEAST_ONE by definition is guaranteed to consume something (or error out).
notStuck);
};
RecognizerEngine.prototype.manySepFirstInternal = function (prodOccurrence, options) {
var laKey = this.getKeyForAutomaticLookahead(keys_1.MANY_SEP_IDX, prodOccurrence);
this.manySepFirstInternalLogic(prodOccurrence, options, laKey);
};
RecognizerEngine.prototype.manySepFirstInternalLogic = function (prodOccurrence, options, key) {
var _this = this;
var action = options.DEF;
var separator = options.SEP;
var firstIterationLaFunc = this.getLaFuncFromCache(key);
// 1st iteration
if (firstIterationLaFunc.call(this) === true) {
action.call(this);
var separatorLookAheadFunc = function () {
return _this.tokenMatcher(_this.LA(1), separator);
};
// 2nd..nth iterations
while (this.tokenMatcher(this.LA(1), separator) === true) {
// note that this CONSUME will never enter recovery because
// the separatorLookAheadFunc checks that the separator really does exist.
this.CONSUME(separator);
// No need for checking infinite loop here due to consuming the separator.
action.call(this);
}
// Performance optimization: "attemptInRepetitionRecovery" will be defined as NOOP unless recovery is enabled
this.attemptInRepetitionRecovery(this.repetitionSepSecondInternal, [
prodOccurrence,
separator,
separatorLookAheadFunc,
action,
interpreter_1.NextTerminalAfterManySepWalker
], separatorLookAheadFunc, keys_1.MANY_SEP_IDX, prodOccurrence, interpreter_1.NextTerminalAfterManySepWalker);
}
};
RecognizerEngine.prototype.repetitionSepSecondInternal = function (prodOccurrence, separator, separatorLookAheadFunc, action, nextTerminalAfterWalker) {
while (separatorLookAheadFunc()) {
// note that this CONSUME will never enter recovery because
// the separatorLookAheadFunc checks that the separator really does exist.
this.CONSUME(separator);
action.call(this);
}
// we can only arrive to this function after an error
// has occurred (hence the name 'second') so the following
// IF will always be entered, its possible to remove it...
// however it is kept to avoid confusion and be consistent.
// Performance optimization: "attemptInRepetitionRecovery" will be defined as NOOP unless recovery is enabled
/* istanbul ignore else */
this.attemptInRepetitionRecovery(this.repetitionSepSecondInternal, [
prodOccurrence,
separator,
separatorLookAheadFunc,
action,
nextTerminalAfterWalker
], separatorLookAheadFunc, keys_1.AT_LEAST_ONE_SEP_IDX, prodOccurrence, nextTerminalAfterWalker);
};
RecognizerEngine.prototype.doSingleRepetition = function (action) {
var beforeIteration = this.getLexerPosition();
action.call(this);
var afterIteration = this.getLexerPosition();
// This boolean will indicate if this repetition progressed
// or if we are "stuck" (potential infinite loop in the repetition).
return afterIteration > beforeIteration;
};
RecognizerEngine.prototype.orInternal = function (altsOrOpts, occurrence) {
var laKey = this.getKeyForAutomaticLookahead(keys_1.OR_IDX, occurrence);
var alts = (0, isArray_1.default)(altsOrOpts) ? altsOrOpts : altsOrOpts.DEF;
var laFunc = this.getLaFuncFromCache(laKey);
var altIdxToTake = laFunc.call(this, alts);
if (altIdxToTake !== undefined) {
var chosenAlternative = alts[altIdxToTake];
return chosenAlternative.ALT.call(this);
}
this.raiseNoAltException(occurrence, altsOrOpts.ERR_MSG);
};
RecognizerEngine.prototype.ruleFinallyStateUpdate = function () {
this.RULE_STACK.pop();
this.RULE_OCCURRENCE_STACK.pop();
// NOOP when cst is disabled
this.cstFinallyStateUpdate();
if (this.RULE_STACK.length === 0 && this.isAtEndOfInput() === false) {
var firstRedundantTok = this.LA(1);
var errMsg = this.errorMessageProvider.buildNotAllInputParsedMessage({
firstRedundant: firstRedundantTok,
ruleName: this.getCurrRuleFullName()
});
this.SAVE_ERROR(new exceptions_public_1.NotAllInputParsedException(errMsg, firstRedundantTok));
}
};
RecognizerEngine.prototype.subruleInternal = function (ruleToCall, idx, options) {
var ruleResult;
try {
var args = options !== undefined ? options.ARGS : undefined;
this.subruleIdx = idx;
ruleResult = ruleToCall.apply(this, args);
this.cstPostNonTerminal(ruleResult, options !== undefined && options.LABEL !== undefined
? options.LABEL
: ruleToCall.ruleName);
return ruleResult;
}
catch (e) {
throw this.subruleInternalError(e, options, ruleToCall.ruleName);
}
};
RecognizerEngine.prototype.subruleInternalError = function (e, options, ruleName) {
if ((0, exceptions_public_1.isRecognitionException)(e) && e.partialCstResult !== undefined) {
this.cstPostNonTerminal(e.partialCstResult, options !== undefined && options.LABEL !== undefined
? options.LABEL
: ruleName);
delete e.partialCstResult;
}
throw e;
};
RecognizerEngine.prototype.consumeInternal = function (tokType, idx, options) {
var consumedToken;
try {
var nextToken = this.LA(1);
if (this.tokenMatcher(nextToken, tokType) === true) {
this.consumeToken();
consumedToken = nextToken;
}
else {
this.consumeInternalError(tokType, nextToken, options);
}
}
catch (eFromConsumption) {
consumedToken = this.consumeInternalRecovery(tokType, idx, eFromConsumption);
}
this.cstPostTerminal(options !== undefined && options.LABEL !== undefined
? options.LABEL
: tokType.name, consumedToken);
return consumedToken;
};
RecognizerEngine.prototype.consumeInternalError = function (tokType, nextToken, options) {
var msg;
var previousToken = this.LA(0);
if (options !== undefined && options.ERR_MSG) {
msg = options.ERR_MSG;
}
else {
msg = this.errorMessageProvider.buildMismatchTokenMessage({
expected: tokType,
actual: nextToken,
previous: previousToken,
ruleName: this.getCurrRuleFullName()
});
}
throw this.SAVE_ERROR(new exceptions_public_1.MismatchedTokenException(msg, nextToken, previousToken));
};
RecognizerEngine.prototype.consumeInternalRecovery = function (tokType, idx, eFromConsumption) {
// no recovery allowed during backtracking, otherwise backtracking may recover invalid syntax and accept it
// but the original syntax could have been parsed successfully without any backtracking + recovery
if (this.recoveryEnabled &&
// TODO: more robust checking of the exception type. Perhaps Typescript extending expressions?
eFromConsumption.name === "MismatchedTokenException" &&
!this.isBackTracking()) {
var follows = this.getFollowsForInRuleRecovery(tokType, idx);
try {
return this.tryInRuleRecovery(tokType, follows);
}
catch (eFromInRuleRecovery) {
if (eFromInRuleRecovery.name === recoverable_1.IN_RULE_RECOVERY_EXCEPTION) {
// failed in RuleRecovery.
// throw the original error in order to trigger reSync error recovery
throw eFromConsumption;
}
else {
throw eFromInRuleRecovery;
}
}
}
else {
throw eFromConsumption;
}
};
RecognizerEngine.prototype.saveRecogState = function () {
// errors is a getter which will clone the errors array
var savedErrors = this.errors;
var savedRuleStack = (0, clone_1.default)(this.RULE_STACK);
return {
errors: savedErrors,
lexerState: this.exportLexerState(),
RULE_STACK: savedRuleStack,
CST_STACK: this.CST_STACK
};
};
RecognizerEngine.prototype.reloadRecogState = function (newState) {
this.errors = newState.errors;
this.importLexerState(newState.lexerState);
this.RULE_STACK = newState.RULE_STACK;
};
RecognizerEngine.prototype.ruleInvocationStateUpdate = function (shortName, fullName, idxInCallingRule) {
this.RULE_OCCURRENCE_STACK.push(idxInCallingRule);
this.RULE_STACK.push(shortName);
// NOOP when cst is disabled
this.cstInvocationStateUpdate(fullName);
};
RecognizerEngine.prototype.isBackTracking = function () {
return this.isBackTrackingStack.length !== 0;
};
RecognizerEngine.prototype.getCurrRuleFullName = function () {
var shortName = this.getLastExplicitRuleShortName();
return this.shortRuleNameToFull[shortName];
};
RecognizerEngine.prototype.shortRuleNameToFullName = function (shortName) {
return this.shortRuleNameToFull[shortName];
};
RecognizerEngine.prototype.isAtEndOfInput = function () {
return this.tokenMatcher(this.LA(1), tokens_public_1.EOF);
};
RecognizerEngine.prototype.reset = function () {
this.resetLexerState();
this.subruleIdx = 0;
this.isBackTrackingStack = [];
this.errors = [];
this.RULE_STACK = [];
// TODO: extract a specific reset for TreeBuilder trait
this.CST_STACK = [];
this.RULE_OCCURRENCE_STACK = [];
};
return RecognizerEngine;
}());
exports.RecognizerEngine = RecognizerEngine;
//# sourceMappingURL=recognizer_engine.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,334 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.attemptInRepetitionRecovery = exports.Recoverable = exports.InRuleRecoveryException = exports.IN_RULE_RECOVERY_EXCEPTION = exports.EOF_FOLLOW_KEY = void 0;
var tokens_public_1 = require("../../../scan/tokens_public");
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var dropRight_1 = __importDefault(require("lodash/dropRight"));
var flatten_1 = __importDefault(require("lodash/flatten"));
var map_1 = __importDefault(require("lodash/map"));
var find_1 = __importDefault(require("lodash/find"));
var has_1 = __importDefault(require("lodash/has"));
var includes_1 = __importDefault(require("lodash/includes"));
var clone_1 = __importDefault(require("lodash/clone"));
var exceptions_public_1 = require("../../exceptions_public");
var constants_1 = require("../../constants");
var parser_1 = require("../parser");
exports.EOF_FOLLOW_KEY = {};
exports.IN_RULE_RECOVERY_EXCEPTION = "InRuleRecoveryException";
var InRuleRecoveryException = /** @class */ (function (_super) {
__extends(InRuleRecoveryException, _super);
function InRuleRecoveryException(message) {
var _this = _super.call(this, message) || this;
_this.name = exports.IN_RULE_RECOVERY_EXCEPTION;
return _this;
}
return InRuleRecoveryException;
}(Error));
exports.InRuleRecoveryException = InRuleRecoveryException;
/**
* This trait is responsible for the error recovery and fault tolerant logic
*/
var Recoverable = /** @class */ (function () {
function Recoverable() {
}
Recoverable.prototype.initRecoverable = function (config) {
this.firstAfterRepMap = {};
this.resyncFollows = {};
this.recoveryEnabled = (0, has_1.default)(config, "recoveryEnabled")
? config.recoveryEnabled // assumes end user provides the correct config value/type
: parser_1.DEFAULT_PARSER_CONFIG.recoveryEnabled;
// performance optimization, NOOP will be inlined which
// effectively means that this optional feature does not exist
// when not used.
if (this.recoveryEnabled) {
this.attemptInRepetitionRecovery = attemptInRepetitionRecovery;
}
};
Recoverable.prototype.getTokenToInsert = function (tokType) {
var tokToInsert = (0, tokens_public_1.createTokenInstance)(tokType, "", NaN, NaN, NaN, NaN, NaN, NaN);
tokToInsert.isInsertedInRecovery = true;
return tokToInsert;
};
Recoverable.prototype.canTokenTypeBeInsertedInRecovery = function (tokType) {
return true;
};
Recoverable.prototype.canTokenTypeBeDeletedInRecovery = function (tokType) {
return true;
};
Recoverable.prototype.tryInRepetitionRecovery = function (grammarRule, grammarRuleArgs, lookAheadFunc, expectedTokType) {
var _this = this;
// TODO: can the resyncTokenType be cached?
var reSyncTokType = this.findReSyncTokenType();
var savedLexerState = this.exportLexerState();
var resyncedTokens = [];
var passedResyncPoint = false;
var nextTokenWithoutResync = this.LA(1);
var currToken = this.LA(1);
var generateErrorMessage = function () {
var previousToken = _this.LA(0);
// we are preemptively re-syncing before an error has been detected, therefor we must reproduce
// the error that would have been thrown
var msg = _this.errorMessageProvider.buildMismatchTokenMessage({
expected: expectedTokType,
actual: nextTokenWithoutResync,
previous: previousToken,
ruleName: _this.getCurrRuleFullName()
});
var error = new exceptions_public_1.MismatchedTokenException(msg, nextTokenWithoutResync, _this.LA(0));
// the first token here will be the original cause of the error, this is not part of the resyncedTokens property.
error.resyncedTokens = (0, dropRight_1.default)(resyncedTokens);
_this.SAVE_ERROR(error);
};
while (!passedResyncPoint) {
// re-synced to a point where we can safely exit the repetition/
if (this.tokenMatcher(currToken, expectedTokType)) {
generateErrorMessage();
return; // must return here to avoid reverting the inputIdx
}
else if (lookAheadFunc.call(this)) {
// we skipped enough tokens so we can resync right back into another iteration of the repetition grammar rule
generateErrorMessage();
// recursive invocation in other to support multiple re-syncs in the same top level repetition grammar rule
grammarRule.apply(this, grammarRuleArgs);
return; // must return here to avoid reverting the inputIdx
}
else if (this.tokenMatcher(currToken, reSyncTokType)) {
passedResyncPoint = true;
}
else {
currToken = this.SKIP_TOKEN();
this.addToResyncTokens(currToken, resyncedTokens);
}
}
// we were unable to find a CLOSER point to resync inside the Repetition, reset the state.
// The parsing exception we were trying to prevent will happen in the NEXT parsing step. it may be handled by
// "between rules" resync recovery later in the flow.
this.importLexerState(savedLexerState);
};
Recoverable.prototype.shouldInRepetitionRecoveryBeTried = function (expectTokAfterLastMatch, nextTokIdx, notStuck) {
// Edge case of arriving from a MANY repetition which is stuck
// Attempting recovery in this case could cause an infinite loop
if (notStuck === false) {
return false;
}
// no need to recover, next token is what we expect...
if (this.tokenMatcher(this.LA(1), expectTokAfterLastMatch)) {
return false;
}
// error recovery is disabled during backtracking as it can make the parser ignore a valid grammar path
// and prefer some backtracking path that includes recovered errors.
if (this.isBackTracking()) {
return false;
}
// if we can perform inRule recovery (single token insertion or deletion) we always prefer that recovery algorithm
// because if it works, it makes the least amount of changes to the input stream (greedy algorithm)
//noinspection RedundantIfStatementJS
if (this.canPerformInRuleRecovery(expectTokAfterLastMatch, this.getFollowsForInRuleRecovery(expectTokAfterLastMatch, nextTokIdx))) {
return false;
}
return true;
};
// Error Recovery functionality
Recoverable.prototype.getFollowsForInRuleRecovery = function (tokType, tokIdxInRule) {
var grammarPath = this.getCurrentGrammarPath(tokType, tokIdxInRule);
var follows = this.getNextPossibleTokenTypes(grammarPath);
return follows;
};
Recoverable.prototype.tryInRuleRecovery = function (expectedTokType, follows) {
if (this.canRecoverWithSingleTokenInsertion(expectedTokType, follows)) {
var tokToInsert = this.getTokenToInsert(expectedTokType);
return tokToInsert;
}
if (this.canRecoverWithSingleTokenDeletion(expectedTokType)) {
var nextTok = this.SKIP_TOKEN();
this.consumeToken();
return nextTok;
}
throw new InRuleRecoveryException("sad sad panda");
};
Recoverable.prototype.canPerformInRuleRecovery = function (expectedToken, follows) {
return (this.canRecoverWithSingleTokenInsertion(expectedToken, follows) ||
this.canRecoverWithSingleTokenDeletion(expectedToken));
};
Recoverable.prototype.canRecoverWithSingleTokenInsertion = function (expectedTokType, follows) {
var _this = this;
if (!this.canTokenTypeBeInsertedInRecovery(expectedTokType)) {
return false;
}
// must know the possible following tokens to perform single token insertion
if ((0, isEmpty_1.default)(follows)) {
return false;
}
var mismatchedTok = this.LA(1);
var isMisMatchedTokInFollows = (0, find_1.default)(follows, function (possibleFollowsTokType) {
return _this.tokenMatcher(mismatchedTok, possibleFollowsTokType);
}) !== undefined;
return isMisMatchedTokInFollows;
};
Recoverable.prototype.canRecoverWithSingleTokenDeletion = function (expectedTokType) {
if (!this.canTokenTypeBeDeletedInRecovery(expectedTokType)) {
return false;
}
var isNextTokenWhatIsExpected = this.tokenMatcher(this.LA(2), expectedTokType);
return isNextTokenWhatIsExpected;
};
Recoverable.prototype.isInCurrentRuleReSyncSet = function (tokenTypeIdx) {
var followKey = this.getCurrFollowKey();
var currentRuleReSyncSet = this.getFollowSetFromFollowKey(followKey);
return (0, includes_1.default)(currentRuleReSyncSet, tokenTypeIdx);
};
Recoverable.prototype.findReSyncTokenType = function () {
var allPossibleReSyncTokTypes = this.flattenFollowSet();
// this loop will always terminate as EOF is always in the follow stack and also always (virtually) in the input
var nextToken = this.LA(1);
var k = 2;
while (true) {
var foundMatch = (0, find_1.default)(allPossibleReSyncTokTypes, function (resyncTokType) {
var canMatch = (0, tokens_public_1.tokenMatcher)(nextToken, resyncTokType);
return canMatch;
});
if (foundMatch !== undefined) {
return foundMatch;
}
nextToken = this.LA(k);
k++;
}
};
Recoverable.prototype.getCurrFollowKey = function () {
// the length is at least one as we always add the ruleName to the stack before invoking the rule.
if (this.RULE_STACK.length === 1) {
return exports.EOF_FOLLOW_KEY;
}
var currRuleShortName = this.getLastExplicitRuleShortName();
var currRuleIdx = this.getLastExplicitRuleOccurrenceIndex();
var prevRuleShortName = this.getPreviousExplicitRuleShortName();
return {
ruleName: this.shortRuleNameToFullName(currRuleShortName),
idxInCallingRule: currRuleIdx,
inRule: this.shortRuleNameToFullName(prevRuleShortName)
};
};
Recoverable.prototype.buildFullFollowKeyStack = function () {
var _this = this;
var explicitRuleStack = this.RULE_STACK;
var explicitOccurrenceStack = this.RULE_OCCURRENCE_STACK;
return (0, map_1.default)(explicitRuleStack, function (ruleName, idx) {
if (idx === 0) {
return exports.EOF_FOLLOW_KEY;
}
return {
ruleName: _this.shortRuleNameToFullName(ruleName),
idxInCallingRule: explicitOccurrenceStack[idx],
inRule: _this.shortRuleNameToFullName(explicitRuleStack[idx - 1])
};
});
};
Recoverable.prototype.flattenFollowSet = function () {
var _this = this;
var followStack = (0, map_1.default)(this.buildFullFollowKeyStack(), function (currKey) {
return _this.getFollowSetFromFollowKey(currKey);
});
return (0, flatten_1.default)(followStack);
};
Recoverable.prototype.getFollowSetFromFollowKey = function (followKey) {
if (followKey === exports.EOF_FOLLOW_KEY) {
return [tokens_public_1.EOF];
}
var followName = followKey.ruleName + followKey.idxInCallingRule + constants_1.IN + followKey.inRule;
return this.resyncFollows[followName];
};
// It does not make any sense to include a virtual EOF token in the list of resynced tokens
// as EOF does not really exist and thus does not contain any useful information (line/column numbers)
Recoverable.prototype.addToResyncTokens = function (token, resyncTokens) {
if (!this.tokenMatcher(token, tokens_public_1.EOF)) {
resyncTokens.push(token);
}
return resyncTokens;
};
Recoverable.prototype.reSyncTo = function (tokType) {
var resyncedTokens = [];
var nextTok = this.LA(1);
while (this.tokenMatcher(nextTok, tokType) === false) {
nextTok = this.SKIP_TOKEN();
this.addToResyncTokens(nextTok, resyncedTokens);
}
// the last token is not part of the error.
return (0, dropRight_1.default)(resyncedTokens);
};
Recoverable.prototype.attemptInRepetitionRecovery = function (prodFunc, args, lookaheadFunc, dslMethodIdx, prodOccurrence, nextToksWalker, notStuck) {
// by default this is a NO-OP
// The actual implementation is with the function(not method) below
};
Recoverable.prototype.getCurrentGrammarPath = function (tokType, tokIdxInRule) {
var pathRuleStack = this.getHumanReadableRuleStack();
var pathOccurrenceStack = (0, clone_1.default)(this.RULE_OCCURRENCE_STACK);
var grammarPath = {
ruleStack: pathRuleStack,
occurrenceStack: pathOccurrenceStack,
lastTok: tokType,
lastTokOccurrence: tokIdxInRule
};
return grammarPath;
};
Recoverable.prototype.getHumanReadableRuleStack = function () {
var _this = this;
return (0, map_1.default)(this.RULE_STACK, function (currShortName) {
return _this.shortRuleNameToFullName(currShortName);
});
};
return Recoverable;
}());
exports.Recoverable = Recoverable;
function attemptInRepetitionRecovery(prodFunc, args, lookaheadFunc, dslMethodIdx, prodOccurrence, nextToksWalker, notStuck) {
var key = this.getKeyForAutomaticLookahead(dslMethodIdx, prodOccurrence);
var firstAfterRepInfo = this.firstAfterRepMap[key];
if (firstAfterRepInfo === undefined) {
var currRuleName = this.getCurrRuleFullName();
var ruleGrammar = this.getGAstProductions()[currRuleName];
var walker = new nextToksWalker(ruleGrammar, prodOccurrence);
firstAfterRepInfo = walker.startWalking();
this.firstAfterRepMap[key] = firstAfterRepInfo;
}
var expectTokAfterLastMatch = firstAfterRepInfo.token;
var nextTokIdx = firstAfterRepInfo.occurrence;
var isEndOfRule = firstAfterRepInfo.isEndOfRule;
// special edge case of a TOP most repetition after which the input should END.
// this will force an attempt for inRule recovery in that scenario.
if (this.RULE_STACK.length === 1 &&
isEndOfRule &&
expectTokAfterLastMatch === undefined) {
expectTokAfterLastMatch = tokens_public_1.EOF;
nextTokIdx = 1;
}
// We don't have anything to re-sync to...
// this condition was extracted from `shouldInRepetitionRecoveryBeTried` to act as a type-guard
if (expectTokAfterLastMatch === undefined || nextTokIdx === undefined) {
return;
}
if (this.shouldInRepetitionRecoveryBeTried(expectTokAfterLastMatch, nextTokIdx, notStuck)) {
// TODO: performance optimization: instead of passing the original args here, we modify
// the args param (or create a new one) and make sure the lookahead func is explicitly provided
// to avoid searching the cache for it once more.
this.tryInRepetitionRecovery(prodFunc, args, lookaheadFunc, expectTokAfterLastMatch);
}
}
exports.attemptInRepetitionRecovery = attemptInRepetitionRecovery;
//# sourceMappingURL=recoverable.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,204 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.TreeBuilder = void 0;
var cst_1 = require("../../cst/cst");
var noop_1 = __importDefault(require("lodash/noop"));
var has_1 = __importDefault(require("lodash/has"));
var keys_1 = __importDefault(require("lodash/keys"));
var isUndefined_1 = __importDefault(require("lodash/isUndefined"));
var cst_visitor_1 = require("../../cst/cst_visitor");
var parser_1 = require("../parser");
/**
* This trait is responsible for the CST building logic.
*/
var TreeBuilder = /** @class */ (function () {
function TreeBuilder() {
}
TreeBuilder.prototype.initTreeBuilder = function (config) {
this.CST_STACK = [];
// outputCst is no longer exposed/defined in the pubic API
this.outputCst = config.outputCst;
this.nodeLocationTracking = (0, has_1.default)(config, "nodeLocationTracking")
? config.nodeLocationTracking // assumes end user provides the correct config value/type
: parser_1.DEFAULT_PARSER_CONFIG.nodeLocationTracking;
if (!this.outputCst) {
this.cstInvocationStateUpdate = noop_1.default;
this.cstFinallyStateUpdate = noop_1.default;
this.cstPostTerminal = noop_1.default;
this.cstPostNonTerminal = noop_1.default;
this.cstPostRule = noop_1.default;
}
else {
if (/full/i.test(this.nodeLocationTracking)) {
if (this.recoveryEnabled) {
this.setNodeLocationFromToken = cst_1.setNodeLocationFull;
this.setNodeLocationFromNode = cst_1.setNodeLocationFull;
this.cstPostRule = noop_1.default;
this.setInitialNodeLocation = this.setInitialNodeLocationFullRecovery;
}
else {
this.setNodeLocationFromToken = noop_1.default;
this.setNodeLocationFromNode = noop_1.default;
this.cstPostRule = this.cstPostRuleFull;
this.setInitialNodeLocation = this.setInitialNodeLocationFullRegular;
}
}
else if (/onlyOffset/i.test(this.nodeLocationTracking)) {
if (this.recoveryEnabled) {
this.setNodeLocationFromToken = cst_1.setNodeLocationOnlyOffset;
this.setNodeLocationFromNode = cst_1.setNodeLocationOnlyOffset;
this.cstPostRule = noop_1.default;
this.setInitialNodeLocation =
this.setInitialNodeLocationOnlyOffsetRecovery;
}
else {
this.setNodeLocationFromToken = noop_1.default;
this.setNodeLocationFromNode = noop_1.default;
this.cstPostRule = this.cstPostRuleOnlyOffset;
this.setInitialNodeLocation =
this.setInitialNodeLocationOnlyOffsetRegular;
}
}
else if (/none/i.test(this.nodeLocationTracking)) {
this.setNodeLocationFromToken = noop_1.default;
this.setNodeLocationFromNode = noop_1.default;
this.cstPostRule = noop_1.default;
this.setInitialNodeLocation = noop_1.default;
}
else {
throw Error("Invalid <nodeLocationTracking> config option: \"".concat(config.nodeLocationTracking, "\""));
}
}
};
TreeBuilder.prototype.setInitialNodeLocationOnlyOffsetRecovery = function (cstNode) {
cstNode.location = {
startOffset: NaN,
endOffset: NaN
};
};
TreeBuilder.prototype.setInitialNodeLocationOnlyOffsetRegular = function (cstNode) {
cstNode.location = {
// without error recovery the starting Location of a new CstNode is guaranteed
// To be the next Token's startOffset (for valid inputs).
// For invalid inputs there won't be any CSTOutput so this potential
// inaccuracy does not matter
startOffset: this.LA(1).startOffset,
endOffset: NaN
};
};
TreeBuilder.prototype.setInitialNodeLocationFullRecovery = function (cstNode) {
cstNode.location = {
startOffset: NaN,
startLine: NaN,
startColumn: NaN,
endOffset: NaN,
endLine: NaN,
endColumn: NaN
};
};
/**
* @see setInitialNodeLocationOnlyOffsetRegular for explanation why this work
* @param cstNode
*/
TreeBuilder.prototype.setInitialNodeLocationFullRegular = function (cstNode) {
var nextToken = this.LA(1);
cstNode.location = {
startOffset: nextToken.startOffset,
startLine: nextToken.startLine,
startColumn: nextToken.startColumn,
endOffset: NaN,
endLine: NaN,
endColumn: NaN
};
};
TreeBuilder.prototype.cstInvocationStateUpdate = function (fullRuleName) {
var cstNode = {
name: fullRuleName,
children: Object.create(null)
};
this.setInitialNodeLocation(cstNode);
this.CST_STACK.push(cstNode);
};
TreeBuilder.prototype.cstFinallyStateUpdate = function () {
this.CST_STACK.pop();
};
TreeBuilder.prototype.cstPostRuleFull = function (ruleCstNode) {
// casts to `required<CstNodeLocation>` are safe because `cstPostRuleFull` should only be invoked when full location is enabled
var prevToken = this.LA(0);
var loc = ruleCstNode.location;
// If this condition is true it means we consumed at least one Token
// In this CstNode.
if (loc.startOffset <= prevToken.startOffset === true) {
loc.endOffset = prevToken.endOffset;
loc.endLine = prevToken.endLine;
loc.endColumn = prevToken.endColumn;
}
// "empty" CstNode edge case
else {
loc.startOffset = NaN;
loc.startLine = NaN;
loc.startColumn = NaN;
}
};
TreeBuilder.prototype.cstPostRuleOnlyOffset = function (ruleCstNode) {
var prevToken = this.LA(0);
// `location' is not null because `cstPostRuleOnlyOffset` will only be invoked when location tracking is enabled.
var loc = ruleCstNode.location;
// If this condition is true it means we consumed at least one Token
// In this CstNode.
if (loc.startOffset <= prevToken.startOffset === true) {
loc.endOffset = prevToken.endOffset;
}
// "empty" CstNode edge case
else {
loc.startOffset = NaN;
}
};
TreeBuilder.prototype.cstPostTerminal = function (key, consumedToken) {
var rootCst = this.CST_STACK[this.CST_STACK.length - 1];
(0, cst_1.addTerminalToCst)(rootCst, consumedToken, key);
// This is only used when **both** error recovery and CST Output are enabled.
this.setNodeLocationFromToken(rootCst.location, consumedToken);
};
TreeBuilder.prototype.cstPostNonTerminal = function (ruleCstResult, ruleName) {
var preCstNode = this.CST_STACK[this.CST_STACK.length - 1];
(0, cst_1.addNoneTerminalToCst)(preCstNode, ruleName, ruleCstResult);
// This is only used when **both** error recovery and CST Output are enabled.
this.setNodeLocationFromNode(preCstNode.location, ruleCstResult.location);
};
TreeBuilder.prototype.getBaseCstVisitorConstructor = function () {
if ((0, isUndefined_1.default)(this.baseCstVisitorConstructor)) {
var newBaseCstVisitorConstructor = (0, cst_visitor_1.createBaseSemanticVisitorConstructor)(this.className, (0, keys_1.default)(this.gastProductionsCache));
this.baseCstVisitorConstructor = newBaseCstVisitorConstructor;
return newBaseCstVisitorConstructor;
}
return this.baseCstVisitorConstructor;
};
TreeBuilder.prototype.getBaseCstVisitorConstructorWithDefaults = function () {
if ((0, isUndefined_1.default)(this.baseCstVisitorWithDefaultsConstructor)) {
var newConstructor = (0, cst_visitor_1.createBaseVisitorConstructorWithDefaults)(this.className, (0, keys_1.default)(this.gastProductionsCache), this.getBaseCstVisitorConstructor());
this.baseCstVisitorWithDefaultsConstructor = newConstructor;
return newConstructor;
}
return this.baseCstVisitorWithDefaultsConstructor;
};
TreeBuilder.prototype.getLastExplicitRuleShortName = function () {
var ruleStack = this.RULE_STACK;
return ruleStack[ruleStack.length - 1];
};
TreeBuilder.prototype.getPreviousExplicitRuleShortName = function () {
var ruleStack = this.RULE_STACK;
return ruleStack[ruleStack.length - 2];
};
TreeBuilder.prototype.getLastExplicitRuleOccurrenceIndex = function () {
var occurrenceStack = this.RULE_OCCURRENCE_STACK;
return occurrenceStack[occurrenceStack.length - 1];
};
return TreeBuilder;
}());
exports.TreeBuilder = TreeBuilder;
//# sourceMappingURL=tree_builder.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,3 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
//# sourceMappingURL=types.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"types.js","sourceRoot":"","sources":["../../../../src/parse/parser/types.ts"],"names":[],"mappings":""}

View File

@@ -0,0 +1,24 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.applyMixins = void 0;
function applyMixins(derivedCtor, baseCtors) {
baseCtors.forEach(function (baseCtor) {
var baseProto = baseCtor.prototype;
Object.getOwnPropertyNames(baseProto).forEach(function (propName) {
if (propName === "constructor") {
return;
}
var basePropDescriptor = Object.getOwnPropertyDescriptor(baseProto, propName);
// Handle Accessors
if (basePropDescriptor &&
(basePropDescriptor.get || basePropDescriptor.set)) {
Object.defineProperty(derivedCtor.prototype, propName, basePropDescriptor);
}
else {
derivedCtor.prototype[propName] = baseCtor.prototype[propName];
}
});
});
}
exports.applyMixins = applyMixins;
//# sourceMappingURL=apply_mixins.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"apply_mixins.js","sourceRoot":"","sources":["../../../../../src/parse/parser/utils/apply_mixins.ts"],"names":[],"mappings":";;;AAAA,SAAgB,WAAW,CAAC,WAAgB,EAAE,SAAgB;IAC5D,SAAS,CAAC,OAAO,CAAC,UAAC,QAAQ;QACzB,IAAM,SAAS,GAAG,QAAQ,CAAC,SAAS,CAAA;QACpC,MAAM,CAAC,mBAAmB,CAAC,SAAS,CAAC,CAAC,OAAO,CAAC,UAAC,QAAQ;YACrD,IAAI,QAAQ,KAAK,aAAa,EAAE;gBAC9B,OAAM;aACP;YAED,IAAM,kBAAkB,GAAG,MAAM,CAAC,wBAAwB,CACxD,SAAS,EACT,QAAQ,CACT,CAAA;YACD,mBAAmB;YACnB,IACE,kBAAkB;gBAClB,CAAC,kBAAkB,CAAC,GAAG,IAAI,kBAAkB,CAAC,GAAG,CAAC,EAClD;gBACA,MAAM,CAAC,cAAc,CACnB,WAAW,CAAC,SAAS,EACrB,QAAQ,EACR,kBAAkB,CACnB,CAAA;aACF;iBAAM;gBACL,WAAW,CAAC,SAAS,CAAC,QAAQ,CAAC,GAAG,QAAQ,CAAC,SAAS,CAAC,QAAQ,CAAC,CAAA;aAC/D;QACH,CAAC,CAAC,CAAA;IACJ,CAAC,CAAC,CAAA;AACJ,CAAC;AA3BD,kCA2BC"}

View File

@@ -0,0 +1,951 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.charCodeToOptimizedIndex = exports.minOptimizationVal = exports.buildLineBreakIssueMessage = exports.LineTerminatorOptimizedTester = exports.isShortPattern = exports.isCustomPattern = exports.cloneEmptyGroups = exports.performWarningRuntimeChecks = exports.performRuntimeChecks = exports.addStickyFlag = exports.addStartOfInput = exports.findUnreachablePatterns = exports.findModesThatDoNotExist = exports.findInvalidGroupType = exports.findDuplicatePatterns = exports.findUnsupportedFlags = exports.findStartOfInputAnchor = exports.findEmptyMatchRegExps = exports.findEndOfInputAnchor = exports.findInvalidPatterns = exports.findMissingPatterns = exports.validatePatterns = exports.analyzeTokenTypes = exports.enableSticky = exports.disableSticky = exports.SUPPORT_STICKY = exports.MODES = exports.DEFAULT_MODE = void 0;
var regexp_to_ast_1 = require("regexp-to-ast");
var lexer_public_1 = require("./lexer_public");
var first_1 = __importDefault(require("lodash/first"));
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var compact_1 = __importDefault(require("lodash/compact"));
var isArray_1 = __importDefault(require("lodash/isArray"));
var values_1 = __importDefault(require("lodash/values"));
var flatten_1 = __importDefault(require("lodash/flatten"));
var reject_1 = __importDefault(require("lodash/reject"));
var difference_1 = __importDefault(require("lodash/difference"));
var indexOf_1 = __importDefault(require("lodash/indexOf"));
var map_1 = __importDefault(require("lodash/map"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var isString_1 = __importDefault(require("lodash/isString"));
var isFunction_1 = __importDefault(require("lodash/isFunction"));
var isUndefined_1 = __importDefault(require("lodash/isUndefined"));
var find_1 = __importDefault(require("lodash/find"));
var has_1 = __importDefault(require("lodash/has"));
var keys_1 = __importDefault(require("lodash/keys"));
var isRegExp_1 = __importDefault(require("lodash/isRegExp"));
var filter_1 = __importDefault(require("lodash/filter"));
var defaults_1 = __importDefault(require("lodash/defaults"));
var reduce_1 = __importDefault(require("lodash/reduce"));
var includes_1 = __importDefault(require("lodash/includes"));
var utils_1 = require("@chevrotain/utils");
var reg_exp_1 = require("./reg_exp");
var reg_exp_parser_1 = require("./reg_exp_parser");
var PATTERN = "PATTERN";
exports.DEFAULT_MODE = "defaultMode";
exports.MODES = "modes";
exports.SUPPORT_STICKY = typeof new RegExp("(?:)").sticky === "boolean";
function disableSticky() {
exports.SUPPORT_STICKY = false;
}
exports.disableSticky = disableSticky;
function enableSticky() {
exports.SUPPORT_STICKY = true;
}
exports.enableSticky = enableSticky;
function analyzeTokenTypes(tokenTypes, options) {
options = (0, defaults_1.default)(options, {
useSticky: exports.SUPPORT_STICKY,
debug: false,
safeMode: false,
positionTracking: "full",
lineTerminatorCharacters: ["\r", "\n"],
tracer: function (msg, action) { return action(); }
});
var tracer = options.tracer;
tracer("initCharCodeToOptimizedIndexMap", function () {
initCharCodeToOptimizedIndexMap();
});
var onlyRelevantTypes;
tracer("Reject Lexer.NA", function () {
onlyRelevantTypes = (0, reject_1.default)(tokenTypes, function (currType) {
return currType[PATTERN] === lexer_public_1.Lexer.NA;
});
});
var hasCustom = false;
var allTransformedPatterns;
tracer("Transform Patterns", function () {
hasCustom = false;
allTransformedPatterns = (0, map_1.default)(onlyRelevantTypes, function (currType) {
var currPattern = currType[PATTERN];
/* istanbul ignore else */
if ((0, isRegExp_1.default)(currPattern)) {
var regExpSource = currPattern.source;
if (regExpSource.length === 1 &&
// only these regExp meta characters which can appear in a length one regExp
regExpSource !== "^" &&
regExpSource !== "$" &&
regExpSource !== "." &&
!currPattern.ignoreCase) {
return regExpSource;
}
else if (regExpSource.length === 2 &&
regExpSource[0] === "\\" &&
// not a meta character
!(0, includes_1.default)([
"d",
"D",
"s",
"S",
"t",
"r",
"n",
"t",
"0",
"c",
"b",
"B",
"f",
"v",
"w",
"W"
], regExpSource[1])) {
// escaped meta Characters: /\+/ /\[/
// or redundant escaping: /\a/
// without the escaping "\"
return regExpSource[1];
}
else {
return options.useSticky
? addStickyFlag(currPattern)
: addStartOfInput(currPattern);
}
}
else if ((0, isFunction_1.default)(currPattern)) {
hasCustom = true;
// CustomPatternMatcherFunc - custom patterns do not require any transformations, only wrapping in a RegExp Like object
return { exec: currPattern };
}
else if (typeof currPattern === "object") {
hasCustom = true;
// ICustomPattern
return currPattern;
}
else if (typeof currPattern === "string") {
if (currPattern.length === 1) {
return currPattern;
}
else {
var escapedRegExpString = currPattern.replace(/[\\^$.*+?()[\]{}|]/g, "\\$&");
var wrappedRegExp = new RegExp(escapedRegExpString);
return options.useSticky
? addStickyFlag(wrappedRegExp)
: addStartOfInput(wrappedRegExp);
}
}
else {
throw Error("non exhaustive match");
}
});
});
var patternIdxToType;
var patternIdxToGroup;
var patternIdxToLongerAltIdxArr;
var patternIdxToPushMode;
var patternIdxToPopMode;
tracer("misc mapping", function () {
patternIdxToType = (0, map_1.default)(onlyRelevantTypes, function (currType) { return currType.tokenTypeIdx; });
patternIdxToGroup = (0, map_1.default)(onlyRelevantTypes, function (clazz) {
var groupName = clazz.GROUP;
/* istanbul ignore next */
if (groupName === lexer_public_1.Lexer.SKIPPED) {
return undefined;
}
else if ((0, isString_1.default)(groupName)) {
return groupName;
}
else if ((0, isUndefined_1.default)(groupName)) {
return false;
}
else {
throw Error("non exhaustive match");
}
});
patternIdxToLongerAltIdxArr = (0, map_1.default)(onlyRelevantTypes, function (clazz) {
var longerAltType = clazz.LONGER_ALT;
if (longerAltType) {
var longerAltIdxArr = (0, isArray_1.default)(longerAltType)
? (0, map_1.default)(longerAltType, function (type) { return (0, indexOf_1.default)(onlyRelevantTypes, type); })
: [(0, indexOf_1.default)(onlyRelevantTypes, longerAltType)];
return longerAltIdxArr;
}
});
patternIdxToPushMode = (0, map_1.default)(onlyRelevantTypes, function (clazz) { return clazz.PUSH_MODE; });
patternIdxToPopMode = (0, map_1.default)(onlyRelevantTypes, function (clazz) {
return (0, has_1.default)(clazz, "POP_MODE");
});
});
var patternIdxToCanLineTerminator;
tracer("Line Terminator Handling", function () {
var lineTerminatorCharCodes = getCharCodes(options.lineTerminatorCharacters);
patternIdxToCanLineTerminator = (0, map_1.default)(onlyRelevantTypes, function (tokType) { return false; });
if (options.positionTracking !== "onlyOffset") {
patternIdxToCanLineTerminator = (0, map_1.default)(onlyRelevantTypes, function (tokType) {
if ((0, has_1.default)(tokType, "LINE_BREAKS")) {
return !!tokType.LINE_BREAKS;
}
else {
return (checkLineBreaksIssues(tokType, lineTerminatorCharCodes) === false &&
(0, reg_exp_1.canMatchCharCode)(lineTerminatorCharCodes, tokType.PATTERN));
}
});
}
});
var patternIdxToIsCustom;
var patternIdxToShort;
var emptyGroups;
var patternIdxToConfig;
tracer("Misc Mapping #2", function () {
patternIdxToIsCustom = (0, map_1.default)(onlyRelevantTypes, isCustomPattern);
patternIdxToShort = (0, map_1.default)(allTransformedPatterns, isShortPattern);
emptyGroups = (0, reduce_1.default)(onlyRelevantTypes, function (acc, clazz) {
var groupName = clazz.GROUP;
if ((0, isString_1.default)(groupName) && !(groupName === lexer_public_1.Lexer.SKIPPED)) {
acc[groupName] = [];
}
return acc;
}, {});
patternIdxToConfig = (0, map_1.default)(allTransformedPatterns, function (x, idx) {
return {
pattern: allTransformedPatterns[idx],
longerAlt: patternIdxToLongerAltIdxArr[idx],
canLineTerminator: patternIdxToCanLineTerminator[idx],
isCustom: patternIdxToIsCustom[idx],
short: patternIdxToShort[idx],
group: patternIdxToGroup[idx],
push: patternIdxToPushMode[idx],
pop: patternIdxToPopMode[idx],
tokenTypeIdx: patternIdxToType[idx],
tokenType: onlyRelevantTypes[idx]
};
});
});
var canBeOptimized = true;
var charCodeToPatternIdxToConfig = [];
if (!options.safeMode) {
tracer("First Char Optimization", function () {
charCodeToPatternIdxToConfig = (0, reduce_1.default)(onlyRelevantTypes, function (result, currTokType, idx) {
if (typeof currTokType.PATTERN === "string") {
var charCode = currTokType.PATTERN.charCodeAt(0);
var optimizedIdx = charCodeToOptimizedIndex(charCode);
addToMapOfArrays(result, optimizedIdx, patternIdxToConfig[idx]);
}
else if ((0, isArray_1.default)(currTokType.START_CHARS_HINT)) {
var lastOptimizedIdx_1;
(0, forEach_1.default)(currTokType.START_CHARS_HINT, function (charOrInt) {
var charCode = typeof charOrInt === "string"
? charOrInt.charCodeAt(0)
: charOrInt;
var currOptimizedIdx = charCodeToOptimizedIndex(charCode);
// Avoid adding the config multiple times
/* istanbul ignore else */
// - Difficult to check this scenario effects as it is only a performance
// optimization that does not change correctness
if (lastOptimizedIdx_1 !== currOptimizedIdx) {
lastOptimizedIdx_1 = currOptimizedIdx;
addToMapOfArrays(result, currOptimizedIdx, patternIdxToConfig[idx]);
}
});
}
else if ((0, isRegExp_1.default)(currTokType.PATTERN)) {
if (currTokType.PATTERN.unicode) {
canBeOptimized = false;
if (options.ensureOptimizations) {
(0, utils_1.PRINT_ERROR)("".concat(reg_exp_1.failedOptimizationPrefixMsg) +
"\tUnable to analyze < ".concat(currTokType.PATTERN.toString(), " > pattern.\n") +
"\tThe regexp unicode flag is not currently supported by the regexp-to-ast library.\n" +
"\tThis will disable the lexer's first char optimizations.\n" +
"\tFor details See: https://chevrotain.io/docs/guide/resolving_lexer_errors.html#UNICODE_OPTIMIZE");
}
}
else {
var optimizedCodes = (0, reg_exp_1.getOptimizedStartCodesIndices)(currTokType.PATTERN, options.ensureOptimizations);
/* istanbul ignore if */
// start code will only be empty given an empty regExp or failure of regexp-to-ast library
// the first should be a different validation and the second cannot be tested.
if ((0, isEmpty_1.default)(optimizedCodes)) {
// we cannot understand what codes may start possible matches
// The optimization correctness requires knowing start codes for ALL patterns.
// Not actually sure this is an error, no debug message
canBeOptimized = false;
}
(0, forEach_1.default)(optimizedCodes, function (code) {
addToMapOfArrays(result, code, patternIdxToConfig[idx]);
});
}
}
else {
if (options.ensureOptimizations) {
(0, utils_1.PRINT_ERROR)("".concat(reg_exp_1.failedOptimizationPrefixMsg) +
"\tTokenType: <".concat(currTokType.name, "> is using a custom token pattern without providing <start_chars_hint> parameter.\n") +
"\tThis will disable the lexer's first char optimizations.\n" +
"\tFor details See: https://chevrotain.io/docs/guide/resolving_lexer_errors.html#CUSTOM_OPTIMIZE");
}
canBeOptimized = false;
}
return result;
}, []);
});
}
return {
emptyGroups: emptyGroups,
patternIdxToConfig: patternIdxToConfig,
charCodeToPatternIdxToConfig: charCodeToPatternIdxToConfig,
hasCustom: hasCustom,
canBeOptimized: canBeOptimized
};
}
exports.analyzeTokenTypes = analyzeTokenTypes;
function validatePatterns(tokenTypes, validModesNames) {
var errors = [];
var missingResult = findMissingPatterns(tokenTypes);
errors = errors.concat(missingResult.errors);
var invalidResult = findInvalidPatterns(missingResult.valid);
var validTokenTypes = invalidResult.valid;
errors = errors.concat(invalidResult.errors);
errors = errors.concat(validateRegExpPattern(validTokenTypes));
errors = errors.concat(findInvalidGroupType(validTokenTypes));
errors = errors.concat(findModesThatDoNotExist(validTokenTypes, validModesNames));
errors = errors.concat(findUnreachablePatterns(validTokenTypes));
return errors;
}
exports.validatePatterns = validatePatterns;
function validateRegExpPattern(tokenTypes) {
var errors = [];
var withRegExpPatterns = (0, filter_1.default)(tokenTypes, function (currTokType) {
return (0, isRegExp_1.default)(currTokType[PATTERN]);
});
errors = errors.concat(findEndOfInputAnchor(withRegExpPatterns));
errors = errors.concat(findStartOfInputAnchor(withRegExpPatterns));
errors = errors.concat(findUnsupportedFlags(withRegExpPatterns));
errors = errors.concat(findDuplicatePatterns(withRegExpPatterns));
errors = errors.concat(findEmptyMatchRegExps(withRegExpPatterns));
return errors;
}
function findMissingPatterns(tokenTypes) {
var tokenTypesWithMissingPattern = (0, filter_1.default)(tokenTypes, function (currType) {
return !(0, has_1.default)(currType, PATTERN);
});
var errors = (0, map_1.default)(tokenTypesWithMissingPattern, function (currType) {
return {
message: "Token Type: ->" +
currType.name +
"<- missing static 'PATTERN' property",
type: lexer_public_1.LexerDefinitionErrorType.MISSING_PATTERN,
tokenTypes: [currType]
};
});
var valid = (0, difference_1.default)(tokenTypes, tokenTypesWithMissingPattern);
return { errors: errors, valid: valid };
}
exports.findMissingPatterns = findMissingPatterns;
function findInvalidPatterns(tokenTypes) {
var tokenTypesWithInvalidPattern = (0, filter_1.default)(tokenTypes, function (currType) {
var pattern = currType[PATTERN];
return (!(0, isRegExp_1.default)(pattern) &&
!(0, isFunction_1.default)(pattern) &&
!(0, has_1.default)(pattern, "exec") &&
!(0, isString_1.default)(pattern));
});
var errors = (0, map_1.default)(tokenTypesWithInvalidPattern, function (currType) {
return {
message: "Token Type: ->" +
currType.name +
"<- static 'PATTERN' can only be a RegExp, a" +
" Function matching the {CustomPatternMatcherFunc} type or an Object matching the {ICustomPattern} interface.",
type: lexer_public_1.LexerDefinitionErrorType.INVALID_PATTERN,
tokenTypes: [currType]
};
});
var valid = (0, difference_1.default)(tokenTypes, tokenTypesWithInvalidPattern);
return { errors: errors, valid: valid };
}
exports.findInvalidPatterns = findInvalidPatterns;
var end_of_input = /[^\\][$]/;
function findEndOfInputAnchor(tokenTypes) {
var EndAnchorFinder = /** @class */ (function (_super) {
__extends(EndAnchorFinder, _super);
function EndAnchorFinder() {
var _this = _super !== null && _super.apply(this, arguments) || this;
_this.found = false;
return _this;
}
EndAnchorFinder.prototype.visitEndAnchor = function (node) {
this.found = true;
};
return EndAnchorFinder;
}(regexp_to_ast_1.BaseRegExpVisitor));
var invalidRegex = (0, filter_1.default)(tokenTypes, function (currType) {
var pattern = currType.PATTERN;
try {
var regexpAst = (0, reg_exp_parser_1.getRegExpAst)(pattern);
var endAnchorVisitor = new EndAnchorFinder();
endAnchorVisitor.visit(regexpAst);
return endAnchorVisitor.found;
}
catch (e) {
// old behavior in case of runtime exceptions with regexp-to-ast.
/* istanbul ignore next - cannot ensure an error in regexp-to-ast*/
return end_of_input.test(pattern.source);
}
});
var errors = (0, map_1.default)(invalidRegex, function (currType) {
return {
message: "Unexpected RegExp Anchor Error:\n" +
"\tToken Type: ->" +
currType.name +
"<- static 'PATTERN' cannot contain end of input anchor '$'\n" +
"\tSee chevrotain.io/docs/guide/resolving_lexer_errors.html#ANCHORS" +
"\tfor details.",
type: lexer_public_1.LexerDefinitionErrorType.EOI_ANCHOR_FOUND,
tokenTypes: [currType]
};
});
return errors;
}
exports.findEndOfInputAnchor = findEndOfInputAnchor;
function findEmptyMatchRegExps(tokenTypes) {
var matchesEmptyString = (0, filter_1.default)(tokenTypes, function (currType) {
var pattern = currType.PATTERN;
return pattern.test("");
});
var errors = (0, map_1.default)(matchesEmptyString, function (currType) {
return {
message: "Token Type: ->" +
currType.name +
"<- static 'PATTERN' must not match an empty string",
type: lexer_public_1.LexerDefinitionErrorType.EMPTY_MATCH_PATTERN,
tokenTypes: [currType]
};
});
return errors;
}
exports.findEmptyMatchRegExps = findEmptyMatchRegExps;
var start_of_input = /[^\\[][\^]|^\^/;
function findStartOfInputAnchor(tokenTypes) {
var StartAnchorFinder = /** @class */ (function (_super) {
__extends(StartAnchorFinder, _super);
function StartAnchorFinder() {
var _this = _super !== null && _super.apply(this, arguments) || this;
_this.found = false;
return _this;
}
StartAnchorFinder.prototype.visitStartAnchor = function (node) {
this.found = true;
};
return StartAnchorFinder;
}(regexp_to_ast_1.BaseRegExpVisitor));
var invalidRegex = (0, filter_1.default)(tokenTypes, function (currType) {
var pattern = currType.PATTERN;
try {
var regexpAst = (0, reg_exp_parser_1.getRegExpAst)(pattern);
var startAnchorVisitor = new StartAnchorFinder();
startAnchorVisitor.visit(regexpAst);
return startAnchorVisitor.found;
}
catch (e) {
// old behavior in case of runtime exceptions with regexp-to-ast.
/* istanbul ignore next - cannot ensure an error in regexp-to-ast*/
return start_of_input.test(pattern.source);
}
});
var errors = (0, map_1.default)(invalidRegex, function (currType) {
return {
message: "Unexpected RegExp Anchor Error:\n" +
"\tToken Type: ->" +
currType.name +
"<- static 'PATTERN' cannot contain start of input anchor '^'\n" +
"\tSee https://chevrotain.io/docs/guide/resolving_lexer_errors.html#ANCHORS" +
"\tfor details.",
type: lexer_public_1.LexerDefinitionErrorType.SOI_ANCHOR_FOUND,
tokenTypes: [currType]
};
});
return errors;
}
exports.findStartOfInputAnchor = findStartOfInputAnchor;
function findUnsupportedFlags(tokenTypes) {
var invalidFlags = (0, filter_1.default)(tokenTypes, function (currType) {
var pattern = currType[PATTERN];
return pattern instanceof RegExp && (pattern.multiline || pattern.global);
});
var errors = (0, map_1.default)(invalidFlags, function (currType) {
return {
message: "Token Type: ->" +
currType.name +
"<- static 'PATTERN' may NOT contain global('g') or multiline('m')",
type: lexer_public_1.LexerDefinitionErrorType.UNSUPPORTED_FLAGS_FOUND,
tokenTypes: [currType]
};
});
return errors;
}
exports.findUnsupportedFlags = findUnsupportedFlags;
// This can only test for identical duplicate RegExps, not semantically equivalent ones.
function findDuplicatePatterns(tokenTypes) {
var found = [];
var identicalPatterns = (0, map_1.default)(tokenTypes, function (outerType) {
return (0, reduce_1.default)(tokenTypes, function (result, innerType) {
if (outerType.PATTERN.source === innerType.PATTERN.source &&
!(0, includes_1.default)(found, innerType) &&
innerType.PATTERN !== lexer_public_1.Lexer.NA) {
// this avoids duplicates in the result, each Token Type may only appear in one "set"
// in essence we are creating Equivalence classes on equality relation.
found.push(innerType);
result.push(innerType);
return result;
}
return result;
}, []);
});
identicalPatterns = (0, compact_1.default)(identicalPatterns);
var duplicatePatterns = (0, filter_1.default)(identicalPatterns, function (currIdenticalSet) {
return currIdenticalSet.length > 1;
});
var errors = (0, map_1.default)(duplicatePatterns, function (setOfIdentical) {
var tokenTypeNames = (0, map_1.default)(setOfIdentical, function (currType) {
return currType.name;
});
var dupPatternSrc = (0, first_1.default)(setOfIdentical).PATTERN;
return {
message: "The same RegExp pattern ->".concat(dupPatternSrc, "<-") +
"has been used in all of the following Token Types: ".concat(tokenTypeNames.join(", "), " <-"),
type: lexer_public_1.LexerDefinitionErrorType.DUPLICATE_PATTERNS_FOUND,
tokenTypes: setOfIdentical
};
});
return errors;
}
exports.findDuplicatePatterns = findDuplicatePatterns;
function findInvalidGroupType(tokenTypes) {
var invalidTypes = (0, filter_1.default)(tokenTypes, function (clazz) {
if (!(0, has_1.default)(clazz, "GROUP")) {
return false;
}
var group = clazz.GROUP;
return group !== lexer_public_1.Lexer.SKIPPED && group !== lexer_public_1.Lexer.NA && !(0, isString_1.default)(group);
});
var errors = (0, map_1.default)(invalidTypes, function (currType) {
return {
message: "Token Type: ->" +
currType.name +
"<- static 'GROUP' can only be Lexer.SKIPPED/Lexer.NA/A String",
type: lexer_public_1.LexerDefinitionErrorType.INVALID_GROUP_TYPE_FOUND,
tokenTypes: [currType]
};
});
return errors;
}
exports.findInvalidGroupType = findInvalidGroupType;
function findModesThatDoNotExist(tokenTypes, validModes) {
var invalidModes = (0, filter_1.default)(tokenTypes, function (clazz) {
return (clazz.PUSH_MODE !== undefined && !(0, includes_1.default)(validModes, clazz.PUSH_MODE));
});
var errors = (0, map_1.default)(invalidModes, function (tokType) {
var msg = "Token Type: ->".concat(tokType.name, "<- static 'PUSH_MODE' value cannot refer to a Lexer Mode ->").concat(tokType.PUSH_MODE, "<-") +
"which does not exist";
return {
message: msg,
type: lexer_public_1.LexerDefinitionErrorType.PUSH_MODE_DOES_NOT_EXIST,
tokenTypes: [tokType]
};
});
return errors;
}
exports.findModesThatDoNotExist = findModesThatDoNotExist;
function findUnreachablePatterns(tokenTypes) {
var errors = [];
var canBeTested = (0, reduce_1.default)(tokenTypes, function (result, tokType, idx) {
var pattern = tokType.PATTERN;
if (pattern === lexer_public_1.Lexer.NA) {
return result;
}
// a more comprehensive validation for all forms of regExps would require
// deeper regExp analysis capabilities
if ((0, isString_1.default)(pattern)) {
result.push({ str: pattern, idx: idx, tokenType: tokType });
}
else if ((0, isRegExp_1.default)(pattern) && noMetaChar(pattern)) {
result.push({ str: pattern.source, idx: idx, tokenType: tokType });
}
return result;
}, []);
(0, forEach_1.default)(tokenTypes, function (tokType, testIdx) {
(0, forEach_1.default)(canBeTested, function (_a) {
var str = _a.str, idx = _a.idx, tokenType = _a.tokenType;
if (testIdx < idx && testTokenType(str, tokType.PATTERN)) {
var msg = "Token: ->".concat(tokenType.name, "<- can never be matched.\n") +
"Because it appears AFTER the Token Type ->".concat(tokType.name, "<-") +
"in the lexer's definition.\n" +
"See https://chevrotain.io/docs/guide/resolving_lexer_errors.html#UNREACHABLE";
errors.push({
message: msg,
type: lexer_public_1.LexerDefinitionErrorType.UNREACHABLE_PATTERN,
tokenTypes: [tokType, tokenType]
});
}
});
});
return errors;
}
exports.findUnreachablePatterns = findUnreachablePatterns;
function testTokenType(str, pattern) {
/* istanbul ignore else */
if ((0, isRegExp_1.default)(pattern)) {
var regExpArray = pattern.exec(str);
return regExpArray !== null && regExpArray.index === 0;
}
else if ((0, isFunction_1.default)(pattern)) {
// maintain the API of custom patterns
return pattern(str, 0, [], {});
}
else if ((0, has_1.default)(pattern, "exec")) {
// maintain the API of custom patterns
return pattern.exec(str, 0, [], {});
}
else if (typeof pattern === "string") {
return pattern === str;
}
else {
throw Error("non exhaustive match");
}
}
function noMetaChar(regExp) {
//https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/RegExp
var metaChars = [
".",
"\\",
"[",
"]",
"|",
"^",
"$",
"(",
")",
"?",
"*",
"+",
"{"
];
return ((0, find_1.default)(metaChars, function (char) { return regExp.source.indexOf(char) !== -1; }) === undefined);
}
function addStartOfInput(pattern) {
var flags = pattern.ignoreCase ? "i" : "";
// always wrapping in a none capturing group preceded by '^' to make sure matching can only work on start of input.
// duplicate/redundant start of input markers have no meaning (/^^^^A/ === /^A/)
return new RegExp("^(?:".concat(pattern.source, ")"), flags);
}
exports.addStartOfInput = addStartOfInput;
function addStickyFlag(pattern) {
var flags = pattern.ignoreCase ? "iy" : "y";
// always wrapping in a none capturing group preceded by '^' to make sure matching can only work on start of input.
// duplicate/redundant start of input markers have no meaning (/^^^^A/ === /^A/)
return new RegExp("".concat(pattern.source), flags);
}
exports.addStickyFlag = addStickyFlag;
function performRuntimeChecks(lexerDefinition, trackLines, lineTerminatorCharacters) {
var errors = [];
// some run time checks to help the end users.
if (!(0, has_1.default)(lexerDefinition, exports.DEFAULT_MODE)) {
errors.push({
message: "A MultiMode Lexer cannot be initialized without a <" +
exports.DEFAULT_MODE +
"> property in its definition\n",
type: lexer_public_1.LexerDefinitionErrorType.MULTI_MODE_LEXER_WITHOUT_DEFAULT_MODE
});
}
if (!(0, has_1.default)(lexerDefinition, exports.MODES)) {
errors.push({
message: "A MultiMode Lexer cannot be initialized without a <" +
exports.MODES +
"> property in its definition\n",
type: lexer_public_1.LexerDefinitionErrorType.MULTI_MODE_LEXER_WITHOUT_MODES_PROPERTY
});
}
if ((0, has_1.default)(lexerDefinition, exports.MODES) &&
(0, has_1.default)(lexerDefinition, exports.DEFAULT_MODE) &&
!(0, has_1.default)(lexerDefinition.modes, lexerDefinition.defaultMode)) {
errors.push({
message: "A MultiMode Lexer cannot be initialized with a ".concat(exports.DEFAULT_MODE, ": <").concat(lexerDefinition.defaultMode, ">") +
"which does not exist\n",
type: lexer_public_1.LexerDefinitionErrorType.MULTI_MODE_LEXER_DEFAULT_MODE_VALUE_DOES_NOT_EXIST
});
}
if ((0, has_1.default)(lexerDefinition, exports.MODES)) {
(0, forEach_1.default)(lexerDefinition.modes, function (currModeValue, currModeName) {
(0, forEach_1.default)(currModeValue, function (currTokType, currIdx) {
if ((0, isUndefined_1.default)(currTokType)) {
errors.push({
message: "A Lexer cannot be initialized using an undefined Token Type. Mode:" +
"<".concat(currModeName, "> at index: <").concat(currIdx, ">\n"),
type: lexer_public_1.LexerDefinitionErrorType.LEXER_DEFINITION_CANNOT_CONTAIN_UNDEFINED
});
}
else if ((0, has_1.default)(currTokType, "LONGER_ALT")) {
var longerAlt = (0, isArray_1.default)(currTokType.LONGER_ALT)
? currTokType.LONGER_ALT
: [currTokType.LONGER_ALT];
(0, forEach_1.default)(longerAlt, function (currLongerAlt) {
if (!(0, isUndefined_1.default)(currLongerAlt) &&
!(0, includes_1.default)(currModeValue, currLongerAlt)) {
errors.push({
message: "A MultiMode Lexer cannot be initialized with a longer_alt <".concat(currLongerAlt.name, "> on token <").concat(currTokType.name, "> outside of mode <").concat(currModeName, ">\n"),
type: lexer_public_1.LexerDefinitionErrorType.MULTI_MODE_LEXER_LONGER_ALT_NOT_IN_CURRENT_MODE
});
}
});
}
});
});
}
return errors;
}
exports.performRuntimeChecks = performRuntimeChecks;
function performWarningRuntimeChecks(lexerDefinition, trackLines, lineTerminatorCharacters) {
var warnings = [];
var hasAnyLineBreak = false;
var allTokenTypes = (0, compact_1.default)((0, flatten_1.default)((0, values_1.default)(lexerDefinition.modes)));
var concreteTokenTypes = (0, reject_1.default)(allTokenTypes, function (currType) { return currType[PATTERN] === lexer_public_1.Lexer.NA; });
var terminatorCharCodes = getCharCodes(lineTerminatorCharacters);
if (trackLines) {
(0, forEach_1.default)(concreteTokenTypes, function (tokType) {
var currIssue = checkLineBreaksIssues(tokType, terminatorCharCodes);
if (currIssue !== false) {
var message = buildLineBreakIssueMessage(tokType, currIssue);
var warningDescriptor = {
message: message,
type: currIssue.issue,
tokenType: tokType
};
warnings.push(warningDescriptor);
}
else {
// we don't want to attempt to scan if the user explicitly specified the line_breaks option.
if ((0, has_1.default)(tokType, "LINE_BREAKS")) {
if (tokType.LINE_BREAKS === true) {
hasAnyLineBreak = true;
}
}
else {
if ((0, reg_exp_1.canMatchCharCode)(terminatorCharCodes, tokType.PATTERN)) {
hasAnyLineBreak = true;
}
}
}
});
}
if (trackLines && !hasAnyLineBreak) {
warnings.push({
message: "Warning: No LINE_BREAKS Found.\n" +
"\tThis Lexer has been defined to track line and column information,\n" +
"\tBut none of the Token Types can be identified as matching a line terminator.\n" +
"\tSee https://chevrotain.io/docs/guide/resolving_lexer_errors.html#LINE_BREAKS \n" +
"\tfor details.",
type: lexer_public_1.LexerDefinitionErrorType.NO_LINE_BREAKS_FLAGS
});
}
return warnings;
}
exports.performWarningRuntimeChecks = performWarningRuntimeChecks;
function cloneEmptyGroups(emptyGroups) {
var clonedResult = {};
var groupKeys = (0, keys_1.default)(emptyGroups);
(0, forEach_1.default)(groupKeys, function (currKey) {
var currGroupValue = emptyGroups[currKey];
/* istanbul ignore else */
if ((0, isArray_1.default)(currGroupValue)) {
clonedResult[currKey] = [];
}
else {
throw Error("non exhaustive match");
}
});
return clonedResult;
}
exports.cloneEmptyGroups = cloneEmptyGroups;
// TODO: refactor to avoid duplication
function isCustomPattern(tokenType) {
var pattern = tokenType.PATTERN;
/* istanbul ignore else */
if ((0, isRegExp_1.default)(pattern)) {
return false;
}
else if ((0, isFunction_1.default)(pattern)) {
// CustomPatternMatcherFunc - custom patterns do not require any transformations, only wrapping in a RegExp Like object
return true;
}
else if ((0, has_1.default)(pattern, "exec")) {
// ICustomPattern
return true;
}
else if ((0, isString_1.default)(pattern)) {
return false;
}
else {
throw Error("non exhaustive match");
}
}
exports.isCustomPattern = isCustomPattern;
function isShortPattern(pattern) {
if ((0, isString_1.default)(pattern) && pattern.length === 1) {
return pattern.charCodeAt(0);
}
else {
return false;
}
}
exports.isShortPattern = isShortPattern;
/**
* Faster than using a RegExp for default newline detection during lexing.
*/
exports.LineTerminatorOptimizedTester = {
// implements /\n|\r\n?/g.test
test: function (text) {
var len = text.length;
for (var i = this.lastIndex; i < len; i++) {
var c = text.charCodeAt(i);
if (c === 10) {
this.lastIndex = i + 1;
return true;
}
else if (c === 13) {
if (text.charCodeAt(i + 1) === 10) {
this.lastIndex = i + 2;
}
else {
this.lastIndex = i + 1;
}
return true;
}
}
return false;
},
lastIndex: 0
};
function checkLineBreaksIssues(tokType, lineTerminatorCharCodes) {
if ((0, has_1.default)(tokType, "LINE_BREAKS")) {
// if the user explicitly declared the line_breaks option we will respect their choice
// and assume it is correct.
return false;
}
else {
/* istanbul ignore else */
if ((0, isRegExp_1.default)(tokType.PATTERN)) {
try {
// TODO: why is the casting suddenly needed?
(0, reg_exp_1.canMatchCharCode)(lineTerminatorCharCodes, tokType.PATTERN);
}
catch (e) {
/* istanbul ignore next - to test this we would have to mock <canMatchCharCode> to throw an error */
return {
issue: lexer_public_1.LexerDefinitionErrorType.IDENTIFY_TERMINATOR,
errMsg: e.message
};
}
return false;
}
else if ((0, isString_1.default)(tokType.PATTERN)) {
// string literal patterns can always be analyzed to detect line terminator usage
return false;
}
else if (isCustomPattern(tokType)) {
// custom token types
return { issue: lexer_public_1.LexerDefinitionErrorType.CUSTOM_LINE_BREAK };
}
else {
throw Error("non exhaustive match");
}
}
}
function buildLineBreakIssueMessage(tokType, details) {
/* istanbul ignore else */
if (details.issue === lexer_public_1.LexerDefinitionErrorType.IDENTIFY_TERMINATOR) {
return ("Warning: unable to identify line terminator usage in pattern.\n" +
"\tThe problem is in the <".concat(tokType.name, "> Token Type\n") +
"\t Root cause: ".concat(details.errMsg, ".\n") +
"\tFor details See: https://chevrotain.io/docs/guide/resolving_lexer_errors.html#IDENTIFY_TERMINATOR");
}
else if (details.issue === lexer_public_1.LexerDefinitionErrorType.CUSTOM_LINE_BREAK) {
return ("Warning: A Custom Token Pattern should specify the <line_breaks> option.\n" +
"\tThe problem is in the <".concat(tokType.name, "> Token Type\n") +
"\tFor details See: https://chevrotain.io/docs/guide/resolving_lexer_errors.html#CUSTOM_LINE_BREAK");
}
else {
throw Error("non exhaustive match");
}
}
exports.buildLineBreakIssueMessage = buildLineBreakIssueMessage;
function getCharCodes(charsOrCodes) {
var charCodes = (0, map_1.default)(charsOrCodes, function (numOrString) {
if ((0, isString_1.default)(numOrString)) {
return numOrString.charCodeAt(0);
}
else {
return numOrString;
}
});
return charCodes;
}
function addToMapOfArrays(map, key, value) {
if (map[key] === undefined) {
map[key] = [value];
}
else {
map[key].push(value);
}
}
exports.minOptimizationVal = 256;
/**
* We are mapping charCode above ASCI (256) into buckets each in the size of 256.
* This is because ASCI are the most common start chars so each one of those will get its own
* possible token configs vector.
*
* Tokens starting with charCodes "above" ASCI are uncommon, so we can "afford"
* to place these into buckets of possible token configs, What we gain from
* this is avoiding the case of creating an optimization 'charCodeToPatternIdxToConfig'
* which would contain 10,000+ arrays of small size (e.g unicode Identifiers scenario).
* Our 'charCodeToPatternIdxToConfig' max size will now be:
* 256 + (2^16 / 2^8) - 1 === 511
*
* note the hack for fast division integer part extraction
* See: https://stackoverflow.com/a/4228528
*/
var charCodeToOptimizedIdxMap = [];
function charCodeToOptimizedIndex(charCode) {
return charCode < exports.minOptimizationVal
? charCode
: charCodeToOptimizedIdxMap[charCode];
}
exports.charCodeToOptimizedIndex = charCodeToOptimizedIndex;
/**
* This is a compromise between cold start / hot running performance
* Creating this array takes ~3ms on a modern machine,
* But if we perform the computation at runtime as needed the CSS Lexer benchmark
* performance degrades by ~10%
*
* TODO: Perhaps it should be lazy initialized only if a charCode > 255 is used.
*/
function initCharCodeToOptimizedIndexMap() {
if ((0, isEmpty_1.default)(charCodeToOptimizedIdxMap)) {
charCodeToOptimizedIdxMap = new Array(65536);
for (var i = 0; i < 65536; i++) {
charCodeToOptimizedIdxMap[i] = i > 255 ? 255 + ~~(i / 255) : i;
}
}
}
//# sourceMappingURL=lexer.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,12 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.defaultLexerErrorProvider = void 0;
exports.defaultLexerErrorProvider = {
buildUnableToPopLexerModeMessage: function (token) {
return "Unable to pop Lexer Mode after encountering Token ->".concat(token.image, "<- The Mode Stack is empty");
},
buildUnexpectedCharactersMessage: function (fullText, startOffset, length, line, column) {
return ("unexpected character: ->".concat(fullText.charAt(startOffset), "<- at offset: ").concat(startOffset, ",") + " skipped ".concat(length, " characters."));
}
};
//# sourceMappingURL=lexer_errors_public.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"lexer_errors_public.js","sourceRoot":"","sources":["../../../src/scan/lexer_errors_public.ts"],"names":[],"mappings":";;;AAEa,QAAA,yBAAyB,GAA+B;IACnE,gCAAgC,YAAC,KAAa;QAC5C,OAAO,8DAAuD,KAAK,CAAC,KAAK,+BAA4B,CAAA;IACvG,CAAC;IAED,gCAAgC,YAC9B,QAAgB,EAChB,WAAmB,EACnB,MAAc,EACd,IAAa,EACb,MAAe;QAEf,OAAO,CACL,kCAA2B,QAAQ,CAAC,MAAM,CACxC,WAAW,CACZ,2BAAiB,WAAW,MAAG,GAAG,mBAAY,MAAM,iBAAc,CACpE,CAAA;IACH,CAAC;CACF,CAAA"}

View File

@@ -0,0 +1,669 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.Lexer = exports.LexerDefinitionErrorType = void 0;
var lexer_1 = require("./lexer");
var noop_1 = __importDefault(require("lodash/noop"));
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var isArray_1 = __importDefault(require("lodash/isArray"));
var last_1 = __importDefault(require("lodash/last"));
var reject_1 = __importDefault(require("lodash/reject"));
var map_1 = __importDefault(require("lodash/map"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var keys_1 = __importDefault(require("lodash/keys"));
var isUndefined_1 = __importDefault(require("lodash/isUndefined"));
var identity_1 = __importDefault(require("lodash/identity"));
var assign_1 = __importDefault(require("lodash/assign"));
var reduce_1 = __importDefault(require("lodash/reduce"));
var clone_1 = __importDefault(require("lodash/clone"));
var utils_1 = require("@chevrotain/utils");
var tokens_1 = require("./tokens");
var lexer_errors_public_1 = require("./lexer_errors_public");
var reg_exp_parser_1 = require("./reg_exp_parser");
var LexerDefinitionErrorType;
(function (LexerDefinitionErrorType) {
LexerDefinitionErrorType[LexerDefinitionErrorType["MISSING_PATTERN"] = 0] = "MISSING_PATTERN";
LexerDefinitionErrorType[LexerDefinitionErrorType["INVALID_PATTERN"] = 1] = "INVALID_PATTERN";
LexerDefinitionErrorType[LexerDefinitionErrorType["EOI_ANCHOR_FOUND"] = 2] = "EOI_ANCHOR_FOUND";
LexerDefinitionErrorType[LexerDefinitionErrorType["UNSUPPORTED_FLAGS_FOUND"] = 3] = "UNSUPPORTED_FLAGS_FOUND";
LexerDefinitionErrorType[LexerDefinitionErrorType["DUPLICATE_PATTERNS_FOUND"] = 4] = "DUPLICATE_PATTERNS_FOUND";
LexerDefinitionErrorType[LexerDefinitionErrorType["INVALID_GROUP_TYPE_FOUND"] = 5] = "INVALID_GROUP_TYPE_FOUND";
LexerDefinitionErrorType[LexerDefinitionErrorType["PUSH_MODE_DOES_NOT_EXIST"] = 6] = "PUSH_MODE_DOES_NOT_EXIST";
LexerDefinitionErrorType[LexerDefinitionErrorType["MULTI_MODE_LEXER_WITHOUT_DEFAULT_MODE"] = 7] = "MULTI_MODE_LEXER_WITHOUT_DEFAULT_MODE";
LexerDefinitionErrorType[LexerDefinitionErrorType["MULTI_MODE_LEXER_WITHOUT_MODES_PROPERTY"] = 8] = "MULTI_MODE_LEXER_WITHOUT_MODES_PROPERTY";
LexerDefinitionErrorType[LexerDefinitionErrorType["MULTI_MODE_LEXER_DEFAULT_MODE_VALUE_DOES_NOT_EXIST"] = 9] = "MULTI_MODE_LEXER_DEFAULT_MODE_VALUE_DOES_NOT_EXIST";
LexerDefinitionErrorType[LexerDefinitionErrorType["LEXER_DEFINITION_CANNOT_CONTAIN_UNDEFINED"] = 10] = "LEXER_DEFINITION_CANNOT_CONTAIN_UNDEFINED";
LexerDefinitionErrorType[LexerDefinitionErrorType["SOI_ANCHOR_FOUND"] = 11] = "SOI_ANCHOR_FOUND";
LexerDefinitionErrorType[LexerDefinitionErrorType["EMPTY_MATCH_PATTERN"] = 12] = "EMPTY_MATCH_PATTERN";
LexerDefinitionErrorType[LexerDefinitionErrorType["NO_LINE_BREAKS_FLAGS"] = 13] = "NO_LINE_BREAKS_FLAGS";
LexerDefinitionErrorType[LexerDefinitionErrorType["UNREACHABLE_PATTERN"] = 14] = "UNREACHABLE_PATTERN";
LexerDefinitionErrorType[LexerDefinitionErrorType["IDENTIFY_TERMINATOR"] = 15] = "IDENTIFY_TERMINATOR";
LexerDefinitionErrorType[LexerDefinitionErrorType["CUSTOM_LINE_BREAK"] = 16] = "CUSTOM_LINE_BREAK";
LexerDefinitionErrorType[LexerDefinitionErrorType["MULTI_MODE_LEXER_LONGER_ALT_NOT_IN_CURRENT_MODE"] = 17] = "MULTI_MODE_LEXER_LONGER_ALT_NOT_IN_CURRENT_MODE";
})(LexerDefinitionErrorType = exports.LexerDefinitionErrorType || (exports.LexerDefinitionErrorType = {}));
var DEFAULT_LEXER_CONFIG = {
deferDefinitionErrorsHandling: false,
positionTracking: "full",
lineTerminatorsPattern: /\n|\r\n?/g,
lineTerminatorCharacters: ["\n", "\r"],
ensureOptimizations: false,
safeMode: false,
errorMessageProvider: lexer_errors_public_1.defaultLexerErrorProvider,
traceInitPerf: false,
skipValidations: false,
recoveryEnabled: true
};
Object.freeze(DEFAULT_LEXER_CONFIG);
var Lexer = /** @class */ (function () {
function Lexer(lexerDefinition, config) {
if (config === void 0) { config = DEFAULT_LEXER_CONFIG; }
var _this = this;
this.lexerDefinition = lexerDefinition;
this.lexerDefinitionErrors = [];
this.lexerDefinitionWarning = [];
this.patternIdxToConfig = {};
this.charCodeToPatternIdxToConfig = {};
this.modes = [];
this.emptyGroups = {};
this.trackStartLines = true;
this.trackEndLines = true;
this.hasCustom = false;
this.canModeBeOptimized = {};
// Duplicated from the parser's perf trace trait to allow future extraction
// of the lexer to a separate package.
this.TRACE_INIT = function (phaseDesc, phaseImpl) {
// No need to optimize this using NOOP pattern because
// It is not called in a hot spot...
if (_this.traceInitPerf === true) {
_this.traceInitIndent++;
var indent = new Array(_this.traceInitIndent + 1).join("\t");
if (_this.traceInitIndent < _this.traceInitMaxIdent) {
console.log("".concat(indent, "--> <").concat(phaseDesc, ">"));
}
var _a = (0, utils_1.timer)(phaseImpl), time = _a.time, value = _a.value;
/* istanbul ignore next - Difficult to reproduce specific performance behavior (>10ms) in tests */
var traceMethod = time > 10 ? console.warn : console.log;
if (_this.traceInitIndent < _this.traceInitMaxIdent) {
traceMethod("".concat(indent, "<-- <").concat(phaseDesc, "> time: ").concat(time, "ms"));
}
_this.traceInitIndent--;
return value;
}
else {
return phaseImpl();
}
};
if (typeof config === "boolean") {
throw Error("The second argument to the Lexer constructor is now an ILexerConfig Object.\n" +
"a boolean 2nd argument is no longer supported");
}
// todo: defaults func?
this.config = (0, assign_1.default)({}, DEFAULT_LEXER_CONFIG, config);
var traceInitVal = this.config.traceInitPerf;
if (traceInitVal === true) {
this.traceInitMaxIdent = Infinity;
this.traceInitPerf = true;
}
else if (typeof traceInitVal === "number") {
this.traceInitMaxIdent = traceInitVal;
this.traceInitPerf = true;
}
this.traceInitIndent = -1;
this.TRACE_INIT("Lexer Constructor", function () {
var actualDefinition;
var hasOnlySingleMode = true;
_this.TRACE_INIT("Lexer Config handling", function () {
if (_this.config.lineTerminatorsPattern ===
DEFAULT_LEXER_CONFIG.lineTerminatorsPattern) {
// optimized built-in implementation for the defaults definition of lineTerminators
_this.config.lineTerminatorsPattern = lexer_1.LineTerminatorOptimizedTester;
}
else {
if (_this.config.lineTerminatorCharacters ===
DEFAULT_LEXER_CONFIG.lineTerminatorCharacters) {
throw Error("Error: Missing <lineTerminatorCharacters> property on the Lexer config.\n" +
"\tFor details See: https://chevrotain.io/docs/guide/resolving_lexer_errors.html#MISSING_LINE_TERM_CHARS");
}
}
if (config.safeMode && config.ensureOptimizations) {
throw Error('"safeMode" and "ensureOptimizations" flags are mutually exclusive.');
}
_this.trackStartLines = /full|onlyStart/i.test(_this.config.positionTracking);
_this.trackEndLines = /full/i.test(_this.config.positionTracking);
// Convert SingleModeLexerDefinition into a IMultiModeLexerDefinition.
if ((0, isArray_1.default)(lexerDefinition)) {
actualDefinition = {
modes: { defaultMode: (0, clone_1.default)(lexerDefinition) },
defaultMode: lexer_1.DEFAULT_MODE
};
}
else {
// no conversion needed, input should already be a IMultiModeLexerDefinition
hasOnlySingleMode = false;
actualDefinition = (0, clone_1.default)(lexerDefinition);
}
});
if (_this.config.skipValidations === false) {
_this.TRACE_INIT("performRuntimeChecks", function () {
_this.lexerDefinitionErrors = _this.lexerDefinitionErrors.concat((0, lexer_1.performRuntimeChecks)(actualDefinition, _this.trackStartLines, _this.config.lineTerminatorCharacters));
});
_this.TRACE_INIT("performWarningRuntimeChecks", function () {
_this.lexerDefinitionWarning = _this.lexerDefinitionWarning.concat((0, lexer_1.performWarningRuntimeChecks)(actualDefinition, _this.trackStartLines, _this.config.lineTerminatorCharacters));
});
}
// for extra robustness to avoid throwing an none informative error message
actualDefinition.modes = actualDefinition.modes
? actualDefinition.modes
: {};
// an error of undefined TokenTypes will be detected in "performRuntimeChecks" above.
// this transformation is to increase robustness in the case of partially invalid lexer definition.
(0, forEach_1.default)(actualDefinition.modes, function (currModeValue, currModeName) {
actualDefinition.modes[currModeName] = (0, reject_1.default)(currModeValue, function (currTokType) { return (0, isUndefined_1.default)(currTokType); });
});
var allModeNames = (0, keys_1.default)(actualDefinition.modes);
(0, forEach_1.default)(actualDefinition.modes, function (currModDef, currModName) {
_this.TRACE_INIT("Mode: <".concat(currModName, "> processing"), function () {
_this.modes.push(currModName);
if (_this.config.skipValidations === false) {
_this.TRACE_INIT("validatePatterns", function () {
_this.lexerDefinitionErrors = _this.lexerDefinitionErrors.concat((0, lexer_1.validatePatterns)(currModDef, allModeNames));
});
}
// If definition errors were encountered, the analysis phase may fail unexpectedly/
// Considering a lexer with definition errors may never be used, there is no point
// to performing the analysis anyhow...
if ((0, isEmpty_1.default)(_this.lexerDefinitionErrors)) {
(0, tokens_1.augmentTokenTypes)(currModDef);
var currAnalyzeResult_1;
_this.TRACE_INIT("analyzeTokenTypes", function () {
currAnalyzeResult_1 = (0, lexer_1.analyzeTokenTypes)(currModDef, {
lineTerminatorCharacters: _this.config.lineTerminatorCharacters,
positionTracking: config.positionTracking,
ensureOptimizations: config.ensureOptimizations,
safeMode: config.safeMode,
tracer: _this.TRACE_INIT
});
});
_this.patternIdxToConfig[currModName] =
currAnalyzeResult_1.patternIdxToConfig;
_this.charCodeToPatternIdxToConfig[currModName] =
currAnalyzeResult_1.charCodeToPatternIdxToConfig;
_this.emptyGroups = (0, assign_1.default)({}, _this.emptyGroups, currAnalyzeResult_1.emptyGroups);
_this.hasCustom = currAnalyzeResult_1.hasCustom || _this.hasCustom;
_this.canModeBeOptimized[currModName] =
currAnalyzeResult_1.canBeOptimized;
}
});
});
_this.defaultMode = actualDefinition.defaultMode;
if (!(0, isEmpty_1.default)(_this.lexerDefinitionErrors) &&
!_this.config.deferDefinitionErrorsHandling) {
var allErrMessages = (0, map_1.default)(_this.lexerDefinitionErrors, function (error) {
return error.message;
});
var allErrMessagesString = allErrMessages.join("-----------------------\n");
throw new Error("Errors detected in definition of Lexer:\n" + allErrMessagesString);
}
// Only print warning if there are no errors, This will avoid pl
(0, forEach_1.default)(_this.lexerDefinitionWarning, function (warningDescriptor) {
(0, utils_1.PRINT_WARNING)(warningDescriptor.message);
});
_this.TRACE_INIT("Choosing sub-methods implementations", function () {
// Choose the relevant internal implementations for this specific parser.
// These implementations should be in-lined by the JavaScript engine
// to provide optimal performance in each scenario.
if (lexer_1.SUPPORT_STICKY) {
_this.chopInput = identity_1.default;
_this.match = _this.matchWithTest;
}
else {
_this.updateLastIndex = noop_1.default;
_this.match = _this.matchWithExec;
}
if (hasOnlySingleMode) {
_this.handleModes = noop_1.default;
}
if (_this.trackStartLines === false) {
_this.computeNewColumn = identity_1.default;
}
if (_this.trackEndLines === false) {
_this.updateTokenEndLineColumnLocation = noop_1.default;
}
if (/full/i.test(_this.config.positionTracking)) {
_this.createTokenInstance = _this.createFullToken;
}
else if (/onlyStart/i.test(_this.config.positionTracking)) {
_this.createTokenInstance = _this.createStartOnlyToken;
}
else if (/onlyOffset/i.test(_this.config.positionTracking)) {
_this.createTokenInstance = _this.createOffsetOnlyToken;
}
else {
throw Error("Invalid <positionTracking> config option: \"".concat(_this.config.positionTracking, "\""));
}
if (_this.hasCustom) {
_this.addToken = _this.addTokenUsingPush;
_this.handlePayload = _this.handlePayloadWithCustom;
}
else {
_this.addToken = _this.addTokenUsingMemberAccess;
_this.handlePayload = _this.handlePayloadNoCustom;
}
});
_this.TRACE_INIT("Failed Optimization Warnings", function () {
var unOptimizedModes = (0, reduce_1.default)(_this.canModeBeOptimized, function (cannotBeOptimized, canBeOptimized, modeName) {
if (canBeOptimized === false) {
cannotBeOptimized.push(modeName);
}
return cannotBeOptimized;
}, []);
if (config.ensureOptimizations && !(0, isEmpty_1.default)(unOptimizedModes)) {
throw Error("Lexer Modes: < ".concat(unOptimizedModes.join(", "), " > cannot be optimized.\n") +
'\t Disable the "ensureOptimizations" lexer config flag to silently ignore this and run the lexer in an un-optimized mode.\n' +
"\t Or inspect the console log for details on how to resolve these issues.");
}
});
_this.TRACE_INIT("clearRegExpParserCache", function () {
(0, reg_exp_parser_1.clearRegExpParserCache)();
});
_this.TRACE_INIT("toFastProperties", function () {
(0, utils_1.toFastProperties)(_this);
});
});
}
Lexer.prototype.tokenize = function (text, initialMode) {
if (initialMode === void 0) { initialMode = this.defaultMode; }
if (!(0, isEmpty_1.default)(this.lexerDefinitionErrors)) {
var allErrMessages = (0, map_1.default)(this.lexerDefinitionErrors, function (error) {
return error.message;
});
var allErrMessagesString = allErrMessages.join("-----------------------\n");
throw new Error("Unable to Tokenize because Errors detected in definition of Lexer:\n" +
allErrMessagesString);
}
return this.tokenizeInternal(text, initialMode);
};
// There is quite a bit of duplication between this and "tokenizeInternalLazy"
// This is intentional due to performance considerations.
// this method also used quite a bit of `!` none null assertions because it is too optimized
// for `tsc` to always understand it is "safe"
Lexer.prototype.tokenizeInternal = function (text, initialMode) {
var _this = this;
var i, j, k, matchAltImage, longerAlt, matchedImage, payload, altPayload, imageLength, group, tokType, newToken, errLength, droppedChar, msg, match;
var orgText = text;
var orgLength = orgText.length;
var offset = 0;
var matchedTokensIndex = 0;
// initializing the tokensArray to the "guessed" size.
// guessing too little will still reduce the number of array re-sizes on pushes.
// guessing too large (Tested by guessing x4 too large) may cost a bit more of memory
// but would still have a faster runtime by avoiding (All but one) array resizing.
var guessedNumberOfTokens = this.hasCustom
? 0 // will break custom token pattern APIs the matchedTokens array will contain undefined elements.
: Math.floor(text.length / 10);
var matchedTokens = new Array(guessedNumberOfTokens);
var errors = [];
var line = this.trackStartLines ? 1 : undefined;
var column = this.trackStartLines ? 1 : undefined;
var groups = (0, lexer_1.cloneEmptyGroups)(this.emptyGroups);
var trackLines = this.trackStartLines;
var lineTerminatorPattern = this.config.lineTerminatorsPattern;
var currModePatternsLength = 0;
var patternIdxToConfig = [];
var currCharCodeToPatternIdxToConfig = [];
var modeStack = [];
var emptyArray = [];
Object.freeze(emptyArray);
var getPossiblePatterns;
function getPossiblePatternsSlow() {
return patternIdxToConfig;
}
function getPossiblePatternsOptimized(charCode) {
var optimizedCharIdx = (0, lexer_1.charCodeToOptimizedIndex)(charCode);
var possiblePatterns = currCharCodeToPatternIdxToConfig[optimizedCharIdx];
if (possiblePatterns === undefined) {
return emptyArray;
}
else {
return possiblePatterns;
}
}
var pop_mode = function (popToken) {
// TODO: perhaps avoid this error in the edge case there is no more input?
if (modeStack.length === 1 &&
// if we have both a POP_MODE and a PUSH_MODE this is in-fact a "transition"
// So no error should occur.
popToken.tokenType.PUSH_MODE === undefined) {
// if we try to pop the last mode there lexer will no longer have ANY mode.
// thus the pop is ignored, an error will be created and the lexer will continue parsing in the previous mode.
var msg_1 = _this.config.errorMessageProvider.buildUnableToPopLexerModeMessage(popToken);
errors.push({
offset: popToken.startOffset,
line: popToken.startLine,
column: popToken.startColumn,
length: popToken.image.length,
message: msg_1
});
}
else {
modeStack.pop();
var newMode = (0, last_1.default)(modeStack);
patternIdxToConfig = _this.patternIdxToConfig[newMode];
currCharCodeToPatternIdxToConfig =
_this.charCodeToPatternIdxToConfig[newMode];
currModePatternsLength = patternIdxToConfig.length;
var modeCanBeOptimized = _this.canModeBeOptimized[newMode] && _this.config.safeMode === false;
if (currCharCodeToPatternIdxToConfig && modeCanBeOptimized) {
getPossiblePatterns = getPossiblePatternsOptimized;
}
else {
getPossiblePatterns = getPossiblePatternsSlow;
}
}
};
function push_mode(newMode) {
modeStack.push(newMode);
currCharCodeToPatternIdxToConfig =
this.charCodeToPatternIdxToConfig[newMode];
patternIdxToConfig = this.patternIdxToConfig[newMode];
currModePatternsLength = patternIdxToConfig.length;
currModePatternsLength = patternIdxToConfig.length;
var modeCanBeOptimized = this.canModeBeOptimized[newMode] && this.config.safeMode === false;
if (currCharCodeToPatternIdxToConfig && modeCanBeOptimized) {
getPossiblePatterns = getPossiblePatternsOptimized;
}
else {
getPossiblePatterns = getPossiblePatternsSlow;
}
}
// this pattern seems to avoid a V8 de-optimization, although that de-optimization does not
// seem to matter performance wise.
push_mode.call(this, initialMode);
var currConfig;
var recoveryEnabled = this.config.recoveryEnabled;
while (offset < orgLength) {
matchedImage = null;
var nextCharCode = orgText.charCodeAt(offset);
var chosenPatternIdxToConfig = getPossiblePatterns(nextCharCode);
var chosenPatternsLength = chosenPatternIdxToConfig.length;
for (i = 0; i < chosenPatternsLength; i++) {
currConfig = chosenPatternIdxToConfig[i];
var currPattern = currConfig.pattern;
payload = null;
// manually in-lined because > 600 chars won't be in-lined in V8
var singleCharCode = currConfig.short;
if (singleCharCode !== false) {
if (nextCharCode === singleCharCode) {
// single character string
matchedImage = currPattern;
}
}
else if (currConfig.isCustom === true) {
match = currPattern.exec(orgText, offset, matchedTokens, groups);
if (match !== null) {
matchedImage = match[0];
if (match.payload !== undefined) {
payload = match.payload;
}
}
else {
matchedImage = null;
}
}
else {
this.updateLastIndex(currPattern, offset);
matchedImage = this.match(currPattern, text, offset);
}
if (matchedImage !== null) {
// even though this pattern matched we must try a another longer alternative.
// this can be used to prioritize keywords over identifiers
longerAlt = currConfig.longerAlt;
if (longerAlt !== undefined) {
// TODO: micro optimize, avoid extra prop access
// by saving/linking longerAlt on the original config?
var longerAltLength = longerAlt.length;
for (k = 0; k < longerAltLength; k++) {
var longerAltConfig = patternIdxToConfig[longerAlt[k]];
var longerAltPattern = longerAltConfig.pattern;
altPayload = null;
// single Char can never be a longer alt so no need to test it.
// manually in-lined because > 600 chars won't be in-lined in V8
if (longerAltConfig.isCustom === true) {
match = longerAltPattern.exec(orgText, offset, matchedTokens, groups);
if (match !== null) {
matchAltImage = match[0];
if (match.payload !== undefined) {
altPayload = match.payload;
}
}
else {
matchAltImage = null;
}
}
else {
this.updateLastIndex(longerAltPattern, offset);
matchAltImage = this.match(longerAltPattern, text, offset);
}
if (matchAltImage && matchAltImage.length > matchedImage.length) {
matchedImage = matchAltImage;
payload = altPayload;
currConfig = longerAltConfig;
// Exit the loop early after matching one of the longer alternatives
// The first matched alternative takes precedence
break;
}
}
}
break;
}
}
// successful match
if (matchedImage !== null) {
imageLength = matchedImage.length;
group = currConfig.group;
if (group !== undefined) {
tokType = currConfig.tokenTypeIdx;
// TODO: "offset + imageLength" and the new column may be computed twice in case of "full" location information inside
// createFullToken method
newToken = this.createTokenInstance(matchedImage, offset, tokType, currConfig.tokenType, line, column, imageLength);
this.handlePayload(newToken, payload);
// TODO: optimize NOOP in case there are no special groups?
if (group === false) {
matchedTokensIndex = this.addToken(matchedTokens, matchedTokensIndex, newToken);
}
else {
groups[group].push(newToken);
}
}
text = this.chopInput(text, imageLength);
offset = offset + imageLength;
// TODO: with newlines the column may be assigned twice
column = this.computeNewColumn(column, imageLength);
if (trackLines === true && currConfig.canLineTerminator === true) {
var numOfLTsInMatch = 0;
var foundTerminator = void 0;
var lastLTEndOffset = void 0;
lineTerminatorPattern.lastIndex = 0;
do {
foundTerminator = lineTerminatorPattern.test(matchedImage);
if (foundTerminator === true) {
lastLTEndOffset = lineTerminatorPattern.lastIndex - 1;
numOfLTsInMatch++;
}
} while (foundTerminator === true);
if (numOfLTsInMatch !== 0) {
line = line + numOfLTsInMatch;
column = imageLength - lastLTEndOffset;
this.updateTokenEndLineColumnLocation(newToken, group, lastLTEndOffset, numOfLTsInMatch, line, column, imageLength);
}
}
// will be NOOP if no modes present
this.handleModes(currConfig, pop_mode, push_mode, newToken);
}
else {
// error recovery, drop characters until we identify a valid token's start point
var errorStartOffset = offset;
var errorLine = line;
var errorColumn = column;
var foundResyncPoint = recoveryEnabled === false;
while (foundResyncPoint === false && offset < orgLength) {
// Identity Func (when sticky flag is enabled)
text = this.chopInput(text, 1);
offset++;
for (j = 0; j < currModePatternsLength; j++) {
var currConfig_1 = patternIdxToConfig[j];
var currPattern = currConfig_1.pattern;
// manually in-lined because > 600 chars won't be in-lined in V8
var singleCharCode = currConfig_1.short;
if (singleCharCode !== false) {
if (orgText.charCodeAt(offset) === singleCharCode) {
// single character string
foundResyncPoint = true;
}
}
else if (currConfig_1.isCustom === true) {
foundResyncPoint =
currPattern.exec(orgText, offset, matchedTokens, groups) !== null;
}
else {
this.updateLastIndex(currPattern, offset);
foundResyncPoint = currPattern.exec(text) !== null;
}
if (foundResyncPoint === true) {
break;
}
}
}
errLength = offset - errorStartOffset;
// at this point we either re-synced or reached the end of the input text
msg = this.config.errorMessageProvider.buildUnexpectedCharactersMessage(orgText, errorStartOffset, errLength, errorLine, errorColumn);
errors.push({
offset: errorStartOffset,
line: errorLine,
column: errorColumn,
length: errLength,
message: msg
});
if (recoveryEnabled === false) {
break;
}
}
}
// if we do have custom patterns which push directly into the
// TODO: custom tokens should not push directly??
if (!this.hasCustom) {
// if we guessed a too large size for the tokens array this will shrink it to the right size.
matchedTokens.length = matchedTokensIndex;
}
return {
tokens: matchedTokens,
groups: groups,
errors: errors
};
};
Lexer.prototype.handleModes = function (config, pop_mode, push_mode, newToken) {
if (config.pop === true) {
// need to save the PUSH_MODE property as if the mode is popped
// patternIdxToPopMode is updated to reflect the new mode after popping the stack
var pushMode = config.push;
pop_mode(newToken);
if (pushMode !== undefined) {
push_mode.call(this, pushMode);
}
}
else if (config.push !== undefined) {
push_mode.call(this, config.push);
}
};
Lexer.prototype.chopInput = function (text, length) {
return text.substring(length);
};
Lexer.prototype.updateLastIndex = function (regExp, newLastIndex) {
regExp.lastIndex = newLastIndex;
};
// TODO: decrease this under 600 characters? inspect stripping comments option in TSC compiler
Lexer.prototype.updateTokenEndLineColumnLocation = function (newToken, group, lastLTIdx, numOfLTsInMatch, line, column, imageLength) {
var lastCharIsLT, fixForEndingInLT;
if (group !== undefined) {
// a none skipped multi line Token, need to update endLine/endColumn
lastCharIsLT = lastLTIdx === imageLength - 1;
fixForEndingInLT = lastCharIsLT ? -1 : 0;
if (!(numOfLTsInMatch === 1 && lastCharIsLT === true)) {
// if a token ends in a LT that last LT only affects the line numbering of following Tokens
newToken.endLine = line + fixForEndingInLT;
// the last LT in a token does not affect the endColumn either as the [columnStart ... columnEnd)
// inclusive to exclusive range.
newToken.endColumn = column - 1 + -fixForEndingInLT;
}
// else single LT in the last character of a token, no need to modify the endLine/EndColumn
}
};
Lexer.prototype.computeNewColumn = function (oldColumn, imageLength) {
return oldColumn + imageLength;
};
Lexer.prototype.createOffsetOnlyToken = function (image, startOffset, tokenTypeIdx, tokenType) {
return {
image: image,
startOffset: startOffset,
tokenTypeIdx: tokenTypeIdx,
tokenType: tokenType
};
};
Lexer.prototype.createStartOnlyToken = function (image, startOffset, tokenTypeIdx, tokenType, startLine, startColumn) {
return {
image: image,
startOffset: startOffset,
startLine: startLine,
startColumn: startColumn,
tokenTypeIdx: tokenTypeIdx,
tokenType: tokenType
};
};
Lexer.prototype.createFullToken = function (image, startOffset, tokenTypeIdx, tokenType, startLine, startColumn, imageLength) {
return {
image: image,
startOffset: startOffset,
endOffset: startOffset + imageLength - 1,
startLine: startLine,
endLine: startLine,
startColumn: startColumn,
endColumn: startColumn + imageLength - 1,
tokenTypeIdx: tokenTypeIdx,
tokenType: tokenType
};
};
Lexer.prototype.addTokenUsingPush = function (tokenVector, index, tokenToAdd) {
tokenVector.push(tokenToAdd);
return index;
};
Lexer.prototype.addTokenUsingMemberAccess = function (tokenVector, index, tokenToAdd) {
tokenVector[index] = tokenToAdd;
index++;
return index;
};
Lexer.prototype.handlePayloadNoCustom = function (token, payload) { };
Lexer.prototype.handlePayloadWithCustom = function (token, payload) {
if (payload !== null) {
token.payload = payload;
}
};
Lexer.prototype.matchWithTest = function (pattern, text, offset) {
var found = pattern.test(text);
if (found === true) {
return text.substring(offset, pattern.lastIndex);
}
return null;
};
Lexer.prototype.matchWithExec = function (pattern, text) {
var regExpArray = pattern.exec(text);
return regExpArray !== null ? regExpArray[0] : null;
};
Lexer.SKIPPED = "This marks a skipped Token pattern, this means each token identified by it will" +
"be consumed and then thrown into oblivion, this can be used to for example to completely ignore whitespace.";
Lexer.NA = /NOT_APPLICABLE/;
return Lexer;
}());
exports.Lexer = Lexer;
//# sourceMappingURL=lexer_public.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,273 @@
"use strict";
var __extends = (this && this.__extends) || (function () {
var extendStatics = function (d, b) {
extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };
return extendStatics(d, b);
};
return function (d, b) {
if (typeof b !== "function" && b !== null)
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null");
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.canMatchCharCode = exports.firstCharOptimizedIndices = exports.getOptimizedStartCodesIndices = exports.failedOptimizationPrefixMsg = void 0;
var regexp_to_ast_1 = require("regexp-to-ast");
var isArray_1 = __importDefault(require("lodash/isArray"));
var every_1 = __importDefault(require("lodash/every"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var find_1 = __importDefault(require("lodash/find"));
var values_1 = __importDefault(require("lodash/values"));
var includes_1 = __importDefault(require("lodash/includes"));
var utils_1 = require("@chevrotain/utils");
var reg_exp_parser_1 = require("./reg_exp_parser");
var lexer_1 = require("./lexer");
var complementErrorMessage = "Complement Sets are not supported for first char optimization";
exports.failedOptimizationPrefixMsg = 'Unable to use "first char" lexer optimizations:\n';
function getOptimizedStartCodesIndices(regExp, ensureOptimizations) {
if (ensureOptimizations === void 0) { ensureOptimizations = false; }
try {
var ast = (0, reg_exp_parser_1.getRegExpAst)(regExp);
var firstChars = firstCharOptimizedIndices(ast.value, {}, ast.flags.ignoreCase);
return firstChars;
}
catch (e) {
/* istanbul ignore next */
// Testing this relies on the regexp-to-ast library having a bug... */
// TODO: only the else branch needs to be ignored, try to fix with newer prettier / tsc
if (e.message === complementErrorMessage) {
if (ensureOptimizations) {
(0, utils_1.PRINT_WARNING)("".concat(exports.failedOptimizationPrefixMsg) +
"\tUnable to optimize: < ".concat(regExp.toString(), " >\n") +
"\tComplement Sets cannot be automatically optimized.\n" +
"\tThis will disable the lexer's first char optimizations.\n" +
"\tSee: https://chevrotain.io/docs/guide/resolving_lexer_errors.html#COMPLEMENT for details.");
}
}
else {
var msgSuffix = "";
if (ensureOptimizations) {
msgSuffix =
"\n\tThis will disable the lexer's first char optimizations.\n" +
"\tSee: https://chevrotain.io/docs/guide/resolving_lexer_errors.html#REGEXP_PARSING for details.";
}
(0, utils_1.PRINT_ERROR)("".concat(exports.failedOptimizationPrefixMsg, "\n") +
"\tFailed parsing: < ".concat(regExp.toString(), " >\n") +
"\tUsing the regexp-to-ast library version: ".concat(regexp_to_ast_1.VERSION, "\n") +
"\tPlease open an issue at: https://github.com/bd82/regexp-to-ast/issues" +
msgSuffix);
}
}
return [];
}
exports.getOptimizedStartCodesIndices = getOptimizedStartCodesIndices;
function firstCharOptimizedIndices(ast, result, ignoreCase) {
switch (ast.type) {
case "Disjunction":
for (var i = 0; i < ast.value.length; i++) {
firstCharOptimizedIndices(ast.value[i], result, ignoreCase);
}
break;
case "Alternative":
var terms = ast.value;
for (var i = 0; i < terms.length; i++) {
var term = terms[i];
// skip terms that cannot effect the first char results
switch (term.type) {
case "EndAnchor":
// A group back reference cannot affect potential starting char.
// because if a back reference is the first production than automatically
// the group being referenced has had to come BEFORE so its codes have already been added
case "GroupBackReference":
// assertions do not affect potential starting codes
case "Lookahead":
case "NegativeLookahead":
case "StartAnchor":
case "WordBoundary":
case "NonWordBoundary":
continue;
}
var atom = term;
switch (atom.type) {
case "Character":
addOptimizedIdxToResult(atom.value, result, ignoreCase);
break;
case "Set":
if (atom.complement === true) {
throw Error(complementErrorMessage);
}
(0, forEach_1.default)(atom.value, function (code) {
if (typeof code === "number") {
addOptimizedIdxToResult(code, result, ignoreCase);
}
else {
// range
var range = code;
// cannot optimize when ignoreCase is
if (ignoreCase === true) {
for (var rangeCode = range.from; rangeCode <= range.to; rangeCode++) {
addOptimizedIdxToResult(rangeCode, result, ignoreCase);
}
}
// Optimization (2 orders of magnitude less work for very large ranges)
else {
// handle unoptimized values
for (var rangeCode = range.from; rangeCode <= range.to && rangeCode < lexer_1.minOptimizationVal; rangeCode++) {
addOptimizedIdxToResult(rangeCode, result, ignoreCase);
}
// Less common charCode where we optimize for faster init time, by using larger "buckets"
if (range.to >= lexer_1.minOptimizationVal) {
var minUnOptVal = range.from >= lexer_1.minOptimizationVal
? range.from
: lexer_1.minOptimizationVal;
var maxUnOptVal = range.to;
var minOptIdx = (0, lexer_1.charCodeToOptimizedIndex)(minUnOptVal);
var maxOptIdx = (0, lexer_1.charCodeToOptimizedIndex)(maxUnOptVal);
for (var currOptIdx = minOptIdx; currOptIdx <= maxOptIdx; currOptIdx++) {
result[currOptIdx] = currOptIdx;
}
}
}
}
});
break;
case "Group":
firstCharOptimizedIndices(atom.value, result, ignoreCase);
break;
/* istanbul ignore next */
default:
throw Error("Non Exhaustive Match");
}
// reached a mandatory production, no more **start** codes can be found on this alternative
var isOptionalQuantifier = atom.quantifier !== undefined && atom.quantifier.atLeast === 0;
if (
// A group may be optional due to empty contents /(?:)/
// or if everything inside it is optional /((a)?)/
(atom.type === "Group" && isWholeOptional(atom) === false) ||
// If this term is not a group it may only be optional if it has an optional quantifier
(atom.type !== "Group" && isOptionalQuantifier === false)) {
break;
}
}
break;
/* istanbul ignore next */
default:
throw Error("non exhaustive match!");
}
// console.log(Object.keys(result).length)
return (0, values_1.default)(result);
}
exports.firstCharOptimizedIndices = firstCharOptimizedIndices;
function addOptimizedIdxToResult(code, result, ignoreCase) {
var optimizedCharIdx = (0, lexer_1.charCodeToOptimizedIndex)(code);
result[optimizedCharIdx] = optimizedCharIdx;
if (ignoreCase === true) {
handleIgnoreCase(code, result);
}
}
function handleIgnoreCase(code, result) {
var char = String.fromCharCode(code);
var upperChar = char.toUpperCase();
/* istanbul ignore else */
if (upperChar !== char) {
var optimizedCharIdx = (0, lexer_1.charCodeToOptimizedIndex)(upperChar.charCodeAt(0));
result[optimizedCharIdx] = optimizedCharIdx;
}
else {
var lowerChar = char.toLowerCase();
if (lowerChar !== char) {
var optimizedCharIdx = (0, lexer_1.charCodeToOptimizedIndex)(lowerChar.charCodeAt(0));
result[optimizedCharIdx] = optimizedCharIdx;
}
}
}
function findCode(setNode, targetCharCodes) {
return (0, find_1.default)(setNode.value, function (codeOrRange) {
if (typeof codeOrRange === "number") {
return (0, includes_1.default)(targetCharCodes, codeOrRange);
}
else {
// range
var range_1 = codeOrRange;
return ((0, find_1.default)(targetCharCodes, function (targetCode) { return range_1.from <= targetCode && targetCode <= range_1.to; }) !== undefined);
}
});
}
function isWholeOptional(ast) {
var quantifier = ast.quantifier;
if (quantifier && quantifier.atLeast === 0) {
return true;
}
if (!ast.value) {
return false;
}
return (0, isArray_1.default)(ast.value)
? (0, every_1.default)(ast.value, isWholeOptional)
: isWholeOptional(ast.value);
}
var CharCodeFinder = /** @class */ (function (_super) {
__extends(CharCodeFinder, _super);
function CharCodeFinder(targetCharCodes) {
var _this = _super.call(this) || this;
_this.targetCharCodes = targetCharCodes;
_this.found = false;
return _this;
}
CharCodeFinder.prototype.visitChildren = function (node) {
// No need to keep looking...
if (this.found === true) {
return;
}
// switch lookaheads as they do not actually consume any characters thus
// finding a charCode at lookahead context does not mean that regexp can actually contain it in a match.
switch (node.type) {
case "Lookahead":
this.visitLookahead(node);
return;
case "NegativeLookahead":
this.visitNegativeLookahead(node);
return;
}
_super.prototype.visitChildren.call(this, node);
};
CharCodeFinder.prototype.visitCharacter = function (node) {
if ((0, includes_1.default)(this.targetCharCodes, node.value)) {
this.found = true;
}
};
CharCodeFinder.prototype.visitSet = function (node) {
if (node.complement) {
if (findCode(node, this.targetCharCodes) === undefined) {
this.found = true;
}
}
else {
if (findCode(node, this.targetCharCodes) !== undefined) {
this.found = true;
}
}
};
return CharCodeFinder;
}(regexp_to_ast_1.BaseRegExpVisitor));
function canMatchCharCode(charCodes, pattern) {
if (pattern instanceof RegExp) {
var ast = (0, reg_exp_parser_1.getRegExpAst)(pattern);
var charCodeFinder = new CharCodeFinder(charCodes);
charCodeFinder.visit(ast);
return charCodeFinder.found;
}
else {
return ((0, find_1.default)(pattern, function (char) {
return (0, includes_1.default)(charCodes, char.charCodeAt(0));
}) !== undefined);
}
}
exports.canMatchCharCode = canMatchCharCode;
//# sourceMappingURL=reg_exp.js.map

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,23 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.clearRegExpParserCache = exports.getRegExpAst = void 0;
var regexp_to_ast_1 = require("regexp-to-ast");
var regExpAstCache = {};
var regExpParser = new regexp_to_ast_1.RegExpParser();
function getRegExpAst(regExp) {
var regExpStr = regExp.toString();
if (regExpAstCache.hasOwnProperty(regExpStr)) {
return regExpAstCache[regExpStr];
}
else {
var regExpAst = regExpParser.pattern(regExpStr);
regExpAstCache[regExpStr] = regExpAst;
return regExpAst;
}
}
exports.getRegExpAst = getRegExpAst;
function clearRegExpParserCache() {
regExpAstCache = {};
}
exports.clearRegExpParserCache = clearRegExpParserCache;
//# sourceMappingURL=reg_exp_parser.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"reg_exp_parser.js","sourceRoot":"","sources":["../../../src/scan/reg_exp_parser.ts"],"names":[],"mappings":";;;AAAA,+CAOsB;AAEtB,IAAI,cAAc,GAAuC,EAAE,CAAA;AAC3D,IAAM,YAAY,GAAG,IAAI,4BAAY,EAAE,CAAA;AAUvC,SAAgB,YAAY,CAAC,MAAc;IACzC,IAAM,SAAS,GAAG,MAAM,CAAC,QAAQ,EAAE,CAAA;IACnC,IAAI,cAAc,CAAC,cAAc,CAAC,SAAS,CAAC,EAAE;QAC5C,OAAO,cAAc,CAAC,SAAS,CAAC,CAAA;KACjC;SAAM;QACL,IAAM,SAAS,GAAG,YAAY,CAAC,OAAO,CAAC,SAAS,CAAC,CAAA;QACjD,cAAc,CAAC,SAAS,CAAC,GAAG,SAAS,CAAA;QACrC,OAAO,SAAS,CAAA;KACjB;AACH,CAAC;AATD,oCASC;AAED,SAAgB,sBAAsB;IACpC,cAAc,GAAG,EAAE,CAAA;AACrB,CAAC;AAFD,wDAEC"}

View File

@@ -0,0 +1,142 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.isTokenType = exports.hasExtendingTokensTypesMapProperty = exports.hasExtendingTokensTypesProperty = exports.hasCategoriesProperty = exports.hasShortKeyProperty = exports.singleAssignCategoriesToksMap = exports.assignCategoriesMapProp = exports.assignCategoriesTokensProp = exports.assignTokenDefaultProps = exports.expandCategories = exports.augmentTokenTypes = exports.tokenIdxToClass = exports.tokenShortNameIdx = exports.tokenStructuredMatcherNoCategories = exports.tokenStructuredMatcher = void 0;
var isEmpty_1 = __importDefault(require("lodash/isEmpty"));
var compact_1 = __importDefault(require("lodash/compact"));
var isArray_1 = __importDefault(require("lodash/isArray"));
var flatten_1 = __importDefault(require("lodash/flatten"));
var difference_1 = __importDefault(require("lodash/difference"));
var map_1 = __importDefault(require("lodash/map"));
var forEach_1 = __importDefault(require("lodash/forEach"));
var has_1 = __importDefault(require("lodash/has"));
var includes_1 = __importDefault(require("lodash/includes"));
var clone_1 = __importDefault(require("lodash/clone"));
function tokenStructuredMatcher(tokInstance, tokConstructor) {
var instanceType = tokInstance.tokenTypeIdx;
if (instanceType === tokConstructor.tokenTypeIdx) {
return true;
}
else {
return (tokConstructor.isParent === true &&
tokConstructor.categoryMatchesMap[instanceType] === true);
}
}
exports.tokenStructuredMatcher = tokenStructuredMatcher;
// Optimized tokenMatcher in case our grammar does not use token categories
// Being so tiny it is much more likely to be in-lined and this avoid the function call overhead
function tokenStructuredMatcherNoCategories(token, tokType) {
return token.tokenTypeIdx === tokType.tokenTypeIdx;
}
exports.tokenStructuredMatcherNoCategories = tokenStructuredMatcherNoCategories;
exports.tokenShortNameIdx = 1;
exports.tokenIdxToClass = {};
function augmentTokenTypes(tokenTypes) {
// collect the parent Token Types as well.
var tokenTypesAndParents = expandCategories(tokenTypes);
// add required tokenType and categoryMatches properties
assignTokenDefaultProps(tokenTypesAndParents);
// fill up the categoryMatches
assignCategoriesMapProp(tokenTypesAndParents);
assignCategoriesTokensProp(tokenTypesAndParents);
(0, forEach_1.default)(tokenTypesAndParents, function (tokType) {
tokType.isParent = tokType.categoryMatches.length > 0;
});
}
exports.augmentTokenTypes = augmentTokenTypes;
function expandCategories(tokenTypes) {
var result = (0, clone_1.default)(tokenTypes);
var categories = tokenTypes;
var searching = true;
while (searching) {
categories = (0, compact_1.default)((0, flatten_1.default)((0, map_1.default)(categories, function (currTokType) { return currTokType.CATEGORIES; })));
var newCategories = (0, difference_1.default)(categories, result);
result = result.concat(newCategories);
if ((0, isEmpty_1.default)(newCategories)) {
searching = false;
}
else {
categories = newCategories;
}
}
return result;
}
exports.expandCategories = expandCategories;
function assignTokenDefaultProps(tokenTypes) {
(0, forEach_1.default)(tokenTypes, function (currTokType) {
if (!hasShortKeyProperty(currTokType)) {
exports.tokenIdxToClass[exports.tokenShortNameIdx] = currTokType;
currTokType.tokenTypeIdx = exports.tokenShortNameIdx++;
}
// CATEGORIES? : TokenType | TokenType[]
if (hasCategoriesProperty(currTokType) &&
!(0, isArray_1.default)(currTokType.CATEGORIES)
// &&
// !isUndefined(currTokType.CATEGORIES.PATTERN)
) {
currTokType.CATEGORIES = [currTokType.CATEGORIES];
}
if (!hasCategoriesProperty(currTokType)) {
currTokType.CATEGORIES = [];
}
if (!hasExtendingTokensTypesProperty(currTokType)) {
currTokType.categoryMatches = [];
}
if (!hasExtendingTokensTypesMapProperty(currTokType)) {
currTokType.categoryMatchesMap = {};
}
});
}
exports.assignTokenDefaultProps = assignTokenDefaultProps;
function assignCategoriesTokensProp(tokenTypes) {
(0, forEach_1.default)(tokenTypes, function (currTokType) {
// avoid duplications
currTokType.categoryMatches = [];
(0, forEach_1.default)(currTokType.categoryMatchesMap, function (val, key) {
currTokType.categoryMatches.push(exports.tokenIdxToClass[key].tokenTypeIdx);
});
});
}
exports.assignCategoriesTokensProp = assignCategoriesTokensProp;
function assignCategoriesMapProp(tokenTypes) {
(0, forEach_1.default)(tokenTypes, function (currTokType) {
singleAssignCategoriesToksMap([], currTokType);
});
}
exports.assignCategoriesMapProp = assignCategoriesMapProp;
function singleAssignCategoriesToksMap(path, nextNode) {
(0, forEach_1.default)(path, function (pathNode) {
nextNode.categoryMatchesMap[pathNode.tokenTypeIdx] = true;
});
(0, forEach_1.default)(nextNode.CATEGORIES, function (nextCategory) {
var newPath = path.concat(nextNode);
// avoids infinite loops due to cyclic categories.
if (!(0, includes_1.default)(newPath, nextCategory)) {
singleAssignCategoriesToksMap(newPath, nextCategory);
}
});
}
exports.singleAssignCategoriesToksMap = singleAssignCategoriesToksMap;
function hasShortKeyProperty(tokType) {
return (0, has_1.default)(tokType, "tokenTypeIdx");
}
exports.hasShortKeyProperty = hasShortKeyProperty;
function hasCategoriesProperty(tokType) {
return (0, has_1.default)(tokType, "CATEGORIES");
}
exports.hasCategoriesProperty = hasCategoriesProperty;
function hasExtendingTokensTypesProperty(tokType) {
return (0, has_1.default)(tokType, "categoryMatches");
}
exports.hasExtendingTokensTypesProperty = hasExtendingTokensTypesProperty;
function hasExtendingTokensTypesMapProperty(tokType) {
return (0, has_1.default)(tokType, "categoryMatchesMap");
}
exports.hasExtendingTokensTypesMapProperty = hasExtendingTokensTypesMapProperty;
function isTokenType(tokType) {
return (0, has_1.default)(tokType, "tokenTypeIdx");
}
exports.isTokenType = isTokenType;
//# sourceMappingURL=tokens.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"tokens.js","sourceRoot":"","sources":["../../../src/scan/tokens.ts"],"names":[],"mappings":";;;;;;AAAA,2DAAoC;AACpC,2DAAoC;AACpC,2DAAoC;AACpC,2DAAoC;AACpC,iEAA0C;AAC1C,mDAA4B;AAC5B,2DAAoC;AACpC,mDAA4B;AAC5B,6DAAsC;AACtC,uDAAgC;AAGhC,SAAgB,sBAAsB,CACpC,WAAmB,EACnB,cAAyB;IAEzB,IAAM,YAAY,GAAG,WAAW,CAAC,YAAY,CAAA;IAC7C,IAAI,YAAY,KAAK,cAAc,CAAC,YAAY,EAAE;QAChD,OAAO,IAAI,CAAA;KACZ;SAAM;QACL,OAAO,CACL,cAAc,CAAC,QAAQ,KAAK,IAAI;YAChC,cAAc,CAAC,kBAAmB,CAAC,YAAY,CAAC,KAAK,IAAI,CAC1D,CAAA;KACF;AACH,CAAC;AAbD,wDAaC;AAED,2EAA2E;AAC3E,gGAAgG;AAChG,SAAgB,kCAAkC,CAChD,KAAa,EACb,OAAkB;IAElB,OAAO,KAAK,CAAC,YAAY,KAAK,OAAO,CAAC,YAAY,CAAA;AACpD,CAAC;AALD,gFAKC;AAEU,QAAA,iBAAiB,GAAG,CAAC,CAAA;AACnB,QAAA,eAAe,GAAsC,EAAE,CAAA;AAEpE,SAAgB,iBAAiB,CAAC,UAAuB;IACvD,0CAA0C;IAC1C,IAAM,oBAAoB,GAAG,gBAAgB,CAAC,UAAU,CAAC,CAAA;IAEzD,wDAAwD;IACxD,uBAAuB,CAAC,oBAAoB,CAAC,CAAA;IAE7C,8BAA8B;IAC9B,uBAAuB,CAAC,oBAAoB,CAAC,CAAA;IAC7C,0BAA0B,CAAC,oBAAoB,CAAC,CAAA;IAEhD,IAAA,iBAAO,EAAC,oBAAoB,EAAE,UAAC,OAAO;QACpC,OAAO,CAAC,QAAQ,GAAG,OAAO,CAAC,eAAgB,CAAC,MAAM,GAAG,CAAC,CAAA;IACxD,CAAC,CAAC,CAAA;AACJ,CAAC;AAdD,8CAcC;AAED,SAAgB,gBAAgB,CAAC,UAAuB;IACtD,IAAI,MAAM,GAAG,IAAA,eAAK,EAAC,UAAU,CAAC,CAAA;IAE9B,IAAI,UAAU,GAAG,UAAU,CAAA;IAC3B,IAAI,SAAS,GAAG,IAAI,CAAA;IACpB,OAAO,SAAS,EAAE;QAChB,UAAU,GAAG,IAAA,iBAAO,EAClB,IAAA,iBAAO,EAAC,IAAA,aAAG,EAAC,UAAU,EAAE,UAAC,WAAW,IAAK,OAAA,WAAW,CAAC,UAAU,EAAtB,CAAsB,CAAC,CAAC,CAClE,CAAA;QAED,IAAM,aAAa,GAAG,IAAA,oBAAU,EAAC,UAAU,EAAE,MAAM,CAAC,CAAA;QAEpD,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,aAAa,CAAC,CAAA;QAErC,IAAI,IAAA,iBAAO,EAAC,aAAa,CAAC,EAAE;YAC1B,SAAS,GAAG,KAAK,CAAA;SAClB;aAAM;YACL,UAAU,GAAG,aAAa,CAAA;SAC3B;KACF;IACD,OAAO,MAAM,CAAA;AACf,CAAC;AArBD,4CAqBC;AAED,SAAgB,uBAAuB,CAAC,UAAuB;IAC7D,IAAA,iBAAO,EAAC,UAAU,EAAE,UAAC,WAAW;QAC9B,IAAI,CAAC,mBAAmB,CAAC,WAAW,CAAC,EAAE;YACrC,uBAAe,CAAC,yBAAiB,CAAC,GAAG,WAAW,CAC/C;YAAM,WAAY,CAAC,YAAY,GAAG,yBAAiB,EAAE,CAAA;SACvD;QAED,wCAAwC;QACxC,IACE,qBAAqB,CAAC,WAAW,CAAC;YAClC,CAAC,IAAA,iBAAO,EAAC,WAAW,CAAC,UAAU,CAAC;QAChC,KAAK;QACL,+CAA+C;UAC/C;YACA,WAAW,CAAC,UAAU,GAAG,CAAC,WAAW,CAAC,UAAkC,CAAC,CAAA;SAC1E;QAED,IAAI,CAAC,qBAAqB,CAAC,WAAW,CAAC,EAAE;YACvC,WAAW,CAAC,UAAU,GAAG,EAAE,CAAA;SAC5B;QAED,IAAI,CAAC,+BAA+B,CAAC,WAAW,CAAC,EAAE;YACjD,WAAW,CAAC,eAAe,GAAG,EAAE,CAAA;SACjC;QAED,IAAI,CAAC,kCAAkC,CAAC,WAAW,CAAC,EAAE;YACpD,WAAW,CAAC,kBAAkB,GAAG,EAAE,CAAA;SACpC;IACH,CAAC,CAAC,CAAA;AACJ,CAAC;AA7BD,0DA6BC;AAED,SAAgB,0BAA0B,CAAC,UAAuB;IAChE,IAAA,iBAAO,EAAC,UAAU,EAAE,UAAC,WAAW;QAC9B,qBAAqB;QACrB,WAAW,CAAC,eAAe,GAAG,EAAE,CAAA;QAChC,IAAA,iBAAO,EAAC,WAAW,CAAC,kBAAmB,EAAE,UAAC,GAAG,EAAE,GAAG;YAChD,WAAW,CAAC,eAAgB,CAAC,IAAI,CAC/B,uBAAe,CAAC,GAAwB,CAAC,CAAC,YAAa,CACxD,CAAA;QACH,CAAC,CAAC,CAAA;IACJ,CAAC,CAAC,CAAA;AACJ,CAAC;AAVD,gEAUC;AAED,SAAgB,uBAAuB,CAAC,UAAuB;IAC7D,IAAA,iBAAO,EAAC,UAAU,EAAE,UAAC,WAAW;QAC9B,6BAA6B,CAAC,EAAE,EAAE,WAAW,CAAC,CAAA;IAChD,CAAC,CAAC,CAAA;AACJ,CAAC;AAJD,0DAIC;AAED,SAAgB,6BAA6B,CAC3C,IAAiB,EACjB,QAAmB;IAEnB,IAAA,iBAAO,EAAC,IAAI,EAAE,UAAC,QAAQ;QACrB,QAAQ,CAAC,kBAAmB,CAAC,QAAQ,CAAC,YAAa,CAAC,GAAG,IAAI,CAAA;IAC7D,CAAC,CAAC,CAAA;IAEF,IAAA,iBAAO,EAAC,QAAQ,CAAC,UAAU,EAAE,UAAC,YAAY;QACxC,IAAM,OAAO,GAAG,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAA;QACrC,kDAAkD;QAClD,IAAI,CAAC,IAAA,kBAAQ,EAAC,OAAO,EAAE,YAAY,CAAC,EAAE;YACpC,6BAA6B,CAAC,OAAO,EAAE,YAAY,CAAC,CAAA;SACrD;IACH,CAAC,CAAC,CAAA;AACJ,CAAC;AAfD,sEAeC;AAED,SAAgB,mBAAmB,CAAC,OAAkB;IACpD,OAAO,IAAA,aAAG,EAAC,OAAO,EAAE,cAAc,CAAC,CAAA;AACrC,CAAC;AAFD,kDAEC;AAED,SAAgB,qBAAqB,CAAC,OAAkB;IACtD,OAAO,IAAA,aAAG,EAAC,OAAO,EAAE,YAAY,CAAC,CAAA;AACnC,CAAC;AAFD,sDAEC;AAED,SAAgB,+BAA+B,CAAC,OAAkB;IAChE,OAAO,IAAA,aAAG,EAAC,OAAO,EAAE,iBAAiB,CAAC,CAAA;AACxC,CAAC;AAFD,0EAEC;AAED,SAAgB,kCAAkC,CAChD,OAAkB;IAElB,OAAO,IAAA,aAAG,EAAC,OAAO,EAAE,oBAAoB,CAAC,CAAA;AAC3C,CAAC;AAJD,gFAIC;AAED,SAAgB,WAAW,CAAC,OAAkB;IAC5C,OAAO,IAAA,aAAG,EAAC,OAAO,EAAE,cAAc,CAAC,CAAA;AACrC,CAAC;AAFD,kCAEC"}

View File

@@ -0,0 +1,5 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.EOF_TOKEN_TYPE = void 0;
exports.EOF_TOKEN_TYPE = 1;
//# sourceMappingURL=tokens_constants.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"tokens_constants.js","sourceRoot":"","sources":["../../../src/scan/tokens_constants.ts"],"names":[],"mappings":";;;AAAa,QAAA,cAAc,GAAG,CAAC,CAAA"}

View File

@@ -0,0 +1,101 @@
"use strict";
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.tokenMatcher = exports.createTokenInstance = exports.EOF = exports.createToken = exports.hasTokenLabel = exports.tokenName = exports.tokenLabel = void 0;
var isString_1 = __importDefault(require("lodash/isString"));
var has_1 = __importDefault(require("lodash/has"));
var isUndefined_1 = __importDefault(require("lodash/isUndefined"));
var lexer_public_1 = require("./lexer_public");
var tokens_1 = require("./tokens");
function tokenLabel(tokType) {
if (hasTokenLabel(tokType)) {
return tokType.LABEL;
}
else {
return tokType.name;
}
}
exports.tokenLabel = tokenLabel;
function tokenName(tokType) {
return tokType.name;
}
exports.tokenName = tokenName;
function hasTokenLabel(obj) {
return (0, isString_1.default)(obj.LABEL) && obj.LABEL !== "";
}
exports.hasTokenLabel = hasTokenLabel;
var PARENT = "parent";
var CATEGORIES = "categories";
var LABEL = "label";
var GROUP = "group";
var PUSH_MODE = "push_mode";
var POP_MODE = "pop_mode";
var LONGER_ALT = "longer_alt";
var LINE_BREAKS = "line_breaks";
var START_CHARS_HINT = "start_chars_hint";
function createToken(config) {
return createTokenInternal(config);
}
exports.createToken = createToken;
function createTokenInternal(config) {
var pattern = config.pattern;
var tokenType = {};
tokenType.name = config.name;
if (!(0, isUndefined_1.default)(pattern)) {
tokenType.PATTERN = pattern;
}
if ((0, has_1.default)(config, PARENT)) {
throw ("The parent property is no longer supported.\n" +
"See: https://github.com/chevrotain/chevrotain/issues/564#issuecomment-349062346 for details.");
}
if ((0, has_1.default)(config, CATEGORIES)) {
// casting to ANY as this will be fixed inside `augmentTokenTypes``
tokenType.CATEGORIES = config[CATEGORIES];
}
(0, tokens_1.augmentTokenTypes)([tokenType]);
if ((0, has_1.default)(config, LABEL)) {
tokenType.LABEL = config[LABEL];
}
if ((0, has_1.default)(config, GROUP)) {
tokenType.GROUP = config[GROUP];
}
if ((0, has_1.default)(config, POP_MODE)) {
tokenType.POP_MODE = config[POP_MODE];
}
if ((0, has_1.default)(config, PUSH_MODE)) {
tokenType.PUSH_MODE = config[PUSH_MODE];
}
if ((0, has_1.default)(config, LONGER_ALT)) {
tokenType.LONGER_ALT = config[LONGER_ALT];
}
if ((0, has_1.default)(config, LINE_BREAKS)) {
tokenType.LINE_BREAKS = config[LINE_BREAKS];
}
if ((0, has_1.default)(config, START_CHARS_HINT)) {
tokenType.START_CHARS_HINT = config[START_CHARS_HINT];
}
return tokenType;
}
exports.EOF = createToken({ name: "EOF", pattern: lexer_public_1.Lexer.NA });
(0, tokens_1.augmentTokenTypes)([exports.EOF]);
function createTokenInstance(tokType, image, startOffset, endOffset, startLine, endLine, startColumn, endColumn) {
return {
image: image,
startOffset: startOffset,
endOffset: endOffset,
startLine: startLine,
endLine: endLine,
startColumn: startColumn,
endColumn: endColumn,
tokenTypeIdx: tokType.tokenTypeIdx,
tokenType: tokType
};
}
exports.createTokenInstance = createTokenInstance;
function tokenMatcher(token, tokType) {
return (0, tokens_1.tokenStructuredMatcher)(token, tokType);
}
exports.tokenMatcher = tokenMatcher;
//# sourceMappingURL=tokens_public.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"tokens_public.js","sourceRoot":"","sources":["../../../src/scan/tokens_public.ts"],"names":[],"mappings":";;;;;;AAAA,6DAAsC;AACtC,mDAA4B;AAC5B,mEAA4C;AAC5C,+CAAsC;AACtC,mCAAoE;AAGpE,SAAgB,UAAU,CAAC,OAAkB;IAC3C,IAAI,aAAa,CAAC,OAAO,CAAC,EAAE;QAC1B,OAAO,OAAO,CAAC,KAAK,CAAA;KACrB;SAAM;QACL,OAAO,OAAO,CAAC,IAAI,CAAA;KACpB;AACH,CAAC;AAND,gCAMC;AAED,SAAgB,SAAS,CAAC,OAAkB;IAC1C,OAAO,OAAO,CAAC,IAAI,CAAA;AACrB,CAAC;AAFD,8BAEC;AAED,SAAgB,aAAa,CAC3B,GAAc;IAEd,OAAO,IAAA,kBAAQ,EAAC,GAAG,CAAC,KAAK,CAAC,IAAI,GAAG,CAAC,KAAK,KAAK,EAAE,CAAA;AAChD,CAAC;AAJD,sCAIC;AAED,IAAM,MAAM,GAAG,QAAQ,CAAA;AACvB,IAAM,UAAU,GAAG,YAAY,CAAA;AAC/B,IAAM,KAAK,GAAG,OAAO,CAAA;AACrB,IAAM,KAAK,GAAG,OAAO,CAAA;AACrB,IAAM,SAAS,GAAG,WAAW,CAAA;AAC7B,IAAM,QAAQ,GAAG,UAAU,CAAA;AAC3B,IAAM,UAAU,GAAG,YAAY,CAAA;AAC/B,IAAM,WAAW,GAAG,aAAa,CAAA;AACjC,IAAM,gBAAgB,GAAG,kBAAkB,CAAA;AAE3C,SAAgB,WAAW,CAAC,MAAoB;IAC9C,OAAO,mBAAmB,CAAC,MAAM,CAAC,CAAA;AACpC,CAAC;AAFD,kCAEC;AAED,SAAS,mBAAmB,CAAC,MAAoB;IAC/C,IAAM,OAAO,GAAG,MAAM,CAAC,OAAO,CAAA;IAE9B,IAAM,SAAS,GAAmB,EAAE,CAAA;IACpC,SAAS,CAAC,IAAI,GAAG,MAAM,CAAC,IAAI,CAAA;IAE5B,IAAI,CAAC,IAAA,qBAAW,EAAC,OAAO,CAAC,EAAE;QACzB,SAAS,CAAC,OAAO,GAAG,OAAO,CAAA;KAC5B;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,MAAM,CAAC,EAAE;QACvB,MAAM,CACJ,+CAA+C;YAC/C,8FAA8F,CAC/F,CAAA;KACF;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,UAAU,CAAC,EAAE;QAC3B,mEAAmE;QACnE,SAAS,CAAC,UAAU,GAAQ,MAAM,CAAC,UAAU,CAAC,CAAA;KAC/C;IAED,IAAA,0BAAiB,EAAC,CAAC,SAAS,CAAC,CAAC,CAAA;IAE9B,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,KAAK,CAAC,EAAE;QACtB,SAAS,CAAC,KAAK,GAAG,MAAM,CAAC,KAAK,CAAC,CAAA;KAChC;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,KAAK,CAAC,EAAE;QACtB,SAAS,CAAC,KAAK,GAAG,MAAM,CAAC,KAAK,CAAC,CAAA;KAChC;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,QAAQ,CAAC,EAAE;QACzB,SAAS,CAAC,QAAQ,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAA;KACtC;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,SAAS,CAAC,EAAE;QAC1B,SAAS,CAAC,SAAS,GAAG,MAAM,CAAC,SAAS,CAAC,CAAA;KACxC;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,UAAU,CAAC,EAAE;QAC3B,SAAS,CAAC,UAAU,GAAG,MAAM,CAAC,UAAU,CAAC,CAAA;KAC1C;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,WAAW,CAAC,EAAE;QAC5B,SAAS,CAAC,WAAW,GAAG,MAAM,CAAC,WAAW,CAAC,CAAA;KAC5C;IAED,IAAI,IAAA,aAAG,EAAC,MAAM,EAAE,gBAAgB,CAAC,EAAE;QACjC,SAAS,CAAC,gBAAgB,GAAG,MAAM,CAAC,gBAAgB,CAAC,CAAA;KACtD;IAED,OAAO,SAAS,CAAA;AAClB,CAAC;AAEY,QAAA,GAAG,GAAG,WAAW,CAAC,EAAE,IAAI,EAAE,KAAK,EAAE,OAAO,EAAE,oBAAK,CAAC,EAAE,EAAE,CAAC,CAAA;AAClE,IAAA,0BAAiB,EAAC,CAAC,WAAG,CAAC,CAAC,CAAA;AAExB,SAAgB,mBAAmB,CACjC,OAAkB,EAClB,KAAa,EACb,WAAmB,EACnB,SAAiB,EACjB,SAAiB,EACjB,OAAe,EACf,WAAmB,EACnB,SAAiB;IAEjB,OAAO;QACL,KAAK,OAAA;QACL,WAAW,aAAA;QACX,SAAS,WAAA;QACT,SAAS,WAAA;QACT,OAAO,SAAA;QACP,WAAW,aAAA;QACX,SAAS,WAAA;QACT,YAAY,EAAQ,OAAQ,CAAC,YAAY;QACzC,SAAS,EAAE,OAAO;KACnB,CAAA;AACH,CAAC;AArBD,kDAqBC;AAED,SAAgB,YAAY,CAAC,KAAa,EAAE,OAAkB;IAC5D,OAAO,IAAA,+BAAsB,EAAC,KAAK,EAAE,OAAO,CAAC,CAAA;AAC/C,CAAC;AAFD,oCAEC"}

View File

@@ -0,0 +1,34 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.isValidRange = exports.Range = void 0;
var Range = /** @class */ (function () {
function Range(start, end) {
this.start = start;
this.end = end;
if (!isValidRange(start, end)) {
throw new Error("INVALID RANGE");
}
}
Range.prototype.contains = function (num) {
return this.start <= num && this.end >= num;
};
Range.prototype.containsRange = function (other) {
return this.start <= other.start && this.end >= other.end;
};
Range.prototype.isContainedInRange = function (other) {
return other.containsRange(this);
};
Range.prototype.strictlyContainsRange = function (other) {
return this.start < other.start && this.end > other.end;
};
Range.prototype.isStrictlyContainedInRange = function (other) {
return other.strictlyContainsRange(this);
};
return Range;
}());
exports.Range = Range;
function isValidRange(start, end) {
return !(start < 0 || end < start);
}
exports.isValidRange = isValidRange;
//# sourceMappingURL=range.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"range.js","sourceRoot":"","sources":["../../../src/text/range.ts"],"names":[],"mappings":";;;AAeA;IACE,eAAmB,KAAa,EAAS,GAAW;QAAjC,UAAK,GAAL,KAAK,CAAQ;QAAS,QAAG,GAAH,GAAG,CAAQ;QAClD,IAAI,CAAC,YAAY,CAAC,KAAK,EAAE,GAAG,CAAC,EAAE;YAC7B,MAAM,IAAI,KAAK,CAAC,eAAe,CAAC,CAAA;SACjC;IACH,CAAC;IAED,wBAAQ,GAAR,UAAS,GAAW;QAClB,OAAO,IAAI,CAAC,KAAK,IAAI,GAAG,IAAI,IAAI,CAAC,GAAG,IAAI,GAAG,CAAA;IAC7C,CAAC;IAED,6BAAa,GAAb,UAAc,KAAa;QACzB,OAAO,IAAI,CAAC,KAAK,IAAI,KAAK,CAAC,KAAK,IAAI,IAAI,CAAC,GAAG,IAAI,KAAK,CAAC,GAAG,CAAA;IAC3D,CAAC;IAED,kCAAkB,GAAlB,UAAmB,KAAa;QAC9B,OAAO,KAAK,CAAC,aAAa,CAAC,IAAI,CAAC,CAAA;IAClC,CAAC;IAED,qCAAqB,GAArB,UAAsB,KAAa;QACjC,OAAO,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC,KAAK,IAAI,IAAI,CAAC,GAAG,GAAG,KAAK,CAAC,GAAG,CAAA;IACzD,CAAC;IAED,0CAA0B,GAA1B,UAA2B,KAAa;QACtC,OAAO,KAAK,CAAC,qBAAqB,CAAC,IAAI,CAAC,CAAA;IAC1C,CAAC;IACH,YAAC;AAAD,CAAC,AA1BD,IA0BC;AA1BY,sBAAK;AA4BlB,SAAgB,YAAY,CAAC,KAAa,EAAE,GAAW;IACrD,OAAO,CAAC,CAAC,KAAK,GAAG,CAAC,IAAI,GAAG,GAAG,KAAK,CAAC,CAAA;AACpC,CAAC;AAFD,oCAEC"}

View File

@@ -0,0 +1,8 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.VERSION = void 0;
// needs a separate module as this is required inside chevrotain productive code
// and also in the entry point for webpack(api.ts).
// A separate file avoids cyclic dependencies and webpack errors.
exports.VERSION = "10.5.0";
//# sourceMappingURL=version.js.map

View File

@@ -0,0 +1 @@
{"version":3,"file":"version.js","sourceRoot":"","sources":["../../src/version.ts"],"names":[],"mappings":";;;AAAA,gFAAgF;AAChF,mDAAmD;AACnD,iEAAiE;AACpD,QAAA,OAAO,GAAG,QAAQ,CAAA"}

View File

@@ -0,0 +1,42 @@
import mod from "../lib/src/api.js";
export default mod;
export const Alternation = mod.Alternation;
export const Alternative = mod.Alternative;
export const CstParser = mod.CstParser;
export const EMPTY_ALT = mod.EMPTY_ALT;
export const EOF = mod.EOF;
export const EarlyExitException = mod.EarlyExitException;
export const EmbeddedActionsParser = mod.EmbeddedActionsParser;
export const GAstVisitor = mod.GAstVisitor;
export const LLkLookaheadStrategy = mod.LLkLookaheadStrategy;
export const Lexer = mod.Lexer;
export const LexerDefinitionErrorType = mod.LexerDefinitionErrorType;
export const MismatchedTokenException = mod.MismatchedTokenException;
export const NoViableAltException = mod.NoViableAltException;
export const NonTerminal = mod.NonTerminal;
export const NotAllInputParsedException = mod.NotAllInputParsedException;
export const Option = mod.Option;
export const Parser = mod.Parser;
export const ParserDefinitionErrorType = mod.ParserDefinitionErrorType;
export const Repetition = mod.Repetition;
export const RepetitionMandatory = mod.RepetitionMandatory;
export const RepetitionMandatoryWithSeparator = mod.RepetitionMandatoryWithSeparator;
export const RepetitionWithSeparator = mod.RepetitionWithSeparator;
export const Rule = mod.Rule;
export const Terminal = mod.Terminal;
export const VERSION = mod.VERSION;
export const clearCache = mod.clearCache;
export const createSyntaxDiagramsCode = mod.createSyntaxDiagramsCode;
export const createToken = mod.createToken;
export const createTokenInstance = mod.createTokenInstance;
export const defaultLexerErrorProvider = mod.defaultLexerErrorProvider;
export const defaultParserErrorProvider = mod.defaultParserErrorProvider;
export const generateCstDts = mod.generateCstDts;
export const getLookaheadPaths = mod.getLookaheadPaths;
export const isRecognitionException = mod.isRecognitionException;
export const serializeGrammar = mod.serializeGrammar;
export const serializeProduction = mod.serializeProduction;
export const tokenLabel = mod.tokenLabel;
export const tokenMatcher = mod.tokenMatcher;
export const tokenName = mod.tokenName;

Some files were not shown because too many files have changed in this diff Show More