mirror of
https://github.com/mermaid-js/mermaid.git
synced 2025-11-18 11:44:07 +01:00
Chevrotain POC
This commit is contained in:
221
instructions.md
Normal file
221
instructions.md
Normal file
@@ -0,0 +1,221 @@
|
||||
# Jison to Chevrotain Parser Conversion Instructions
|
||||
|
||||
## Overview
|
||||
This guide provides step-by-step instructions for converting a Jison-based parser to Chevrotain, specifically for the flowchart parser located at `src/diagrams/flowchart/parser/flow.jison`.
|
||||
|
||||
## Critical Requirements
|
||||
- **Multi-mode lexing is MANDATORY** - This is crucial for mirroring Jison's lexical states
|
||||
- Preserve the existing parser structure to maintain compatibility
|
||||
- All original test cases must be included in the converted test suite
|
||||
- Minimize changes to test implementation
|
||||
|
||||
## Understanding Jison States
|
||||
The Jison parser uses multiple lexical states defined with `%x`:
|
||||
- string, md_string, acc_title, acc_descr, acc_descr_multiline
|
||||
- dir, vertex, text, ellipseText, trapText, edgeText
|
||||
- thickEdgeText, dottedEdgeText, click, href, callbackname
|
||||
- callbackargs, shapeData, shapeDataStr, shapeDataEndBracket
|
||||
|
||||
### State Management in Jison:
|
||||
- `this.pushState(stateName)` or `this.begin(stateName)` - Enter a new state
|
||||
- `this.popState()` - Return to the previous state
|
||||
- States operate as a stack (LIFO - Last In, First Out)
|
||||
|
||||
## Conversion Process
|
||||
|
||||
### Phase 1: Analysis
|
||||
1. **Study the Jison file thoroughly**
|
||||
- Map all lexical states and their purposes
|
||||
- Document which tokens are available in each state
|
||||
- Note all state transitions (when states are entered/exited)
|
||||
- Identify semantic actions and their data transformations
|
||||
|
||||
2. **Create a state transition diagram**
|
||||
- Document which tokens trigger state changes
|
||||
- Map the relationships between states
|
||||
- Identify any nested state scenarios
|
||||
|
||||
### Phase 2: Lexer Implementation
|
||||
1. **Set up Chevrotain multi-mode lexer structure**
|
||||
- Create a mode for each Jison state
|
||||
- Define a default mode corresponding to Jison's INITIAL state
|
||||
- Ensure mode names match Jison state names for clarity
|
||||
|
||||
2. **Convert token definitions**
|
||||
- For each Jison token rule, create equivalent Chevrotain token
|
||||
- Pay special attention to tokens that trigger state changes
|
||||
- Preserve token precedence and ordering from Jison
|
||||
|
||||
3. **Implement state transitions**
|
||||
- Tokens that call `pushState` should use Chevrotain's push_mode
|
||||
- Tokens that call `popState` should use Chevrotain's pop_mode
|
||||
- Maintain the stack-based behavior of Jison states
|
||||
|
||||
### Phase 3: Parser Implementation
|
||||
1. **Convert grammar rules**
|
||||
- Translate each Jison grammar rule to Chevrotain's format
|
||||
- Preserve the rule hierarchy and structure
|
||||
- Maintain the same rule names where possible
|
||||
|
||||
2. **Handle semantic actions**
|
||||
- Convert Jison's semantic actions to Chevrotain's visitor pattern
|
||||
- Ensure data structures remain compatible
|
||||
- Preserve any side effects or state mutations
|
||||
|
||||
### Phase 4: Testing Strategy
|
||||
1. **Test file naming convention**
|
||||
- Original: `*.spec.js`
|
||||
- Converted: `*-chev.spec.ts`
|
||||
- Keep test files in the same directory: `src/diagrams/flowchart/parser/`
|
||||
|
||||
2. **Test conversion approach**
|
||||
- Copy each original test file
|
||||
- Rename with `-chev.spec.ts` suffix
|
||||
- Modify only the import statements and parser initialization
|
||||
- Keep test cases and assertions unchanged
|
||||
- Run tests individually: `vitest packages/mermaid/src/diagrams/flowchart/parser/flow-chev.spec.ts --run`
|
||||
|
||||
3. **Validation checklist**
|
||||
- All original test cases must pass
|
||||
- Test coverage should match the original
|
||||
- Performance should be comparable or better
|
||||
|
||||
### Phase 5: Integration
|
||||
1. **API compatibility**
|
||||
- Ensure the new parser exposes the same public interface
|
||||
- Return values should match the original parser
|
||||
- Error messages should be equivalent
|
||||
|
||||
2. **Gradual migration**
|
||||
- Create a feature flag to switch between parsers
|
||||
- Allow parallel testing of both implementations
|
||||
- Monitor for any behavioral differences
|
||||
|
||||
## Common Pitfalls to Avoid
|
||||
1. **State management differences**
|
||||
- Chevrotain's modes are more rigid than Jison's states
|
||||
- Ensure proper mode stack behavior is maintained
|
||||
- Test deeply nested state scenarios
|
||||
|
||||
2. **Token precedence**
|
||||
- Chevrotain's token ordering matters more than in Jison
|
||||
- Longer patterns should generally come before shorter ones
|
||||
- Test edge cases with ambiguous inputs
|
||||
|
||||
3. **Semantic action timing**
|
||||
- Chevrotain processes semantic actions differently
|
||||
- Ensure actions execute at the correct parse phase
|
||||
- Validate that data flows correctly through the parse tree
|
||||
|
||||
## Success Criteria
|
||||
- All original tests pass with the new parser
|
||||
- No changes required to downstream code
|
||||
- Performance is equal or better
|
||||
- Parser behavior is identical for all valid inputs
|
||||
- Error handling remains consistent
|
||||
|
||||
|
||||
# This is a reference to how Chevrotain handles multi-mode lexing
|
||||
|
||||
## Summary: Using Multi-Mode Lexing in Chevrotain
|
||||
|
||||
Chevrotain supports *multi-mode lexing*, allowing you to define different sets of tokenization rules (modes) that the lexer can switch between based on context. This is essential for parsing languages with embedded or context-sensitive syntax, such as HTML or templating languages[3][2].
|
||||
|
||||
**Key Concepts:**
|
||||
|
||||
- **Modes:** Each mode is an array of token types (constructors) defining the valid tokens in that context.
|
||||
- **Mode Stack:** The lexer maintains a stack of modes. Only the top (current) mode's tokens are active at any time[2].
|
||||
- **Switching Modes:**
|
||||
- Use `PUSH_MODE` on a token to switch to a new mode after matching that token.
|
||||
- Use `POP_MODE` on a token to return to the previous mode.
|
||||
|
||||
**Implementation Steps:**
|
||||
|
||||
1. **Define Tokens with Mode Switching:**
|
||||
- Tokens can specify `PUSH_MODE` or `POP_MODE` to control mode transitions.
|
||||
```javascript
|
||||
const EnterLetters = createToken({ name: "EnterLetters", pattern: /LETTERS/, push_mode: "letter_mode" });
|
||||
const ExitLetters = createToken({ name: "ExitLetters", pattern: /EXIT_LETTERS/, pop_mode: true });
|
||||
```
|
||||
|
||||
2. **Create the Multi-Mode Lexer Definition:**
|
||||
- Structure your modes as an object mapping mode names to arrays of token constructors.
|
||||
```javascript
|
||||
const multiModeLexerDefinition = {
|
||||
modes: {
|
||||
numbers_mode: [One, Two, EnterLetters, ExitNumbers, Whitespace],
|
||||
letter_mode: [Alpha, Beta, ExitLetters, Whitespace],
|
||||
},
|
||||
defaultMode: "numbers_mode"
|
||||
};
|
||||
```
|
||||
|
||||
3. **Instantiate the Lexer:**
|
||||
- Pass the multi-mode definition to the Chevrotain `Lexer` constructor.
|
||||
```javascript
|
||||
const MultiModeLexer = new Lexer(multiModeLexerDefinition);
|
||||
```
|
||||
|
||||
4. **Tokenize Input:**
|
||||
- The lexer will automatically switch modes as it encounters tokens with `PUSH_MODE` or `POP_MODE`.
|
||||
```javascript
|
||||
const lexResult = MultiModeLexer.tokenize(input);
|
||||
```
|
||||
|
||||
5. **Parser Integration:**
|
||||
- When constructing the parser, provide a flat array of all token constructors used in all modes, as the parser does not natively accept the multi-mode structure[1].
|
||||
```javascript
|
||||
// Flatten all tokens from all modes for the parser
|
||||
let tokenCtors = [];
|
||||
for (let mode in multiModeLexerDefinition.modes) {
|
||||
tokenCtors = tokenCtors.concat(multiModeLexerDefinition.modes[mode]);
|
||||
}
|
||||
class MultiModeParser extends Parser {
|
||||
constructor(tokens) {
|
||||
super(tokens, tokenCtors);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Best Practices:**
|
||||
|
||||
- Place more specific tokens before more general ones to avoid prefix-matching issues[2].
|
||||
- Use the mode stack judiciously to manage nested or recursive language constructs.
|
||||
|
||||
**References:**
|
||||
- Chevrotain documentation on [lexer modes][3]
|
||||
- Example code and integration notes from Chevrotain issues and docs[1][2]
|
||||
|
||||
This approach enables robust, context-sensitive lexing for complex language grammars in Chevrotain.
|
||||
|
||||
[1] https://github.com/chevrotain/chevrotain/issues/395
|
||||
[2] https://chevrotain.io/documentation/0_7_2/classes/lexer.html
|
||||
[3] https://chevrotain.io/docs/features/lexer_modes.html
|
||||
[4] https://github.com/SAP/chevrotain/issues/370
|
||||
[5] https://galaxy.ai/youtube-summarizer/understanding-lexers-parsers-and-interpreters-with-chevrotain-l-jMsoAY64k
|
||||
[6] https://chevrotain.io/documentation/8_0_1/classes/lexer.html
|
||||
[7] https://fastly.jsdelivr.net/npm/chevrotain@11.0.3/src/scan/lexer.ts
|
||||
[8] https://chevrotain.io/docs/guide/resolving_lexer_errors.html
|
||||
[9] https://www.youtube.com/watch?v=l-jMsoAY64k
|
||||
[10] https://github.com/SAP/chevrotain/blob/master/packages/chevrotain/test/scan/lexer_spec.ts
|
||||
|
||||
**Important**
|
||||
Always assume I want the exact code edit!
|
||||
Always assume I want you to apply this fixes directly!
|
||||
|
||||
# Running tests
|
||||
|
||||
Run tests in one file from the project root using this command:
|
||||
`vitest #filename-relative-to-project-root# --run`
|
||||
|
||||
Example:
|
||||
`vitest packages/mermaid/src/diagrams/flowchart/parser/flow-chev.spec.ts --run`
|
||||
|
||||
To run all flowchart test for the migration
|
||||
`vitest packages/mermaid/src/diagrams/flowchart/parser/*flow*-chev.spec.ts --run`
|
||||
|
||||
To run a specific test in a test file:
|
||||
`vitest #filename-relative-to-project-root# -t "string-matching-test" --run`
|
||||
|
||||
Example:
|
||||
`vitest packages/mermaid/src/diagrams/flowchart/parser/flow-chev-singlenode.spec.js -t "diamond node with html in it (SN3)" --run`
|
||||
@@ -71,6 +71,7 @@
|
||||
"@iconify/utils": "^2.1.33",
|
||||
"@mermaid-js/parser": "workspace:^",
|
||||
"@types/d3": "^7.4.3",
|
||||
"chevrotain": "^11.0.3",
|
||||
"cytoscape": "^3.29.3",
|
||||
"cytoscape-cose-bilkent": "^4.1.0",
|
||||
"cytoscape-fcose": "^2.2.0",
|
||||
|
||||
@@ -2,9 +2,9 @@ import type { MermaidConfig } from '../../config.type.js';
|
||||
import { setConfig } from '../../diagram-api/diagramAPI.js';
|
||||
import { FlowDB } from './flowDb.js';
|
||||
import renderer from './flowRenderer-v3-unified.js';
|
||||
// @ts-ignore: JISON doesn't support types
|
||||
//import flowParser from './parser/flow.jison';
|
||||
import flowParser from './parser/flowParser.ts';
|
||||
// Replace the Jison import with Chevrotain parser
|
||||
// import flowParser from './parser/flow.jison';
|
||||
import flowParser from './parser/flowParserAdapter.js';
|
||||
import flowStyles from './styles.js';
|
||||
|
||||
export const diagram = {
|
||||
|
||||
@@ -0,0 +1,27 @@
|
||||
import type { MermaidConfig } from '../../config.type.js';
|
||||
import { setConfig } from '../../diagram-api/diagramAPI.js';
|
||||
import { FlowDB } from './flowDb.js';
|
||||
import renderer from './flowRenderer-v3-unified.js';
|
||||
// @ts-ignore: JISON doesn't support types
|
||||
//import flowParser from './parser/flow.jison';
|
||||
import flowParser from './parser/flowParser.ts';
|
||||
import flowStyles from './styles.js';
|
||||
|
||||
export const diagram = {
|
||||
parser: flowParser,
|
||||
get db() {
|
||||
return new FlowDB();
|
||||
},
|
||||
renderer,
|
||||
styles: flowStyles,
|
||||
init: (cnf: MermaidConfig) => {
|
||||
if (!cnf.flowchart) {
|
||||
cnf.flowchart = {};
|
||||
}
|
||||
if (cnf.layout) {
|
||||
setConfig({ layout: cnf.layout });
|
||||
}
|
||||
cnf.flowchart.arrowMarkerAbsolute = cnf.arrowMarkerAbsolute;
|
||||
setConfig({ flowchart: { arrowMarkerAbsolute: cnf.arrowMarkerAbsolute } });
|
||||
},
|
||||
};
|
||||
@@ -0,0 +1,244 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
securityLevel: 'strict',
|
||||
});
|
||||
|
||||
describe('[Chevrotain Arrows] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle basic arrow', function () {
|
||||
const res = flow.parse('graph TD;A-->B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
|
||||
it('should handle arrow with text', function () {
|
||||
const res = flow.parse('graph TD;A-->|text|B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('text');
|
||||
});
|
||||
|
||||
it('should handle dotted arrow', function () {
|
||||
const res = flow.parse('graph TD;A-.->B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_dotted');
|
||||
});
|
||||
|
||||
it('should handle dotted arrow with text', function () {
|
||||
const res = flow.parse('graph TD;A-.-|text|B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('text');
|
||||
expect(edges[0].type).toBe('arrow_dotted');
|
||||
});
|
||||
|
||||
it('should handle thick arrow', function () {
|
||||
const res = flow.parse('graph TD;A==>B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_thick');
|
||||
});
|
||||
|
||||
it('should handle thick arrow with text', function () {
|
||||
const res = flow.parse('graph TD;A==|text|==>B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('text');
|
||||
expect(edges[0].type).toBe('arrow_thick');
|
||||
});
|
||||
|
||||
it('should handle open arrow', function () {
|
||||
const res = flow.parse('graph TD;A---B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_open');
|
||||
});
|
||||
|
||||
it('should handle open arrow with text', function () {
|
||||
const res = flow.parse('graph TD;A---|text|B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('text');
|
||||
expect(edges[0].type).toBe('arrow_open');
|
||||
});
|
||||
|
||||
it('should handle cross arrow', function () {
|
||||
const res = flow.parse('graph TD;A--xB;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle circle arrow', function () {
|
||||
const res = flow.parse('graph TD;A--oB;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_circle');
|
||||
});
|
||||
|
||||
it('should handle bidirectional arrow', function () {
|
||||
const res = flow.parse('graph TD;A<-->B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('double_arrow_point');
|
||||
});
|
||||
|
||||
it('should handle bidirectional arrow with text', function () {
|
||||
const res = flow.parse('graph TD;A<--|text|-->B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('text');
|
||||
expect(edges[0].type).toBe('double_arrow_point');
|
||||
});
|
||||
|
||||
it('should handle multiple arrows in sequence', function () {
|
||||
const res = flow.parse('graph TD;A-->B-->C;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('C').id).toBe('C');
|
||||
expect(edges.length).toBe(2);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[1].start).toBe('B');
|
||||
expect(edges[1].end).toBe('C');
|
||||
});
|
||||
|
||||
it('should handle multiple arrows with different types', function () {
|
||||
const res = flow.parse('graph TD;A-->B-.->C==>D;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(3);
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
expect(edges[1].type).toBe('arrow_dotted');
|
||||
expect(edges[2].type).toBe('arrow_thick');
|
||||
});
|
||||
|
||||
it('should handle long arrows', function () {
|
||||
const res = flow.parse('graph TD;A---->B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
expect(edges[0].length).toBe('long');
|
||||
});
|
||||
|
||||
it('should handle extra long arrows', function () {
|
||||
const res = flow.parse('graph TD;A------>B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
expect(edges[0].length).toBe('extralong');
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,240 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
securityLevel: 'strict',
|
||||
});
|
||||
|
||||
describe('[Chevrotain Edges] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle a single edge', function () {
|
||||
const res = flow.parse('graph TD;A-->B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
});
|
||||
|
||||
it('should handle multiple edges', function () {
|
||||
const res = flow.parse('graph TD;A-->B;B-->C;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('C').id).toBe('C');
|
||||
expect(edges.length).toBe(2);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[1].start).toBe('B');
|
||||
expect(edges[1].end).toBe('C');
|
||||
});
|
||||
|
||||
it('should handle chained edges', function () {
|
||||
const res = flow.parse('graph TD;A-->B-->C;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('C').id).toBe('C');
|
||||
expect(edges.length).toBe(2);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[1].start).toBe('B');
|
||||
expect(edges[1].end).toBe('C');
|
||||
});
|
||||
|
||||
it('should handle edges with text', function () {
|
||||
const res = flow.parse('graph TD;A-->|text|B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('text');
|
||||
});
|
||||
|
||||
it('should handle edges with quoted text', function () {
|
||||
const res = flow.parse('graph TD;A-->|"quoted text"|B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('quoted text');
|
||||
});
|
||||
|
||||
it('should handle edges with complex text', function () {
|
||||
const res = flow.parse('graph TD;A-->|"text with spaces and symbols!"|B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].text).toBe('text with spaces and symbols!');
|
||||
});
|
||||
|
||||
it('should handle multiple edges from one node', function () {
|
||||
const res = flow.parse('graph TD;A-->B;A-->C;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('C').id).toBe('C');
|
||||
expect(edges.length).toBe(2);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[1].start).toBe('A');
|
||||
expect(edges[1].end).toBe('C');
|
||||
});
|
||||
|
||||
it('should handle multiple edges to one node', function () {
|
||||
const res = flow.parse('graph TD;A-->C;B-->C;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('C').id).toBe('C');
|
||||
expect(edges.length).toBe(2);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('C');
|
||||
expect(edges[1].start).toBe('B');
|
||||
expect(edges[1].end).toBe('C');
|
||||
});
|
||||
|
||||
it('should handle edges with node shapes', function () {
|
||||
const res = flow.parse('graph TD;A[Start]-->B{Decision};');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('A').type).toBe('square');
|
||||
expect(vert.get('A').text).toBe('Start');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('B').type).toBe('diamond');
|
||||
expect(vert.get('B').text).toBe('Decision');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
});
|
||||
|
||||
it('should handle complex edge patterns', function () {
|
||||
const res = flow.parse('graph TD;A[Start]-->B{Decision};B-->|Yes|C[Process];B-->|No|D[End];');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(3);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[1].start).toBe('B');
|
||||
expect(edges[1].end).toBe('C');
|
||||
expect(edges[1].text).toBe('Yes');
|
||||
expect(edges[2].start).toBe('B');
|
||||
expect(edges[2].end).toBe('D');
|
||||
expect(edges[2].text).toBe('No');
|
||||
});
|
||||
|
||||
it('should handle edges with ampersand syntax', function () {
|
||||
const res = flow.parse('graph TD;A & B --> C;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('C').id).toBe('C');
|
||||
expect(edges.length).toBe(2);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('C');
|
||||
expect(edges[1].start).toBe('B');
|
||||
expect(edges[1].end).toBe('C');
|
||||
});
|
||||
|
||||
it('should handle edges with multiple ampersands', function () {
|
||||
const res = flow.parse('graph TD;A & B & C --> D;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(vert.get('C').id).toBe('C');
|
||||
expect(vert.get('D').id).toBe('D');
|
||||
expect(edges.length).toBe(3);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('D');
|
||||
expect(edges[1].start).toBe('B');
|
||||
expect(edges[1].end).toBe('D');
|
||||
expect(edges[2].start).toBe('C');
|
||||
expect(edges[2].end).toBe('D');
|
||||
});
|
||||
|
||||
it('should handle self-referencing edges', function () {
|
||||
const res = flow.parse('graph TD;A-->A;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('A');
|
||||
});
|
||||
|
||||
it('should handle edges with numeric node IDs', function () {
|
||||
const res = flow.parse('graph TD;1-->2;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('1').id).toBe('1');
|
||||
expect(vert.get('2').id).toBe('2');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('1');
|
||||
expect(edges[0].end).toBe('2');
|
||||
});
|
||||
|
||||
it('should handle edges with mixed alphanumeric node IDs', function () {
|
||||
const res = flow.parse('graph TD;A1-->B2;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A1').id).toBe('A1');
|
||||
expect(vert.get('B2').id).toBe('B2');
|
||||
expect(edges.length).toBe(1);
|
||||
expect(edges[0].start).toBe('A1');
|
||||
expect(edges[0].end).toBe('B2');
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,362 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
securityLevel: 'strict',
|
||||
});
|
||||
|
||||
const keywords = [
|
||||
'graph',
|
||||
'flowchart',
|
||||
'flowchart-elk',
|
||||
'style',
|
||||
'default',
|
||||
'linkStyle',
|
||||
'interpolate',
|
||||
'classDef',
|
||||
'class',
|
||||
'href',
|
||||
'call',
|
||||
'click',
|
||||
'_self',
|
||||
'_blank',
|
||||
'_parent',
|
||||
'_top',
|
||||
'end',
|
||||
'subgraph',
|
||||
];
|
||||
|
||||
const specialChars = ['#', ':', '0', '&', ',', '*', '.', '\\', 'v', '-', '/', '_'];
|
||||
|
||||
describe('[Chevrotain Singlenodes] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle a single node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;A;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('A').styles.length).toBe(0);
|
||||
});
|
||||
|
||||
it('should handle a single node with white space after it (SN1)', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;A ;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('A').styles.length).toBe(0);
|
||||
});
|
||||
|
||||
it('should handle a single square node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a[A];');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').styles.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('square');
|
||||
});
|
||||
|
||||
it('should handle a single round square node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a[A];');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').styles.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('square');
|
||||
});
|
||||
|
||||
it('should handle a single circle node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a((A));');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('circle');
|
||||
});
|
||||
|
||||
it('should handle a single round node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a(A);');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('round');
|
||||
});
|
||||
|
||||
it('should handle a single diamond node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a{A};');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('diamond');
|
||||
});
|
||||
|
||||
it('should handle a single diamond node with whitespace after it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a{A} ;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('diamond');
|
||||
});
|
||||
|
||||
it('should handle a single diamond node with html in it (SN3)', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a{A <br> end};');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('diamond');
|
||||
expect(vert.get('a').text).toBe('A <br> end');
|
||||
});
|
||||
|
||||
it('should handle a single hexagon node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a{{A}};');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('hexagon');
|
||||
});
|
||||
|
||||
it('should handle a single hexagon node with html in it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a{{A <br> end}};');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('hexagon');
|
||||
expect(vert.get('a').text).toBe('A <br> end');
|
||||
});
|
||||
|
||||
it('should handle a single round node with html in it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a(A <br> end);');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('round');
|
||||
expect(vert.get('a').text).toBe('A <br> end');
|
||||
});
|
||||
|
||||
it('should handle a single double circle node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a(((A)));');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('doublecircle');
|
||||
});
|
||||
|
||||
it('should handle a single double circle node with whitespace after it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a(((A))) ;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('doublecircle');
|
||||
});
|
||||
|
||||
it('should handle a single double circle node with html in it (SN3)', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;a(((A <br> end)));');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('doublecircle');
|
||||
expect(vert.get('a').text).toBe('A <br> end');
|
||||
});
|
||||
|
||||
it('should handle a single node with alphanumerics starting on a char', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;id1;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('id1').styles.length).toBe(0);
|
||||
});
|
||||
|
||||
it('should handle a single node with a single digit', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;1;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('1').text).toBe('1');
|
||||
});
|
||||
|
||||
it('should handle a single node with a single digit in a subgraph', function () {
|
||||
// Silly but syntactically correct
|
||||
|
||||
const res = flow.parse('graph TD;subgraph "hello";1;end;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('1').text).toBe('1');
|
||||
});
|
||||
|
||||
it('should handle a single node with alphanumerics starting on a num', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;1id;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('1id').styles.length).toBe(0);
|
||||
});
|
||||
|
||||
it('should handle a single node with alphanumerics containing a minus sign', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;i-d;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('i-d').styles.length).toBe(0);
|
||||
});
|
||||
|
||||
it('should handle a single node with alphanumerics containing a underscore sign', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parse('graph TD;i_d;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('i_d').styles.length).toBe(0);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle keywords between dashes "-"', function (keyword) {
|
||||
const res = flow.parse(`graph TD;a-${keyword}-node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`a-${keyword}-node`).text).toBe(`a-${keyword}-node`);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle keywords between periods "."', function (keyword) {
|
||||
const res = flow.parse(`graph TD;a.${keyword}.node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`a.${keyword}.node`).text).toBe(`a.${keyword}.node`);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle keywords between underscores "_"', function (keyword) {
|
||||
const res = flow.parse(`graph TD;a_${keyword}_node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`a_${keyword}_node`).text).toBe(`a_${keyword}_node`);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle nodes ending in %s', function (keyword) {
|
||||
const res = flow.parse(`graph TD;node_${keyword};node.${keyword};node-${keyword};`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`node_${keyword}`).text).toBe(`node_${keyword}`);
|
||||
expect(vert.get(`node.${keyword}`).text).toBe(`node.${keyword}`);
|
||||
expect(vert.get(`node-${keyword}`).text).toBe(`node-${keyword}`);
|
||||
});
|
||||
|
||||
const errorKeywords = [
|
||||
'graph',
|
||||
'flowchart',
|
||||
'flowchart-elk',
|
||||
'style',
|
||||
'linkStyle',
|
||||
'interpolate',
|
||||
'classDef',
|
||||
'class',
|
||||
'_self',
|
||||
'_blank',
|
||||
'_parent',
|
||||
'_top',
|
||||
'end',
|
||||
'subgraph',
|
||||
];
|
||||
|
||||
it.each(errorKeywords)('should throw error at nodes beginning with %s', function (keyword) {
|
||||
const str = `graph TD;${keyword}.node;${keyword}-node;${keyword}/node`;
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(() => flow.parse(str)).toThrowError();
|
||||
});
|
||||
|
||||
const workingKeywords = ['default', 'href', 'click', 'call'];
|
||||
|
||||
it.each(workingKeywords)('should parse node beginning with %s', function (keyword) {
|
||||
flow.parse(`graph TD; ${keyword}.node;${keyword}-node;${keyword}/node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`${keyword}.node`).text).toBe(`${keyword}.node`);
|
||||
expect(vert.get(`${keyword}-node`).text).toBe(`${keyword}-node`);
|
||||
expect(vert.get(`${keyword}/node`).text).toBe(`${keyword}/node`);
|
||||
});
|
||||
|
||||
it.each(specialChars)(
|
||||
'should allow node ids of single special characters',
|
||||
function (specialChar) {
|
||||
flow.parse(`graph TD; ${specialChar} --> A`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`${specialChar}`).text).toBe(`${specialChar}`);
|
||||
}
|
||||
);
|
||||
|
||||
it.each(specialChars)(
|
||||
'should allow node ids with special characters at start of id',
|
||||
function (specialChar) {
|
||||
flow.parse(`graph TD; ${specialChar}node --> A`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`${specialChar}node`).text).toBe(`${specialChar}node`);
|
||||
}
|
||||
);
|
||||
|
||||
it.each(specialChars)(
|
||||
'should allow node ids with special characters at end of id',
|
||||
function (specialChar) {
|
||||
flow.parse(`graph TD; node${specialChar} --> A`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`node${specialChar}`).text).toBe(`node${specialChar}`);
|
||||
}
|
||||
);
|
||||
});
|
||||
230
packages/mermaid/src/diagrams/flowchart/parser/flow-chev.spec.js
Normal file
230
packages/mermaid/src/diagrams/flowchart/parser/flow-chev.spec.js
Normal file
@@ -0,0 +1,230 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { cleanupComments } from '../../../diagram-api/comments.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
securityLevel: 'strict',
|
||||
});
|
||||
|
||||
describe('parsing a flow chart with Chevrotain', function () {
|
||||
beforeEach(function () {
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle a trailing whitespaces after statements', function () {
|
||||
const res = flow.parse(cleanupComments('graph TD;\n\n\n %% Comment\n A-->B; \n B-->C;'));
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(2);
|
||||
expect(edges[0].start).toBe('A');
|
||||
expect(edges[0].end).toBe('B');
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
expect(edges[0].text).toBe('');
|
||||
});
|
||||
|
||||
it('should handle node names with "end" substring', function () {
|
||||
const res = flow.parse('graph TD\nendpoint --> sender');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('endpoint').id).toBe('endpoint');
|
||||
expect(vert.get('sender').id).toBe('sender');
|
||||
expect(edges[0].start).toBe('endpoint');
|
||||
expect(edges[0].end).toBe('sender');
|
||||
});
|
||||
|
||||
it('should handle node names ending with keywords', function () {
|
||||
const res = flow.parse('graph TD\nblend --> monograph');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('blend').id).toBe('blend');
|
||||
expect(vert.get('monograph').id).toBe('monograph');
|
||||
expect(edges[0].start).toBe('blend');
|
||||
expect(edges[0].end).toBe('monograph');
|
||||
});
|
||||
|
||||
it('should allow default in the node name/id', function () {
|
||||
const res = flow.parse('graph TD\ndefault --> monograph');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('default').id).toBe('default');
|
||||
expect(vert.get('monograph').id).toBe('monograph');
|
||||
expect(edges[0].start).toBe('default');
|
||||
expect(edges[0].end).toBe('monograph');
|
||||
});
|
||||
|
||||
describe('special characters should be handled.', function () {
|
||||
const charTest = function (char, result) {
|
||||
const res = flow.parse('graph TD;A(' + char + ')-->B;');
|
||||
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
if (result) {
|
||||
expect(vert.get('A').text).toBe(result);
|
||||
} else {
|
||||
expect(vert.get('A').text).toBe(char);
|
||||
}
|
||||
flow.yy.clear();
|
||||
};
|
||||
|
||||
it("should be able to parse a '.'", function () {
|
||||
charTest('.');
|
||||
charTest('Start 103a.a1');
|
||||
});
|
||||
|
||||
it("should be able to parse a ':'", function () {
|
||||
charTest(':');
|
||||
});
|
||||
|
||||
it("should be able to parse a ','", function () {
|
||||
charTest(',');
|
||||
});
|
||||
|
||||
it("should be able to parse text containing '-'", function () {
|
||||
charTest('a-b');
|
||||
});
|
||||
|
||||
it("should be able to parse a '+'", function () {
|
||||
charTest('+');
|
||||
});
|
||||
|
||||
it("should be able to parse a '*'", function () {
|
||||
charTest('*');
|
||||
});
|
||||
|
||||
it("should be able to parse a '<'", function () {
|
||||
charTest('<', '<');
|
||||
});
|
||||
|
||||
it("should be able to parse a '&'", function () {
|
||||
charTest('&');
|
||||
});
|
||||
});
|
||||
|
||||
it('should be possible to use direction in node ids', function () {
|
||||
let statement = '';
|
||||
|
||||
statement = statement + 'graph TD;' + '\n';
|
||||
statement = statement + ' node1TB\n';
|
||||
|
||||
const res = flow.parse(statement);
|
||||
const vertices = flow.yy.getVertices();
|
||||
const classes = flow.yy.getClasses();
|
||||
expect(vertices.get('node1TB').id).toBe('node1TB');
|
||||
});
|
||||
|
||||
it('should be possible to use direction in node ids', function () {
|
||||
let statement = '';
|
||||
|
||||
statement = statement + 'graph TD;A--x|text including URL space|B;';
|
||||
const res = flow.parse(statement);
|
||||
const vertices = flow.yy.getVertices();
|
||||
const classes = flow.yy.getClasses();
|
||||
expect(vertices.get('A').id).toBe('A');
|
||||
});
|
||||
|
||||
it('should be possible to use numbers as labels', function () {
|
||||
let statement = '';
|
||||
|
||||
statement = statement + 'graph TB;subgraph "number as labels";1;end;';
|
||||
const res = flow.parse(statement);
|
||||
const vertices = flow.yy.getVertices();
|
||||
|
||||
expect(vertices.get('1').id).toBe('1');
|
||||
});
|
||||
|
||||
it('should add accTitle and accDescr to flow chart', function () {
|
||||
const flowChart = `graph LR
|
||||
accTitle: Big decisions
|
||||
accDescr: Flow chart of the decision making process
|
||||
A[Hard] -->|Text| B(Round)
|
||||
B --> C{Decision}
|
||||
C -->|One| D[Result 1]
|
||||
C -->|Two| E[Result 2]
|
||||
`;
|
||||
|
||||
flow.parse(flowChart);
|
||||
expect(flow.yy.getAccTitle()).toBe('Big decisions');
|
||||
expect(flow.yy.getAccDescription()).toBe('Flow chart of the decision making process');
|
||||
});
|
||||
|
||||
it('should add accTitle and a multi line accDescr to flow chart', function () {
|
||||
const flowChart = `graph LR
|
||||
accTitle: Big decisions
|
||||
|
||||
accDescr {
|
||||
Flow chart of the decision making process
|
||||
with a second line
|
||||
}
|
||||
|
||||
A[Hard] -->|Text| B(Round)
|
||||
B --> C{Decision}
|
||||
C -->|One| D[Result 1]
|
||||
C -->|Two| E[Result 2]
|
||||
`;
|
||||
|
||||
flow.parse(flowChart);
|
||||
expect(flow.yy.getAccTitle()).toBe('Big decisions');
|
||||
expect(flow.yy.getAccDescription()).toBe(
|
||||
`Flow chart of the decision making process
|
||||
with a second line`
|
||||
);
|
||||
});
|
||||
|
||||
for (const unsafeProp of ['__proto__', 'constructor']) {
|
||||
it(`should work with node id ${unsafeProp}`, function () {
|
||||
const flowChart = `graph LR
|
||||
${unsafeProp} --> A;`;
|
||||
|
||||
expect(() => {
|
||||
flow.parse(flowChart);
|
||||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it(`should work with tooltip id ${unsafeProp}`, function () {
|
||||
const flowChart = `graph LR
|
||||
click ${unsafeProp} callback "${unsafeProp}";`;
|
||||
|
||||
expect(() => {
|
||||
flow.parse(flowChart);
|
||||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it(`should work with class id ${unsafeProp}`, function () {
|
||||
const flowChart = `graph LR
|
||||
${unsafeProp} --> A;
|
||||
classDef ${unsafeProp} color:#ffffff,fill:#000000;
|
||||
class ${unsafeProp} ${unsafeProp};`;
|
||||
|
||||
expect(() => {
|
||||
flow.parse(flowChart);
|
||||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it(`should work with subgraph id ${unsafeProp}`, function () {
|
||||
const flowChart = `graph LR
|
||||
${unsafeProp} --> A;
|
||||
subgraph ${unsafeProp}
|
||||
C --> D;
|
||||
end;`;
|
||||
|
||||
expect(() => {
|
||||
flow.parse(flowChart);
|
||||
}).not.toThrow();
|
||||
});
|
||||
}
|
||||
});
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
import { cleanupComments } from '../../../diagram-api/comments.js';
|
||||
|
||||
@@ -9,15 +9,15 @@ setConfig({
|
||||
|
||||
describe('[Comments] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle comments', function () {
|
||||
const res = flow.parser.parse(cleanupComments('graph TD;\n%% Comment\n A-->B;'));
|
||||
const res = flow.parse(cleanupComments('graph TD;\n%% Comment\n A-->B;'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -29,10 +29,10 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle comments at the start', function () {
|
||||
const res = flow.parser.parse(cleanupComments('%% Comment\ngraph TD;\n A-->B;'));
|
||||
const res = flow.parse(cleanupComments('%% Comment\ngraph TD;\n A-->B;'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -44,10 +44,10 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle comments at the end', function () {
|
||||
const res = flow.parser.parse(cleanupComments('graph TD;\n A-->B\n %% Comment at the end\n'));
|
||||
const res = flow.parse(cleanupComments('graph TD;\n A-->B\n %% Comment at the end\n'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -59,10 +59,10 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle comments at the end no trailing newline', function () {
|
||||
const res = flow.parser.parse(cleanupComments('graph TD;\n A-->B\n%% Comment'));
|
||||
const res = flow.parse(cleanupComments('graph TD;\n A-->B\n%% Comment'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -74,10 +74,10 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle comments at the end many trailing newlines', function () {
|
||||
const res = flow.parser.parse(cleanupComments('graph TD;\n A-->B\n%% Comment\n\n\n'));
|
||||
const res = flow.parse(cleanupComments('graph TD;\n A-->B\n%% Comment\n\n\n'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -89,10 +89,10 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle no trailing newlines', function () {
|
||||
const res = flow.parser.parse(cleanupComments('graph TD;\n A-->B'));
|
||||
const res = flow.parse(cleanupComments('graph TD;\n A-->B'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -104,10 +104,10 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle many trailing newlines', function () {
|
||||
const res = flow.parser.parse(cleanupComments('graph TD;\n A-->B\n\n'));
|
||||
const res = flow.parse(cleanupComments('graph TD;\n A-->B\n\n'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -119,10 +119,10 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle a comment with blank rows in-between', function () {
|
||||
const res = flow.parser.parse(cleanupComments('graph TD;\n\n\n %% Comment\n A-->B;'));
|
||||
const res = flow.parse(cleanupComments('graph TD;\n\n\n %% Comment\n A-->B;'));
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -134,14 +134,14 @@ describe('[Comments] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle a comment with mermaid flowchart code in them', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
cleanupComments(
|
||||
'graph TD;\n\n\n %% Test od>Odd shape]-->|Two line<br>edge comment|ro;\n A-->B;'
|
||||
)
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,18 +8,18 @@ setConfig({
|
||||
|
||||
describe('when parsing directions', function () {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.parser.yy.setGen('gen-2');
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
flow.yy.setGen('gen-2');
|
||||
});
|
||||
|
||||
it('should use default direction from top level', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
subgraph A
|
||||
a --> b
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(2);
|
||||
@@ -29,13 +29,13 @@ describe('when parsing directions', function () {
|
||||
expect(subgraph.dir).toBe(undefined);
|
||||
});
|
||||
it('should handle a subgraph with a direction', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
subgraph A
|
||||
direction BT
|
||||
a --> b
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(2);
|
||||
@@ -45,14 +45,14 @@ describe('when parsing directions', function () {
|
||||
expect(subgraph.dir).toBe('BT');
|
||||
});
|
||||
it('should use the last defined direction', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
subgraph A
|
||||
direction BT
|
||||
a --> b
|
||||
direction RL
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(2);
|
||||
@@ -63,7 +63,7 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should handle nested subgraphs 1', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
subgraph A
|
||||
direction RL
|
||||
b-->B
|
||||
@@ -75,7 +75,7 @@ describe('when parsing directions', function () {
|
||||
c
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(2);
|
||||
|
||||
const subgraphA = subgraphs.find((o) => o.id === 'A');
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -63,27 +63,27 @@ const regularEdges = [
|
||||
|
||||
describe('[Edges] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle open ended edges', function () {
|
||||
const res = flow.parser.parse('graph TD;A---B;');
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const res = flow.parse('graph TD;A---B;');
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_open');
|
||||
});
|
||||
|
||||
it('should handle cross ended edges', function () {
|
||||
const res = flow.parser.parse('graph TD;A--xB;');
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const res = flow.parse('graph TD;A--xB;');
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle open ended edges', function () {
|
||||
const res = flow.parser.parse('graph TD;A--oB;');
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const res = flow.parse('graph TD;A--oB;');
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_circle');
|
||||
});
|
||||
@@ -92,11 +92,9 @@ describe('[Edges] when parsing', () => {
|
||||
describe('open ended edges with ids and labels', function () {
|
||||
regularEdges.forEach((edgeType) => {
|
||||
it(`should handle ${edgeType.stroke} ${edgeType.type} with no text`, function () {
|
||||
const res = flow.parser.parse(
|
||||
`flowchart TD;\nA e1@${edgeType.edgeStart}${edgeType.edgeEnd} B;`
|
||||
);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const res = flow.parse(`flowchart TD;\nA e1@${edgeType.edgeStart}${edgeType.edgeEnd} B;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
@@ -108,11 +106,9 @@ describe('[Edges] when parsing', () => {
|
||||
expect(edges[0].stroke).toBe(`${edgeType.stroke}`);
|
||||
});
|
||||
it(`should handle ${edgeType.stroke} ${edgeType.type} with text`, function () {
|
||||
const res = flow.parser.parse(
|
||||
`flowchart TD;\nA e1@${edgeType.edgeStart}${edgeType.edgeEnd} B;`
|
||||
);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const res = flow.parse(`flowchart TD;\nA e1@${edgeType.edgeStart}${edgeType.edgeEnd} B;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
@@ -125,11 +121,11 @@ describe('[Edges] when parsing', () => {
|
||||
});
|
||||
});
|
||||
it('should handle normal edges where you also have a node with metadata', function () {
|
||||
const res = flow.parser.parse(`flowchart LR
|
||||
const res = flow.parse(`flowchart LR
|
||||
A id1@-->B
|
||||
A@{ shape: 'rect' }
|
||||
`);
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].id).toBe('id1');
|
||||
});
|
||||
@@ -137,11 +133,11 @@ A@{ shape: 'rect' }
|
||||
describe('double ended edges with ids and labels', function () {
|
||||
doubleEndedEdges.forEach((edgeType) => {
|
||||
it(`should handle ${edgeType.stroke} ${edgeType.type} with text`, function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
`flowchart TD;\nA e1@${edgeType.edgeStart} label ${edgeType.edgeEnd} B;`
|
||||
);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
expect(edges.length).toBe(1);
|
||||
@@ -159,10 +155,10 @@ A@{ shape: 'rect' }
|
||||
describe('edges', function () {
|
||||
doubleEndedEdges.forEach((edgeType) => {
|
||||
it(`should handle ${edgeType.stroke} ${edgeType.type} with no text`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA ${edgeType.edgeStart}${edgeType.edgeEnd} B;`);
|
||||
const res = flow.parse(`graph TD;\nA ${edgeType.edgeStart}${edgeType.edgeEnd} B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -175,12 +171,12 @@ A@{ shape: 'rect' }
|
||||
});
|
||||
|
||||
it(`should handle ${edgeType.stroke} ${edgeType.type} with text`, function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
`graph TD;\nA ${edgeType.edgeStart} text ${edgeType.edgeEnd} B;`
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -195,12 +191,12 @@ A@{ shape: 'rect' }
|
||||
it.each(keywords)(
|
||||
`should handle ${edgeType.stroke} ${edgeType.type} with %s text`,
|
||||
function (keyword) {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
`graph TD;\nA ${edgeType.edgeStart} ${keyword} ${edgeType.edgeEnd} B;`
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -216,11 +212,11 @@ A@{ shape: 'rect' }
|
||||
});
|
||||
|
||||
it('should handle multiple edges', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;A---|This is the 123 s text|B;\nA---|This is the second edge|B;'
|
||||
);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -242,10 +238,10 @@ A@{ shape: 'rect' }
|
||||
describe('edge length', function () {
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle normal edges with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -${'-'.repeat(length)}- B;`);
|
||||
const res = flow.parse(`graph TD;\nA -${'-'.repeat(length)}- B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -261,10 +257,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle normal labelled edges with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -- Label -${'-'.repeat(length)}- B;`);
|
||||
const res = flow.parse(`graph TD;\nA -- Label -${'-'.repeat(length)}- B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -280,10 +276,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle normal edges with arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -${'-'.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA -${'-'.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -299,10 +295,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle normal labelled edges with arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -- Label -${'-'.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA -- Label -${'-'.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -318,10 +314,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle normal edges with double arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA <-${'-'.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA <-${'-'.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -337,10 +333,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle normal labelled edges with double arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA <-- Label -${'-'.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA <-- Label -${'-'.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -356,10 +352,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle thick edges with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA =${'='.repeat(length)}= B;`);
|
||||
const res = flow.parse(`graph TD;\nA =${'='.repeat(length)}= B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -375,10 +371,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle thick labelled edges with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA == Label =${'='.repeat(length)}= B;`);
|
||||
const res = flow.parse(`graph TD;\nA == Label =${'='.repeat(length)}= B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -394,10 +390,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle thick edges with arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA =${'='.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA =${'='.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -413,10 +409,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle thick labelled edges with arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA == Label =${'='.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA == Label =${'='.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -432,10 +428,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle thick edges with double arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA <=${'='.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA <=${'='.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -451,10 +447,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle thick labelled edges with double arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA <== Label =${'='.repeat(length)}> B;`);
|
||||
const res = flow.parse(`graph TD;\nA <== Label =${'='.repeat(length)}> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -470,10 +466,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle dotted edges with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -${'.'.repeat(length)}- B;`);
|
||||
const res = flow.parse(`graph TD;\nA -${'.'.repeat(length)}- B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -489,10 +485,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle dotted labelled edges with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -. Label ${'.'.repeat(length)}- B;`);
|
||||
const res = flow.parse(`graph TD;\nA -. Label ${'.'.repeat(length)}- B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -508,10 +504,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle dotted edges with arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -${'.'.repeat(length)}-> B;`);
|
||||
const res = flow.parse(`graph TD;\nA -${'.'.repeat(length)}-> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -527,10 +523,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle dotted labelled edges with arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA -. Label ${'.'.repeat(length)}-> B;`);
|
||||
const res = flow.parse(`graph TD;\nA -. Label ${'.'.repeat(length)}-> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -546,10 +542,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle dotted edges with double arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA <-${'.'.repeat(length)}-> B;`);
|
||||
const res = flow.parse(`graph TD;\nA <-${'.'.repeat(length)}-> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -565,10 +561,10 @@ A@{ shape: 'rect' }
|
||||
|
||||
for (let length = 1; length <= 3; ++length) {
|
||||
it(`should handle dotted edges with double arrows with length ${length}`, function () {
|
||||
const res = flow.parser.parse(`graph TD;\nA <-. Label ${'.'.repeat(length)}-> B;`);
|
||||
const res = flow.parse(`graph TD;\nA <-. Label ${'.'.repeat(length)}-> B;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
import { vi } from 'vitest';
|
||||
const spyOn = vi.spyOn;
|
||||
@@ -12,26 +12,26 @@ describe('[Interactions] when parsing', () => {
|
||||
let flowDb;
|
||||
beforeEach(function () {
|
||||
flowDb = new FlowDB();
|
||||
flow.parser.yy = flowDb;
|
||||
flow.parser.yy.clear();
|
||||
flow.yy = flowDb;
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should be possible to use click to a callback', function () {
|
||||
spyOn(flowDb, 'setClickEvent');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A callback');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A callback');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setClickEvent).toHaveBeenCalledWith('A', 'callback');
|
||||
});
|
||||
|
||||
it('should be possible to use click to a click and call callback', function () {
|
||||
spyOn(flowDb, 'setClickEvent');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A call callback()');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A call callback()');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setClickEvent).toHaveBeenCalledWith('A', 'callback');
|
||||
});
|
||||
@@ -39,10 +39,10 @@ describe('[Interactions] when parsing', () => {
|
||||
it('should be possible to use click to a callback with tooltip', function () {
|
||||
spyOn(flowDb, 'setClickEvent');
|
||||
spyOn(flowDb, 'setTooltip');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A callback "tooltip"');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A callback "tooltip"');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setClickEvent).toHaveBeenCalledWith('A', 'callback');
|
||||
expect(flowDb.setTooltip).toHaveBeenCalledWith('A', 'tooltip');
|
||||
@@ -51,10 +51,10 @@ describe('[Interactions] when parsing', () => {
|
||||
it('should be possible to use click to a click and call callback with tooltip', function () {
|
||||
spyOn(flowDb, 'setClickEvent');
|
||||
spyOn(flowDb, 'setTooltip');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A call callback() "tooltip"');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A call callback() "tooltip"');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setClickEvent).toHaveBeenCalledWith('A', 'callback');
|
||||
expect(flowDb.setTooltip).toHaveBeenCalledWith('A', 'tooltip');
|
||||
@@ -62,30 +62,30 @@ describe('[Interactions] when parsing', () => {
|
||||
|
||||
it('should be possible to use click to a callback with an arbitrary number of args', function () {
|
||||
spyOn(flowDb, 'setClickEvent');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A call callback("test0", test1, test2)');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A call callback("test0", test1, test2)');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setClickEvent).toHaveBeenCalledWith('A', 'callback', '"test0", test1, test2');
|
||||
});
|
||||
|
||||
it('should handle interaction - click to a link', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A "click.html"');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A "click.html"');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html');
|
||||
});
|
||||
|
||||
it('should handle interaction - click to a click and href link', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A href "click.html"');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A href "click.html"');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html');
|
||||
});
|
||||
@@ -93,10 +93,10 @@ describe('[Interactions] when parsing', () => {
|
||||
it('should handle interaction - click to a link with tooltip', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
spyOn(flowDb, 'setTooltip');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A "click.html" "tooltip"');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A "click.html" "tooltip"');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html');
|
||||
expect(flowDb.setTooltip).toHaveBeenCalledWith('A', 'tooltip');
|
||||
@@ -105,10 +105,10 @@ describe('[Interactions] when parsing', () => {
|
||||
it('should handle interaction - click to a click and href link with tooltip', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
spyOn(flowDb, 'setTooltip');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A href "click.html" "tooltip"');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A href "click.html" "tooltip"');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html');
|
||||
expect(flowDb.setTooltip).toHaveBeenCalledWith('A', 'tooltip');
|
||||
@@ -116,20 +116,20 @@ describe('[Interactions] when parsing', () => {
|
||||
|
||||
it('should handle interaction - click to a link with target', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A "click.html" _blank');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A "click.html" _blank');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html', '_blank');
|
||||
});
|
||||
|
||||
it('should handle interaction - click to a click and href link with target', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A href "click.html" _blank');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A href "click.html" _blank');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html', '_blank');
|
||||
});
|
||||
@@ -137,10 +137,10 @@ describe('[Interactions] when parsing', () => {
|
||||
it('should handle interaction - click to a link with tooltip and target', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
spyOn(flowDb, 'setTooltip');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A "click.html" "tooltip" _blank');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A "click.html" "tooltip" _blank');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html', '_blank');
|
||||
expect(flowDb.setTooltip).toHaveBeenCalledWith('A', 'tooltip');
|
||||
@@ -149,10 +149,10 @@ describe('[Interactions] when parsing', () => {
|
||||
it('should handle interaction - click to a click and href link with tooltip and target', function () {
|
||||
spyOn(flowDb, 'setLink');
|
||||
spyOn(flowDb, 'setTooltip');
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclick A href "click.html" "tooltip" _blank');
|
||||
const res = flow.parse('graph TD\nA-->B\nclick A href "click.html" "tooltip" _blank');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(flowDb.setLink).toHaveBeenCalledWith('A', 'click.html', '_blank');
|
||||
expect(flowDb.setTooltip).toHaveBeenCalledWith('A', 'tooltip');
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,21 +8,21 @@ setConfig({
|
||||
|
||||
describe('[Lines] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle line interpolation default definitions', function () {
|
||||
const res = flow.parser.parse('graph TD\n' + 'A-->B\n' + 'linkStyle default interpolate basis');
|
||||
const res = flow.parse('graph TD\n' + 'A-->B\n' + 'linkStyle default interpolate basis');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.defaultInterpolate).toBe('basis');
|
||||
});
|
||||
|
||||
it('should handle line interpolation numbered definitions', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\n' +
|
||||
'A-->B\n' +
|
||||
'A-->C\n' +
|
||||
@@ -30,38 +30,38 @@ describe('[Lines] when parsing', () => {
|
||||
'linkStyle 1 interpolate cardinal'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].interpolate).toBe('basis');
|
||||
expect(edges[1].interpolate).toBe('cardinal');
|
||||
});
|
||||
|
||||
it('should handle line interpolation multi-numbered definitions', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\n' + 'A-->B\n' + 'A-->C\n' + 'linkStyle 0,1 interpolate basis'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].interpolate).toBe('basis');
|
||||
expect(edges[1].interpolate).toBe('basis');
|
||||
});
|
||||
|
||||
it('should handle line interpolation default with style', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\n' + 'A-->B\n' + 'linkStyle default interpolate basis stroke-width:1px;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.defaultInterpolate).toBe('basis');
|
||||
});
|
||||
|
||||
it('should handle line interpolation numbered with style', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\n' +
|
||||
'A-->B\n' +
|
||||
'A-->C\n' +
|
||||
@@ -69,20 +69,20 @@ describe('[Lines] when parsing', () => {
|
||||
'linkStyle 1 interpolate cardinal stroke-width:1px;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].interpolate).toBe('basis');
|
||||
expect(edges[1].interpolate).toBe('cardinal');
|
||||
});
|
||||
|
||||
it('should handle line interpolation multi-numbered with style', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\n' + 'A-->B\n' + 'A-->C\n' + 'linkStyle 0,1 interpolate basis stroke-width:1px;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].interpolate).toBe('basis');
|
||||
expect(edges[1].interpolate).toBe('basis');
|
||||
@@ -90,28 +90,28 @@ describe('[Lines] when parsing', () => {
|
||||
|
||||
describe('it should handle new line type notation', function () {
|
||||
it('should handle regular lines', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;');
|
||||
const res = flow.parse('graph TD;A-->B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].stroke).toBe('normal');
|
||||
});
|
||||
|
||||
it('should handle dotted lines', function () {
|
||||
const res = flow.parser.parse('graph TD;A-.->B;');
|
||||
const res = flow.parse('graph TD;A-.->B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].stroke).toBe('dotted');
|
||||
});
|
||||
|
||||
it('should handle dotted lines', function () {
|
||||
const res = flow.parser.parse('graph TD;A==>B;');
|
||||
const res = flow.parse('graph TD;A==>B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].stroke).toBe('thick');
|
||||
});
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,16 +8,16 @@ setConfig({
|
||||
|
||||
describe('parsing a flow chart with markdown strings', function () {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('markdown formatting in nodes and labels', function () {
|
||||
const res = flow.parser.parse(`flowchart
|
||||
const res = flow.parse(`flowchart
|
||||
A["\`The cat in **the** hat\`"]-- "\`The *bat* in the chat\`" -->B["The dog in the hog"] -- "The rat in the mat" -->C;`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('A').text).toBe('The cat in **the** hat');
|
||||
@@ -38,7 +38,7 @@ A["\`The cat in **the** hat\`"]-- "\`The *bat* in the chat\`" -->B["The dog in t
|
||||
expect(edges[1].labelType).toBe('string');
|
||||
});
|
||||
it('markdown formatting in subgraphs', function () {
|
||||
const res = flow.parser.parse(`flowchart LR
|
||||
const res = flow.parse(`flowchart LR
|
||||
subgraph "One"
|
||||
a("\`The **cat**
|
||||
in the hat\`") -- "1o" --> b{{"\`The **dog** in the hog\`"}}
|
||||
@@ -48,7 +48,7 @@ subgraph "\`**Two**\`"
|
||||
in the hat\`") -- "\`1o **ipa**\`" --> d("The dog in the hog")
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(2);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,105 +8,105 @@ setConfig({
|
||||
|
||||
describe('when parsing directions', function () {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.parser.yy.setGen('gen-2');
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
flow.yy.setGen('gen-2');
|
||||
});
|
||||
|
||||
it('should handle basic shape data statements', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded}`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
});
|
||||
it('should handle basic shape data statements', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded }`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
});
|
||||
|
||||
it('should handle basic shape data statements with &', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded } & E`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(2);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
expect(data4Layout.nodes[1].label).toEqual('E');
|
||||
});
|
||||
it('should handle shape data statements with edges', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded } --> E`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(2);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
expect(data4Layout.nodes[1].label).toEqual('E');
|
||||
});
|
||||
it('should handle basic shape data statements with amp and edges 1', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded } & E --> F`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(3);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
expect(data4Layout.nodes[1].label).toEqual('E');
|
||||
});
|
||||
it('should handle basic shape data statements with amp and edges 2', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded } & E@{ shape: rounded } --> F`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(3);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
expect(data4Layout.nodes[1].label).toEqual('E');
|
||||
});
|
||||
it('should handle basic shape data statements with amp and edges 3', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded } & E@{ shape: rounded } --> F & G@{ shape: rounded }`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(4);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
expect(data4Layout.nodes[1].label).toEqual('E');
|
||||
});
|
||||
it('should handle basic shape data statements with amp and edges 4', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded } & E@{ shape: rounded } --> F@{ shape: rounded } & G@{ shape: rounded }`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(4);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
expect(data4Layout.nodes[1].label).toEqual('E');
|
||||
});
|
||||
it('should handle basic shape data statements with amp and edges 5, trailing space', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded } & E@{ shape: rounded } --> F{ shape: rounded } & G{ shape: rounded } `);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(4);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
expect(data4Layout.nodes[1].label).toEqual('E');
|
||||
});
|
||||
it('should no matter of there are no leading spaces', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{shape: rounded}`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
@@ -114,10 +114,10 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should no matter of there are many leading spaces', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded}`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
@@ -125,27 +125,27 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should be forgiving with many spaces before the end', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded }`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('D');
|
||||
});
|
||||
it('should be possible to add multiple properties on the same line', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
D@{ shape: rounded , label: "DD"}`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('rounded');
|
||||
expect(data4Layout.nodes[0].label).toEqual('DD');
|
||||
});
|
||||
it('should be possible to link to a node with more data', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A --> D@{
|
||||
shape: circle
|
||||
other: "clock"
|
||||
@@ -153,7 +153,7 @@ describe('when parsing directions', function () {
|
||||
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(2);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('A');
|
||||
@@ -163,7 +163,7 @@ describe('when parsing directions', function () {
|
||||
expect(data4Layout.edges.length).toBe(1);
|
||||
});
|
||||
it('should not disturb adding multiple nodes after each other', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A[hello]
|
||||
B@{
|
||||
shape: circle
|
||||
@@ -175,7 +175,7 @@ describe('when parsing directions', function () {
|
||||
}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(3);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('hello');
|
||||
@@ -185,21 +185,21 @@ describe('when parsing directions', function () {
|
||||
expect(data4Layout.nodes[2].label).toEqual('Hello');
|
||||
});
|
||||
it('should use handle bracket end (}) character inside the shape data', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A@{
|
||||
label: "This is }"
|
||||
other: "clock"
|
||||
}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('This is }');
|
||||
});
|
||||
it('should error on nonexistent shape', function () {
|
||||
expect(() => {
|
||||
flow.parser.parse(`flowchart TB
|
||||
flow.parse(`flowchart TB
|
||||
A@{ shape: this-shape-does-not-exist }
|
||||
`);
|
||||
}).toThrow('No such shape: this-shape-does-not-exist.');
|
||||
@@ -207,23 +207,23 @@ describe('when parsing directions', function () {
|
||||
it('should error on internal-only shape', function () {
|
||||
expect(() => {
|
||||
// this shape does exist, but it's only supposed to be for internal/backwards compatibility use
|
||||
flow.parser.parse(`flowchart TB
|
||||
flow.parse(`flowchart TB
|
||||
A@{ shape: rect_left_inv_arrow }
|
||||
`);
|
||||
}).toThrow('No such shape: rect_left_inv_arrow. Shape names should be lowercase.');
|
||||
});
|
||||
it('Diamond shapes should work as usual', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A{This is a label}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('diamond');
|
||||
expect(data4Layout.nodes[0].label).toEqual('This is a label');
|
||||
});
|
||||
it('Multi line strings should be supported', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A@{
|
||||
label: |
|
||||
This is a
|
||||
@@ -232,13 +232,13 @@ describe('when parsing directions', function () {
|
||||
}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('This is a\nmultiline string\n');
|
||||
});
|
||||
it('Multi line strings should be supported', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A@{
|
||||
label: "This is a
|
||||
multiline string"
|
||||
@@ -246,57 +246,57 @@ describe('when parsing directions', function () {
|
||||
}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('This is a<br/>multiline string');
|
||||
});
|
||||
it('should be possible to use } in strings', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A@{
|
||||
label: "This is a string with }"
|
||||
other: "clock"
|
||||
}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('This is a string with }');
|
||||
});
|
||||
it('should be possible to use @ in strings', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A@{
|
||||
label: "This is a string with @"
|
||||
other: "clock"
|
||||
}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('This is a string with @');
|
||||
});
|
||||
it('should be possible to use @ in strings', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
A@{
|
||||
label: "This is a string with}"
|
||||
other: "clock"
|
||||
}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(1);
|
||||
expect(data4Layout.nodes[0].shape).toEqual('squareRect');
|
||||
expect(data4Layout.nodes[0].label).toEqual('This is a string with}');
|
||||
});
|
||||
|
||||
it('should be possible to use @ syntax to add labels on multi nodes', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
n2["label for n2"] & n4@{ label: "label for n4"} & n5@{ label: "label for n5"}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(3);
|
||||
expect(data4Layout.nodes[0].label).toEqual('label for n2');
|
||||
expect(data4Layout.nodes[1].label).toEqual('label for n4');
|
||||
@@ -304,12 +304,12 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should be possible to use @ syntax to add labels on multi nodes with edge/link', function () {
|
||||
const res = flow.parser.parse(`flowchart TD
|
||||
const res = flow.parse(`flowchart TD
|
||||
A["A"] --> B["for B"] & C@{ label: "for c"} & E@{label : "for E"}
|
||||
D@{label: "for D"}
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(5);
|
||||
expect(data4Layout.nodes[0].label).toEqual('A');
|
||||
expect(data4Layout.nodes[1].label).toEqual('for B');
|
||||
@@ -319,7 +319,7 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should be possible to use @ syntax in labels', function () {
|
||||
const res = flow.parser.parse(`flowchart TD
|
||||
const res = flow.parse(`flowchart TD
|
||||
A["@A@"] --> B["@for@ B@"] & C@{ label: "@for@ c@"} & E{"\`@for@ E@\`"} & D(("@for@ D@"))
|
||||
H1{{"@for@ H@"}}
|
||||
H2{{"\`@for@ H@\`"}}
|
||||
@@ -329,7 +329,7 @@ describe('when parsing directions', function () {
|
||||
AS2>"\`@for@ AS@\`"]
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(11);
|
||||
expect(data4Layout.nodes[0].label).toEqual('@A@');
|
||||
expect(data4Layout.nodes[1].label).toEqual('@for@ B@');
|
||||
@@ -345,12 +345,12 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should handle unique edge creation with using @ and &', function () {
|
||||
const res = flow.parser.parse(`flowchart TD
|
||||
const res = flow.parse(`flowchart TD
|
||||
A & B e1@--> C & D
|
||||
A1 e2@--> C1 & D1
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(7);
|
||||
expect(data4Layout.edges.length).toBe(6);
|
||||
expect(data4Layout.edges[0].id).toEqual('L_A_C_0');
|
||||
@@ -362,12 +362,12 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should handle redefine same edge ids again', function () {
|
||||
const res = flow.parser.parse(`flowchart TD
|
||||
const res = flow.parse(`flowchart TD
|
||||
A & B e1@--> C & D
|
||||
A1 e1@--> C1 & D1
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(7);
|
||||
expect(data4Layout.edges.length).toBe(6);
|
||||
expect(data4Layout.edges[0].id).toEqual('L_A_C_0');
|
||||
@@ -379,7 +379,7 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it('should handle overriding edge animate again', function () {
|
||||
const res = flow.parser.parse(`flowchart TD
|
||||
const res = flow.parse(`flowchart TD
|
||||
A e1@--> B
|
||||
C e2@--> D
|
||||
E e3@--> F
|
||||
@@ -389,7 +389,7 @@ describe('when parsing directions', function () {
|
||||
e3@{ animate: false }
|
||||
`);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(6);
|
||||
expect(data4Layout.edges.length).toBe(3);
|
||||
expect(data4Layout.edges[0].id).toEqual('e1');
|
||||
@@ -401,12 +401,12 @@ describe('when parsing directions', function () {
|
||||
});
|
||||
|
||||
it.skip('should be possible to use @ syntax to add labels with trail spaces', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
`flowchart TB
|
||||
n2["label for n2"] & n4@{ label: "label for n4"} & n5@{ label: "label for n5"} `
|
||||
);
|
||||
|
||||
const data4Layout = flow.parser.yy.getData();
|
||||
const data4Layout = flow.yy.getData();
|
||||
expect(data4Layout.nodes.length).toBe(3);
|
||||
expect(data4Layout.nodes[0].label).toEqual('label for n2');
|
||||
expect(data4Layout.nodes[1].label).toEqual('label for n4');
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -31,26 +31,26 @@ const specialChars = ['#', ':', '0', '&', ',', '*', '.', '\\', 'v', '-', '/', '_
|
||||
|
||||
describe('[Singlenodes] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
it('should handle a single node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;A;');
|
||||
const res = flow.parse('graph TD;A;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('A').styles.length).toBe(0);
|
||||
});
|
||||
it('should handle a single node with white space after it (SN1)', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;A ;');
|
||||
const res = flow.parse('graph TD;A ;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('A').styles.length).toBe(0);
|
||||
@@ -58,10 +58,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single square node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a[A];');
|
||||
const res = flow.parse('graph TD;a[A];');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').styles.length).toBe(0);
|
||||
@@ -70,10 +70,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single round square node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a[A];');
|
||||
const res = flow.parse('graph TD;a[A];');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').styles.length).toBe(0);
|
||||
@@ -82,10 +82,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single circle node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a((A));');
|
||||
const res = flow.parse('graph TD;a((A));');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('circle');
|
||||
@@ -93,10 +93,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single round node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a(A);');
|
||||
const res = flow.parse('graph TD;a(A);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('round');
|
||||
@@ -104,10 +104,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single odd node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a>A];');
|
||||
const res = flow.parse('graph TD;a>A];');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('odd');
|
||||
@@ -115,10 +115,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single diamond node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a{A};');
|
||||
const res = flow.parse('graph TD;a{A};');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('diamond');
|
||||
@@ -126,10 +126,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single diamond node with whitespace after it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a{A} ;');
|
||||
const res = flow.parse('graph TD;a{A} ;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('diamond');
|
||||
@@ -137,10 +137,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single diamond node with html in it (SN3)', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a{A <br> end};');
|
||||
const res = flow.parse('graph TD;a{A <br> end};');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('diamond');
|
||||
@@ -149,10 +149,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single hexagon node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a{{A}};');
|
||||
const res = flow.parse('graph TD;a{{A}};');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('hexagon');
|
||||
@@ -160,10 +160,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single hexagon node with html in it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a{{A <br> end}};');
|
||||
const res = flow.parse('graph TD;a{{A <br> end}};');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('hexagon');
|
||||
@@ -172,10 +172,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single round node with html in it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a(A <br> end);');
|
||||
const res = flow.parse('graph TD;a(A <br> end);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('round');
|
||||
@@ -184,10 +184,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single double circle node', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a(((A)));');
|
||||
const res = flow.parse('graph TD;a(((A)));');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('doublecircle');
|
||||
@@ -195,10 +195,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single double circle node with whitespace after it', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a(((A))) ;');
|
||||
const res = flow.parse('graph TD;a(((A))) ;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('doublecircle');
|
||||
@@ -206,10 +206,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single double circle node with html in it (SN3)', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;a(((A <br> end)));');
|
||||
const res = flow.parse('graph TD;a(((A <br> end)));');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('a').type).toBe('doublecircle');
|
||||
@@ -218,10 +218,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single node with alphanumerics starting on a char', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;id1;');
|
||||
const res = flow.parse('graph TD;id1;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('id1').styles.length).toBe(0);
|
||||
@@ -229,10 +229,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single node with a single digit', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;1;');
|
||||
const res = flow.parse('graph TD;1;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('1').text).toBe('1');
|
||||
@@ -241,10 +241,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
it('should handle a single node with a single digit in a subgraph', function () {
|
||||
// Silly but syntactically correct
|
||||
|
||||
const res = flow.parser.parse('graph TD;subgraph "hello";1;end;');
|
||||
const res = flow.parse('graph TD;subgraph "hello";1;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('1').text).toBe('1');
|
||||
@@ -252,10 +252,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single node with alphanumerics starting on a num', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;1id;');
|
||||
const res = flow.parse('graph TD;1id;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('1id').styles.length).toBe(0);
|
||||
@@ -263,10 +263,10 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single node with alphanumerics containing a minus sign', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;i-d;');
|
||||
const res = flow.parse('graph TD;i-d;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('i-d').styles.length).toBe(0);
|
||||
@@ -274,36 +274,36 @@ describe('[Singlenodes] when parsing', () => {
|
||||
|
||||
it('should handle a single node with alphanumerics containing a underscore sign', function () {
|
||||
// Silly but syntactically correct
|
||||
const res = flow.parser.parse('graph TD;i_d;');
|
||||
const res = flow.parse('graph TD;i_d;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges.length).toBe(0);
|
||||
expect(vert.get('i_d').styles.length).toBe(0);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle keywords between dashes "-"', function (keyword) {
|
||||
const res = flow.parser.parse(`graph TD;a-${keyword}-node;`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const res = flow.parse(`graph TD;a-${keyword}-node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`a-${keyword}-node`).text).toBe(`a-${keyword}-node`);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle keywords between periods "."', function (keyword) {
|
||||
const res = flow.parser.parse(`graph TD;a.${keyword}.node;`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const res = flow.parse(`graph TD;a.${keyword}.node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`a.${keyword}.node`).text).toBe(`a.${keyword}.node`);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle keywords between underscores "_"', function (keyword) {
|
||||
const res = flow.parser.parse(`graph TD;a_${keyword}_node;`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const res = flow.parse(`graph TD;a_${keyword}_node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`a_${keyword}_node`).text).toBe(`a_${keyword}_node`);
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle nodes ending in %s', function (keyword) {
|
||||
const res = flow.parser.parse(`graph TD;node_${keyword};node.${keyword};node-${keyword};`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const res = flow.parse(`graph TD;node_${keyword};node.${keyword};node-${keyword};`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`node_${keyword}`).text).toBe(`node_${keyword}`);
|
||||
expect(vert.get(`node.${keyword}`).text).toBe(`node.${keyword}`);
|
||||
expect(vert.get(`node-${keyword}`).text).toBe(`node-${keyword}`);
|
||||
@@ -327,16 +327,16 @@ describe('[Singlenodes] when parsing', () => {
|
||||
];
|
||||
it.each(errorKeywords)('should throw error at nodes beginning with %s', function (keyword) {
|
||||
const str = `graph TD;${keyword}.node;${keyword}-node;${keyword}/node`;
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(() => flow.parser.parse(str)).toThrowError();
|
||||
expect(() => flow.parse(str)).toThrowError();
|
||||
});
|
||||
|
||||
const workingKeywords = ['default', 'href', 'click', 'call'];
|
||||
|
||||
it.each(workingKeywords)('should parse node beginning with %s', function (keyword) {
|
||||
flow.parser.parse(`graph TD; ${keyword}.node;${keyword}-node;${keyword}/node;`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
flow.parse(`graph TD; ${keyword}.node;${keyword}-node;${keyword}/node;`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`${keyword}.node`).text).toBe(`${keyword}.node`);
|
||||
expect(vert.get(`${keyword}-node`).text).toBe(`${keyword}-node`);
|
||||
expect(vert.get(`${keyword}/node`).text).toBe(`${keyword}/node`);
|
||||
@@ -345,8 +345,8 @@ describe('[Singlenodes] when parsing', () => {
|
||||
it.each(specialChars)(
|
||||
'should allow node ids of single special characters',
|
||||
function (specialChar) {
|
||||
flow.parser.parse(`graph TD; ${specialChar} --> A`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
flow.parse(`graph TD; ${specialChar} --> A`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`${specialChar}`).text).toBe(`${specialChar}`);
|
||||
}
|
||||
);
|
||||
@@ -354,8 +354,8 @@ describe('[Singlenodes] when parsing', () => {
|
||||
it.each(specialChars)(
|
||||
'should allow node ids with special characters at start of id',
|
||||
function (specialChar) {
|
||||
flow.parser.parse(`graph TD; ${specialChar}node --> A`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
flow.parse(`graph TD; ${specialChar}node --> A`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`${specialChar}node`).text).toBe(`${specialChar}node`);
|
||||
}
|
||||
);
|
||||
@@ -363,8 +363,8 @@ describe('[Singlenodes] when parsing', () => {
|
||||
it.each(specialChars)(
|
||||
'should allow node ids with special characters at end of id',
|
||||
function (specialChar) {
|
||||
flow.parser.parse(`graph TD; node${specialChar} --> A`);
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
flow.parse(`graph TD; node${specialChar} --> A`);
|
||||
const vert = flow.yy.getVertices();
|
||||
expect(vert.get(`node${specialChar}`).text).toBe(`node${specialChar}`);
|
||||
}
|
||||
);
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,27 +8,27 @@ setConfig({
|
||||
|
||||
describe('[Style] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.parser.yy.setGen('gen-2');
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
flow.yy.setGen('gen-2');
|
||||
});
|
||||
|
||||
// log.debug(flow.parser.parse('graph TD;style Q background:#fff;'));
|
||||
// log.debug(flow.parse('graph TD;style Q background:#fff;'));
|
||||
it('should handle styles for vertices', function () {
|
||||
const res = flow.parser.parse('graph TD;style Q background:#fff;');
|
||||
const res = flow.parse('graph TD;style Q background:#fff;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('Q').styles.length).toBe(1);
|
||||
expect(vert.get('Q').styles[0]).toBe('background:#fff');
|
||||
});
|
||||
|
||||
it('should handle multiple styles for a vortex', function () {
|
||||
const res = flow.parser.parse('graph TD;style R background:#fff,border:1px solid red;');
|
||||
const res = flow.parse('graph TD;style R background:#fff,border:1px solid red;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('R').styles.length).toBe(2);
|
||||
expect(vert.get('R').styles[0]).toBe('background:#fff');
|
||||
@@ -36,12 +36,12 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle multiple styles in a graph', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;style S background:#aaa;\nstyle T background:#bbb,border:1px solid red;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('S').styles.length).toBe(1);
|
||||
expect(vert.get('T').styles.length).toBe(2);
|
||||
@@ -51,12 +51,12 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle styles and graph definitions in a graph', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;S-->T;\nstyle S background:#aaa;\nstyle T background:#bbb,border:1px solid red;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('S').styles.length).toBe(1);
|
||||
expect(vert.get('T').styles.length).toBe(2);
|
||||
@@ -66,10 +66,10 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle styles and graph definitions in a graph', function () {
|
||||
const res = flow.parser.parse('graph TD;style T background:#bbb,border:1px solid red;');
|
||||
// const res = flow.parser.parse('graph TD;style T background: #bbb;');
|
||||
const res = flow.parse('graph TD;style T background:#bbb,border:1px solid red;');
|
||||
// const res = flow.parse('graph TD;style T background: #bbb;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(vert.get('T').styles.length).toBe(2);
|
||||
expect(vert.get('T').styles[0]).toBe('background:#bbb');
|
||||
@@ -77,11 +77,11 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should keep node label text (if already defined) when a style is applied', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;A(( ));B((Test));C;style A background:#fff;style D border:1px solid red;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(vert.get('A').text).toBe('');
|
||||
expect(vert.get('B').text).toBe('Test');
|
||||
@@ -90,12 +90,12 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should be possible to declare a class', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;classDef exClass background:#bbb,border:1px solid red;'
|
||||
);
|
||||
// const res = flow.parser.parse('graph TD;style T background: #bbb;');
|
||||
// const res = flow.parse('graph TD;style T background: #bbb;');
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(classes.get('exClass').styles[0]).toBe('background:#bbb');
|
||||
@@ -103,11 +103,11 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should be possible to declare multiple classes', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;classDef firstClass,secondClass background:#bbb,border:1px solid red;'
|
||||
);
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('firstClass').styles.length).toBe(2);
|
||||
expect(classes.get('firstClass').styles[0]).toBe('background:#bbb');
|
||||
@@ -119,24 +119,24 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should be possible to declare a class with a dot in the style', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;classDef exClass background:#bbb,border:1.5px solid red;'
|
||||
);
|
||||
// const res = flow.parser.parse('graph TD;style T background: #bbb;');
|
||||
// const res = flow.parse('graph TD;style T background: #bbb;');
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(classes.get('exClass').styles[0]).toBe('background:#bbb');
|
||||
expect(classes.get('exClass').styles[1]).toBe('border:1.5px solid red');
|
||||
});
|
||||
it('should be possible to declare a class with a space in the style', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD;classDef exClass background: #bbb,border:1.5px solid red;'
|
||||
);
|
||||
// const res = flow.parser.parse('graph TD;style T background : #bbb;');
|
||||
// const res = flow.parse('graph TD;style T background : #bbb;');
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(classes.get('exClass').styles[0]).toBe('background: #bbb');
|
||||
@@ -150,9 +150,9 @@ describe('[Style] when parsing', () => {
|
||||
statement = statement + 'a-->b;' + '\n';
|
||||
statement = statement + 'class a exClass;';
|
||||
|
||||
const res = flow.parser.parse(statement);
|
||||
const res = flow.parse(statement);
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(classes.get('exClass').styles[0]).toBe('background:#bbb');
|
||||
@@ -166,9 +166,9 @@ describe('[Style] when parsing', () => {
|
||||
statement = statement + 'a_a-->b_b;' + '\n';
|
||||
statement = statement + 'class a_a exClass;';
|
||||
|
||||
const res = flow.parser.parse(statement);
|
||||
const res = flow.parse(statement);
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(classes.get('exClass').styles[0]).toBe('background:#bbb');
|
||||
@@ -181,9 +181,9 @@ describe('[Style] when parsing', () => {
|
||||
statement = statement + 'classDef exClass background:#bbb,border:1px solid red;' + '\n';
|
||||
statement = statement + 'a-->b[test]:::exClass;' + '\n';
|
||||
|
||||
const res = flow.parser.parse(statement);
|
||||
const vertices = flow.parser.yy.getVertices();
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const res = flow.parse(statement);
|
||||
const vertices = flow.yy.getVertices();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(vertices.get('b').classes[0]).toBe('exClass');
|
||||
@@ -198,9 +198,9 @@ describe('[Style] when parsing', () => {
|
||||
statement = statement + 'classDef exClass background:#bbb,border:1px solid red;' + '\n';
|
||||
statement = statement + 'b[test]:::exClass;' + '\n';
|
||||
|
||||
const res = flow.parser.parse(statement);
|
||||
const vertices = flow.parser.yy.getVertices();
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const res = flow.parse(statement);
|
||||
const vertices = flow.yy.getVertices();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(vertices.get('b').classes[0]).toBe('exClass');
|
||||
@@ -215,9 +215,9 @@ describe('[Style] when parsing', () => {
|
||||
statement = statement + 'classDef exClass background:#bbb,border:1px solid red;' + '\n';
|
||||
statement = statement + 'A[test]:::exClass-->B[test2];' + '\n';
|
||||
|
||||
const res = flow.parser.parse(statement);
|
||||
const vertices = flow.parser.yy.getVertices();
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const res = flow.parse(statement);
|
||||
const vertices = flow.yy.getVertices();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(vertices.get('A').classes[0]).toBe('exClass');
|
||||
@@ -232,9 +232,9 @@ describe('[Style] when parsing', () => {
|
||||
statement = statement + 'classDef exClass background:#bbb,border:1px solid red;' + '\n';
|
||||
statement = statement + 'a-->b[1 a a text!.]:::exClass;' + '\n';
|
||||
|
||||
const res = flow.parser.parse(statement);
|
||||
const vertices = flow.parser.yy.getVertices();
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const res = flow.parse(statement);
|
||||
const vertices = flow.yy.getVertices();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(vertices.get('b').classes[0]).toBe('exClass');
|
||||
@@ -249,10 +249,10 @@ describe('[Style] when parsing', () => {
|
||||
statement = statement + 'a-->b;' + '\n';
|
||||
statement = statement + 'class a,b exClass;';
|
||||
|
||||
const res = flow.parser.parse(statement);
|
||||
const res = flow.parse(statement);
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const vertices = flow.parser.yy.getVertices();
|
||||
const classes = flow.yy.getClasses();
|
||||
const vertices = flow.yy.getVertices();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(classes.get('exClass').styles[0]).toBe('background:#bbb');
|
||||
@@ -262,7 +262,7 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle style definitions with more then 1 digit in a row', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\n' +
|
||||
'A-->B1\n' +
|
||||
'A-->B2\n' +
|
||||
@@ -278,8 +278,8 @@ describe('[Style] when parsing', () => {
|
||||
'linkStyle 10 stroke-width:1px;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
@@ -299,17 +299,17 @@ describe('[Style] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle style definitions within number of edges', function () {
|
||||
const res = flow.parser.parse(`graph TD
|
||||
const res = flow.parse(`graph TD
|
||||
A-->B
|
||||
linkStyle 0 stroke-width:1px;`);
|
||||
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].style[0]).toBe('stroke-width:1px');
|
||||
});
|
||||
|
||||
it('should handle multi-numbered style definitions with more then 1 digit in a row', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\n' +
|
||||
'A-->B1\n' +
|
||||
'A-->B2\n' +
|
||||
@@ -326,41 +326,41 @@ describe('[Style] when parsing', () => {
|
||||
'linkStyle 10,11 stroke-width:1px;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
|
||||
it('should handle classDefs with style in classes', function () {
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nclassDef exClass font-style:bold;');
|
||||
const res = flow.parse('graph TD\nA-->B\nclassDef exClass font-style:bold;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
|
||||
it('should handle classDefs with % in classes', function () {
|
||||
const res = flow.parser.parse(
|
||||
const res = flow.parse(
|
||||
'graph TD\nA-->B\nclassDef exClass fill:#f96,stroke:#333,stroke-width:4px,font-size:50%,font-style:bold;'
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
|
||||
it('should handle multiple vertices with style', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
classDef C1 stroke-dasharray:4
|
||||
classDef C2 stroke-dasharray:6
|
||||
A & B:::C1 & D:::C1 --> E:::C2
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(vert.get('A').classes.length).toBe(0);
|
||||
expect(vert.get('B').classes[0]).toBe('C1');
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,187 +8,187 @@ setConfig({
|
||||
|
||||
describe('[Text] when parsing', () => {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
});
|
||||
|
||||
describe('it should handle text on edges', function () {
|
||||
it('should handle text without space', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|textNoSpace|B;');
|
||||
const res = flow.parse('graph TD;A--x|textNoSpace|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle with space', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|text including space|B;');
|
||||
const res = flow.parse('graph TD;A--x|text including space|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle text with /', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|text with / should work|B;');
|
||||
const res = flow.parse('graph TD;A--x|text with / should work|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].text).toBe('text with / should work');
|
||||
});
|
||||
|
||||
it('should handle space and space between vertices and link', function () {
|
||||
const res = flow.parser.parse('graph TD;A --x|textNoSpace| B;');
|
||||
const res = flow.parse('graph TD;A --x|textNoSpace| B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle space and CAPS', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|text including CAPS space|B;');
|
||||
const res = flow.parse('graph TD;A--x|text including CAPS space|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle space and dir', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|text including URL space|B;');
|
||||
const res = flow.parse('graph TD;A--x|text including URL space|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(edges[0].text).toBe('text including URL space');
|
||||
});
|
||||
|
||||
it('should handle space and send', function () {
|
||||
const res = flow.parser.parse('graph TD;A--text including URL space and send-->B;');
|
||||
const res = flow.parse('graph TD;A--text including URL space and send-->B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
expect(edges[0].text).toBe('text including URL space and send');
|
||||
});
|
||||
it('should handle space and send', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- text including URL space and send -->B;');
|
||||
const res = flow.parse('graph TD;A-- text including URL space and send -->B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
expect(edges[0].text).toBe('text including URL space and send');
|
||||
});
|
||||
|
||||
it('should handle space and dir (TD)', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|text including R TD space|B;');
|
||||
const res = flow.parse('graph TD;A--x|text including R TD space|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(edges[0].text).toBe('text including R TD space');
|
||||
});
|
||||
it('should handle `', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|text including `|B;');
|
||||
const res = flow.parse('graph TD;A--x|text including `|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(edges[0].text).toBe('text including `');
|
||||
});
|
||||
it('should handle v in node ids only v', function () {
|
||||
// only v
|
||||
const res = flow.parser.parse('graph TD;A--xv(my text);');
|
||||
const res = flow.parse('graph TD;A--xv(my text);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(vert.get('v').text).toBe('my text');
|
||||
});
|
||||
it('should handle v in node ids v at end', function () {
|
||||
// v at end
|
||||
const res = flow.parser.parse('graph TD;A--xcsv(my text);');
|
||||
const res = flow.parse('graph TD;A--xcsv(my text);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(vert.get('csv').text).toBe('my text');
|
||||
});
|
||||
it('should handle v in node ids v in middle', function () {
|
||||
// v in middle
|
||||
const res = flow.parser.parse('graph TD;A--xava(my text);');
|
||||
const res = flow.parse('graph TD;A--xava(my text);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(vert.get('ava').text).toBe('my text');
|
||||
});
|
||||
it('should handle v in node ids, v at start', function () {
|
||||
// v at start
|
||||
const res = flow.parser.parse('graph TD;A--xva(my text);');
|
||||
const res = flow.parse('graph TD;A--xva(my text);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(vert.get('va').text).toBe('my text');
|
||||
});
|
||||
it('should handle keywords', function () {
|
||||
const res = flow.parser.parse('graph TD;A--x|text including graph space|B;');
|
||||
const res = flow.parse('graph TD;A--x|text including graph space|B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].text).toBe('text including graph space');
|
||||
});
|
||||
it('should handle keywords', function () {
|
||||
const res = flow.parser.parse('graph TD;V-->a[v]');
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const res = flow.parse('graph TD;V-->a[v]');
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('a').text).toBe('v');
|
||||
});
|
||||
it('should handle quoted text', function () {
|
||||
const res = flow.parser.parse('graph TD;V-- "test string()" -->a[v]');
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const res = flow.parse('graph TD;V-- "test string()" -->a[v]');
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(edges[0].text).toBe('test string()');
|
||||
});
|
||||
});
|
||||
|
||||
describe('it should handle text on lines', () => {
|
||||
it('should handle normal text on lines', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- test text with == -->B;');
|
||||
const res = flow.parse('graph TD;A-- test text with == -->B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].stroke).toBe('normal');
|
||||
});
|
||||
it('should handle dotted text on lines (TD3)', function () {
|
||||
const res = flow.parser.parse('graph TD;A-. test text with == .->B;');
|
||||
const res = flow.parse('graph TD;A-. test text with == .->B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].stroke).toBe('dotted');
|
||||
});
|
||||
it('should handle thick text on lines', function () {
|
||||
const res = flow.parser.parse('graph TD;A== test text with - ==>B;');
|
||||
const res = flow.parse('graph TD;A== test text with - ==>B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].stroke).toBe('thick');
|
||||
});
|
||||
@@ -196,99 +196,99 @@ describe('[Text] when parsing', () => {
|
||||
|
||||
describe('it should handle text on edges using the new notation', function () {
|
||||
it('should handle text without space', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- textNoSpace --xB;');
|
||||
const res = flow.parse('graph TD;A-- textNoSpace --xB;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle text with multiple leading space', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- textNoSpace --xB;');
|
||||
const res = flow.parse('graph TD;A-- textNoSpace --xB;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle with space', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- text including space --xB;');
|
||||
const res = flow.parse('graph TD;A-- text including space --xB;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle text with /', function () {
|
||||
const res = flow.parser.parse('graph TD;A -- text with / should work --x B;');
|
||||
const res = flow.parse('graph TD;A -- text with / should work --x B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].text).toBe('text with / should work');
|
||||
});
|
||||
|
||||
it('should handle space and space between vertices and link', function () {
|
||||
const res = flow.parser.parse('graph TD;A -- textNoSpace --x B;');
|
||||
const res = flow.parse('graph TD;A -- textNoSpace --x B;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle space and CAPS', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- text including CAPS space --xB;');
|
||||
const res = flow.parse('graph TD;A-- text including CAPS space --xB;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
});
|
||||
|
||||
it('should handle space and dir', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- text including URL space --xB;');
|
||||
const res = flow.parse('graph TD;A-- text including URL space --xB;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(edges[0].text).toBe('text including URL space');
|
||||
});
|
||||
|
||||
it('should handle space and dir (TD2)', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- text including R TD space --xB;');
|
||||
const res = flow.parse('graph TD;A-- text including R TD space --xB;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_cross');
|
||||
expect(edges[0].text).toBe('text including R TD space');
|
||||
});
|
||||
it('should handle keywords', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- text including graph space and v --xB;');
|
||||
const res = flow.parse('graph TD;A-- text including graph space and v --xB;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].text).toBe('text including graph space and v');
|
||||
});
|
||||
it('should handle keywords', function () {
|
||||
const res = flow.parser.parse('graph TD;A-- text including graph space and v --xB[blav]');
|
||||
const res = flow.parse('graph TD;A-- text including graph space and v --xB[blav]');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].text).toBe('text including graph space and v');
|
||||
});
|
||||
// it.skip('should handle text on open links',function(){
|
||||
// const res = flow.parser.parse('graph TD;A-- text including graph space --B');
|
||||
// const res = flow.parse('graph TD;A-- text including graph space --B');
|
||||
//
|
||||
// const vert = flow.parser.yy.getVertices();
|
||||
// const edges = flow.parser.yy.getEdges();
|
||||
// const vert = flow.yy.getVertices();
|
||||
// const edges = flow.yy.getEdges();
|
||||
//
|
||||
// expect(edges[0].text).toBe('text including graph space');
|
||||
//
|
||||
@@ -297,10 +297,10 @@ describe('[Text] when parsing', () => {
|
||||
|
||||
describe('it should handle text in vertices, ', function () {
|
||||
it('should handle space', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->C(Chimpansen hoppar);');
|
||||
const res = flow.parse('graph TD;A-->C(Chimpansen hoppar);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('C').type).toBe('round');
|
||||
expect(vert.get('C').text).toBe('Chimpansen hoppar');
|
||||
@@ -347,109 +347,109 @@ describe('[Text] when parsing', () => {
|
||||
|
||||
shapes.forEach((shape) => {
|
||||
it.each(keywords)(`should handle %s keyword in ${shape.name} vertex`, function (keyword) {
|
||||
const rest = flow.parser.parse(
|
||||
const rest = flow.parse(
|
||||
`graph TD;A_${keyword}_node-->B${shape.start}This node has a ${keyword} as text${shape.end};`
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('B').type).toBe(`${shape.name}`);
|
||||
expect(vert.get('B').text).toBe(`This node has a ${keyword} as text`);
|
||||
});
|
||||
});
|
||||
|
||||
it.each(keywords)('should handle %s keyword in rect vertex', function (keyword) {
|
||||
const rest = flow.parser.parse(
|
||||
const rest = flow.parse(
|
||||
`graph TD;A_${keyword}_node-->B[|borders:lt|This node has a ${keyword} as text];`
|
||||
);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('B').type).toBe('rect');
|
||||
expect(vert.get('B').text).toBe(`This node has a ${keyword} as text`);
|
||||
});
|
||||
|
||||
it('should handle edge case for odd vertex with node id ending with minus', function () {
|
||||
const res = flow.parser.parse('graph TD;A_node-->odd->Vertex Text];');
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const res = flow.parse('graph TD;A_node-->odd->Vertex Text];');
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(vert.get('odd-').type).toBe('odd');
|
||||
expect(vert.get('odd-').text).toBe('Vertex Text');
|
||||
});
|
||||
it('should allow forward slashes in lean_right vertices', function () {
|
||||
const rest = flow.parser.parse(`graph TD;A_node-->B[/This node has a / as text/];`);
|
||||
const rest = flow.parse(`graph TD;A_node-->B[/This node has a / as text/];`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('B').type).toBe('lean_right');
|
||||
expect(vert.get('B').text).toBe(`This node has a / as text`);
|
||||
});
|
||||
|
||||
it('should allow back slashes in lean_left vertices', function () {
|
||||
const rest = flow.parser.parse(`graph TD;A_node-->B[\\This node has a \\ as text\\];`);
|
||||
const rest = flow.parse(`graph TD;A_node-->B[\\This node has a \\ as text\\];`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
expect(vert.get('B').type).toBe('lean_left');
|
||||
expect(vert.get('B').text).toBe(`This node has a \\ as text`);
|
||||
});
|
||||
|
||||
it('should handle åäö and minus', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->C{Chimpansen hoppar åäö-ÅÄÖ};');
|
||||
const res = flow.parse('graph TD;A-->C{Chimpansen hoppar åäö-ÅÄÖ};');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('C').type).toBe('diamond');
|
||||
expect(vert.get('C').text).toBe('Chimpansen hoppar åäö-ÅÄÖ');
|
||||
});
|
||||
|
||||
it('should handle with åäö, minus and space and br', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->C(Chimpansen hoppar åäö <br> - ÅÄÖ);');
|
||||
const res = flow.parse('graph TD;A-->C(Chimpansen hoppar åäö <br> - ÅÄÖ);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('C').type).toBe('round');
|
||||
expect(vert.get('C').text).toBe('Chimpansen hoppar åäö <br> - ÅÄÖ');
|
||||
});
|
||||
// it.skip('should handle åäö, minus and space and br',function(){
|
||||
// const res = flow.parser.parse('graph TD; A[Object(foo,bar)]-->B(Thing);');
|
||||
// const res = flow.parse('graph TD; A[Object(foo,bar)]-->B(Thing);');
|
||||
//
|
||||
// const vert = flow.parser.yy.getVertices();
|
||||
// const edges = flow.parser.yy.getEdges();
|
||||
// const vert = flow.yy.getVertices();
|
||||
// const edges = flow.yy.getEdges();
|
||||
//
|
||||
// expect(vert.get('C').type).toBe('round');
|
||||
// expect(vert.get('C').text).toBe(' A[Object(foo,bar)]-->B(Thing);');
|
||||
// });
|
||||
it('should handle unicode chars', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->C(Начало);');
|
||||
const res = flow.parse('graph TD;A-->C(Начало);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(vert.get('C').text).toBe('Начало');
|
||||
});
|
||||
it('should handle backslash', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->C(c:\\windows);');
|
||||
const res = flow.parse('graph TD;A-->C(c:\\windows);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const vert = flow.yy.getVertices();
|
||||
|
||||
expect(vert.get('C').text).toBe('c:\\windows');
|
||||
});
|
||||
it('should handle CAPS', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->C(some CAPS);');
|
||||
const res = flow.parse('graph TD;A-->C(some CAPS);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('C').type).toBe('round');
|
||||
expect(vert.get('C').text).toBe('some CAPS');
|
||||
});
|
||||
it('should handle directions', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->C(some URL);');
|
||||
const res = flow.parse('graph TD;A-->C(some URL);');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('C').type).toBe('round');
|
||||
expect(vert.get('C').text).toBe('some URL');
|
||||
@@ -457,10 +457,10 @@ describe('[Text] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle multi-line text', function () {
|
||||
const res = flow.parser.parse('graph TD;A--o|text space|B;\n B-->|more text with space|C;');
|
||||
const res = flow.parse('graph TD;A--o|text space|B;\n B-->|more text with space|C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_circle');
|
||||
expect(edges[1].type).toBe('arrow_point');
|
||||
@@ -477,102 +477,102 @@ describe('[Text] when parsing', () => {
|
||||
});
|
||||
|
||||
it('should handle text in vertices with space', function () {
|
||||
const res = flow.parser.parse('graph TD;A[chimpansen hoppar]-->C;');
|
||||
const res = flow.parse('graph TD;A[chimpansen hoppar]-->C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').type).toBe('square');
|
||||
expect(vert.get('A').text).toBe('chimpansen hoppar');
|
||||
});
|
||||
|
||||
it('should handle text in vertices with space with spaces between vertices and link', function () {
|
||||
const res = flow.parser.parse('graph TD;A[chimpansen hoppar] --> C;');
|
||||
const res = flow.parse('graph TD;A[chimpansen hoppar] --> C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').type).toBe('square');
|
||||
expect(vert.get('A').text).toBe('chimpansen hoppar');
|
||||
});
|
||||
it('should handle text including _ in vertices', function () {
|
||||
const res = flow.parser.parse('graph TD;A[chimpansen_hoppar] --> C;');
|
||||
const res = flow.parse('graph TD;A[chimpansen_hoppar] --> C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').type).toBe('square');
|
||||
expect(vert.get('A').text).toBe('chimpansen_hoppar');
|
||||
});
|
||||
|
||||
it('should handle quoted text in vertices ', function () {
|
||||
const res = flow.parser.parse('graph TD;A["chimpansen hoppar ()[]"] --> C;');
|
||||
const res = flow.parse('graph TD;A["chimpansen hoppar ()[]"] --> C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').type).toBe('square');
|
||||
expect(vert.get('A').text).toBe('chimpansen hoppar ()[]');
|
||||
});
|
||||
|
||||
it('should handle text in circle vertices with space', function () {
|
||||
const res = flow.parser.parse('graph TD;A((chimpansen hoppar))-->C;');
|
||||
const res = flow.parse('graph TD;A((chimpansen hoppar))-->C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').type).toBe('circle');
|
||||
expect(vert.get('A').text).toBe('chimpansen hoppar');
|
||||
});
|
||||
|
||||
it('should handle text in ellipse vertices', function () {
|
||||
const res = flow.parser.parse('graph TD\nA(-this is an ellipse-)-->B');
|
||||
const res = flow.parse('graph TD\nA(-this is an ellipse-)-->B');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').type).toBe('ellipse');
|
||||
expect(vert.get('A').text).toBe('this is an ellipse');
|
||||
});
|
||||
|
||||
it('should not freeze when ellipse text has a `(`', function () {
|
||||
expect(() => flow.parser.parse('graph\nX(- My Text (')).toThrowError();
|
||||
expect(() => flow.parse('graph\nX(- My Text (')).toThrowError();
|
||||
});
|
||||
|
||||
it('should handle text in diamond vertices with space', function () {
|
||||
const res = flow.parser.parse('graph TD;A(chimpansen hoppar)-->C;');
|
||||
const res = flow.parse('graph TD;A(chimpansen hoppar)-->C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').type).toBe('round');
|
||||
expect(vert.get('A').text).toBe('chimpansen hoppar');
|
||||
});
|
||||
|
||||
it('should handle text in with ?', function () {
|
||||
const res = flow.parser.parse('graph TD;A(?)-->|?|C;');
|
||||
const res = flow.parse('graph TD;A(?)-->|?|C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').text).toBe('?');
|
||||
expect(edges[0].text).toBe('?');
|
||||
});
|
||||
it('should handle text in with éèêàçô', function () {
|
||||
const res = flow.parser.parse('graph TD;A(éèêàçô)-->|éèêàçô|C;');
|
||||
const res = flow.parse('graph TD;A(éèêàçô)-->|éèêàçô|C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').text).toBe('éèêàçô');
|
||||
expect(edges[0].text).toBe('éèêàçô');
|
||||
});
|
||||
|
||||
it('should handle text in with ,.?!+-*', function () {
|
||||
const res = flow.parser.parse('graph TD;A(,.?!+-*)-->|,.?!+-*|C;');
|
||||
const res = flow.parse('graph TD;A(,.?!+-*)-->|,.?!+-*|C;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').text).toBe(',.?!+-*');
|
||||
expect(edges[0].text).toBe(',.?!+-*');
|
||||
@@ -580,30 +580,30 @@ describe('[Text] when parsing', () => {
|
||||
|
||||
it('should throw error at nested set of brackets', function () {
|
||||
const str = 'graph TD; A[This is a () in text];';
|
||||
expect(() => flow.parser.parse(str)).toThrowError("got 'PS'");
|
||||
expect(() => flow.parse(str)).toThrowError("got 'PS'");
|
||||
});
|
||||
|
||||
it('should throw error for strings and text at the same time', function () {
|
||||
const str = 'graph TD;A(this node has "string" and text)-->|this link has "string" and text|C;';
|
||||
|
||||
expect(() => flow.parser.parse(str)).toThrowError("got 'STR'");
|
||||
expect(() => flow.parse(str)).toThrowError("got 'STR'");
|
||||
});
|
||||
|
||||
it('should throw error for escaping quotes in text state', function () {
|
||||
//prettier-ignore
|
||||
const str = 'graph TD; A[This is a \"()\" in text];'; //eslint-disable-line no-useless-escape
|
||||
|
||||
expect(() => flow.parser.parse(str)).toThrowError("got 'STR'");
|
||||
expect(() => flow.parse(str)).toThrowError("got 'STR'");
|
||||
});
|
||||
|
||||
it('should throw error for nested quotation marks', function () {
|
||||
const str = 'graph TD; A["This is a "()" in text"];';
|
||||
|
||||
expect(() => flow.parser.parse(str)).toThrowError("Expecting 'SQE'");
|
||||
expect(() => flow.parse(str)).toThrowError("Expecting 'SQE'");
|
||||
});
|
||||
|
||||
it('should throw error', function () {
|
||||
const str = `graph TD; node[hello ) world] --> works`;
|
||||
expect(() => flow.parser.parse(str)).toThrowError("got 'PE'");
|
||||
expect(() => flow.parse(str)).toThrowError("got 'PE'");
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,19 +8,19 @@ setConfig({
|
||||
|
||||
describe('when parsing flowcharts', function () {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.parser.yy.setGen('gen-2');
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
flow.yy.setGen('gen-2');
|
||||
});
|
||||
|
||||
it('should handle chaining of vertices', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
A-->B-->C;
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -36,13 +36,13 @@ describe('when parsing flowcharts', function () {
|
||||
expect(edges[1].text).toBe('');
|
||||
});
|
||||
it('should handle chaining of vertices', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
A & B --> C;
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -58,13 +58,13 @@ describe('when parsing flowcharts', function () {
|
||||
expect(edges[1].text).toBe('');
|
||||
});
|
||||
it('should multiple vertices in link statement in the beginning', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
A-->B & C;
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -80,13 +80,13 @@ describe('when parsing flowcharts', function () {
|
||||
expect(edges[1].text).toBe('');
|
||||
});
|
||||
it('should multiple vertices in link statement at the end', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
A & B--> C & D;
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -111,13 +111,13 @@ describe('when parsing flowcharts', function () {
|
||||
expect(edges[3].text).toBe('');
|
||||
});
|
||||
it('should handle chaining of vertices at both ends at once', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
A & B--> C & D;
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -142,13 +142,13 @@ describe('when parsing flowcharts', function () {
|
||||
expect(edges[3].text).toBe('');
|
||||
});
|
||||
it('should handle chaining and multiple nodes in link statement FVC ', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
A --> B & B2 & C --> D2;
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(vert.get('A').id).toBe('A');
|
||||
expect(vert.get('B').id).toBe('B');
|
||||
@@ -182,16 +182,16 @@ describe('when parsing flowcharts', function () {
|
||||
expect(edges[5].text).toBe('');
|
||||
});
|
||||
it('should handle chaining and multiple nodes in link statement with extra info in statements', function () {
|
||||
const res = flow.parser.parse(`
|
||||
const res = flow.parse(`
|
||||
graph TD
|
||||
A[ h ] -- hello --> B[" test "]:::exClass & C --> D;
|
||||
classDef exClass background:#bbb,border:1px solid red;
|
||||
`);
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
const classes = flow.parser.yy.getClasses();
|
||||
const classes = flow.yy.getClasses();
|
||||
|
||||
expect(classes.get('exClass').styles.length).toBe(2);
|
||||
expect(classes.get('exClass').styles[0]).toBe('background:#bbb');
|
||||
|
||||
951
packages/mermaid/src/diagrams/flowchart/parser/flowAst.ts
Normal file
951
packages/mermaid/src/diagrams/flowchart/parser/flowAst.ts
Normal file
@@ -0,0 +1,951 @@
|
||||
import type { IToken } from 'chevrotain';
|
||||
import { FlowchartParser } from './flowParser.js';
|
||||
|
||||
// Define interfaces matching existing Mermaid structures
|
||||
interface Vertex {
|
||||
id: string;
|
||||
text?: string;
|
||||
type?: string;
|
||||
style?: string;
|
||||
classes?: string[];
|
||||
dir?: string;
|
||||
props?: Record<string, string>;
|
||||
}
|
||||
|
||||
interface Edge {
|
||||
start: string | string[];
|
||||
end: string | string[];
|
||||
type?: string;
|
||||
stroke?: string;
|
||||
length?: number;
|
||||
text?: string;
|
||||
}
|
||||
|
||||
interface ParseResult {
|
||||
vertices: Record<string, Vertex>;
|
||||
edges: Edge[];
|
||||
classes: Record<string, string>;
|
||||
subGraphs: any[];
|
||||
direction: string;
|
||||
clickEvents: any[];
|
||||
tooltips: Record<string, string>;
|
||||
accTitle: string;
|
||||
accDescription: string;
|
||||
}
|
||||
|
||||
const BaseVisitor = new FlowchartParser().getBaseCstVisitorConstructor();
|
||||
|
||||
export class FlowchartAstVisitor extends BaseVisitor {
|
||||
private vertices: Record<string, Vertex> = {};
|
||||
private edges: Edge[] = [];
|
||||
private classes: Record<string, string> = {};
|
||||
private subGraphs: any[] = [];
|
||||
private direction = 'TB';
|
||||
private clickEvents: any[] = [];
|
||||
private tooltips: Record<string, string> = {};
|
||||
private subCount = 0;
|
||||
private accTitle = '';
|
||||
private accDescription = '';
|
||||
|
||||
constructor() {
|
||||
super();
|
||||
this.validateVisitor();
|
||||
}
|
||||
|
||||
// Clear visitor state for new parse
|
||||
clear(): void {
|
||||
this.vertices = {};
|
||||
this.edges = [];
|
||||
this.classes = {};
|
||||
this.subGraphs = [];
|
||||
this.direction = 'TB';
|
||||
this.clickEvents = [];
|
||||
this.tooltips = {};
|
||||
this.subCount = 0;
|
||||
this.lastNodeId = '';
|
||||
this.accTitle = '';
|
||||
this.accDescription = '';
|
||||
}
|
||||
|
||||
flowchart(ctx: any): ParseResult {
|
||||
this.visit(ctx.graphDeclaration);
|
||||
|
||||
if (ctx.statement) {
|
||||
ctx.statement.forEach((stmt: any) => this.visit(stmt));
|
||||
}
|
||||
|
||||
return {
|
||||
vertices: this.vertices,
|
||||
edges: this.edges,
|
||||
classes: this.classes,
|
||||
subGraphs: this.subGraphs,
|
||||
direction: this.direction,
|
||||
clickEvents: this.clickEvents,
|
||||
tooltips: this.tooltips,
|
||||
accTitle: this.accTitle,
|
||||
accDescription: this.accDescription,
|
||||
};
|
||||
}
|
||||
|
||||
graphDeclaration(ctx: any): void {
|
||||
if (ctx.DirectionValue) {
|
||||
this.direction = ctx.DirectionValue[0].image;
|
||||
}
|
||||
}
|
||||
|
||||
statement(ctx: any): void {
|
||||
if (ctx.vertexStatement) {
|
||||
this.visit(ctx.vertexStatement);
|
||||
} else if (ctx.styleStatement) {
|
||||
this.visit(ctx.styleStatement);
|
||||
} else if (ctx.linkStyleStatement) {
|
||||
this.visit(ctx.linkStyleStatement);
|
||||
} else if (ctx.classDefStatement) {
|
||||
this.visit(ctx.classDefStatement);
|
||||
} else if (ctx.classStatement) {
|
||||
this.visit(ctx.classStatement);
|
||||
} else if (ctx.clickStatement) {
|
||||
this.visit(ctx.clickStatement);
|
||||
} else if (ctx.subgraphStatement) {
|
||||
this.visit(ctx.subgraphStatement);
|
||||
} else if (ctx.directionStatement) {
|
||||
this.visit(ctx.directionStatement);
|
||||
} else if (ctx.accStatement) {
|
||||
this.visit(ctx.accStatement);
|
||||
}
|
||||
}
|
||||
|
||||
// Statement separator (does nothing)
|
||||
statementSeparator(_ctx: any): void {
|
||||
// No action needed for separators
|
||||
}
|
||||
|
||||
// Vertex statement - handles node chains with links
|
||||
vertexStatement(ctx: any): void {
|
||||
const nodes = ctx.node || [];
|
||||
const links = ctx.link || [];
|
||||
|
||||
// Process first node and collect all node IDs (for ampersand syntax)
|
||||
let startNodeIds: string[] = [];
|
||||
if (nodes.length > 0) {
|
||||
startNodeIds = this.visit(nodes[0]);
|
||||
}
|
||||
|
||||
// Process alternating links and nodes
|
||||
for (const [i, link] of links.entries()) {
|
||||
const linkData = this.visit(link);
|
||||
const nextNode = nodes[i + 1];
|
||||
|
||||
if (nextNode) {
|
||||
// Collect target node IDs (for ampersand syntax)
|
||||
const endNodeIds = this.visit(nextNode);
|
||||
|
||||
// Create edges from each start node to each end node
|
||||
for (const startNodeId of startNodeIds) {
|
||||
for (const endNodeId of endNodeIds) {
|
||||
const edge: any = {
|
||||
start: startNodeId,
|
||||
end: endNodeId,
|
||||
type: linkData.type,
|
||||
text: linkData.text || '',
|
||||
};
|
||||
|
||||
// Include length property if present
|
||||
if (linkData.length) {
|
||||
edge.length = linkData.length;
|
||||
}
|
||||
|
||||
this.edges.push(edge);
|
||||
}
|
||||
}
|
||||
|
||||
// Update start nodes for next iteration
|
||||
startNodeIds = endNodeIds;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Node - handles multiple nodes with ampersand
|
||||
node(ctx: any): string[] {
|
||||
const styledVertices = ctx.styledVertex || [];
|
||||
const nodeIds: string[] = [];
|
||||
for (const vertex of styledVertices) {
|
||||
this.visit(vertex);
|
||||
// Collect the node ID that was just processed
|
||||
nodeIds.push(this.lastNodeId);
|
||||
}
|
||||
return nodeIds;
|
||||
}
|
||||
|
||||
// Styled vertex
|
||||
styledVertex(ctx: any): void {
|
||||
if (ctx.vertex) {
|
||||
this.visit(ctx.vertex);
|
||||
}
|
||||
// TODO: Handle styling
|
||||
}
|
||||
|
||||
// Vertex - handles different node shapes
|
||||
vertex(ctx: any): void {
|
||||
const nodeIds = ctx.nodeId || [];
|
||||
const nodeTexts = ctx.nodeText || [];
|
||||
|
||||
if (nodeIds.length > 0) {
|
||||
const nodeId = this.visit(nodeIds[0]);
|
||||
let nodeText = nodeId;
|
||||
let nodeType = 'default';
|
||||
|
||||
// Determine node type based on tokens present
|
||||
if (ctx.SquareStart) {
|
||||
nodeType = 'square';
|
||||
if (nodeTexts.length > 0) {
|
||||
nodeText = this.visit(nodeTexts[0]);
|
||||
}
|
||||
} else if (ctx.DoubleCircleStart) {
|
||||
nodeType = 'doublecircle';
|
||||
if (nodeTexts.length > 0) {
|
||||
nodeText = this.visit(nodeTexts[0]);
|
||||
}
|
||||
} else if (ctx.CircleStart) {
|
||||
nodeType = 'circle';
|
||||
if (nodeTexts.length > 0) {
|
||||
nodeText = this.visit(nodeTexts[0]);
|
||||
}
|
||||
} else if (ctx.PS) {
|
||||
nodeType = 'round';
|
||||
if (nodeTexts.length > 0) {
|
||||
nodeText = this.visit(nodeTexts[0]);
|
||||
}
|
||||
} else if (ctx.HexagonStart) {
|
||||
nodeType = 'hexagon';
|
||||
if (nodeTexts.length > 0) {
|
||||
nodeText = this.visit(nodeTexts[0]);
|
||||
}
|
||||
} else if (ctx.DiamondStart) {
|
||||
nodeType = 'diamond';
|
||||
if (nodeTexts.length > 0) {
|
||||
nodeText = this.visit(nodeTexts[0]);
|
||||
}
|
||||
}
|
||||
|
||||
// Add vertex to the graph
|
||||
this.vertices[nodeId] = {
|
||||
id: nodeId,
|
||||
text: nodeText,
|
||||
type: nodeType,
|
||||
classes: [],
|
||||
};
|
||||
this.lastNodeId = nodeId;
|
||||
}
|
||||
}
|
||||
|
||||
// Helper to get last processed node ID
|
||||
private lastNodeId = '';
|
||||
private getLastNodeId(): string {
|
||||
return this.lastNodeId;
|
||||
}
|
||||
|
||||
nodeStatement(ctx: any): void {
|
||||
const nodes: string[] = [];
|
||||
|
||||
// Process first node
|
||||
const firstNode = this.visit(ctx.nodeDefinition[0]);
|
||||
nodes.push(firstNode.id);
|
||||
|
||||
// Process additional nodes (connected with &)
|
||||
if (ctx.Ampersand) {
|
||||
ctx.nodeDefinition.slice(1).forEach((nodeDef: any) => {
|
||||
const node = this.visit(nodeDef);
|
||||
nodes.push(node.id);
|
||||
});
|
||||
}
|
||||
|
||||
// Process link chain if present
|
||||
if (ctx.linkChain) {
|
||||
const linkedNodes = this.visit(ctx.linkChain);
|
||||
// Create edges between nodes
|
||||
const startNodes = nodes;
|
||||
linkedNodes.forEach((linkData: any) => {
|
||||
const edge: any = {
|
||||
start: startNodes,
|
||||
end: linkData.node,
|
||||
type: linkData.linkType,
|
||||
text: linkData.linkText,
|
||||
};
|
||||
|
||||
// Include length property if present
|
||||
if (linkData.linkLength) {
|
||||
edge.length = linkData.linkLength;
|
||||
}
|
||||
|
||||
this.edges.push(edge);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
nodeDefinition(ctx: any): any {
|
||||
const nodeId = this.visit(ctx.nodeId);
|
||||
let text = nodeId;
|
||||
let type = 'default';
|
||||
|
||||
if (ctx.nodeShape) {
|
||||
const shapeData = this.visit(ctx.nodeShape);
|
||||
text = shapeData.text || nodeId;
|
||||
type = shapeData.type;
|
||||
}
|
||||
|
||||
// Add to vertices if not exists
|
||||
if (!this.vertices[nodeId]) {
|
||||
this.vertices[nodeId] = {
|
||||
id: nodeId,
|
||||
text: text,
|
||||
type: type,
|
||||
};
|
||||
}
|
||||
|
||||
// Handle style class
|
||||
if (ctx.StyleSeparator && ctx.className) {
|
||||
const className = this.visit(ctx.className);
|
||||
this.vertices[nodeId].classes = [className];
|
||||
}
|
||||
|
||||
return { id: nodeId, text, type };
|
||||
}
|
||||
|
||||
nodeId(ctx: any): string {
|
||||
let nodeId = '';
|
||||
|
||||
if (ctx.NODE_STRING) {
|
||||
nodeId = ctx.NODE_STRING[0].image;
|
||||
} else if (ctx.NumberToken) {
|
||||
nodeId = ctx.NumberToken[0].image;
|
||||
} else if (ctx.Default) {
|
||||
nodeId = ctx.Default[0].image;
|
||||
} else if (ctx.Ampersand) {
|
||||
// Standalone & (uses CONSUME2)
|
||||
nodeId = ctx.Ampersand[0].image;
|
||||
} else if (ctx.Minus) {
|
||||
// Standalone - (uses CONSUME2)
|
||||
nodeId = ctx.Minus[0].image;
|
||||
} else if (ctx.DirectionValue) {
|
||||
// Standalone direction value (uses CONSUME2)
|
||||
nodeId = ctx.DirectionValue[0].image;
|
||||
} else if (ctx.Colon) {
|
||||
// Standalone : character
|
||||
nodeId = ctx.Colon[0].image;
|
||||
} else if (ctx.Comma) {
|
||||
// Standalone , character
|
||||
nodeId = ctx.Comma[0].image;
|
||||
}
|
||||
|
||||
// If no nodeId was found, it might be an empty context
|
||||
if (!nodeId) {
|
||||
throw new Error('Unable to parse node ID from context');
|
||||
}
|
||||
|
||||
// Validate node ID against reserved keywords
|
||||
this.validateNodeId(nodeId);
|
||||
|
||||
return nodeId;
|
||||
}
|
||||
|
||||
private validateNodeId(nodeId: string): void {
|
||||
// Keywords that should throw errors when used as node ID prefixes
|
||||
const errorKeywords = [
|
||||
'graph',
|
||||
'flowchart',
|
||||
'flowchart-elk',
|
||||
'style',
|
||||
'linkStyle',
|
||||
'interpolate',
|
||||
'classDef',
|
||||
'class',
|
||||
'_self',
|
||||
'_blank',
|
||||
'_parent',
|
||||
'_top',
|
||||
'end',
|
||||
'subgraph',
|
||||
];
|
||||
|
||||
// Check if node ID starts with any error keyword followed by a delimiter
|
||||
// This matches the original JISON parser behavior where keywords are only
|
||||
// rejected when followed by delimiters like '.', '-', '/', etc.
|
||||
for (const keyword of errorKeywords) {
|
||||
if (nodeId.startsWith(keyword)) {
|
||||
// Allow if the keyword is not followed by a delimiter (e.g., "endpoint" is OK, "end.node" is not)
|
||||
const afterKeyword = nodeId.substring(keyword.length);
|
||||
if (afterKeyword.length === 0 || /^[./\-]/.test(afterKeyword)) {
|
||||
throw new Error(`Node ID cannot start with reserved keyword: ${keyword}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
nodeShape(ctx: any): any {
|
||||
if (ctx.squareShape) {
|
||||
return this.visit(ctx.squareShape);
|
||||
} else if (ctx.circleShape) {
|
||||
return this.visit(ctx.circleShape);
|
||||
} else if (ctx.diamondShape) {
|
||||
return this.visit(ctx.diamondShape);
|
||||
}
|
||||
return { type: 'default', text: '' };
|
||||
}
|
||||
|
||||
squareShape(ctx: any): any {
|
||||
const text = this.visit(ctx.nodeText);
|
||||
return { type: 'square', text };
|
||||
}
|
||||
|
||||
circleShape(ctx: any): any {
|
||||
const text = this.visit(ctx.nodeText);
|
||||
return { type: 'doublecircle', text };
|
||||
}
|
||||
|
||||
diamondShape(ctx: any): any {
|
||||
const text = this.visit(ctx.nodeText);
|
||||
return { type: 'diamond', text };
|
||||
}
|
||||
|
||||
nodeText(ctx: any): string {
|
||||
if (ctx.TextContent) {
|
||||
return ctx.TextContent[0].image;
|
||||
} else if (ctx.NODE_STRING) {
|
||||
return ctx.NODE_STRING[0].image;
|
||||
} else if (ctx.StringContent) {
|
||||
return ctx.StringContent[0].image;
|
||||
} else if (ctx.QuotedString) {
|
||||
// Remove quotes from quoted string
|
||||
const quoted = ctx.QuotedString[0].image;
|
||||
return quoted.slice(1, -1);
|
||||
} else if (ctx.NumberToken) {
|
||||
return ctx.NumberToken[0].image;
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
linkChain(ctx: any): any[] {
|
||||
const links: any[] = [];
|
||||
|
||||
for (const [i, link] of ctx.link.entries()) {
|
||||
const linkData = this.visit(link);
|
||||
const nodeData = this.visit(ctx.nodeDefinition[i]);
|
||||
|
||||
const linkInfo: any = {
|
||||
linkType: linkData.type,
|
||||
linkText: linkData.text,
|
||||
node: nodeData.id,
|
||||
};
|
||||
|
||||
// Include length property if present
|
||||
if (linkData.length) {
|
||||
linkInfo.linkLength = linkData.length;
|
||||
}
|
||||
|
||||
links.push(linkInfo);
|
||||
}
|
||||
|
||||
return links;
|
||||
}
|
||||
|
||||
link(ctx: any): any {
|
||||
let linkData = { type: 'arrow', text: '' };
|
||||
|
||||
if (ctx.linkStatement) {
|
||||
linkData = this.visit(ctx.linkStatement);
|
||||
} else if (ctx.linkWithEdgeText) {
|
||||
linkData = this.visit(ctx.linkWithEdgeText);
|
||||
} else if (ctx.linkWithArrowText) {
|
||||
linkData = this.visit(ctx.linkWithArrowText);
|
||||
}
|
||||
|
||||
return linkData;
|
||||
}
|
||||
|
||||
linkStatement(ctx: any): any {
|
||||
if (ctx.LINK) {
|
||||
return this.parseLinkToken(ctx.LINK[0]);
|
||||
} else if (ctx.THICK_LINK) {
|
||||
return this.parseLinkToken(ctx.THICK_LINK[0]);
|
||||
} else if (ctx.DOTTED_LINK) {
|
||||
return this.parseLinkToken(ctx.DOTTED_LINK[0]);
|
||||
}
|
||||
return { type: 'arrow', text: '' };
|
||||
}
|
||||
|
||||
parseLinkToken(token: IToken): any {
|
||||
const image = token.image;
|
||||
let type = 'arrow_point'; // Default for --> arrows
|
||||
let length: string | undefined = undefined;
|
||||
|
||||
// Check for bidirectional arrows first
|
||||
if (image.startsWith('<') && image.endsWith('>')) {
|
||||
if (image.includes('.')) {
|
||||
type = 'double_arrow_dotted';
|
||||
} else if (image.includes('=')) {
|
||||
type = 'double_arrow_thick';
|
||||
} else {
|
||||
type = 'double_arrow_point';
|
||||
}
|
||||
return { type, text: '', length };
|
||||
}
|
||||
|
||||
// Determine link type based on pattern
|
||||
if (image.includes('.')) {
|
||||
type = 'arrow_dotted';
|
||||
} else if (image.includes('=')) {
|
||||
type = 'arrow_thick';
|
||||
}
|
||||
|
||||
// Check for special endings
|
||||
if (image.endsWith('o')) {
|
||||
type = type.replace('_point', '_circle');
|
||||
type = type.replace('_dotted', '_dotted_circle');
|
||||
type = type.replace('_thick', '_thick_circle');
|
||||
} else if (image.endsWith('x')) {
|
||||
type = type.replace('_point', '_cross');
|
||||
type = type.replace('_dotted', '_dotted_cross');
|
||||
type = type.replace('_thick', '_thick_cross');
|
||||
} else if (image.endsWith('-') && !image.includes('.') && !image.includes('=')) {
|
||||
type = 'arrow_open'; // Open arrow (no arrowhead)
|
||||
}
|
||||
|
||||
// Determine arrow length based on number of dashes
|
||||
if (image.includes('-')) {
|
||||
const dashCount = (image.match(/-/g) || []).length;
|
||||
if (dashCount >= 6) {
|
||||
length = 'extralong'; // cspell:disable-line
|
||||
} else if (dashCount >= 4) {
|
||||
length = 'long';
|
||||
}
|
||||
}
|
||||
|
||||
const result: any = { type, text: '' };
|
||||
if (length) {
|
||||
result.length = length;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
linkWithEdgeText(ctx: any): any {
|
||||
let text = '';
|
||||
if (ctx.edgeText) {
|
||||
text = this.visit(ctx.edgeText);
|
||||
}
|
||||
|
||||
// Get the link type from the START_* token or EdgeTextEnd token
|
||||
let linkData: any = { type: 'arrow', text: '' };
|
||||
|
||||
// Check for bidirectional arrows first
|
||||
if (ctx.START_LINK && ctx.EdgeTextEnd) {
|
||||
const startToken = ctx.START_LINK[0].image;
|
||||
const endToken = ctx.EdgeTextEnd[0].image;
|
||||
|
||||
if (startToken.includes('<') && endToken.includes('>')) {
|
||||
if (startToken.includes('.') || endToken.includes('.')) {
|
||||
linkData = { type: 'double_arrow_dotted', text: '' };
|
||||
} else if (startToken.includes('=') || endToken.includes('=')) {
|
||||
linkData = { type: 'double_arrow_thick', text: '' };
|
||||
} else {
|
||||
linkData = { type: 'double_arrow_point', text: '' };
|
||||
}
|
||||
} else {
|
||||
linkData = { type: 'arrow_point', text: '' };
|
||||
|
||||
// Check for arrow length in START_LINK token
|
||||
const dashCount = (startToken.match(/-/g) || []).length;
|
||||
if (dashCount >= 6) {
|
||||
linkData.length = 'extralong'; // cspell:disable-line
|
||||
} else if (dashCount >= 4) {
|
||||
linkData.length = 'long';
|
||||
}
|
||||
}
|
||||
} else if (ctx.START_DOTTED_LINK) {
|
||||
linkData = { type: 'arrow_dotted', text: '' };
|
||||
} else if (ctx.START_THICK_LINK) {
|
||||
linkData = { type: 'arrow_thick', text: '' };
|
||||
} else if (ctx.START_LINK) {
|
||||
linkData = { type: 'arrow_point', text: '' };
|
||||
} else if (ctx.EdgeTextEnd) {
|
||||
linkData = this.parseLinkToken(ctx.EdgeTextEnd[0]);
|
||||
}
|
||||
|
||||
linkData.text = text;
|
||||
return linkData;
|
||||
}
|
||||
|
||||
linkWithArrowText(ctx: any): any {
|
||||
// Get the link type from the link token
|
||||
let linkData: any = { type: 'arrow', text: '' };
|
||||
if (ctx.LINK) {
|
||||
linkData = this.parseLinkToken(ctx.LINK[0]);
|
||||
} else if (ctx.THICK_LINK) {
|
||||
linkData = this.parseLinkToken(ctx.THICK_LINK[0]);
|
||||
} else if (ctx.DOTTED_LINK) {
|
||||
linkData = this.parseLinkToken(ctx.DOTTED_LINK[0]);
|
||||
}
|
||||
|
||||
// Get the arrow text
|
||||
if (ctx.arrowText) {
|
||||
linkData.text = this.visit(ctx.arrowText);
|
||||
}
|
||||
|
||||
return linkData;
|
||||
}
|
||||
|
||||
edgeText(ctx: any): string {
|
||||
let text = '';
|
||||
if (ctx.EdgeTextContent) {
|
||||
ctx.EdgeTextContent.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
// Note: EdgeTextPipe tokens (|) are delimiters and should not be included in the text
|
||||
if (ctx.NODE_STRING) {
|
||||
ctx.NODE_STRING.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.EDGE_TEXT) {
|
||||
return ctx.EDGE_TEXT[0].image;
|
||||
} else if (ctx.String) {
|
||||
return ctx.String[0].image.slice(1, -1); // Remove quotes
|
||||
} else if (ctx.MarkdownString) {
|
||||
return ctx.MarkdownString[0].image.slice(2, -2); // Remove markdown quotes
|
||||
}
|
||||
return text;
|
||||
}
|
||||
|
||||
linkText(ctx: any): string {
|
||||
let text = '';
|
||||
if (ctx.NodeText) {
|
||||
ctx.NodeText.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.String) {
|
||||
ctx.String.forEach((token: IToken) => {
|
||||
text += token.image.slice(1, -1); // Remove quotes
|
||||
});
|
||||
}
|
||||
if (ctx.MarkdownString) {
|
||||
ctx.MarkdownString.forEach((token: IToken) => {
|
||||
text += token.image.slice(2, -2); // Remove markdown quotes
|
||||
});
|
||||
}
|
||||
return text;
|
||||
}
|
||||
|
||||
arrowText(ctx: any): string {
|
||||
if (ctx.text) {
|
||||
return this.visit(ctx.text);
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
text(ctx: any): string {
|
||||
let text = '';
|
||||
if (ctx.TextContent) {
|
||||
ctx.TextContent.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.NODE_STRING) {
|
||||
ctx.NODE_STRING.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.NumberToken) {
|
||||
ctx.NumberToken.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.WhiteSpace) {
|
||||
ctx.WhiteSpace.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.Colon) {
|
||||
ctx.Colon.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.Minus) {
|
||||
ctx.Minus.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.Ampersand) {
|
||||
ctx.Ampersand.forEach((token: IToken) => {
|
||||
text += token.image;
|
||||
});
|
||||
}
|
||||
if (ctx.QuotedString) {
|
||||
ctx.QuotedString.forEach((token: IToken) => {
|
||||
// Remove quotes from quoted string
|
||||
text += token.image.slice(1, -1);
|
||||
});
|
||||
}
|
||||
return text;
|
||||
}
|
||||
|
||||
styleStatement(ctx: any): void {
|
||||
const nodeId = this.visit(ctx.nodeId);
|
||||
const styles = this.visit(ctx.styleList);
|
||||
|
||||
if (this.vertices[nodeId]) {
|
||||
// Ensure styles is an array before calling join
|
||||
const styleArray = Array.isArray(styles) ? styles : [styles];
|
||||
this.vertices[nodeId].style = styleArray.join(',');
|
||||
}
|
||||
}
|
||||
|
||||
classDefStatement(ctx: any): void {
|
||||
const className = this.visit(ctx.className);
|
||||
const styles = this.visit(ctx.styleList);
|
||||
|
||||
// Ensure styles is an array before calling join
|
||||
const styleArray = Array.isArray(styles) ? styles : [styles];
|
||||
this.classes[className] = styleArray.join(',');
|
||||
}
|
||||
|
||||
classStatement(ctx: any): void {
|
||||
const nodeIds = this.visit(ctx.nodeIdList);
|
||||
const className = this.visit(ctx.className);
|
||||
|
||||
nodeIds.forEach((nodeId: string) => {
|
||||
if (this.vertices[nodeId]) {
|
||||
this.vertices[nodeId].classes = [className];
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
clickStatement(ctx: any): void {
|
||||
const nodeId = this.visit(ctx.nodeId);
|
||||
|
||||
if (ctx.clickHref) {
|
||||
const hrefData = this.visit(ctx.clickHref);
|
||||
this.clickEvents.push({
|
||||
id: nodeId,
|
||||
type: 'href',
|
||||
href: hrefData.href,
|
||||
target: hrefData.target,
|
||||
});
|
||||
} else if (ctx.clickCall) {
|
||||
const callData = this.visit(ctx.clickCall);
|
||||
this.clickEvents.push({
|
||||
id: nodeId,
|
||||
type: 'call',
|
||||
functionName: callData.functionName,
|
||||
args: callData.args,
|
||||
});
|
||||
}
|
||||
|
||||
// Handle tooltip
|
||||
if (ctx.String) {
|
||||
const tooltip = ctx.String[0].image.slice(1, -1);
|
||||
this.tooltips[nodeId] = tooltip;
|
||||
}
|
||||
}
|
||||
|
||||
subgraphStatement(ctx: any): void {
|
||||
const subgraph: any = {
|
||||
id: `subGraph${this.subCount++}`,
|
||||
title: '',
|
||||
nodes: [],
|
||||
};
|
||||
|
||||
if (ctx.subgraphId) {
|
||||
subgraph.id = this.visit(ctx.subgraphId);
|
||||
}
|
||||
|
||||
if (ctx.nodeText) {
|
||||
subgraph.title = this.visit(ctx.nodeText);
|
||||
} else if (ctx.QuotedString) {
|
||||
subgraph.title = ctx.QuotedString[0].image.slice(1, -1); // Remove quotes
|
||||
}
|
||||
|
||||
// Store current state
|
||||
const prevVertices = this.vertices;
|
||||
|
||||
// Process subgraph statements
|
||||
if (ctx.statement) {
|
||||
ctx.statement.forEach((stmt: any) => this.visit(stmt));
|
||||
}
|
||||
|
||||
// Collect nodes added in subgraph
|
||||
Object.keys(this.vertices).forEach((key) => {
|
||||
if (!prevVertices[key]) {
|
||||
subgraph.nodes.push(key);
|
||||
}
|
||||
});
|
||||
|
||||
this.subGraphs.push(subgraph);
|
||||
}
|
||||
|
||||
directionStatement(ctx: any): void {
|
||||
this.direction = ctx.DirectionValue[0].image;
|
||||
}
|
||||
|
||||
// Helper methods for remaining rules...
|
||||
className(ctx: any): string {
|
||||
return ctx.NODE_STRING[0].image;
|
||||
}
|
||||
|
||||
nodeIdList(ctx: any): string[] {
|
||||
const ids = [this.visit(ctx.nodeId[0])];
|
||||
if (ctx.nodeId.length > 1) {
|
||||
ctx.nodeId.slice(1).forEach((node: any) => {
|
||||
ids.push(this.visit(node));
|
||||
});
|
||||
}
|
||||
return ids;
|
||||
}
|
||||
|
||||
styleList(ctx: any): string[] {
|
||||
const styles: string[] = [];
|
||||
if (ctx.style) {
|
||||
ctx.style.forEach((style: any) => {
|
||||
styles.push(this.visit(style));
|
||||
});
|
||||
}
|
||||
return styles;
|
||||
}
|
||||
|
||||
style(ctx: any): string {
|
||||
// Collect all tokens with their positions, excluding semicolons
|
||||
const allTokens: IToken[] = [];
|
||||
|
||||
Object.keys(ctx).forEach((key) => {
|
||||
if (ctx[key] && Array.isArray(ctx[key]) && key !== 'Semicolon') {
|
||||
allTokens.push(...ctx[key]);
|
||||
}
|
||||
});
|
||||
|
||||
// Sort tokens by their position in the input
|
||||
allTokens.sort((a, b) => a.startOffset - b.startOffset);
|
||||
|
||||
// Concatenate tokens in order
|
||||
return allTokens.map((token) => token.image).join('');
|
||||
}
|
||||
|
||||
standaloneLinkStatement(ctx: any): void {
|
||||
const startNodeId = this.visit(ctx.nodeId[0]);
|
||||
const endNodeId = this.visit(ctx.nodeId[1]);
|
||||
const linkData = this.visit(ctx.link);
|
||||
|
||||
const edge: any = {
|
||||
start: startNodeId,
|
||||
end: endNodeId,
|
||||
type: linkData.type,
|
||||
text: linkData.text,
|
||||
};
|
||||
|
||||
// Include length property if present
|
||||
if (linkData.length) {
|
||||
edge.length = linkData.length;
|
||||
}
|
||||
|
||||
this.edges.push(edge);
|
||||
}
|
||||
|
||||
// Missing visitor methods
|
||||
linkStyleStatement(_ctx: any): void {
|
||||
// Handle link style statements
|
||||
// TODO: Implement link styling
|
||||
}
|
||||
|
||||
linkIndexList(_ctx: any): number[] {
|
||||
// Handle link index lists
|
||||
// TODO: Implement link index parsing
|
||||
return [];
|
||||
}
|
||||
|
||||
numberList(_ctx: any): number[] {
|
||||
// Handle number lists
|
||||
// TODO: Implement number list parsing
|
||||
return [];
|
||||
}
|
||||
|
||||
accStatement(ctx: any): void {
|
||||
if (ctx.AccTitle && ctx.AccTitleValue) {
|
||||
this.accTitle = ctx.AccTitleValue[0].image.trim();
|
||||
} else if (ctx.AccDescr && ctx.AccDescrValue) {
|
||||
this.accDescription = ctx.AccDescrValue[0].image.trim();
|
||||
} else if (ctx.AccDescrMultiline && ctx.AccDescrMultilineValue) {
|
||||
this.accDescription = ctx.AccDescrMultilineValue[0].image.trim();
|
||||
}
|
||||
}
|
||||
|
||||
clickHref(ctx: any): any {
|
||||
let href = '';
|
||||
if (ctx.NODE_STRING) {
|
||||
href = ctx.NODE_STRING[0].image;
|
||||
} else if (ctx.QuotedString) {
|
||||
href = ctx.QuotedString[0].image.slice(1, -1); // Remove quotes
|
||||
}
|
||||
return {
|
||||
href: href,
|
||||
target: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
clickCall(ctx: any): any {
|
||||
let functionName = '';
|
||||
|
||||
if (ctx.Call) {
|
||||
if (ctx.NODE_STRING) {
|
||||
functionName = ctx.NODE_STRING[0].image;
|
||||
} else if (ctx.QuotedString) {
|
||||
functionName = ctx.QuotedString[0].image.slice(1, -1); // Remove quotes
|
||||
}
|
||||
return {
|
||||
functionName: functionName,
|
||||
args: [], // TODO: Parse arguments if present
|
||||
};
|
||||
} else if (ctx.Callback) {
|
||||
if (ctx.NODE_STRING) {
|
||||
functionName = ctx.NODE_STRING[0].image;
|
||||
} else if (ctx.QuotedString) {
|
||||
functionName = ctx.QuotedString[0].image.slice(1, -1); // Remove quotes
|
||||
} else if (ctx.StringStart && ctx.StringContent && ctx.StringEnd) {
|
||||
functionName = ctx.StringContent[0].image; // String content without quotes
|
||||
}
|
||||
return {
|
||||
functionName: functionName,
|
||||
args: [],
|
||||
};
|
||||
}
|
||||
return {
|
||||
functionName: '',
|
||||
args: [],
|
||||
};
|
||||
}
|
||||
|
||||
subgraphId(ctx: any): string {
|
||||
if (ctx.NODE_STRING) {
|
||||
return ctx.NODE_STRING[0].image;
|
||||
} else if (ctx.QuotedString) {
|
||||
return ctx.QuotedString[0].image.slice(1, -1); // Remove quotes
|
||||
} else if (ctx.StringStart && ctx.StringContent && ctx.StringEnd) {
|
||||
return ctx.StringContent[0].image; // String content without quotes
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
// Return the complete AST
|
||||
getAST() {
|
||||
return {
|
||||
vertices: this.vertices,
|
||||
edges: this.edges,
|
||||
classes: this.classes,
|
||||
subGraphs: this.subGraphs,
|
||||
direction: this.direction,
|
||||
clickEvents: this.clickEvents,
|
||||
tooltips: this.tooltips,
|
||||
accTitle: this.accTitle,
|
||||
accDescription: this.accDescription,
|
||||
};
|
||||
}
|
||||
}
|
||||
855
packages/mermaid/src/diagrams/flowchart/parser/flowLexer.ts
Normal file
855
packages/mermaid/src/diagrams/flowchart/parser/flowLexer.ts
Normal file
@@ -0,0 +1,855 @@
|
||||
import { createToken, Lexer } from 'chevrotain';
|
||||
|
||||
// Debug flag for lexer logging
|
||||
const DEBUG_LEXER = false; // Set to true to enable debug logging
|
||||
|
||||
// ============================================================================
|
||||
// JISON TO CHEVROTAIN MULTI-MODE LEXER IMPLEMENTATION
|
||||
// Following the instructions to implement all Jison states as Chevrotain modes
|
||||
// Based on flow.jison lines 9-28 and state transitions throughout the file
|
||||
// ============================================================================
|
||||
|
||||
// ============================================================================
|
||||
// SHARED TOKENS (used across multiple modes)
|
||||
// ============================================================================
|
||||
|
||||
// Whitespace and comments (skipped in all modes)
|
||||
const WhiteSpace = createToken({
|
||||
name: 'WhiteSpace',
|
||||
pattern: /[\t ]+/, // Only spaces and tabs, not newlines
|
||||
group: Lexer.SKIPPED,
|
||||
});
|
||||
|
||||
const Comment = createToken({
|
||||
name: 'Comment',
|
||||
pattern: /%%[^\n]*/,
|
||||
group: Lexer.SKIPPED,
|
||||
});
|
||||
|
||||
// Basic structural tokens
|
||||
const Newline = createToken({
|
||||
name: 'Newline',
|
||||
pattern: /(\r?\n)+/,
|
||||
});
|
||||
|
||||
const Semicolon = createToken({
|
||||
name: 'Semicolon',
|
||||
pattern: /;/,
|
||||
});
|
||||
|
||||
const Space = createToken({
|
||||
name: 'Space',
|
||||
pattern: /\s/,
|
||||
});
|
||||
|
||||
const EOF = createToken({
|
||||
name: 'EOF',
|
||||
pattern: Lexer.NA,
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// NODE STRING AND IDENTIFIERS
|
||||
// ============================================================================
|
||||
|
||||
// Node string pattern from JISON line 205-207
|
||||
// Modified to include special characters and handle minus character edge cases
|
||||
// Allows - in node IDs including standalone -, -at-start, and -at-end patterns
|
||||
// Avoids conflicts with link tokens by using negative lookahead for link patterns
|
||||
// Handles compound cases like &node, -node, vnode where special chars are followed by word chars
|
||||
// Only matches compound patterns (special char + word chars), not standalone special chars
|
||||
const NODE_STRING = createToken({
|
||||
name: 'NODE_STRING',
|
||||
pattern:
|
||||
/\\\w+|\w+\\|&[\w!"#$%&'*+,./:?\\`]+[\w!"#$%&'*+,./:?\\`-]*|-[\w!"#$%&'*+,./:?\\`]+[\w!"#$%&'*+,./:?\\`-]*|[<>^v][\w!"#$%&'*+,./:?\\`]+[\w!"#$%&'*+,./:?\\`-]*|[\w!"#$%&'*+,./:?\\`](?:[\w!"#$%&'*+,./:?\\`]|-(?![.=-])|\.(?!-))*[\w!"#$%&'*+,./:?\\`]|[\w!"#$%&'*+,./:?\\`]|&|-|\\|\//,
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// KEYWORDS (with longer_alt to handle conflicts)
|
||||
// ============================================================================
|
||||
|
||||
const Graph = createToken({
|
||||
name: 'Graph',
|
||||
pattern: /graph|flowchart|flowchart-elk/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Subgraph = createToken({
|
||||
name: 'Subgraph',
|
||||
pattern: /subgraph/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const End = createToken({
|
||||
name: 'End',
|
||||
pattern: /end/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Style = createToken({
|
||||
name: 'Style',
|
||||
pattern: /style/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const LinkStyle = createToken({
|
||||
name: 'LinkStyle',
|
||||
pattern: /linkstyle/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const ClassDef = createToken({
|
||||
name: 'ClassDef',
|
||||
pattern: /classdef/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Class = createToken({
|
||||
name: 'Class',
|
||||
pattern: /class/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Click = createToken({
|
||||
name: 'Click',
|
||||
pattern: /click/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Href = createToken({
|
||||
name: 'Href',
|
||||
pattern: /href/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Callback = createToken({
|
||||
name: 'Callback',
|
||||
pattern: /callback/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Call = createToken({
|
||||
name: 'Call',
|
||||
pattern: /call/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Default = createToken({
|
||||
name: 'Default',
|
||||
pattern: /default/i,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// DIRECTION TOKENS (JISON lines 127-137)
|
||||
// ============================================================================
|
||||
|
||||
const DirectionValue = createToken({
|
||||
name: 'DirectionValue',
|
||||
pattern: /LR|RL|TB|BT|TD|BR|<|>|\^|v/,
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// ACCESSIBILITY TOKENS (JISON lines 31-37)
|
||||
// ============================================================================
|
||||
|
||||
// Mode-switching tokens for accessibility
|
||||
const AccTitle = createToken({
|
||||
name: 'AccTitle',
|
||||
pattern: /accTitle\s*:\s*/,
|
||||
push_mode: 'acc_title_mode',
|
||||
});
|
||||
|
||||
const AccDescr = createToken({
|
||||
name: 'AccDescr',
|
||||
pattern: /accDescr\s*:\s*/,
|
||||
push_mode: 'acc_descr_mode',
|
||||
});
|
||||
|
||||
const AccDescrMultiline = createToken({
|
||||
name: 'AccDescrMultiline',
|
||||
pattern: /accDescr\s*{\s*/,
|
||||
push_mode: 'acc_descr_multiline_mode',
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// STRING TOKENS (JISON lines 82-87)
|
||||
// ============================================================================
|
||||
|
||||
// Mode-switching tokens for strings
|
||||
const StringStart = createToken({
|
||||
name: 'StringStart',
|
||||
pattern: /"/,
|
||||
push_mode: 'string_mode',
|
||||
});
|
||||
|
||||
const MarkdownStringStart = createToken({
|
||||
name: 'MarkdownStringStart',
|
||||
pattern: /"`/,
|
||||
push_mode: 'md_string_mode',
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// SHAPE DATA TOKENS (JISON lines 41-64)
|
||||
// ============================================================================
|
||||
|
||||
const ShapeDataStart = createToken({
|
||||
name: 'ShapeDataStart',
|
||||
pattern: /@{/,
|
||||
push_mode: 'shapeData_mode',
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// LINK TOKENS (JISON lines 154-164)
|
||||
// ============================================================================
|
||||
|
||||
const LINK = createToken({
|
||||
name: 'LINK',
|
||||
pattern: /\s*[<ox]?--+[>ox-]\s*/,
|
||||
});
|
||||
|
||||
const START_LINK = createToken({
|
||||
name: 'START_LINK',
|
||||
pattern: /\s*[<ox]?--\s*/,
|
||||
push_mode: 'edgeText_mode',
|
||||
});
|
||||
|
||||
const THICK_LINK = createToken({
|
||||
name: 'THICK_LINK',
|
||||
pattern: /\s*[<ox]?==+[=>ox-]?\s*/,
|
||||
});
|
||||
|
||||
const START_THICK_LINK = createToken({
|
||||
name: 'START_THICK_LINK',
|
||||
pattern: /\s*[<ox]?==(?=\s*\|)\s*/,
|
||||
push_mode: 'thickEdgeText_mode',
|
||||
});
|
||||
|
||||
const DOTTED_LINK = createToken({
|
||||
name: 'DOTTED_LINK',
|
||||
pattern: /\s*[<ox]?-?\.+-[>ox-]?\s*/,
|
||||
});
|
||||
|
||||
const START_DOTTED_LINK = createToken({
|
||||
name: 'START_DOTTED_LINK',
|
||||
pattern: /\s*[<ox]?-\.(?!-)\s*/,
|
||||
push_mode: 'dottedEdgeText_mode',
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// SHAPE TOKENS (JISON lines 169-194)
|
||||
// ============================================================================
|
||||
|
||||
// Mode-switching tokens for shapes
|
||||
const SquareStart = createToken({
|
||||
name: 'SquareStart',
|
||||
pattern: /\[/,
|
||||
push_mode: 'text_mode',
|
||||
});
|
||||
|
||||
const PS = createToken({
|
||||
name: 'PS',
|
||||
pattern: /\(/,
|
||||
push_mode: 'text_mode',
|
||||
});
|
||||
|
||||
// Circle and double circle tokens (must come before PS)
|
||||
const DoubleCircleStart = createToken({
|
||||
name: 'DoubleCircleStart',
|
||||
pattern: /\({3}/,
|
||||
push_mode: 'text_mode',
|
||||
});
|
||||
|
||||
const CircleStart = createToken({
|
||||
name: 'CircleStart',
|
||||
pattern: /\(\(/,
|
||||
push_mode: 'text_mode',
|
||||
});
|
||||
|
||||
// Hexagon tokens
|
||||
const HexagonStart = createToken({
|
||||
name: 'HexagonStart',
|
||||
pattern: /{{/,
|
||||
push_mode: 'text_mode',
|
||||
});
|
||||
|
||||
const DiamondStart = createToken({
|
||||
name: 'DiamondStart',
|
||||
pattern: /{/,
|
||||
push_mode: 'text_mode',
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// BASIC PUNCTUATION
|
||||
// ============================================================================
|
||||
|
||||
const Colon = createToken({
|
||||
name: 'Colon',
|
||||
pattern: /:/,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Comma = createToken({
|
||||
name: 'Comma',
|
||||
pattern: /,/,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Pipe = createToken({
|
||||
name: 'Pipe',
|
||||
pattern: /\|/,
|
||||
});
|
||||
|
||||
const Ampersand = createToken({
|
||||
name: 'Ampersand',
|
||||
pattern: /&/,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Minus = createToken({
|
||||
name: 'Minus',
|
||||
pattern: /-/,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
// Additional special character tokens for node IDs
|
||||
const Hash = createToken({
|
||||
name: 'Hash',
|
||||
pattern: /#/,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Asterisk = createToken({
|
||||
name: 'Asterisk',
|
||||
pattern: /\*/,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Dot = createToken({
|
||||
name: 'Dot',
|
||||
pattern: /\./,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
// Backslash token removed - handled entirely by NODE_STRING
|
||||
|
||||
const Slash = createToken({
|
||||
name: 'Slash',
|
||||
pattern: /\//,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const Underscore = createToken({
|
||||
name: 'Underscore',
|
||||
pattern: /_/,
|
||||
longer_alt: NODE_STRING,
|
||||
});
|
||||
|
||||
const NumberToken = createToken({
|
||||
name: 'NumberToken',
|
||||
pattern: /\d+/,
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// MODE-SPECIFIC TOKENS
|
||||
// ============================================================================
|
||||
|
||||
// Tokens for acc_title mode (JISON line 32)
|
||||
const AccTitleValue = createToken({
|
||||
name: 'AccTitleValue',
|
||||
pattern: /[^\n]+/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for acc_descr mode (JISON line 34)
|
||||
const AccDescrValue = createToken({
|
||||
name: 'AccDescrValue',
|
||||
pattern: /[^\n]+/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for acc_descr_multiline mode (JISON lines 36-37)
|
||||
const AccDescrMultilineValue = createToken({
|
||||
name: 'AccDescrMultilineValue',
|
||||
pattern: /[^}]+/,
|
||||
});
|
||||
|
||||
const AccDescrMultilineEnd = createToken({
|
||||
name: 'AccDescrMultilineEnd',
|
||||
pattern: /}/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for string mode (JISON lines 85-86)
|
||||
const StringContent = createToken({
|
||||
name: 'StringContent',
|
||||
pattern: /[^"]+/,
|
||||
});
|
||||
|
||||
const StringEnd = createToken({
|
||||
name: 'StringEnd',
|
||||
pattern: /"/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for md_string mode (JISON lines 82-83)
|
||||
const MarkdownStringContent = createToken({
|
||||
name: 'MarkdownStringContent',
|
||||
pattern: /[^"`]+/,
|
||||
});
|
||||
|
||||
const MarkdownStringEnd = createToken({
|
||||
name: 'MarkdownStringEnd',
|
||||
pattern: /`"/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for text mode (JISON lines 272-283)
|
||||
const TextContent = createToken({
|
||||
name: 'TextContent',
|
||||
pattern: /[^"()[\]{|}]+/,
|
||||
});
|
||||
|
||||
const QuotedString = createToken({
|
||||
name: 'QuotedString',
|
||||
pattern: /"[^"]*"/,
|
||||
});
|
||||
|
||||
const SquareEnd = createToken({
|
||||
name: 'SquareEnd',
|
||||
pattern: /]/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
const PE = createToken({
|
||||
name: 'PE',
|
||||
pattern: /\)/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Circle and double circle end tokens (must come before PE)
|
||||
const DoubleCircleEnd = createToken({
|
||||
name: 'DoubleCircleEnd',
|
||||
pattern: /\){3}/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
const CircleEnd = createToken({
|
||||
name: 'CircleEnd',
|
||||
pattern: /\)\)/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Hexagon end token
|
||||
const HexagonEnd = createToken({
|
||||
name: 'HexagonEnd',
|
||||
pattern: /}}/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
const DiamondEnd = createToken({
|
||||
name: 'DiamondEnd',
|
||||
pattern: /}/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for edge text modes (JISON lines 156, 160, 164)
|
||||
const EdgeTextContent = createToken({
|
||||
name: 'EdgeTextContent',
|
||||
pattern: /[^|-]+/,
|
||||
});
|
||||
|
||||
const EdgeTextPipe = createToken({
|
||||
name: 'EdgeTextPipe',
|
||||
pattern: /\|/,
|
||||
});
|
||||
|
||||
const EdgeTextEnd = createToken({
|
||||
name: 'EdgeTextEnd',
|
||||
pattern: /(-+[>ox-])|(=+[=>ox])/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for shapeData mode (JISON lines 57-64)
|
||||
const ShapeDataContent = createToken({
|
||||
name: 'ShapeDataContent',
|
||||
pattern: /[^"}]+/,
|
||||
});
|
||||
|
||||
const ShapeDataStringStart = createToken({
|
||||
name: 'ShapeDataStringStart',
|
||||
pattern: /"/,
|
||||
push_mode: 'shapeDataStr_mode',
|
||||
});
|
||||
|
||||
const ShapeDataEnd = createToken({
|
||||
name: 'ShapeDataEnd',
|
||||
pattern: /}/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// Tokens for shapeDataStr mode (JISON lines 49-56)
|
||||
const ShapeDataStringContent = createToken({
|
||||
name: 'ShapeDataStringContent',
|
||||
pattern: /[^"]+/,
|
||||
});
|
||||
|
||||
const ShapeDataStringEnd = createToken({
|
||||
name: 'ShapeDataStringEnd',
|
||||
pattern: /"/,
|
||||
pop_mode: true,
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// MULTI-MODE LEXER DEFINITION
|
||||
// Following JISON states exactly
|
||||
// ============================================================================
|
||||
|
||||
const multiModeLexerDefinition = {
|
||||
modes: {
|
||||
// INITIAL mode - equivalent to JISON default state
|
||||
initial_mode: [
|
||||
WhiteSpace,
|
||||
Comment,
|
||||
|
||||
// Accessibility tokens (must come before other patterns)
|
||||
AccTitle,
|
||||
AccDescr,
|
||||
AccDescrMultiline,
|
||||
|
||||
// Keywords (must come before NODE_STRING)
|
||||
Graph,
|
||||
Subgraph,
|
||||
End,
|
||||
Style,
|
||||
LinkStyle,
|
||||
ClassDef,
|
||||
Class,
|
||||
Click,
|
||||
Href,
|
||||
Callback,
|
||||
Call,
|
||||
Default,
|
||||
|
||||
// Links (order matters for precedence - must come before DirectionValue)
|
||||
START_THICK_LINK,
|
||||
THICK_LINK,
|
||||
START_DOTTED_LINK,
|
||||
DOTTED_LINK,
|
||||
LINK,
|
||||
START_LINK,
|
||||
|
||||
// Direction values (must come after LINK tokens)
|
||||
DirectionValue,
|
||||
|
||||
// String starts (QuotedString must come before StringStart to avoid conflicts)
|
||||
MarkdownStringStart,
|
||||
QuotedString,
|
||||
StringStart,
|
||||
|
||||
// Shape data
|
||||
ShapeDataStart,
|
||||
|
||||
// Shape starts (order matters - longer patterns first)
|
||||
SquareStart,
|
||||
DoubleCircleStart,
|
||||
CircleStart,
|
||||
PS,
|
||||
HexagonStart,
|
||||
DiamondStart,
|
||||
|
||||
// Basic punctuation (must come before NODE_STRING)
|
||||
Pipe,
|
||||
Colon,
|
||||
Comma,
|
||||
Ampersand,
|
||||
Minus,
|
||||
|
||||
// Node strings and numbers (must come after punctuation)
|
||||
NODE_STRING,
|
||||
NumberToken,
|
||||
|
||||
// Structural tokens
|
||||
Newline,
|
||||
Semicolon,
|
||||
Space,
|
||||
EOF,
|
||||
],
|
||||
|
||||
// acc_title mode (JISON line 32)
|
||||
acc_title_mode: [WhiteSpace, Comment, AccTitleValue],
|
||||
|
||||
// acc_descr mode (JISON line 34)
|
||||
acc_descr_mode: [WhiteSpace, Comment, AccDescrValue],
|
||||
|
||||
// acc_descr_multiline mode (JISON lines 36-37)
|
||||
acc_descr_multiline_mode: [WhiteSpace, Comment, AccDescrMultilineEnd, AccDescrMultilineValue],
|
||||
|
||||
// string mode (JISON lines 85-86)
|
||||
string_mode: [StringEnd, StringContent],
|
||||
|
||||
// md_string mode (JISON lines 82-83)
|
||||
md_string_mode: [MarkdownStringEnd, MarkdownStringContent],
|
||||
|
||||
// text mode (JISON lines 272-283)
|
||||
text_mode: [
|
||||
WhiteSpace,
|
||||
Comment,
|
||||
SquareEnd,
|
||||
DoubleCircleEnd,
|
||||
CircleEnd,
|
||||
PE,
|
||||
HexagonEnd,
|
||||
DiamondEnd,
|
||||
QuotedString,
|
||||
Pipe, // Special handling for pipe in text mode
|
||||
TextContent,
|
||||
],
|
||||
|
||||
// edgeText mode (JISON line 156)
|
||||
edgeText_mode: [WhiteSpace, Comment, EdgeTextEnd, EdgeTextPipe, QuotedString, EdgeTextContent],
|
||||
|
||||
// thickEdgeText mode (JISON line 160)
|
||||
thickEdgeText_mode: [
|
||||
WhiteSpace,
|
||||
Comment,
|
||||
EdgeTextEnd,
|
||||
EdgeTextPipe,
|
||||
QuotedString,
|
||||
EdgeTextContent,
|
||||
],
|
||||
|
||||
// dottedEdgeText mode (JISON line 164)
|
||||
dottedEdgeText_mode: [
|
||||
WhiteSpace,
|
||||
Comment,
|
||||
EdgeTextEnd,
|
||||
EdgeTextPipe,
|
||||
QuotedString,
|
||||
EdgeTextContent,
|
||||
],
|
||||
|
||||
// shapeData mode (JISON lines 57-64)
|
||||
shapeData_mode: [WhiteSpace, Comment, ShapeDataEnd, ShapeDataStringStart, ShapeDataContent],
|
||||
|
||||
// shapeDataStr mode (JISON lines 49-56)
|
||||
shapeDataStr_mode: [ShapeDataStringEnd, ShapeDataStringContent],
|
||||
},
|
||||
|
||||
defaultMode: 'initial_mode',
|
||||
};
|
||||
|
||||
const FlowchartLexer = new Lexer(multiModeLexerDefinition);
|
||||
|
||||
// Debug wrapper for lexer tokenization
|
||||
const tokenizeWithDebug = (input: string) => {
|
||||
const lexResult = FlowchartLexer.tokenize(input);
|
||||
|
||||
if (DEBUG_LEXER) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.debug('Errors:\n', lexResult.errors);
|
||||
// eslint-disable-next-line no-console
|
||||
console.debug(
|
||||
'Tokens:\n',
|
||||
lexResult.tokens.map((t) => [t.image, t.tokenType.name])
|
||||
);
|
||||
}
|
||||
|
||||
return lexResult;
|
||||
};
|
||||
|
||||
// Extend FlowchartLexer with debug capability
|
||||
const FlowchartLexerWithDebug = {
|
||||
...FlowchartLexer,
|
||||
tokenize: tokenizeWithDebug,
|
||||
};
|
||||
|
||||
// Export all tokens for parser use
|
||||
export const allTokens = [
|
||||
// Basic tokens
|
||||
WhiteSpace,
|
||||
Comment,
|
||||
Newline,
|
||||
Semicolon,
|
||||
Space,
|
||||
EOF,
|
||||
|
||||
// Node strings and identifiers
|
||||
NODE_STRING,
|
||||
NumberToken,
|
||||
|
||||
// Keywords
|
||||
Graph,
|
||||
Subgraph,
|
||||
End,
|
||||
Style,
|
||||
LinkStyle,
|
||||
ClassDef,
|
||||
Class,
|
||||
Click,
|
||||
Href,
|
||||
Call,
|
||||
Default,
|
||||
|
||||
// Direction
|
||||
DirectionValue,
|
||||
|
||||
// Accessibility
|
||||
AccTitle,
|
||||
AccTitleValue,
|
||||
AccDescr,
|
||||
AccDescrValue,
|
||||
AccDescrMultiline,
|
||||
AccDescrMultilineValue,
|
||||
AccDescrMultilineEnd,
|
||||
|
||||
// Strings
|
||||
StringStart,
|
||||
StringContent,
|
||||
StringEnd,
|
||||
MarkdownStringStart,
|
||||
MarkdownStringContent,
|
||||
MarkdownStringEnd,
|
||||
|
||||
// Shape data
|
||||
ShapeDataStart,
|
||||
ShapeDataContent,
|
||||
ShapeDataStringStart,
|
||||
ShapeDataStringContent,
|
||||
ShapeDataStringEnd,
|
||||
ShapeDataEnd,
|
||||
|
||||
// Links
|
||||
LINK,
|
||||
START_LINK,
|
||||
THICK_LINK,
|
||||
START_THICK_LINK,
|
||||
DOTTED_LINK,
|
||||
START_DOTTED_LINK,
|
||||
|
||||
// Edge text
|
||||
EdgeTextContent,
|
||||
EdgeTextPipe,
|
||||
EdgeTextEnd,
|
||||
|
||||
// Shapes
|
||||
SquareStart,
|
||||
SquareEnd,
|
||||
DoubleCircleStart,
|
||||
DoubleCircleEnd,
|
||||
CircleStart,
|
||||
CircleEnd,
|
||||
PS,
|
||||
PE,
|
||||
HexagonStart,
|
||||
HexagonEnd,
|
||||
DiamondStart,
|
||||
DiamondEnd,
|
||||
|
||||
// Text content
|
||||
TextContent,
|
||||
QuotedString,
|
||||
|
||||
// Basic punctuation
|
||||
Colon,
|
||||
Comma,
|
||||
Pipe,
|
||||
Ampersand,
|
||||
Minus,
|
||||
];
|
||||
|
||||
export { FlowchartLexerWithDebug as FlowchartLexer };
|
||||
|
||||
// Export individual tokens for parser use
|
||||
export {
|
||||
// Basic tokens
|
||||
WhiteSpace,
|
||||
Comment,
|
||||
Newline,
|
||||
Semicolon,
|
||||
Space,
|
||||
EOF,
|
||||
|
||||
// Node strings and identifiers
|
||||
NODE_STRING,
|
||||
NumberToken,
|
||||
|
||||
// Keywords
|
||||
Graph,
|
||||
Subgraph,
|
||||
End,
|
||||
Style,
|
||||
LinkStyle,
|
||||
ClassDef,
|
||||
Class,
|
||||
Click,
|
||||
Href,
|
||||
Callback,
|
||||
Call,
|
||||
Default,
|
||||
|
||||
// Direction
|
||||
DirectionValue,
|
||||
|
||||
// Accessibility
|
||||
AccTitle,
|
||||
AccTitleValue,
|
||||
AccDescr,
|
||||
AccDescrValue,
|
||||
AccDescrMultiline,
|
||||
AccDescrMultilineValue,
|
||||
AccDescrMultilineEnd,
|
||||
|
||||
// Strings
|
||||
StringStart,
|
||||
StringContent,
|
||||
StringEnd,
|
||||
MarkdownStringStart,
|
||||
MarkdownStringContent,
|
||||
MarkdownStringEnd,
|
||||
|
||||
// Shape data
|
||||
ShapeDataStart,
|
||||
ShapeDataContent,
|
||||
ShapeDataStringStart,
|
||||
ShapeDataStringContent,
|
||||
ShapeDataStringEnd,
|
||||
ShapeDataEnd,
|
||||
|
||||
// Links
|
||||
LINK,
|
||||
START_LINK,
|
||||
THICK_LINK,
|
||||
START_THICK_LINK,
|
||||
DOTTED_LINK,
|
||||
START_DOTTED_LINK,
|
||||
|
||||
// Edge text
|
||||
EdgeTextContent,
|
||||
EdgeTextPipe,
|
||||
EdgeTextEnd,
|
||||
|
||||
// Shapes
|
||||
SquareStart,
|
||||
SquareEnd,
|
||||
DoubleCircleStart,
|
||||
DoubleCircleEnd,
|
||||
CircleStart,
|
||||
CircleEnd,
|
||||
PS,
|
||||
PE,
|
||||
HexagonStart,
|
||||
HexagonEnd,
|
||||
DiamondStart,
|
||||
DiamondEnd,
|
||||
|
||||
// Text content
|
||||
TextContent,
|
||||
QuotedString,
|
||||
|
||||
// Basic punctuation
|
||||
Colon,
|
||||
Comma,
|
||||
Pipe,
|
||||
Ampersand,
|
||||
Minus,
|
||||
};
|
||||
@@ -0,0 +1,277 @@
|
||||
import { createToken, Lexer } from 'chevrotain';
|
||||
|
||||
// Define lexer mode names following JISON states
|
||||
const MODES = {
|
||||
DEFAULT: 'default_mode',
|
||||
STRING: 'string_mode',
|
||||
MD_STRING: 'md_string_mode',
|
||||
ACC_TITLE: 'acc_title_mode',
|
||||
ACC_DESCR: 'acc_descr_mode',
|
||||
ACC_DESCR_MULTILINE: 'acc_descr_multiline_mode',
|
||||
DIR: 'dir_mode',
|
||||
VERTEX: 'vertex_mode',
|
||||
TEXT: 'text_mode',
|
||||
ELLIPSE_TEXT: 'ellipseText_mode',
|
||||
TRAP_TEXT: 'trapText_mode',
|
||||
EDGE_TEXT: 'edgeText_mode',
|
||||
THICK_EDGE_TEXT: 'thickEdgeText_mode',
|
||||
DOTTED_EDGE_TEXT: 'dottedEdgeText_mode',
|
||||
CLICK: 'click_mode',
|
||||
HREF: 'href_mode',
|
||||
CALLBACK_NAME: 'callbackname_mode',
|
||||
CALLBACK_ARGS: 'callbackargs_mode',
|
||||
SHAPE_DATA: 'shapeData_mode',
|
||||
SHAPE_DATA_STR: 'shapeDataStr_mode',
|
||||
SHAPE_DATA_END_BRACKET: 'shapeDataEndBracket_mode',
|
||||
};
|
||||
|
||||
// Whitespace and comments (skipped in all modes)
|
||||
const WhiteSpace = createToken({
|
||||
name: 'WhiteSpace',
|
||||
pattern: /\s+/,
|
||||
group: Lexer.SKIPPED,
|
||||
});
|
||||
|
||||
const Comment = createToken({
|
||||
name: 'Comment',
|
||||
pattern: /%%[^\n]*/,
|
||||
group: Lexer.SKIPPED,
|
||||
});
|
||||
|
||||
// Keywords - following JISON patterns exactly
|
||||
const Graph = createToken({
|
||||
name: 'Graph',
|
||||
pattern: /graph|flowchart|flowchart-elk/i,
|
||||
});
|
||||
|
||||
const Direction = createToken({
|
||||
name: 'Direction',
|
||||
pattern: /direction/i,
|
||||
});
|
||||
|
||||
const Subgraph = createToken({
|
||||
name: 'Subgraph',
|
||||
pattern: /subgraph/i,
|
||||
});
|
||||
|
||||
const End = createToken({
|
||||
name: 'End',
|
||||
pattern: /end/i,
|
||||
});
|
||||
|
||||
// Mode switching tokens - following JISON patterns exactly
|
||||
|
||||
// Links with edge text - following JISON lines 154-164
|
||||
const LINK = createToken({
|
||||
name: 'LINK',
|
||||
pattern: /\s*[<ox]?--+[>ox-]\s*/,
|
||||
});
|
||||
|
||||
const START_LINK = createToken({
|
||||
name: 'START_LINK',
|
||||
pattern: /\s*[<ox]?--\s*/,
|
||||
});
|
||||
|
||||
const THICK_LINK = createToken({
|
||||
name: 'THICK_LINK',
|
||||
pattern: /\s*[<ox]?==+[=>ox]\s*/,
|
||||
});
|
||||
|
||||
const START_THICK_LINK = createToken({
|
||||
name: 'START_THICK_LINK',
|
||||
pattern: /\s*[<ox]?==\s*/,
|
||||
});
|
||||
|
||||
const DOTTED_LINK = createToken({
|
||||
name: 'DOTTED_LINK',
|
||||
pattern: /\s*[<ox]?-?\.+-[>ox]?\s*/,
|
||||
});
|
||||
|
||||
const START_DOTTED_LINK = createToken({
|
||||
name: 'START_DOTTED_LINK',
|
||||
pattern: /\s*[<ox]?-\.\s*/,
|
||||
});
|
||||
|
||||
// Edge text tokens
|
||||
const EDGE_TEXT = createToken({
|
||||
name: 'EDGE_TEXT',
|
||||
pattern: /[^-]+/,
|
||||
});
|
||||
|
||||
// Shape tokens that trigger text mode - following JISON lines 272-283
|
||||
const PIPE = createToken({
|
||||
name: 'PIPE',
|
||||
pattern: /\|/,
|
||||
});
|
||||
|
||||
const PS = createToken({
|
||||
name: 'PS',
|
||||
pattern: /\(/,
|
||||
});
|
||||
|
||||
const PE = createToken({
|
||||
name: 'PE',
|
||||
pattern: /\)/,
|
||||
});
|
||||
|
||||
const SQS = createToken({
|
||||
name: 'SQS',
|
||||
pattern: /\[/,
|
||||
});
|
||||
|
||||
const SQE = createToken({
|
||||
name: 'SQE',
|
||||
pattern: /]/,
|
||||
});
|
||||
|
||||
const DIAMOND_START = createToken({
|
||||
name: 'DIAMOND_START',
|
||||
pattern: /{/,
|
||||
});
|
||||
|
||||
const DIAMOND_STOP = createToken({
|
||||
name: 'DIAMOND_STOP',
|
||||
pattern: /}/,
|
||||
});
|
||||
|
||||
// Text content - following JISON line 283
|
||||
const TEXT = createToken({
|
||||
name: 'TEXT',
|
||||
pattern: /[^"()[\]{|}]+/,
|
||||
});
|
||||
|
||||
// Node string - simplified pattern for now
|
||||
const NODE_STRING = createToken({
|
||||
name: 'NODE_STRING',
|
||||
pattern: /[\w!"#$%&'*+./?\\`]+/,
|
||||
});
|
||||
|
||||
// Basic tokens
|
||||
const NUM = createToken({
|
||||
name: 'NUM',
|
||||
pattern: /\d+/,
|
||||
});
|
||||
|
||||
const MINUS = createToken({
|
||||
name: 'MINUS',
|
||||
pattern: /-/,
|
||||
});
|
||||
|
||||
const AMP = createToken({
|
||||
name: 'AMP',
|
||||
pattern: /&/,
|
||||
});
|
||||
|
||||
const SEMI = createToken({
|
||||
name: 'SEMI',
|
||||
pattern: /;/,
|
||||
});
|
||||
|
||||
const COMMA = createToken({
|
||||
name: 'COMMA',
|
||||
pattern: /,/,
|
||||
});
|
||||
|
||||
const COLON = createToken({
|
||||
name: 'COLON',
|
||||
pattern: /:/,
|
||||
});
|
||||
|
||||
const QUOTE = createToken({
|
||||
name: 'QUOTE',
|
||||
pattern: /"/,
|
||||
});
|
||||
|
||||
const NEWLINE = createToken({
|
||||
name: 'NEWLINE',
|
||||
pattern: /(\r?\n)+/,
|
||||
});
|
||||
|
||||
const SPACE = createToken({
|
||||
name: 'SPACE',
|
||||
pattern: /\s/,
|
||||
});
|
||||
|
||||
// Create a simple single-mode lexer for now
|
||||
const allTokens = [
|
||||
// Whitespace and comments (skipped)
|
||||
WhiteSpace,
|
||||
Comment,
|
||||
|
||||
// Keywords
|
||||
Graph,
|
||||
Direction,
|
||||
Subgraph,
|
||||
End,
|
||||
|
||||
// Links (must come before MINUS)
|
||||
LINK,
|
||||
START_LINK,
|
||||
THICK_LINK,
|
||||
START_THICK_LINK,
|
||||
DOTTED_LINK,
|
||||
START_DOTTED_LINK,
|
||||
|
||||
// Shapes
|
||||
PS, // (
|
||||
PE, // )
|
||||
SQS, // [
|
||||
SQE, // ]
|
||||
DIAMOND_START, // {
|
||||
DIAMOND_STOP, // }
|
||||
PIPE, // |
|
||||
|
||||
// Text and identifiers
|
||||
NODE_STRING,
|
||||
TEXT,
|
||||
NUM,
|
||||
|
||||
// Single characters
|
||||
NEWLINE,
|
||||
SPACE,
|
||||
SEMI,
|
||||
COMMA,
|
||||
COLON,
|
||||
AMP,
|
||||
MINUS,
|
||||
QUOTE,
|
||||
];
|
||||
|
||||
// Create simple single-mode lexer
|
||||
const FlowchartMultiModeLexer = new Lexer(allTokens);
|
||||
|
||||
// Export tokens and lexer
|
||||
export {
|
||||
FlowchartMultiModeLexer,
|
||||
MODES,
|
||||
// Export all tokens
|
||||
Graph,
|
||||
Direction,
|
||||
Subgraph,
|
||||
End,
|
||||
LINK,
|
||||
START_LINK,
|
||||
THICK_LINK,
|
||||
START_THICK_LINK,
|
||||
DOTTED_LINK,
|
||||
START_DOTTED_LINK,
|
||||
EDGE_TEXT,
|
||||
PIPE,
|
||||
PS,
|
||||
PE,
|
||||
SQS,
|
||||
SQE,
|
||||
DIAMOND_START,
|
||||
DIAMOND_STOP,
|
||||
TEXT,
|
||||
NODE_STRING,
|
||||
NUM,
|
||||
MINUS,
|
||||
AMP,
|
||||
SEMI,
|
||||
COMMA,
|
||||
COLON,
|
||||
QUOTE,
|
||||
NEWLINE,
|
||||
SPACE,
|
||||
};
|
||||
@@ -1,12 +1,556 @@
|
||||
// @ts-ignore: JISON doesn't support types
|
||||
import flowJisonParser from './flow.jison';
|
||||
import { CstParser } from 'chevrotain';
|
||||
import * as tokens from './flowLexer.js';
|
||||
|
||||
const newParser = Object.assign({}, flowJisonParser);
|
||||
export class FlowchartParser extends CstParser {
|
||||
constructor() {
|
||||
super(tokens.allTokens, {
|
||||
recoveryEnabled: true,
|
||||
nodeLocationTracking: 'full',
|
||||
});
|
||||
|
||||
newParser.parse = (src: string): unknown => {
|
||||
// remove the trailing whitespace after closing curly braces when ending a line break
|
||||
const newSrc = src.replace(/}\s*\n/g, '}\n');
|
||||
return flowJisonParser.parse(newSrc);
|
||||
};
|
||||
this.performSelfAnalysis();
|
||||
}
|
||||
|
||||
export default newParser;
|
||||
// Root rule
|
||||
public flowchart = this.RULE('flowchart', () => {
|
||||
this.SUBRULE(this.graphDeclaration);
|
||||
// Handle statements and separators more flexibly
|
||||
this.MANY(() => {
|
||||
this.SUBRULE(this.statement);
|
||||
// Optional separator after statement
|
||||
this.OPTION(() => {
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// Graph declaration
|
||||
private graphDeclaration = this.RULE('graphDeclaration', () => {
|
||||
this.CONSUME(tokens.Graph);
|
||||
this.OPTION(() => {
|
||||
this.CONSUME(tokens.DirectionValue);
|
||||
});
|
||||
this.OPTION2(() => {
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
});
|
||||
|
||||
// Statement separator
|
||||
private statementSeparator = this.RULE('statementSeparator', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.Newline) },
|
||||
{ ALT: () => this.CONSUME(tokens.Semicolon) },
|
||||
{ ALT: () => this.CONSUME(tokens.WhiteSpace) }, // Allow whitespace as separator
|
||||
]);
|
||||
// Allow trailing whitespace and newlines after separators
|
||||
this.MANY(() => {
|
||||
this.OR2([
|
||||
{ ALT: () => this.CONSUME2(tokens.WhiteSpace) },
|
||||
{ ALT: () => this.CONSUME2(tokens.Newline) },
|
||||
]);
|
||||
});
|
||||
});
|
||||
|
||||
// Statement - following JISON structure
|
||||
private statement = this.RULE('statement', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.SUBRULE(this.vertexStatement) },
|
||||
{ ALT: () => this.SUBRULE(this.styleStatement) },
|
||||
{ ALT: () => this.SUBRULE(this.linkStyleStatement) },
|
||||
{ ALT: () => this.SUBRULE(this.classDefStatement) },
|
||||
{ ALT: () => this.SUBRULE(this.classStatement) },
|
||||
{ ALT: () => this.SUBRULE(this.clickStatement) },
|
||||
{ ALT: () => this.SUBRULE(this.subgraphStatement) },
|
||||
// Direction statement only when DirectionValue is followed by separator
|
||||
{
|
||||
ALT: () => this.SUBRULE(this.directionStatement),
|
||||
GATE: () =>
|
||||
this.LA(1).tokenType === tokens.DirectionValue &&
|
||||
(this.LA(2).tokenType === tokens.Semicolon ||
|
||||
this.LA(2).tokenType === tokens.Newline ||
|
||||
this.LA(2).tokenType === tokens.WhiteSpace ||
|
||||
this.LA(2) === undefined), // EOF
|
||||
},
|
||||
{ ALT: () => this.SUBRULE(this.accStatement) }, // Re-enabled
|
||||
]);
|
||||
});
|
||||
|
||||
// Vertex statement - avoiding left recursion
|
||||
private vertexStatement = this.RULE('vertexStatement', () => {
|
||||
this.SUBRULE(this.node);
|
||||
this.MANY(() => {
|
||||
this.SUBRULE(this.link);
|
||||
this.SUBRULE2(this.node);
|
||||
});
|
||||
});
|
||||
|
||||
// Node - avoiding left recursion
|
||||
private node = this.RULE('node', () => {
|
||||
this.SUBRULE(this.styledVertex);
|
||||
this.MANY(() => {
|
||||
this.CONSUME(tokens.Ampersand);
|
||||
this.SUBRULE2(this.styledVertex);
|
||||
});
|
||||
});
|
||||
|
||||
// Styled vertex
|
||||
private styledVertex = this.RULE('styledVertex', () => {
|
||||
this.SUBRULE(this.vertex);
|
||||
// TODO: Add style separator support when implementing styling
|
||||
});
|
||||
|
||||
// Vertex - following JISON pattern
|
||||
private vertex = this.RULE('vertex', () => {
|
||||
this.OR([
|
||||
// idString SQS text SQE
|
||||
{
|
||||
ALT: () => {
|
||||
this.SUBRULE(this.nodeId);
|
||||
this.CONSUME(tokens.SquareStart);
|
||||
this.SUBRULE(this.nodeText);
|
||||
this.CONSUME(tokens.SquareEnd);
|
||||
},
|
||||
},
|
||||
// idString DoubleCircleStart text DoubleCircleEnd
|
||||
{
|
||||
ALT: () => {
|
||||
this.SUBRULE2(this.nodeId);
|
||||
this.CONSUME(tokens.DoubleCircleStart);
|
||||
this.SUBRULE2(this.nodeText);
|
||||
this.CONSUME(tokens.DoubleCircleEnd);
|
||||
},
|
||||
},
|
||||
// idString CircleStart text CircleEnd
|
||||
{
|
||||
ALT: () => {
|
||||
this.SUBRULE3(this.nodeId);
|
||||
this.CONSUME(tokens.CircleStart);
|
||||
this.SUBRULE3(this.nodeText);
|
||||
this.CONSUME(tokens.CircleEnd);
|
||||
},
|
||||
},
|
||||
// idString PS text PE
|
||||
{
|
||||
ALT: () => {
|
||||
this.SUBRULE4(this.nodeId);
|
||||
this.CONSUME(tokens.PS);
|
||||
this.SUBRULE4(this.nodeText);
|
||||
this.CONSUME(tokens.PE);
|
||||
},
|
||||
},
|
||||
// idString HexagonStart text HexagonEnd
|
||||
{
|
||||
ALT: () => {
|
||||
this.SUBRULE5(this.nodeId);
|
||||
this.CONSUME(tokens.HexagonStart);
|
||||
this.SUBRULE5(this.nodeText);
|
||||
this.CONSUME(tokens.HexagonEnd);
|
||||
},
|
||||
},
|
||||
// idString DIAMOND_START text DIAMOND_STOP
|
||||
{
|
||||
ALT: () => {
|
||||
this.SUBRULE6(this.nodeId);
|
||||
this.CONSUME(tokens.DiamondStart);
|
||||
this.SUBRULE6(this.nodeText);
|
||||
this.CONSUME(tokens.DiamondEnd);
|
||||
},
|
||||
},
|
||||
// idString (plain node)
|
||||
{ ALT: () => this.SUBRULE7(this.nodeId) },
|
||||
]);
|
||||
});
|
||||
|
||||
// Node definition (legacy)
|
||||
private nodeDefinition = this.RULE('nodeDefinition', () => {
|
||||
this.SUBRULE(this.nodeId);
|
||||
this.OPTION(() => {
|
||||
this.SUBRULE(this.nodeShape);
|
||||
});
|
||||
// TODO: Add style separator support when implementing styling
|
||||
});
|
||||
|
||||
// Node ID - handles both simple and compound node IDs
|
||||
private nodeId = this.RULE('nodeId', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.NumberToken) },
|
||||
|
||||
// Allow special characters as standalone node IDs (matching JISON parser behavior)
|
||||
{ ALT: () => this.CONSUME2(tokens.Ampersand) },
|
||||
{ ALT: () => this.CONSUME2(tokens.Minus) },
|
||||
{ ALT: () => this.CONSUME2(tokens.DirectionValue) },
|
||||
{ ALT: () => this.CONSUME(tokens.Colon) },
|
||||
{ ALT: () => this.CONSUME(tokens.Comma) },
|
||||
// Only allow 'default' as node ID when not followed by statement patterns
|
||||
{
|
||||
ALT: () => this.CONSUME(tokens.Default),
|
||||
GATE: () => this.LA(2).tokenType !== tokens.DirectionValue,
|
||||
},
|
||||
]);
|
||||
});
|
||||
|
||||
// Node shape
|
||||
private nodeShape = this.RULE('nodeShape', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.SUBRULE(this.squareShape) },
|
||||
{ ALT: () => this.SUBRULE(this.circleShape) },
|
||||
{ ALT: () => this.SUBRULE(this.diamondShape) },
|
||||
]);
|
||||
});
|
||||
|
||||
// Shape definitions
|
||||
private squareShape = this.RULE('squareShape', () => {
|
||||
this.CONSUME(tokens.SquareStart);
|
||||
this.SUBRULE(this.nodeText);
|
||||
this.CONSUME(tokens.SquareEnd);
|
||||
});
|
||||
|
||||
private circleShape = this.RULE('circleShape', () => {
|
||||
this.CONSUME(tokens.PS);
|
||||
this.SUBRULE(this.nodeText);
|
||||
this.CONSUME(tokens.PE);
|
||||
});
|
||||
|
||||
private diamondShape = this.RULE('diamondShape', () => {
|
||||
this.CONSUME(tokens.DiamondStart);
|
||||
this.SUBRULE(this.nodeText);
|
||||
this.CONSUME(tokens.DiamondEnd);
|
||||
});
|
||||
|
||||
// Node text
|
||||
private nodeText = this.RULE('nodeText', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.TextContent) },
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.QuotedString) },
|
||||
{ ALT: () => this.CONSUME(tokens.NumberToken) },
|
||||
]);
|
||||
});
|
||||
|
||||
// Link chain
|
||||
private linkChain = this.RULE('linkChain', () => {
|
||||
this.AT_LEAST_ONE(() => {
|
||||
this.SUBRULE(this.link);
|
||||
this.SUBRULE(this.nodeDefinition);
|
||||
});
|
||||
});
|
||||
|
||||
// Link - following JISON structure
|
||||
private link = this.RULE('link', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.SUBRULE(this.linkWithEdgeText) },
|
||||
{ ALT: () => this.SUBRULE(this.linkWithArrowText) },
|
||||
{ ALT: () => this.SUBRULE(this.linkStatement) },
|
||||
]);
|
||||
});
|
||||
|
||||
// Link with arrow text - LINK arrowText
|
||||
private linkWithArrowText = this.RULE('linkWithArrowText', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.LINK) },
|
||||
{ ALT: () => this.CONSUME(tokens.THICK_LINK) },
|
||||
{ ALT: () => this.CONSUME(tokens.DOTTED_LINK) },
|
||||
]);
|
||||
this.SUBRULE(this.arrowText);
|
||||
});
|
||||
|
||||
// Link statement
|
||||
private linkStatement = this.RULE('linkStatement', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.LINK) },
|
||||
{ ALT: () => this.CONSUME(tokens.THICK_LINK) },
|
||||
{ ALT: () => this.CONSUME(tokens.DOTTED_LINK) },
|
||||
]);
|
||||
});
|
||||
|
||||
// Link with edge text - START_LINK/START_DOTTED_LINK/START_THICK_LINK edgeText EdgeTextEnd
|
||||
private linkWithEdgeText = this.RULE('linkWithEdgeText', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.START_LINK) },
|
||||
{ ALT: () => this.CONSUME(tokens.START_DOTTED_LINK) },
|
||||
{ ALT: () => this.CONSUME(tokens.START_THICK_LINK) },
|
||||
]);
|
||||
this.SUBRULE(this.edgeText);
|
||||
this.CONSUME(tokens.EdgeTextEnd);
|
||||
});
|
||||
|
||||
// Edge text
|
||||
private edgeText = this.RULE('edgeText', () => {
|
||||
this.MANY(() => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.EdgeTextContent) },
|
||||
{ ALT: () => this.CONSUME(tokens.EdgeTextPipe) },
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.QuotedString) },
|
||||
]);
|
||||
});
|
||||
});
|
||||
|
||||
// Arrow text - PIPE text PIPE
|
||||
private arrowText = this.RULE('arrowText', () => {
|
||||
this.CONSUME(tokens.Pipe);
|
||||
this.SUBRULE(this.text);
|
||||
this.CONSUME2(tokens.Pipe);
|
||||
});
|
||||
|
||||
// Text rule - following JISON pattern
|
||||
private text = this.RULE('text', () => {
|
||||
this.AT_LEAST_ONE(() => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.TextContent) },
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.NumberToken) },
|
||||
{ ALT: () => this.CONSUME(tokens.WhiteSpace) },
|
||||
{ ALT: () => this.CONSUME(tokens.Colon) },
|
||||
{ ALT: () => this.CONSUME(tokens.Minus) },
|
||||
{ ALT: () => this.CONSUME(tokens.Ampersand) },
|
||||
{ ALT: () => this.CONSUME(tokens.QuotedString) },
|
||||
]);
|
||||
});
|
||||
});
|
||||
|
||||
// Link text
|
||||
private linkText = this.RULE('linkText', () => {
|
||||
this.AT_LEAST_ONE(() => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.TextContent) },
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
]);
|
||||
});
|
||||
});
|
||||
|
||||
// Style statement
|
||||
private styleStatement = this.RULE('styleStatement', () => {
|
||||
this.CONSUME(tokens.Style);
|
||||
this.SUBRULE(this.nodeId);
|
||||
this.SUBRULE(this.styleList);
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
|
||||
// Link style statement
|
||||
private linkStyleStatement = this.RULE('linkStyleStatement', () => {
|
||||
this.CONSUME(tokens.LinkStyle);
|
||||
this.SUBRULE(this.linkIndexList);
|
||||
this.SUBRULE(this.styleList);
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
|
||||
// Class definition statement
|
||||
private classDefStatement = this.RULE('classDefStatement', () => {
|
||||
this.CONSUME(tokens.ClassDef);
|
||||
this.SUBRULE(this.className);
|
||||
this.SUBRULE(this.styleList);
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
|
||||
// Class statement
|
||||
private classStatement = this.RULE('classStatement', () => {
|
||||
this.CONSUME(tokens.Class);
|
||||
this.SUBRULE(this.nodeIdList);
|
||||
this.SUBRULE(this.className);
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
|
||||
// Click statement
|
||||
private clickStatement = this.RULE('clickStatement', () => {
|
||||
this.CONSUME(tokens.Click);
|
||||
this.SUBRULE(this.nodeId);
|
||||
this.OR([
|
||||
{ ALT: () => this.SUBRULE(this.clickHref) },
|
||||
{ ALT: () => this.SUBRULE(this.clickCall) },
|
||||
]);
|
||||
this.OPTION(() => {
|
||||
this.OR2([
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.QuotedString) },
|
||||
]);
|
||||
});
|
||||
this.OPTION2(() => {
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
});
|
||||
|
||||
// Click href
|
||||
private clickHref = this.RULE('clickHref', () => {
|
||||
this.CONSUME(tokens.Href);
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.QuotedString) },
|
||||
]);
|
||||
});
|
||||
|
||||
// Click call
|
||||
private clickCall = this.RULE('clickCall', () => {
|
||||
this.OR([
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.Call);
|
||||
this.OR2([
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.QuotedString) },
|
||||
]);
|
||||
this.OPTION(() => {
|
||||
this.CONSUME(tokens.Pipe);
|
||||
// Parse arguments
|
||||
this.CONSUME2(tokens.Pipe);
|
||||
});
|
||||
},
|
||||
},
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.Callback);
|
||||
this.OR3([
|
||||
{ ALT: () => this.CONSUME2(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME2(tokens.QuotedString) },
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.StringStart);
|
||||
this.CONSUME(tokens.StringContent);
|
||||
this.CONSUME(tokens.StringEnd);
|
||||
},
|
||||
},
|
||||
]);
|
||||
},
|
||||
},
|
||||
]);
|
||||
});
|
||||
|
||||
// Subgraph statement
|
||||
private subgraphStatement = this.RULE('subgraphStatement', () => {
|
||||
this.CONSUME(tokens.Subgraph);
|
||||
this.OPTION(() => {
|
||||
this.SUBRULE(this.subgraphId);
|
||||
});
|
||||
this.OPTION2(() => {
|
||||
this.OR([
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.SquareStart);
|
||||
this.SUBRULE(this.nodeText);
|
||||
this.CONSUME(tokens.SquareEnd);
|
||||
},
|
||||
},
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.QuotedString);
|
||||
},
|
||||
},
|
||||
]);
|
||||
});
|
||||
this.OPTION3(() => {
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
this.MANY(() => {
|
||||
this.OR2([
|
||||
{ ALT: () => this.SUBRULE2(this.statement) },
|
||||
{ ALT: () => this.SUBRULE2(this.statementSeparator) },
|
||||
]);
|
||||
});
|
||||
this.CONSUME(tokens.End);
|
||||
this.OPTION4(() => {
|
||||
this.SUBRULE3(this.statementSeparator);
|
||||
});
|
||||
});
|
||||
|
||||
// Direction statement
|
||||
private directionStatement = this.RULE('directionStatement', () => {
|
||||
// TODO: Add direction keyword token
|
||||
this.CONSUME(tokens.DirectionValue);
|
||||
this.SUBRULE(this.statementSeparator);
|
||||
});
|
||||
|
||||
// Helper rules
|
||||
private className = this.RULE('className', () => {
|
||||
this.CONSUME(tokens.NODE_STRING);
|
||||
});
|
||||
|
||||
private subgraphId = this.RULE('subgraphId', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.QuotedString) },
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.StringStart);
|
||||
this.CONSUME(tokens.StringContent);
|
||||
this.CONSUME(tokens.StringEnd);
|
||||
},
|
||||
},
|
||||
]);
|
||||
});
|
||||
|
||||
private nodeIdList = this.RULE('nodeIdList', () => {
|
||||
this.SUBRULE(this.nodeId);
|
||||
this.MANY(() => {
|
||||
this.CONSUME(tokens.Comma);
|
||||
this.SUBRULE2(this.nodeId);
|
||||
});
|
||||
});
|
||||
|
||||
private linkIndexList = this.RULE('linkIndexList', () => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) }, // "default"
|
||||
{ ALT: () => this.SUBRULE(this.numberList) },
|
||||
]);
|
||||
});
|
||||
|
||||
private numberList = this.RULE('numberList', () => {
|
||||
this.CONSUME(tokens.NumberToken);
|
||||
this.MANY(() => {
|
||||
this.CONSUME(tokens.Comma);
|
||||
this.CONSUME2(tokens.NumberToken);
|
||||
});
|
||||
});
|
||||
|
||||
private styleList = this.RULE('styleList', () => {
|
||||
this.SUBRULE(this.style);
|
||||
this.MANY(() => {
|
||||
this.CONSUME(tokens.Comma);
|
||||
this.SUBRULE2(this.style);
|
||||
});
|
||||
});
|
||||
|
||||
private style = this.RULE('style', () => {
|
||||
this.AT_LEAST_ONE(() => {
|
||||
this.OR([
|
||||
{ ALT: () => this.CONSUME(tokens.NODE_STRING) },
|
||||
{ ALT: () => this.CONSUME(tokens.NumberToken) },
|
||||
{ ALT: () => this.CONSUME(tokens.Colon) },
|
||||
{ ALT: () => this.CONSUME(tokens.Semicolon) },
|
||||
{ ALT: () => this.CONSUME(tokens.Minus) },
|
||||
]);
|
||||
});
|
||||
});
|
||||
|
||||
// Standalone link statement
|
||||
private standaloneLinkStatement = this.RULE('standaloneLinkStatement', () => {
|
||||
this.SUBRULE(this.nodeId);
|
||||
this.SUBRULE(this.link);
|
||||
this.SUBRULE2(this.nodeId);
|
||||
});
|
||||
|
||||
// Accessibility statement
|
||||
private accStatement = this.RULE('accStatement', () => {
|
||||
this.OR([
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.AccTitle);
|
||||
this.CONSUME(tokens.AccTitleValue);
|
||||
},
|
||||
},
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.AccDescr);
|
||||
this.CONSUME(tokens.AccDescrValue);
|
||||
},
|
||||
},
|
||||
{
|
||||
ALT: () => {
|
||||
this.CONSUME(tokens.AccDescrMultiline);
|
||||
this.CONSUME(tokens.AccDescrMultilineValue);
|
||||
this.CONSUME(tokens.AccDescrMultilineEnd);
|
||||
},
|
||||
},
|
||||
]);
|
||||
});
|
||||
}
|
||||
|
||||
@@ -0,0 +1,363 @@
|
||||
import { FlowchartLexer } from './flowLexer.js';
|
||||
import { FlowchartParser } from './flowParser.js';
|
||||
import { FlowchartAstVisitor } from './flowAst.js';
|
||||
|
||||
// Interface matching existing Mermaid flowDb expectations
|
||||
export interface FlowDb {
|
||||
vertices: Record<string, any>;
|
||||
edges: any[];
|
||||
classes: Record<string, string>;
|
||||
subGraphs: any[];
|
||||
direction: string;
|
||||
tooltips: Record<string, string>;
|
||||
clickEvents: any[];
|
||||
firstGraph: () => boolean;
|
||||
setDirection: (dir: string) => void;
|
||||
addVertex: (
|
||||
id: string,
|
||||
text?: string,
|
||||
type?: string,
|
||||
style?: string,
|
||||
classes?: string[],
|
||||
dir?: string,
|
||||
props?: any
|
||||
) => void;
|
||||
addLink: (start: string | string[], end: string | string[], linkData: any) => void;
|
||||
addClass: (id: string, style: string) => void;
|
||||
setClass: (ids: string | string[], className: string) => void;
|
||||
setClickEvent: (id: string, functionName: string, functionArgs?: string) => void;
|
||||
setLink: (id: string, link: string, target?: string) => void;
|
||||
addSubGraph: (id: string, list: any[], title: string) => string;
|
||||
getVertices: () => Record<string, any>;
|
||||
getEdges: () => any[];
|
||||
getClasses: () => Record<string, string>;
|
||||
clear: () => void;
|
||||
setAccTitle: (title: string) => void;
|
||||
setAccDescription: (description: string) => void;
|
||||
}
|
||||
|
||||
class FlowchartParserAdapter {
|
||||
public lexer: any;
|
||||
public parser: FlowchartParser;
|
||||
public visitor: FlowchartAstVisitor;
|
||||
|
||||
// Mermaid compatibility
|
||||
public yy: FlowDb;
|
||||
|
||||
constructor() {
|
||||
this.lexer = FlowchartLexer;
|
||||
this.parser = new FlowchartParser();
|
||||
this.visitor = new FlowchartAstVisitor();
|
||||
|
||||
// Initialize yy object for Mermaid compatibility
|
||||
this.yy = this.createYY();
|
||||
}
|
||||
|
||||
public createYY(): FlowDb {
|
||||
const state = {
|
||||
vertices: new Map<string, any>(),
|
||||
edges: [] as any[],
|
||||
classes: {} as Record<string, string>,
|
||||
subGraphs: [] as any[],
|
||||
direction: 'TB',
|
||||
tooltips: {} as Record<string, string>,
|
||||
clickEvents: [] as any[],
|
||||
subCount: 0,
|
||||
accTitle: '',
|
||||
accDescription: '',
|
||||
};
|
||||
|
||||
return {
|
||||
vertices: state.vertices,
|
||||
edges: state.edges,
|
||||
classes: state.classes,
|
||||
subGraphs: state.subGraphs,
|
||||
direction: state.direction,
|
||||
tooltips: state.tooltips,
|
||||
clickEvents: state.clickEvents,
|
||||
|
||||
firstGraph: () => true,
|
||||
|
||||
setDirection: (dir: string) => {
|
||||
state.direction = dir;
|
||||
},
|
||||
|
||||
addVertex: (
|
||||
id: string,
|
||||
text?: string,
|
||||
type?: string,
|
||||
style?: string,
|
||||
classes?: string[],
|
||||
dir?: string,
|
||||
props?: any
|
||||
) => {
|
||||
state.vertices.set(id, {
|
||||
id,
|
||||
text: text || id,
|
||||
type: type || 'default',
|
||||
style,
|
||||
classes,
|
||||
dir,
|
||||
props,
|
||||
});
|
||||
},
|
||||
|
||||
addLink: (start: string | string[], end: string | string[], linkData: any) => {
|
||||
state.edges.push({
|
||||
start: Array.isArray(start) ? start[start.length - 1] : start,
|
||||
end: Array.isArray(end) ? end[end.length - 1] : end,
|
||||
type: linkData.type || 'arrow',
|
||||
stroke: linkData.stroke || 'normal',
|
||||
length: linkData.length,
|
||||
text: linkData.text,
|
||||
});
|
||||
},
|
||||
|
||||
addClass: (id: string, style: string) => {
|
||||
state.classes[id] = style;
|
||||
},
|
||||
|
||||
setClass: (ids: string | string[], className: string) => {
|
||||
const idArray = Array.isArray(ids) ? ids : [ids];
|
||||
idArray.forEach((id) => {
|
||||
const vertex = state.vertices.get(id);
|
||||
if (vertex) {
|
||||
vertex.classes = [className];
|
||||
}
|
||||
});
|
||||
},
|
||||
|
||||
setClickEvent: (id: string, functionName: string, functionArgs?: string) => {
|
||||
state.clickEvents.push({
|
||||
id,
|
||||
functionName,
|
||||
functionArgs,
|
||||
});
|
||||
},
|
||||
|
||||
setLink: (id: string, link: string, target?: string) => {
|
||||
state.clickEvents.push({
|
||||
id,
|
||||
link,
|
||||
target,
|
||||
});
|
||||
},
|
||||
|
||||
addSubGraph: (id: string, list: any[], title: string) => {
|
||||
const sgId = id || `subGraph${state.subCount++}`;
|
||||
state.subGraphs.push({
|
||||
id: sgId,
|
||||
nodes: list,
|
||||
title: title || sgId,
|
||||
});
|
||||
return sgId;
|
||||
},
|
||||
|
||||
getVertices: () => state.vertices,
|
||||
getEdges: () => state.edges,
|
||||
getClasses: () => state.classes,
|
||||
|
||||
clear: () => {
|
||||
state.vertices.clear();
|
||||
state.edges.length = 0;
|
||||
state.classes = {};
|
||||
state.subGraphs = [];
|
||||
state.direction = 'TB';
|
||||
state.tooltips = {};
|
||||
state.clickEvents = [];
|
||||
state.subCount = 0;
|
||||
state.accTitle = '';
|
||||
state.accDescription = '';
|
||||
},
|
||||
|
||||
setAccTitle: (title: string) => {
|
||||
state.accTitle = title;
|
||||
},
|
||||
|
||||
setAccDescription: (description: string) => {
|
||||
state.accDescription = description;
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
parse(text: string): any {
|
||||
// Clear previous state
|
||||
this.yy.clear();
|
||||
|
||||
// Tokenize
|
||||
const lexResult = this.lexer.tokenize(text);
|
||||
|
||||
if (lexResult.errors.length > 0) {
|
||||
const error = lexResult.errors[0];
|
||||
throw new Error(
|
||||
`Lexing error at line ${error.line}, column ${error.column}: ${error.message}`
|
||||
);
|
||||
}
|
||||
|
||||
// Parse
|
||||
this.parser.input = lexResult.tokens;
|
||||
const cst = this.parser.flowchart();
|
||||
|
||||
if (this.parser.errors.length > 0) {
|
||||
const error = this.parser.errors[0];
|
||||
throw new Error(`Parse error: ${error.message}`);
|
||||
}
|
||||
|
||||
// Visit CST and build AST
|
||||
const ast = this.visitor.visit(cst);
|
||||
|
||||
// Update yy state with parsed data
|
||||
// Convert plain object vertices to Map
|
||||
Object.entries(ast.vertices).forEach(([id, vertex]) => {
|
||||
this.yy.vertices.set(id, vertex);
|
||||
});
|
||||
this.yy.edges.push(...ast.edges);
|
||||
Object.assign(this.yy.classes, ast.classes);
|
||||
this.yy.subGraphs.push(...ast.subGraphs);
|
||||
this.yy.direction = ast.direction;
|
||||
Object.assign(this.yy.tooltips, ast.tooltips);
|
||||
this.yy.clickEvents.push(...ast.clickEvents);
|
||||
|
||||
return ast;
|
||||
}
|
||||
|
||||
// Compatibility method for Mermaid
|
||||
getYY(): FlowDb {
|
||||
return this.yy;
|
||||
}
|
||||
}
|
||||
|
||||
// Export a singleton instance for compatibility
|
||||
const parserInstance = new FlowchartParserAdapter();
|
||||
|
||||
// Create a flow object that can have its yy property reassigned
|
||||
const flow = {
|
||||
parser: parserInstance,
|
||||
yy: parserInstance.yy,
|
||||
parse: (text: string) => {
|
||||
// Use the current yy object (which might have been reassigned by tests)
|
||||
const targetYY = flow.yy;
|
||||
|
||||
// Clear previous state
|
||||
targetYY.clear();
|
||||
parserInstance.visitor.clear();
|
||||
|
||||
// Tokenize
|
||||
const lexResult = parserInstance.lexer.tokenize(text);
|
||||
|
||||
if (lexResult.errors.length > 0) {
|
||||
const error = lexResult.errors[0];
|
||||
throw new Error(
|
||||
`Lexing error at line ${error.line}, column ${error.column}: ${error.message}`
|
||||
);
|
||||
}
|
||||
|
||||
// Parse
|
||||
parserInstance.parser.input = lexResult.tokens;
|
||||
const cst = parserInstance.parser.flowchart();
|
||||
|
||||
if (parserInstance.parser.errors.length > 0) {
|
||||
const error = parserInstance.parser.errors[0];
|
||||
throw new Error(`Parse error: ${error.message}`);
|
||||
}
|
||||
|
||||
// Visit CST and build AST
|
||||
const ast = parserInstance.visitor.visit(cst);
|
||||
|
||||
// Update yy state with parsed data
|
||||
// Convert plain object vertices to Map
|
||||
Object.entries(ast.vertices).forEach(([id, vertex]) => {
|
||||
// Use addVertex method if available, otherwise set directly
|
||||
if (typeof targetYY.addVertex === 'function') {
|
||||
// Create textObj structure expected by FlowDB
|
||||
const textObj = vertex.text ? { text: vertex.text, type: 'text' } : undefined;
|
||||
targetYY.addVertex(
|
||||
id,
|
||||
textObj,
|
||||
vertex.type,
|
||||
vertex.style || [],
|
||||
vertex.classes || [],
|
||||
vertex.dir,
|
||||
vertex.props || {},
|
||||
undefined // metadata
|
||||
);
|
||||
} else {
|
||||
targetYY.vertices.set(id, vertex);
|
||||
}
|
||||
});
|
||||
|
||||
// Add edges
|
||||
ast.edges.forEach((edge) => {
|
||||
if (typeof targetYY.addLink === 'function') {
|
||||
// Create the linkData structure expected by FlowDB
|
||||
const linkData = {
|
||||
type: edge.type,
|
||||
stroke: edge.stroke,
|
||||
length: edge.length,
|
||||
text: edge.text ? { text: edge.text, type: 'text' } : undefined,
|
||||
};
|
||||
targetYY.addLink([edge.start], [edge.end], linkData);
|
||||
} else {
|
||||
targetYY.edges.push(edge);
|
||||
}
|
||||
});
|
||||
|
||||
// Add classes
|
||||
Object.entries(ast.classes).forEach(([id, className]) => {
|
||||
if (typeof targetYY.addClass === 'function') {
|
||||
// FlowDB.addClass expects an array of style strings, not a single string
|
||||
const styleArray = className.split(',').map((s) => s.trim());
|
||||
targetYY.addClass(id, styleArray);
|
||||
} else {
|
||||
targetYY.classes[id] = className;
|
||||
}
|
||||
});
|
||||
|
||||
// Add subgraphs
|
||||
if (targetYY.subGraphs) {
|
||||
targetYY.subGraphs.push(...ast.subGraphs);
|
||||
}
|
||||
|
||||
// Set direction
|
||||
if (typeof targetYY.setDirection === 'function') {
|
||||
targetYY.setDirection(ast.direction);
|
||||
} else {
|
||||
targetYY.direction = ast.direction;
|
||||
}
|
||||
|
||||
// Add tooltips
|
||||
Object.entries(ast.tooltips).forEach(([id, tooltip]) => {
|
||||
if (typeof targetYY.setTooltip === 'function') {
|
||||
targetYY.setTooltip(id, tooltip);
|
||||
} else if (targetYY.tooltips) {
|
||||
targetYY.tooltips[id] = tooltip;
|
||||
}
|
||||
});
|
||||
|
||||
// Add accessibility information
|
||||
if (ast.accTitle && typeof targetYY.setAccTitle === 'function') {
|
||||
targetYY.setAccTitle(ast.accTitle);
|
||||
}
|
||||
if (ast.accDescription && typeof targetYY.setAccDescription === 'function') {
|
||||
targetYY.setAccDescription(ast.accDescription);
|
||||
}
|
||||
|
||||
// Add click events
|
||||
ast.clickEvents.forEach((clickEvent) => {
|
||||
if (typeof targetYY.setClickEvent === 'function') {
|
||||
targetYY.setClickEvent(clickEvent.id, clickEvent.functionName, clickEvent.functionArgs);
|
||||
} else if (targetYY.clickEvents) {
|
||||
targetYY.clickEvents.push(clickEvent);
|
||||
}
|
||||
});
|
||||
|
||||
return ast;
|
||||
},
|
||||
};
|
||||
|
||||
// Mermaid expects these exports
|
||||
export const parser = parserInstance;
|
||||
export const yy = parserInstance.yy;
|
||||
|
||||
// Default export for modern imports
|
||||
export default flow;
|
||||
@@ -1,5 +1,5 @@
|
||||
import { FlowDB } from '../flowDb.js';
|
||||
import flow from './flowParser.ts';
|
||||
import flow from './flowParserAdapter.js';
|
||||
import { setConfig } from '../../../config.js';
|
||||
|
||||
setConfig({
|
||||
@@ -8,13 +8,13 @@ setConfig({
|
||||
|
||||
describe('when parsing subgraphs', function () {
|
||||
beforeEach(function () {
|
||||
flow.parser.yy = new FlowDB();
|
||||
flow.parser.yy.clear();
|
||||
flow.parser.yy.setGen('gen-2');
|
||||
flow.yy = new FlowDB();
|
||||
flow.yy.clear();
|
||||
flow.yy.setGen('gen-2');
|
||||
});
|
||||
it('should handle subgraph with tab indentation', function () {
|
||||
const res = flow.parser.parse('graph TB\nsubgraph One\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const res = flow.parse('graph TB\nsubgraph One\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
@@ -25,8 +25,8 @@ describe('when parsing subgraphs', function () {
|
||||
expect(subgraph.id).toBe('One');
|
||||
});
|
||||
it('should handle subgraph with chaining nodes indentation', function () {
|
||||
const res = flow.parser.parse('graph TB\nsubgraph One\n\ta1-->a2-->a3\nend');
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const res = flow.parse('graph TB\nsubgraph One\n\ta1-->a2-->a3\nend');
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(3);
|
||||
@@ -38,8 +38,8 @@ describe('when parsing subgraphs', function () {
|
||||
});
|
||||
|
||||
it('should handle subgraph with multiple words in title', function () {
|
||||
const res = flow.parser.parse('graph TB\nsubgraph "Some Title"\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const res = flow.parse('graph TB\nsubgraph "Some Title"\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(2);
|
||||
@@ -50,8 +50,8 @@ describe('when parsing subgraphs', function () {
|
||||
});
|
||||
|
||||
it('should handle subgraph with id and title notation', function () {
|
||||
const res = flow.parser.parse('graph TB\nsubgraph some-id[Some Title]\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const res = flow.parse('graph TB\nsubgraph some-id[Some Title]\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(2);
|
||||
@@ -62,8 +62,8 @@ describe('when parsing subgraphs', function () {
|
||||
});
|
||||
|
||||
it.skip('should handle subgraph without id and space in title', function () {
|
||||
const res = flow.parser.parse('graph TB\nsubgraph Some Title\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const res = flow.parse('graph TB\nsubgraph Some Title\n\ta1-->a2\nend');
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(2);
|
||||
@@ -74,13 +74,13 @@ describe('when parsing subgraphs', function () {
|
||||
});
|
||||
|
||||
it('should handle subgraph id starting with a number', function () {
|
||||
const res = flow.parser.parse(`graph TD
|
||||
const res = flow.parse(`graph TD
|
||||
A[Christmas] -->|Get money| B(Go shopping)
|
||||
subgraph 1test
|
||||
A
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
expect(subgraph.nodes.length).toBe(1);
|
||||
@@ -89,20 +89,20 @@ describe('when parsing subgraphs', function () {
|
||||
});
|
||||
|
||||
it('should handle subgraphs1', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;subgraph myTitle;c-->d;end;');
|
||||
const res = flow.parse('graph TD;A-->B;subgraph myTitle;c-->d;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle subgraphs with title in quotes', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;subgraph "title in quotes";c-->d;end;');
|
||||
const res = flow.parse('graph TD;A-->B;subgraph "title in quotes";c-->d;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
@@ -111,12 +111,12 @@ describe('when parsing subgraphs', function () {
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle subgraphs in old style that was broken', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;subgraph old style that is broken;c-->d;end;');
|
||||
const res = flow.parse('graph TD;A-->B;subgraph old style that is broken;c-->d;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
@@ -125,12 +125,12 @@ describe('when parsing subgraphs', function () {
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle subgraphs with dashes in the title', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;subgraph a-b-c;c-->d;end;');
|
||||
const res = flow.parse('graph TD;A-->B;subgraph a-b-c;c-->d;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
@@ -139,12 +139,12 @@ describe('when parsing subgraphs', function () {
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle subgraphs with id and title in brackets', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;subgraph uid1[text of doom];c-->d;end;');
|
||||
const res = flow.parse('graph TD;A-->B;subgraph uid1[text of doom];c-->d;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
@@ -154,12 +154,12 @@ describe('when parsing subgraphs', function () {
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle subgraphs with id and title in brackets and quotes', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;subgraph uid2["text of doom"];c-->d;end;');
|
||||
const res = flow.parse('graph TD;A-->B;subgraph uid2["text of doom"];c-->d;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
@@ -169,12 +169,12 @@ describe('when parsing subgraphs', function () {
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle subgraphs with id and title in brackets without spaces', function () {
|
||||
const res = flow.parser.parse('graph TD;A-->B;subgraph uid2[textofdoom];c-->d;end;');
|
||||
const res = flow.parse('graph TD;A-->B;subgraph uid2[textofdoom];c-->d;end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(1);
|
||||
const subgraph = subgraphs[0];
|
||||
|
||||
@@ -185,19 +185,19 @@ describe('when parsing subgraphs', function () {
|
||||
});
|
||||
|
||||
it('should handle subgraphs2', function () {
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nsubgraph myTitle\n\n c-->d \nend\n');
|
||||
const res = flow.parse('graph TD\nA-->B\nsubgraph myTitle\n\n c-->d \nend\n');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
|
||||
it('should handle subgraphs3', function () {
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nsubgraph myTitle \n\n c-->d \nend\n');
|
||||
const res = flow.parse('graph TD\nA-->B\nsubgraph myTitle \n\n c-->d \nend\n');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
@@ -211,36 +211,36 @@ describe('when parsing subgraphs', function () {
|
||||
' subgraph inner\n\n e-->f \n end \n\n' +
|
||||
' subgraph inner\n\n h-->i \n end \n\n' +
|
||||
'end\n';
|
||||
const res = flow.parser.parse(str);
|
||||
const res = flow.parse(str);
|
||||
});
|
||||
|
||||
it('should handle subgraphs4', function () {
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nsubgraph myTitle\nc-->d\nend;');
|
||||
const res = flow.parse('graph TD\nA-->B\nsubgraph myTitle\nc-->d\nend;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
|
||||
it('should handle subgraphs5', function () {
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nsubgraph myTitle\nc-- text -->d\nd-->e\n end;');
|
||||
const res = flow.parse('graph TD\nA-->B\nsubgraph myTitle\nc-- text -->d\nd-->e\n end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle subgraphs with multi node statements in it', function () {
|
||||
const res = flow.parser.parse('graph TD\nA-->B\nsubgraph myTitle\na & b --> c & e\n end;');
|
||||
const res = flow.parse('graph TD\nA-->B\nsubgraph myTitle\na & b --> c & e\n end;');
|
||||
|
||||
const vert = flow.parser.yy.getVertices();
|
||||
const edges = flow.parser.yy.getEdges();
|
||||
const vert = flow.yy.getVertices();
|
||||
const edges = flow.yy.getEdges();
|
||||
|
||||
expect(edges[0].type).toBe('arrow_point');
|
||||
});
|
||||
it('should handle nested subgraphs 1', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
subgraph A
|
||||
b-->B
|
||||
a
|
||||
@@ -250,7 +250,7 @@ describe('when parsing subgraphs', function () {
|
||||
c
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(2);
|
||||
|
||||
const subgraphA = subgraphs.find((o) => o.id === 'A');
|
||||
@@ -263,7 +263,7 @@ describe('when parsing subgraphs', function () {
|
||||
expect(subgraphA.nodes).not.toContain('c');
|
||||
});
|
||||
it('should handle nested subgraphs 2', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
b-->B
|
||||
a-->c
|
||||
subgraph B
|
||||
@@ -275,7 +275,7 @@ describe('when parsing subgraphs', function () {
|
||||
B
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(2);
|
||||
|
||||
const subgraphA = subgraphs.find((o) => o.id === 'A');
|
||||
@@ -288,7 +288,7 @@ describe('when parsing subgraphs', function () {
|
||||
expect(subgraphA.nodes).not.toContain('c');
|
||||
});
|
||||
it('should handle nested subgraphs 3', function () {
|
||||
const res = flow.parser.parse(`flowchart TB
|
||||
const res = flow.parse(`flowchart TB
|
||||
subgraph B
|
||||
c
|
||||
end
|
||||
@@ -298,7 +298,7 @@ describe('when parsing subgraphs', function () {
|
||||
a
|
||||
end`);
|
||||
|
||||
const subgraphs = flow.parser.yy.getSubGraphs();
|
||||
const subgraphs = flow.yy.getSubGraphs();
|
||||
expect(subgraphs.length).toBe(2);
|
||||
|
||||
const subgraphA = subgraphs.find((o) => o.id === 'A');
|
||||
|
||||
@@ -0,0 +1,40 @@
|
||||
import { FlowchartLexer } from './flowLexer.js';
|
||||
import { FlowchartParser } from './flowParser.js';
|
||||
import { FlowchartAstVisitor } from './flowAst.js';
|
||||
|
||||
// Simple test function
|
||||
function testChevrotainParser() {
|
||||
// Test simple flowchart
|
||||
const input = `
|
||||
graph TD
|
||||
A[Start] --> B{Decision}
|
||||
B -->|Yes| C[Process]
|
||||
B -->|No| D[End]
|
||||
C --> D
|
||||
`;
|
||||
|
||||
// Tokenize
|
||||
const lexResult = FlowchartLexer.tokenize(input);
|
||||
|
||||
if (lexResult.errors.length > 0) {
|
||||
throw new Error(`Lexing errors: ${lexResult.errors.map((e) => e.message).join(', ')}`);
|
||||
}
|
||||
|
||||
// Parse
|
||||
const parser = new FlowchartParser();
|
||||
parser.input = lexResult.tokens;
|
||||
const cst = parser.flowchart();
|
||||
|
||||
if (parser.errors.length > 0) {
|
||||
throw new Error(`Parse errors: ${parser.errors.map((e) => e.message).join(', ')}`);
|
||||
}
|
||||
|
||||
// Visit CST and build AST
|
||||
const visitor = new FlowchartAstVisitor();
|
||||
const ast = visitor.visit(cst);
|
||||
|
||||
return ast;
|
||||
}
|
||||
|
||||
// Export for testing
|
||||
export { testChevrotainParser };
|
||||
156
pnpm-lock.yaml
generated
156
pnpm-lock.yaml
generated
@@ -229,6 +229,9 @@ importers:
|
||||
'@types/d3':
|
||||
specifier: ^7.4.3
|
||||
version: 7.4.3
|
||||
chevrotain:
|
||||
specifier: ^11.0.3
|
||||
version: 11.0.3
|
||||
cytoscape:
|
||||
specifier: ^3.29.3
|
||||
version: 3.31.0
|
||||
@@ -508,6 +511,67 @@ importers:
|
||||
specifier: ^7.3.0
|
||||
version: 7.3.0
|
||||
|
||||
packages/mermaid/src/vitepress:
|
||||
dependencies:
|
||||
'@mdi/font':
|
||||
specifier: ^7.4.47
|
||||
version: 7.4.47
|
||||
'@vueuse/core':
|
||||
specifier: ^12.7.0
|
||||
version: 12.7.0(typescript@5.7.3)
|
||||
font-awesome:
|
||||
specifier: ^4.7.0
|
||||
version: 4.7.0
|
||||
jiti:
|
||||
specifier: ^2.4.2
|
||||
version: 2.4.2
|
||||
mermaid:
|
||||
specifier: workspace:^
|
||||
version: link:../..
|
||||
vue:
|
||||
specifier: ^3.4.38
|
||||
version: 3.5.13(typescript@5.7.3)
|
||||
devDependencies:
|
||||
'@iconify-json/carbon':
|
||||
specifier: ^1.1.37
|
||||
version: 1.2.1
|
||||
'@unocss/reset':
|
||||
specifier: ^66.0.0
|
||||
version: 66.0.0
|
||||
'@vite-pwa/vitepress':
|
||||
specifier: ^0.5.3
|
||||
version: 0.5.4(vite-plugin-pwa@0.21.2(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0))
|
||||
'@vitejs/plugin-vue':
|
||||
specifier: ^5.0.5
|
||||
version: 5.2.1(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))
|
||||
fast-glob:
|
||||
specifier: ^3.3.3
|
||||
version: 3.3.3
|
||||
https-localhost:
|
||||
specifier: ^4.7.1
|
||||
version: 4.7.1
|
||||
pathe:
|
||||
specifier: ^2.0.3
|
||||
version: 2.0.3
|
||||
unocss:
|
||||
specifier: ^66.0.0
|
||||
version: 66.0.0(postcss@8.5.3)(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))
|
||||
unplugin-vue-components:
|
||||
specifier: ^28.4.0
|
||||
version: 28.4.0(@babel/parser@7.27.2)(vue@3.5.13(typescript@5.7.3))
|
||||
vite:
|
||||
specifier: ^6.1.1
|
||||
version: 6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1)
|
||||
vite-plugin-pwa:
|
||||
specifier: ^0.21.1
|
||||
version: 0.21.2(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0)
|
||||
vitepress:
|
||||
specifier: 1.6.3
|
||||
version: 1.6.3(@algolia/client-search@5.20.3)(@types/node@22.13.5)(axios@1.8.4)(postcss@8.5.3)(search-insights@2.17.2)(terser@5.39.0)(typescript@5.7.3)
|
||||
workbox-window:
|
||||
specifier: ^7.3.0
|
||||
version: 7.3.0
|
||||
|
||||
packages/parser:
|
||||
dependencies:
|
||||
langium:
|
||||
@@ -3627,6 +3691,15 @@ packages:
|
||||
peerDependencies:
|
||||
vite: ^2.9.0 || ^3.0.0-0 || ^4.0.0 || ^5.0.0-0 || ^6.0.0-0
|
||||
|
||||
'@vite-pwa/vitepress@0.5.4':
|
||||
resolution: {integrity: sha512-g57qwG983WTyQNLnOcDVPQEIeN+QDgK/HdqghmygiUFp3a/MzVvmLXC/EVnPAXxWa8W2g9pZ9lE3EiDGs2HjsA==}
|
||||
peerDependencies:
|
||||
'@vite-pwa/assets-generator': ^0.2.6
|
||||
vite-plugin-pwa: '>=0.21.2 <1'
|
||||
peerDependenciesMeta:
|
||||
'@vite-pwa/assets-generator':
|
||||
optional: true
|
||||
|
||||
'@vite-pwa/vitepress@1.0.0':
|
||||
resolution: {integrity: sha512-i5RFah4urA6tZycYlGyBslVx8cVzbZBcARJLDg5rWMfAkRmyLtpRU6usGfVOwyN9kjJ2Bkm+gBHXF1hhr7HptQ==}
|
||||
peerDependencies:
|
||||
@@ -9594,6 +9667,18 @@ packages:
|
||||
peerDependencies:
|
||||
vite: '>=4 <=6'
|
||||
|
||||
vite-plugin-pwa@0.21.2:
|
||||
resolution: {integrity: sha512-vFhH6Waw8itNu37hWUJxL50q+CBbNcMVzsKaYHQVrfxTt3ihk3PeLO22SbiP1UNWzcEPaTQv+YVxe4G0KOjAkg==}
|
||||
engines: {node: '>=16.0.0'}
|
||||
peerDependencies:
|
||||
'@vite-pwa/assets-generator': ^0.2.6
|
||||
vite: ^3.1.0 || ^4.0.0 || ^5.0.0 || ^6.0.0
|
||||
workbox-build: ^7.3.0
|
||||
workbox-window: ^7.3.0
|
||||
peerDependenciesMeta:
|
||||
'@vite-pwa/assets-generator':
|
||||
optional: true
|
||||
|
||||
vite-plugin-pwa@1.0.0:
|
||||
resolution: {integrity: sha512-X77jo0AOd5OcxmWj3WnVti8n7Kw2tBgV1c8MCXFclrSlDV23ePzv2eTDIALXI2Qo6nJ5pZJeZAuX0AawvRfoeA==}
|
||||
engines: {node: '>=16.0.0'}
|
||||
@@ -14191,6 +14276,16 @@ snapshots:
|
||||
transitivePeerDependencies:
|
||||
- vue
|
||||
|
||||
'@unocss/astro@66.0.0(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))':
|
||||
dependencies:
|
||||
'@unocss/core': 66.0.0
|
||||
'@unocss/reset': 66.0.0
|
||||
'@unocss/vite': 66.0.0(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))
|
||||
optionalDependencies:
|
||||
vite: 6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1)
|
||||
transitivePeerDependencies:
|
||||
- vue
|
||||
|
||||
'@unocss/cli@66.0.0':
|
||||
dependencies:
|
||||
'@ampproject/remapping': 2.3.0
|
||||
@@ -14326,6 +14421,24 @@ snapshots:
|
||||
transitivePeerDependencies:
|
||||
- vue
|
||||
|
||||
'@unocss/vite@66.0.0(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))':
|
||||
dependencies:
|
||||
'@ampproject/remapping': 2.3.0
|
||||
'@unocss/config': 66.0.0
|
||||
'@unocss/core': 66.0.0
|
||||
'@unocss/inspector': 66.0.0(vue@3.5.13(typescript@5.7.3))
|
||||
chokidar: 3.6.0
|
||||
magic-string: 0.30.17
|
||||
tinyglobby: 0.2.12
|
||||
unplugin-utils: 0.2.4
|
||||
vite: 6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1)
|
||||
transitivePeerDependencies:
|
||||
- vue
|
||||
|
||||
'@vite-pwa/vitepress@0.5.4(vite-plugin-pwa@0.21.2(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0))':
|
||||
dependencies:
|
||||
vite-plugin-pwa: 0.21.2(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0)
|
||||
|
||||
'@vite-pwa/vitepress@1.0.0(vite-plugin-pwa@1.0.0(vite@6.1.1(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0))':
|
||||
dependencies:
|
||||
vite-plugin-pwa: 1.0.0(vite@6.1.1(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0)
|
||||
@@ -14340,6 +14453,11 @@ snapshots:
|
||||
vite: 6.1.1(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1)
|
||||
vue: 3.5.13(typescript@5.7.3)
|
||||
|
||||
'@vitejs/plugin-vue@5.2.1(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))':
|
||||
dependencies:
|
||||
vite: 6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1)
|
||||
vue: 3.5.13(typescript@5.7.3)
|
||||
|
||||
'@vitest/coverage-v8@3.0.6(vitest@3.0.6)':
|
||||
dependencies:
|
||||
'@ampproject/remapping': 2.3.0
|
||||
@@ -21455,6 +21573,33 @@ snapshots:
|
||||
- supports-color
|
||||
- vue
|
||||
|
||||
unocss@66.0.0(postcss@8.5.3)(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3)):
|
||||
dependencies:
|
||||
'@unocss/astro': 66.0.0(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))
|
||||
'@unocss/cli': 66.0.0
|
||||
'@unocss/core': 66.0.0
|
||||
'@unocss/postcss': 66.0.0(postcss@8.5.3)
|
||||
'@unocss/preset-attributify': 66.0.0
|
||||
'@unocss/preset-icons': 66.0.0
|
||||
'@unocss/preset-mini': 66.0.0
|
||||
'@unocss/preset-tagify': 66.0.0
|
||||
'@unocss/preset-typography': 66.0.0
|
||||
'@unocss/preset-uno': 66.0.0
|
||||
'@unocss/preset-web-fonts': 66.0.0
|
||||
'@unocss/preset-wind': 66.0.0
|
||||
'@unocss/preset-wind3': 66.0.0
|
||||
'@unocss/transformer-attributify-jsx': 66.0.0
|
||||
'@unocss/transformer-compile-class': 66.0.0
|
||||
'@unocss/transformer-directives': 66.0.0
|
||||
'@unocss/transformer-variant-group': 66.0.0
|
||||
'@unocss/vite': 66.0.0(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(vue@3.5.13(typescript@5.7.3))
|
||||
optionalDependencies:
|
||||
vite: 6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1)
|
||||
transitivePeerDependencies:
|
||||
- postcss
|
||||
- supports-color
|
||||
- vue
|
||||
|
||||
unpipe@1.0.0: {}
|
||||
|
||||
unplugin-utils@0.2.4:
|
||||
@@ -21570,6 +21715,17 @@ snapshots:
|
||||
transitivePeerDependencies:
|
||||
- supports-color
|
||||
|
||||
vite-plugin-pwa@0.21.2(vite@6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0):
|
||||
dependencies:
|
||||
debug: 4.4.0(supports-color@8.1.1)
|
||||
pretty-bytes: 6.1.1
|
||||
tinyglobby: 0.2.12
|
||||
vite: 6.1.6(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1)
|
||||
workbox-build: 7.1.1(@types/babel__core@7.20.5)
|
||||
workbox-window: 7.3.0
|
||||
transitivePeerDependencies:
|
||||
- supports-color
|
||||
|
||||
vite-plugin-pwa@1.0.0(vite@6.1.1(@types/node@22.13.5)(jiti@2.4.2)(terser@5.39.0)(tsx@4.19.3)(yaml@2.7.1))(workbox-build@7.1.1(@types/babel__core@7.20.5))(workbox-window@7.3.0):
|
||||
dependencies:
|
||||
debug: 4.4.0(supports-color@8.1.1)
|
||||
|
||||
Reference in New Issue
Block a user