Compare commits

..

1 Commits

Author SHA1 Message Date
johnnyfish
4fdc2ccd91 feat: implement reusable click-outside hook for edit fields
- Create custom useClickOutside and useEditClickOutside hooks
- Replace useClickAway with custom hook for better event handling
- Add click-outside behavior to all edit fields:
  - Diagram name in navbar
  - Area names on canvas (with context menu)
  - Table names in side panel
  - Relationship names in side panel
  - Table edit mode panel
- Improve edit mode UX with auto-focus and text selection
- Add pencil icons for visual edit affordance
2025-09-15 19:22:23 +03:00
133 changed files with 1715 additions and 8136 deletions

View File

@@ -1,72 +1,5 @@
# Changelog
## [1.17.0](https://github.com/chartdb/chartdb/compare/v1.16.0...v1.17.0) (2025-10-27)
### Features
* create relationships on canvas modal ([#946](https://github.com/chartdb/chartdb/issues/946)) ([34475ad](https://github.com/chartdb/chartdb/commit/34475add32f11323589ef092ccf2a8e9152ff272))
### Bug Fixes
* add auto-increment field detection in smart-query import ([#935](https://github.com/chartdb/chartdb/issues/935)) ([57b3b87](https://github.com/chartdb/chartdb/commit/57b3b8777fd0a445abf0ba6603faab612d469d5c))
* add open table in editor from canvas edit ([#952](https://github.com/chartdb/chartdb/issues/952)) ([7d811de](https://github.com/chartdb/chartdb/commit/7d811de097eb11e51012772fa6bf586fd0b16c62))
* add rels export dbml ([#937](https://github.com/chartdb/chartdb/issues/937)) ([c3c646b](https://github.com/chartdb/chartdb/commit/c3c646bf7cbb1328f4b2eb85c9a7e929f0fcd3b9))
* add support for arrays ([#949](https://github.com/chartdb/chartdb/issues/949)) ([49328d8](https://github.com/chartdb/chartdb/commit/49328d8fbd7786f6c0c04cd5605d43a24cbf10ea))
* add support for parsing default values in DBML ([#948](https://github.com/chartdb/chartdb/issues/948)) ([459698b](https://github.com/chartdb/chartdb/commit/459698b5d0a1ff23a3719c2e55e4ab2e2384c4fe))
* add timestampz and int as datatypes to postgres ([#940](https://github.com/chartdb/chartdb/issues/940)) ([b15bc94](https://github.com/chartdb/chartdb/commit/b15bc945acb96d7cb3832b3b1b607dfcaef9e5ca))
* auto-enter edit mode when creating new tables from canvas ([#943](https://github.com/chartdb/chartdb/issues/943)) ([bcd8aa9](https://github.com/chartdb/chartdb/commit/bcd8aa9378aa563f40a2b6802cc503be4c882356))
* dbml diff fields types preview ([#934](https://github.com/chartdb/chartdb/issues/934)) ([bb03309](https://github.com/chartdb/chartdb/commit/bb033091b1f64b888822be1423a80f16f5314f6b))
* exit table edit on area click ([#945](https://github.com/chartdb/chartdb/issues/945)) ([38fedce](https://github.com/chartdb/chartdb/commit/38fedcec0c10ea2b3f0b7fc92ca1f5ac9e540389))
* import array fields ([#961](https://github.com/chartdb/chartdb/issues/961)) ([91e713c](https://github.com/chartdb/chartdb/commit/91e713c30a44f1ba7a767ca7816079610136fcb8))
* manipulate schema directly from the canvas ([#947](https://github.com/chartdb/chartdb/issues/947)) ([7ad0e77](https://github.com/chartdb/chartdb/commit/7ad0e7712de975a23b2a337dc0a4a7fb4b122bd1))
* preserve multi-word types in DBML export/import ([#956](https://github.com/chartdb/chartdb/issues/956)) ([9ed27cf](https://github.com/chartdb/chartdb/commit/9ed27cf30cca1312713e80e525138f0c27154936))
* prevent text input glitch when editing table field names ([#944](https://github.com/chartdb/chartdb/issues/944)) ([498655e](https://github.com/chartdb/chartdb/commit/498655e7b77e57eaf641ba86263ce1ef60b93e16))
* resolve canvas filter tree state issues ([#953](https://github.com/chartdb/chartdb/issues/953)) ([ccb29e0](https://github.com/chartdb/chartdb/commit/ccb29e0a574dfa4cfdf0ebf242a4c4aaa48cc37b))
* resolve dbml increment & nullable attributes issue ([#954](https://github.com/chartdb/chartdb/issues/954)) ([2c4b344](https://github.com/chartdb/chartdb/commit/2c4b344efb24041e7f607fc6124e109b69aaa457))
* show SQL Script option conditionally for databases without DDL support ([#960](https://github.com/chartdb/chartdb/issues/960)) ([acf6d4b](https://github.com/chartdb/chartdb/commit/acf6d4b3654d8868b8a8ebf717c608d9749b71da))
* use flag for custom types ([#951](https://github.com/chartdb/chartdb/issues/951)) ([62dec48](https://github.com/chartdb/chartdb/commit/62dec4857211b705a8039691da1772263ea986fe))
## [1.16.0](https://github.com/chartdb/chartdb/compare/v1.15.1...v1.16.0) (2025-09-24)
### Features
* add area context menu and UI improvements ([#918](https://github.com/chartdb/chartdb/issues/918)) ([d09379e](https://github.com/chartdb/chartdb/commit/d09379e8be0fa3c83ca77ff62ae815fe4db9869b))
* add quick table mode on canvas ([#915](https://github.com/chartdb/chartdb/issues/915)) ([8954d89](https://github.com/chartdb/chartdb/commit/8954d893bbfee45bb311380115fb14ebbf3a3133))
* add zoom navigation buttons to canvas filter for tables and areas ([#903](https://github.com/chartdb/chartdb/issues/903)) ([a0fb1ed](https://github.com/chartdb/chartdb/commit/a0fb1ed08ba18b66354fa3498d610097a83d4afc))
* **import-db:** add DBML syntax to import database dialog ([#768](https://github.com/chartdb/chartdb/issues/768)) ([af3638d](https://github.com/chartdb/chartdb/commit/af3638da7a9b70f281ceaddbc2f712a713d90cda))
### Bug Fixes
* add areas width and height + table width to diff check ([#931](https://github.com/chartdb/chartdb/issues/931)) ([98f6edd](https://github.com/chartdb/chartdb/commit/98f6edd5c8a8e9130e892b2d841744e0cf63a7bf))
* add diff x,y ([#928](https://github.com/chartdb/chartdb/issues/928)) ([e4c4a3b](https://github.com/chartdb/chartdb/commit/e4c4a3b35484d9ece955a5aec577603dde73d634))
* add support for ALTER TABLE ADD COLUMN in PostgreSQL importer ([#892](https://github.com/chartdb/chartdb/issues/892)) ([ec6e46f](https://github.com/chartdb/chartdb/commit/ec6e46fe81ea1806c179c50a4c5779d8596008aa))
* add tests for diff ([#930](https://github.com/chartdb/chartdb/issues/930)) ([47a7a73](https://github.com/chartdb/chartdb/commit/47a7a73a137b87dfa6e67aff5f939cf64ccf4601))
* dbml edit mode glitch ([#925](https://github.com/chartdb/chartdb/issues/925)) ([93d72a8](https://github.com/chartdb/chartdb/commit/93d72a896bab9aa79d8ea2f876126887e432214c))
* dbml export default time bug ([#922](https://github.com/chartdb/chartdb/issues/922)) ([bc82f9d](https://github.com/chartdb/chartdb/commit/bc82f9d6a8fe4de2f7e0fc465e0a20c5dbf8f41d))
* dbml export renaming fields bug ([#921](https://github.com/chartdb/chartdb/issues/921)) ([26dc299](https://github.com/chartdb/chartdb/commit/26dc299cd28e9890d191c13f84a15ac38ae48b11))
* **dbml:** export array fields without quotes ([#911](https://github.com/chartdb/chartdb/issues/911)) ([5e81c18](https://github.com/chartdb/chartdb/commit/5e81c1848aaa911990e1e881d62525f5254d6d34))
* diff logic ([#927](https://github.com/chartdb/chartdb/issues/927)) ([1b8d51b](https://github.com/chartdb/chartdb/commit/1b8d51b73c4ed4b7c5929adcb17a44927c7defca))
* export dbml issues after upgrade version ([#883](https://github.com/chartdb/chartdb/issues/883)) ([07937a2](https://github.com/chartdb/chartdb/commit/07937a2f51708b1c10b45c2bd1f9a9acf5c3f708))
* export sql + import metadata lib ([#902](https://github.com/chartdb/chartdb/issues/902)) ([ffddcdc](https://github.com/chartdb/chartdb/commit/ffddcdcc987bacb0e0d7e8dea27d08d3a8c5a8c8))
* handle bidirectional relationships in DBML export ([#924](https://github.com/chartdb/chartdb/issues/924)) ([9991077](https://github.com/chartdb/chartdb/commit/99910779789a9c6ef113d06bc3de31e35b9b04d1))
* import dbml set pk field unique ([#920](https://github.com/chartdb/chartdb/issues/920)) ([d6ba4a4](https://github.com/chartdb/chartdb/commit/d6ba4a40749d85d2703f120600df4345dab3c561))
* improve SQL default value parsing for PostgreSQL, MySQL, and SQL Server with proper type handling and casting support ([#900](https://github.com/chartdb/chartdb/issues/900)) ([fe9ef27](https://github.com/chartdb/chartdb/commit/fe9ef275b8619dcfd7e57541a62a6237a16d29a8))
* move area utils ([#932](https://github.com/chartdb/chartdb/issues/932)) ([2dc1a6f](https://github.com/chartdb/chartdb/commit/2dc1a6fc7519e0a455b0e1306601195deb156c96))
* move auto arrange to toolbar ([#904](https://github.com/chartdb/chartdb/issues/904)) ([b016a70](https://github.com/chartdb/chartdb/commit/b016a70691bc22af5720b4de683e8c9353994fcc))
* remove general db creation ([#901](https://github.com/chartdb/chartdb/issues/901)) ([df89f0b](https://github.com/chartdb/chartdb/commit/df89f0b6b9ba3fcc8b05bae4f60c0dc4ad1d2215))
* remove many to many rel option ([#933](https://github.com/chartdb/chartdb/issues/933)) ([c567c0a](https://github.com/chartdb/chartdb/commit/c567c0a5f39157b2c430e92192b6750304d7a834))
* reset increment and default when change field ([#896](https://github.com/chartdb/chartdb/issues/896)) ([e5e1d59](https://github.com/chartdb/chartdb/commit/e5e1d5932762422ea63acfd6cf9fe4f03aa822f7))
* **sql-import:** handle SQL Server DDL with multiple tables, inline foreign keys, and case-insensitive field matching ([#897](https://github.com/chartdb/chartdb/issues/897)) ([2a64dee](https://github.com/chartdb/chartdb/commit/2a64deebb87a11ee3892024c3273d682bb86f7ef))
* **sql-import:** support ALTER TABLE ALTER COLUMN TYPE in PostgreSQL importer ([#895](https://github.com/chartdb/chartdb/issues/895)) ([aa29061](https://github.com/chartdb/chartdb/commit/aa290615caf806d7d0374c848d50b4636fde7e96))
* **sqlite:** improve parser to handle tables without column types and fix column detection ([#914](https://github.com/chartdb/chartdb/issues/914)) ([d3dbf41](https://github.com/chartdb/chartdb/commit/d3dbf41894d74f0ffce9afe3bd810f065aa53017))
* trigger edit table on canvas from context menu ([#919](https://github.com/chartdb/chartdb/issues/919)) ([bdc41c0](https://github.com/chartdb/chartdb/commit/bdc41c0b74d9d9918e7b6cd2152fa07c0c58ce60))
* update deps vulns ([#909](https://github.com/chartdb/chartdb/issues/909)) ([2bd9ca2](https://github.com/chartdb/chartdb/commit/2bd9ca25b2c7b1f053ff4fdc8c5cfc1b0e65901d))
* upgrade dbml lib ([#880](https://github.com/chartdb/chartdb/issues/880)) ([d8e0bc7](https://github.com/chartdb/chartdb/commit/d8e0bc7db8881971ddaea7177bcebee13cc865f6))
## [1.15.1](https://github.com/chartdb/chartdb/compare/v1.15.0...v1.15.1) (2025-08-27)

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "chartdb",
"version": "1.17.0",
"version": "1.15.1",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "chartdb",
"version": "1.17.0",
"version": "1.15.1",
"dependencies": {
"@ai-sdk/openai": "^0.0.51",
"@dbml/core": "^3.13.9",

View File

@@ -1,7 +1,7 @@
{
"name": "chartdb",
"private": true,
"version": "1.17.0",
"version": "1.15.1",
"type": "module",
"scripts": {
"dev": "vite",

View File

@@ -38,7 +38,7 @@ export interface CodeSnippetProps {
className?: string;
code: string;
codeToCopy?: string;
language?: 'sql' | 'shell' | 'dbml';
language?: 'sql' | 'shell';
loading?: boolean;
autoScroll?: boolean;
isComplete?: boolean;

View File

@@ -9,14 +9,12 @@ export const setupDBMLLanguage = (monaco: Monaco) => {
base: 'vs-dark',
inherit: true,
rules: [
{ token: 'comment', foreground: '6A9955' }, // Comments
{ token: 'keyword', foreground: '569CD6' }, // Table, Ref keywords
{ token: 'string', foreground: 'CE9178' }, // Strings
{ token: 'annotation', foreground: '9CDCFE' }, // [annotations]
{ token: 'delimiter', foreground: 'D4D4D4' }, // Braces {}
{ token: 'operator', foreground: 'D4D4D4' }, // Operators
{ token: 'type', foreground: '4EC9B0' }, // Data types
{ token: 'identifier', foreground: '9CDCFE' }, // Field names
{ token: 'datatype', foreground: '4EC9B0' }, // Data types
],
colors: {},
});
@@ -25,14 +23,12 @@ export const setupDBMLLanguage = (monaco: Monaco) => {
base: 'vs',
inherit: true,
rules: [
{ token: 'comment', foreground: '008000' }, // Comments
{ token: 'keyword', foreground: '0000FF' }, // Table, Ref keywords
{ token: 'string', foreground: 'A31515' }, // Strings
{ token: 'annotation', foreground: '001080' }, // [annotations]
{ token: 'delimiter', foreground: '000000' }, // Braces {}
{ token: 'operator', foreground: '000000' }, // Operators
{ token: 'type', foreground: '267F99' }, // Data types
{ token: 'identifier', foreground: '001080' }, // Field names
],
colors: {},
});
@@ -41,59 +37,23 @@ export const setupDBMLLanguage = (monaco: Monaco) => {
const datatypePattern = dataTypesNames.join('|');
monaco.languages.setMonarchTokensProvider('dbml', {
keywords: ['Table', 'Ref', 'Indexes', 'Note', 'Enum', 'enum'],
keywords: ['Table', 'Ref', 'Indexes', 'Note', 'Enum'],
datatypes: dataTypesNames,
operators: ['>', '<', '-'],
tokenizer: {
root: [
// Comments
[/\/\/.*$/, 'comment'],
// Keywords - case insensitive
[
/\b([Tt][Aa][Bb][Ll][Ee]|[Ee][Nn][Uu][Mm]|[Rr][Ee][Ff]|[Ii][Nn][Dd][Ee][Xx][Ee][Ss]|[Nn][Oo][Tt][Ee])\b/,
'keyword',
],
// Annotations in brackets
[/\[.*?\]/, 'annotation'],
// Strings
[/'''/, 'string', '@tripleQuoteString'],
[/"([^"\\]|\\.)*$/, 'string.invalid'], // non-terminated string
[/'([^'\\]|\\.)*$/, 'string.invalid'], // non-terminated string
[/"/, 'string', '@string_double'],
[/'/, 'string', '@string_single'],
[/".*?"/, 'string'],
[/'.*?'/, 'string'],
[/`.*?`/, 'string'],
// Delimiters and operators
[/[{}()]/, 'delimiter'],
[/[<>-]/, 'operator'],
[/:/, 'delimiter'],
// Data types
[new RegExp(`\\b(${datatypePattern})\\b`, 'i'), 'type'],
// Numbers
[/\d+/, 'number'],
// Identifiers
[/[a-zA-Z_]\w*/, 'identifier'],
[/[{}]/, 'delimiter'],
[/[<>]/, 'operator'],
[new RegExp(`\\b(${datatypePattern})\\b`, 'i'), 'type'], // Added 'i' flag for case-insensitive matching
],
string_double: [
[/[^\\"]+/, 'string'],
[/\\./, 'string.escape'],
[/"/, 'string', '@pop'],
],
string_single: [
[/[^\\']+/, 'string'],
[/\\./, 'string.escape'],
[/'/, 'string', '@pop'],
],
tripleQuoteString: [
[/[^']+/, 'string'],
[/'''/, 'string', '@pop'],

View File

@@ -58,7 +58,6 @@ export interface SelectBoxProps {
footerButtons?: React.ReactNode;
commandOnMouseDown?: (e: React.MouseEvent) => void;
commandOnClick?: (e: React.MouseEvent) => void;
onSearchChange?: (search: string) => void;
}
export const SelectBox = React.forwardRef<HTMLInputElement, SelectBoxProps>(
@@ -88,7 +87,6 @@ export const SelectBox = React.forwardRef<HTMLInputElement, SelectBoxProps>(
footerButtons,
commandOnMouseDown,
commandOnClick,
onSearchChange,
},
ref
) => {
@@ -242,7 +240,6 @@ export const SelectBox = React.forwardRef<HTMLInputElement, SelectBoxProps>(
<CommandItem
className="flex items-center"
key={option.value}
value={option.label}
keywords={option.regex ? [option.regex] : undefined}
onSelect={() =>
handleSelect(
@@ -407,10 +404,7 @@ export const SelectBox = React.forwardRef<HTMLInputElement, SelectBoxProps>(
<div className="relative">
<CommandInput
value={searchTerm}
onValueChange={(e) => {
setSearchTerm(e);
onSearchChange?.(e);
}}
onValueChange={(e) => setSearchTerm(e)}
ref={ref}
placeholder={inputPlaceholder ?? 'Search...'}
className="h-9"

View File

@@ -42,7 +42,6 @@ interface TreeViewProps<
renderHoverComponent?: (node: TreeNode<Type, Context>) => ReactNode;
renderActionsComponent?: (node: TreeNode<Type, Context>) => ReactNode;
loadingNodeIds?: string[];
disableCache?: boolean;
}
export function TreeView<
@@ -63,14 +62,12 @@ export function TreeView<
renderHoverComponent,
renderActionsComponent,
loadingNodeIds,
disableCache = false,
}: TreeViewProps<Type, Context>) {
const { expanded, loading, loadedChildren, hasMoreChildren, toggleNode } =
useTree({
fetchChildren,
expanded: expandedProp,
setExpanded: setExpandedProp,
disableCache,
});
const [selectedIdInternal, setSelectedIdInternal] = React.useState<
string | undefined
@@ -148,7 +145,6 @@ export function TreeView<
renderHoverComponent={renderHoverComponent}
renderActionsComponent={renderActionsComponent}
loadingNodeIds={loadingNodeIds}
disableCache={disableCache}
/>
))}
</div>
@@ -183,7 +179,6 @@ interface TreeNodeProps<
renderHoverComponent?: (node: TreeNode<Type, Context>) => ReactNode;
renderActionsComponent?: (node: TreeNode<Type, Context>) => ReactNode;
loadingNodeIds?: string[];
disableCache?: boolean;
}
function TreeNode<Type extends string, Context extends Record<Type, unknown>>({
@@ -206,16 +201,11 @@ function TreeNode<Type extends string, Context extends Record<Type, unknown>>({
renderHoverComponent,
renderActionsComponent,
loadingNodeIds,
disableCache = false,
}: TreeNodeProps<Type, Context>) {
const [isHovered, setIsHovered] = useState(false);
const isExpanded = expanded[node.id];
const isLoading = loading[node.id];
// If cache is disabled, always use fresh node.children
// Otherwise, use cached loadedChildren if available (for async fetched data)
const children = disableCache
? node.children
: node.children || loadedChildren[node.id];
const children = loadedChildren[node.id] || node.children;
const isSelected = selectedId === node.id;
const IconComponent =
@@ -433,7 +423,6 @@ function TreeNode<Type extends string, Context extends Record<Type, unknown>>({
renderHoverComponent={renderHoverComponent}
renderActionsComponent={renderActionsComponent}
loadingNodeIds={loadingNodeIds}
disableCache={disableCache}
/>
))}
{isLoading ? (

View File

@@ -28,12 +28,10 @@ export function useTree<
fetchChildren,
expanded: expandedProp,
setExpanded: setExpandedProp,
disableCache = false,
}: {
fetchChildren?: FetchChildrenFunction<Type, Context>;
expanded?: ExpandedState;
setExpanded?: Dispatch<SetStateAction<ExpandedState>>;
disableCache?: boolean;
}) {
const [expandedInternal, setExpandedInternal] = useState<ExpandedState>({});
@@ -91,8 +89,8 @@ export function useTree<
// Get any previously fetched children
const previouslyFetchedChildren = loadedChildren[nodeId] || [];
// Only cache if caching is enabled
if (!disableCache && staticChildren?.length) {
// If we have static children, merge them with any previously fetched children
if (staticChildren?.length) {
const mergedChildren = mergeChildren(
staticChildren,
previouslyFetchedChildren
@@ -112,8 +110,8 @@ export function useTree<
// Set expanded state immediately to show static/previously fetched children
setExpanded((prev) => ({ ...prev, [nodeId]: true }));
// If we haven't loaded dynamic children yet and cache is enabled
if (!disableCache && !previouslyFetchedChildren.length) {
// If we haven't loaded dynamic children yet
if (!previouslyFetchedChildren.length) {
setLoading((prev) => ({ ...prev, [nodeId]: true }));
try {
const fetchedChildren = await fetchChildren?.(
@@ -142,14 +140,7 @@ export function useTree<
}
}
},
[
expanded,
loadedChildren,
fetchChildren,
mergeChildren,
setExpanded,
disableCache,
]
[expanded, loadedChildren, fetchChildren, mergeChildren, setExpanded]
);
return {

View File

@@ -24,31 +24,6 @@ export interface CanvasContext {
fieldId?: string;
} | null>
>;
tempFloatingEdge: {
sourceNodeId: string;
targetNodeId?: string;
} | null;
setTempFloatingEdge: React.Dispatch<
React.SetStateAction<{
sourceNodeId: string;
targetNodeId?: string;
} | null>
>;
startFloatingEdgeCreation: ({
sourceNodeId,
}: {
sourceNodeId: string;
}) => void;
endFloatingEdgeCreation: () => void;
hoveringTableId: string | null;
setHoveringTableId: React.Dispatch<React.SetStateAction<string | null>>;
showCreateRelationshipNode: (params: {
sourceTableId: string;
targetTableId: string;
x: number;
y: number;
}) => void;
hideCreateRelationshipNode: () => void;
}
export const canvasContext = createContext<CanvasContext>({
@@ -60,12 +35,4 @@ export const canvasContext = createContext<CanvasContext>({
showFilter: false,
editTableModeTable: null,
setEditTableModeTable: emptyFn,
tempFloatingEdge: null,
setTempFloatingEdge: emptyFn,
startFloatingEdgeCreation: emptyFn,
endFloatingEdgeCreation: emptyFn,
hoveringTableId: null,
setHoveringTableId: emptyFn,
showCreateRelationshipNode: emptyFn,
hideCreateRelationshipNode: emptyFn,
});

View File

@@ -5,7 +5,6 @@ import React, {
useEffect,
useRef,
} from 'react';
import type { CanvasContext } from './canvas-context';
import { canvasContext } from './canvas-context';
import { useChartDB } from '@/hooks/use-chartdb';
import { adjustTablePositions } from '@/lib/domain/db-table';
@@ -16,10 +15,6 @@ import { createGraph } from '@/lib/graph';
import { useDiagramFilter } from '../diagram-filter-context/use-diagram-filter';
import { filterTable } from '@/lib/domain/diagram-filter/filter';
import { defaultSchemas } from '@/lib/data/default-schemas';
import {
CREATE_RELATIONSHIP_NODE_ID,
type CreateRelationshipNodeType,
} from '@/pages/editor-page/canvas/create-relationship-node/create-relationship-node';
interface CanvasProviderProps {
children: ReactNode;
@@ -35,7 +30,7 @@ export const CanvasProvider = ({ children }: CanvasProviderProps) => {
diagramId,
} = useChartDB();
const { filter, loading: filterLoading } = useDiagramFilter();
const { fitView, screenToFlowPosition, setNodes } = useReactFlow();
const { fitView } = useReactFlow();
const [overlapGraph, setOverlapGraph] =
useState<Graph<string>>(createGraph());
const [editTableModeTable, setEditTableModeTable] = useState<{
@@ -44,12 +39,6 @@ export const CanvasProvider = ({ children }: CanvasProviderProps) => {
} | null>(null);
const [showFilter, setShowFilter] = useState(false);
const [tempFloatingEdge, setTempFloatingEdge] =
useState<CanvasContext['tempFloatingEdge']>(null);
const [hoveringTableId, setHoveringTableId] = useState<string | null>(null);
const diagramIdActiveFilterRef = useRef<string>();
useEffect(() => {
@@ -133,66 +122,6 @@ export const CanvasProvider = ({ children }: CanvasProviderProps) => {
]
);
const startFloatingEdgeCreation: CanvasContext['startFloatingEdgeCreation'] =
useCallback(({ sourceNodeId }) => {
setShowFilter(false);
setTempFloatingEdge({
sourceNodeId,
});
}, []);
const endFloatingEdgeCreation: CanvasContext['endFloatingEdgeCreation'] =
useCallback(() => {
setTempFloatingEdge(null);
}, []);
const hideCreateRelationshipNode: CanvasContext['hideCreateRelationshipNode'] =
useCallback(() => {
setNodes((nds) =>
nds.filter((n) => n.id !== CREATE_RELATIONSHIP_NODE_ID)
);
endFloatingEdgeCreation();
}, [setNodes, endFloatingEdgeCreation]);
const showCreateRelationshipNode: CanvasContext['showCreateRelationshipNode'] =
useCallback(
({ sourceTableId, targetTableId, x, y }) => {
setTempFloatingEdge((edge) =>
edge
? {
...edge,
targetNodeId: targetTableId,
}
: null
);
const cursorPos = screenToFlowPosition({
x,
y,
});
const newNode: CreateRelationshipNodeType = {
id: CREATE_RELATIONSHIP_NODE_ID,
type: 'create-relationship',
position: cursorPos,
data: {
sourceTableId,
targetTableId,
},
draggable: true,
selectable: false,
zIndex: 1000,
};
setNodes((nds) => {
const nodesWithoutOldCreateRelationshipNode = nds.filter(
(n) => n.id !== CREATE_RELATIONSHIP_NODE_ID
);
return [...nodesWithoutOldCreateRelationshipNode, newNode];
});
},
[screenToFlowPosition, setNodes]
);
return (
<canvasContext.Provider
value={{
@@ -204,14 +133,6 @@ export const CanvasProvider = ({ children }: CanvasProviderProps) => {
showFilter,
editTableModeTable,
setEditTableModeTable,
tempFloatingEdge: tempFloatingEdge,
setTempFloatingEdge: setTempFloatingEdge,
startFloatingEdgeCreation: startFloatingEdgeCreation,
endFloatingEdgeCreation: endFloatingEdgeCreation,
hoveringTableId,
setHoveringTableId,
showCreateRelationshipNode,
hideCreateRelationshipNode,
}}
>
{children}

View File

@@ -74,10 +74,10 @@ export const ChartDBProvider: React.FC<
useState<string>();
const diffCalculatedHandler = useCallback((event: DiffCalculatedEvent) => {
const { tablesToAdd, fieldsToAdd, relationshipsToAdd } = event.data;
const { tablesAdded, fieldsAdded, relationshipsAdded } = event.data;
setTables((tables) =>
[...tables, ...(tablesToAdd ?? [])].map((table) => {
const fields = fieldsToAdd.get(table.id);
[...tables, ...(tablesAdded ?? [])].map((table) => {
const fields = fieldsAdded.get(table.id);
return fields
? { ...table, fields: [...table.fields, ...fields] }
: table;
@@ -85,7 +85,7 @@ export const ChartDBProvider: React.FC<
);
setRelationships((relationships) => [
...relationships,
...(relationshipsToAdd ?? []),
...(relationshipsAdded ?? []),
]);
}, []);
@@ -350,7 +350,6 @@ export const ChartDBProvider: React.FC<
isView: false,
order: tables.length,
...attributes,
schema: attributes?.schema ?? defaultSchemas[databaseType],
};
table.indexes = getTableIndexesWithPrimaryKey({

View File

@@ -7,6 +7,7 @@ import type { ExportImageDialogProps } from '@/dialogs/export-image-dialog/expor
import type { ExportDiagramDialogProps } from '@/dialogs/export-diagram-dialog/export-diagram-dialog';
import type { ImportDiagramDialogProps } from '@/dialogs/import-diagram-dialog/import-diagram-dialog';
import type { CreateRelationshipDialogProps } from '@/dialogs/create-relationship-dialog/create-relationship-dialog';
import type { ImportDBMLDialogProps } from '@/dialogs/import-dbml-dialog/import-dbml-dialog';
import type { OpenDiagramDialogProps } from '@/dialogs/open-diagram-dialog/open-diagram-dialog';
import type { CreateDiagramDialogProps } from '@/dialogs/create-diagram-dialog/create-diagram-dialog';
@@ -66,6 +67,12 @@ export interface DialogContext {
params: Omit<ImportDiagramDialogProps, 'dialog'>
) => void;
closeImportDiagramDialog: () => void;
// Import DBML dialog
openImportDBMLDialog: (
params?: Omit<ImportDBMLDialogProps, 'dialog'>
) => void;
closeImportDBMLDialog: () => void;
}
export const dialogContext = createContext<DialogContext>({
@@ -89,4 +96,6 @@ export const dialogContext = createContext<DialogContext>({
closeExportDiagramDialog: emptyFn,
openImportDiagramDialog: emptyFn,
closeImportDiagramDialog: emptyFn,
openImportDBMLDialog: emptyFn,
closeImportDBMLDialog: emptyFn,
});

View File

@@ -20,6 +20,8 @@ import type { ExportImageDialogProps } from '@/dialogs/export-image-dialog/expor
import { ExportImageDialog } from '@/dialogs/export-image-dialog/export-image-dialog';
import { ExportDiagramDialog } from '@/dialogs/export-diagram-dialog/export-diagram-dialog';
import { ImportDiagramDialog } from '@/dialogs/import-diagram-dialog/import-diagram-dialog';
import type { ImportDBMLDialogProps } from '@/dialogs/import-dbml-dialog/import-dbml-dialog';
import { ImportDBMLDialog } from '@/dialogs/import-dbml-dialog/import-dbml-dialog';
export const DialogProvider: React.FC<React.PropsWithChildren> = ({
children,
@@ -130,6 +132,11 @@ export const DialogProvider: React.FC<React.PropsWithChildren> = ({
const [openImportDiagramDialog, setOpenImportDiagramDialog] =
useState(false);
// Import DBML dialog
const [openImportDBMLDialog, setOpenImportDBMLDialog] = useState(false);
const [importDBMLDialogParams, setImportDBMLDialogParams] =
useState<Omit<ImportDBMLDialogProps, 'dialog'>>();
return (
<dialogContext.Provider
value={{
@@ -158,6 +165,11 @@ export const DialogProvider: React.FC<React.PropsWithChildren> = ({
openImportDiagramDialog: () => setOpenImportDiagramDialog(true),
closeImportDiagramDialog: () =>
setOpenImportDiagramDialog(false),
openImportDBMLDialog: (params) => {
setImportDBMLDialogParams(params);
setOpenImportDBMLDialog(true);
},
closeImportDBMLDialog: () => setOpenImportDBMLDialog(false),
}}
>
{children}
@@ -192,6 +204,10 @@ export const DialogProvider: React.FC<React.PropsWithChildren> = ({
/>
<ExportDiagramDialog dialog={{ open: openExportDiagramDialog }} />
<ImportDiagramDialog dialog={{ open: openImportDiagramDialog }} />
<ImportDBMLDialog
dialog={{ open: openImportDBMLDialog }}
{...importDBMLDialogParams}
/>
</dialogContext.Provider>
);
};

View File

@@ -15,9 +15,9 @@ export type DiffEventBase<T extends DiffEventType, D> = {
};
export type DiffCalculatedData = {
tablesToAdd: DBTable[];
fieldsToAdd: Map<string, DBField[]>;
relationshipsToAdd: DBRelationship[];
tablesAdded: DBTable[];
fieldsAdded: Map<string, DBField[]>;
relationshipsAdded: DBRelationship[];
};
export type DiffCalculatedEvent = DiffEventBase<
@@ -44,21 +44,15 @@ export interface DiffContext {
options?: {
summaryOnly?: boolean;
};
}) => { foundDiff: boolean };
}) => void;
resetDiff: () => void;
// table diff
checkIfTableHasChange: ({ tableId }: { tableId: string }) => boolean;
checkIfNewTable: ({ tableId }: { tableId: string }) => boolean;
checkIfTableRemoved: ({ tableId }: { tableId: string }) => boolean;
getTableNewName: ({ tableId }: { tableId: string }) => {
old: string;
new: string;
} | null;
getTableNewColor: ({ tableId }: { tableId: string }) => {
old: string;
new: string;
} | null;
getTableNewName: ({ tableId }: { tableId: string }) => string | null;
getTableNewColor: ({ tableId }: { tableId: string }) => string | null;
// field diff
checkIfFieldHasChange: ({
@@ -70,46 +64,17 @@ export interface DiffContext {
}) => boolean;
checkIfFieldRemoved: ({ fieldId }: { fieldId: string }) => boolean;
checkIfNewField: ({ fieldId }: { fieldId: string }) => boolean;
getFieldNewName: ({
fieldId,
}: {
fieldId: string;
}) => { old: string; new: string } | null;
getFieldNewType: ({
fieldId,
}: {
fieldId: string;
}) => { old: DataType; new: DataType } | null;
getFieldNewPrimaryKey: ({
fieldId,
}: {
fieldId: string;
}) => { old: boolean; new: boolean } | null;
getFieldNewNullable: ({
fieldId,
}: {
fieldId: string;
}) => { old: boolean; new: boolean } | null;
getFieldNewName: ({ fieldId }: { fieldId: string }) => string | null;
getFieldNewType: ({ fieldId }: { fieldId: string }) => DataType | null;
getFieldNewPrimaryKey: ({ fieldId }: { fieldId: string }) => boolean | null;
getFieldNewNullable: ({ fieldId }: { fieldId: string }) => boolean | null;
getFieldNewCharacterMaximumLength: ({
fieldId,
}: {
fieldId: string;
}) => { old: string; new: string } | null;
getFieldNewScale: ({
fieldId,
}: {
fieldId: string;
}) => { old: number; new: number } | null;
getFieldNewPrecision: ({
fieldId,
}: {
fieldId: string;
}) => { old: number; new: number } | null;
getFieldNewIsArray: ({
fieldId,
}: {
fieldId: string;
}) => { old: boolean; new: boolean } | null;
}) => string | null;
getFieldNewScale: ({ fieldId }: { fieldId: string }) => number | null;
getFieldNewPrecision: ({ fieldId }: { fieldId: string }) => number | null;
// relationship diff
checkIfNewRelationship: ({

View File

@@ -36,7 +36,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const events = useEventEmitter<DiffEvent>();
const generateFieldsToAddMap = useCallback(
const generateNewFieldsMap = useCallback(
({
diffMap,
newDiagram,
@@ -66,7 +66,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
[]
);
const findRelationshipsToAdd = useCallback(
const findNewRelationships = useCallback(
({
diffMap,
newDiagram,
@@ -101,7 +101,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
diffMap: DiffMap;
}): DiffCalculatedData => {
return {
tablesToAdd:
tablesAdded:
newDiagram?.tables?.filter((table) => {
const tableKey = getDiffMapKey({
diffObject: 'table',
@@ -114,17 +114,17 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
);
}) ?? [],
fieldsToAdd: generateFieldsToAddMap({
fieldsAdded: generateNewFieldsMap({
diffMap: diffMap,
newDiagram: newDiagram,
}),
relationshipsToAdd: findRelationshipsToAdd({
relationshipsAdded: findNewRelationships({
diffMap: diffMap,
newDiagram: newDiagram,
}),
};
},
[findRelationshipsToAdd, generateFieldsToAddMap]
[findNewRelationships, generateNewFieldsMap]
);
const calculateDiff: DiffContext['calculateDiff'] = useCallback(
@@ -149,8 +149,6 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
newDiagram: newDiagramArg,
}),
});
return { foundDiff: !!newDiffs.size };
},
[setDiffMap, events, generateDiffCalculatedData]
);
@@ -167,10 +165,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(tableNameKey);
if (diff?.type === 'changed') {
return {
new: diff.newValue as string,
old: diff.oldValue as string,
};
return diff.newValue as string;
}
}
@@ -191,10 +186,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(tableColorKey);
if (diff?.type === 'changed') {
return {
new: diff.newValue as string,
old: diff.oldValue as string,
};
return diff.newValue as string;
}
}
return null;
@@ -285,10 +277,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as string,
new: diff.newValue as string,
};
return diff.newValue as string;
}
}
@@ -309,10 +298,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as DataType,
new: diff.newValue as DataType,
};
return diff.newValue as DataType;
}
}
@@ -335,10 +321,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as boolean,
new: diff.newValue as boolean,
};
return diff.newValue as boolean;
}
}
@@ -359,10 +342,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as boolean,
new: diff.newValue as boolean,
};
return diff.newValue as boolean;
}
}
@@ -385,10 +365,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as string,
new: diff.newValue as string,
};
return diff.newValue as string;
}
}
@@ -409,10 +386,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as number,
new: diff.newValue as number,
};
return diff.newValue as number;
}
}
@@ -435,34 +409,7 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as number,
new: diff.newValue as number,
};
}
}
return null;
},
[diffMap]
);
const getFieldNewIsArray = useCallback<DiffContext['getFieldNewIsArray']>(
({ fieldId }) => {
const fieldKey = getDiffMapKey({
diffObject: 'field',
objectId: fieldId,
attribute: 'isArray',
});
if (diffMap.has(fieldKey)) {
const diff = diffMap.get(fieldKey);
if (diff?.type === 'changed') {
return {
old: diff.oldValue as boolean,
new: diff.newValue as boolean,
};
return diff.newValue as number;
}
}
@@ -544,7 +491,6 @@ export const DiffProvider: React.FC<React.PropsWithChildren> = ({
getFieldNewCharacterMaximumLength,
getFieldNewScale,
getFieldNewPrecision,
getFieldNewIsArray,
// relationship diff
checkIfNewRelationship,

View File

@@ -42,14 +42,6 @@ import {
type ValidationResult,
} from '@/lib/data/sql-import/sql-validator';
import { SQLValidationStatus } from './sql-validation-status';
import { setupDBMLLanguage } from '@/components/code-snippet/languages/dbml-language';
import type { ImportMethod } from '@/lib/import-method/import-method';
import { detectImportMethod } from '@/lib/import-method/detect-import-method';
import { verifyDBML } from '@/lib/dbml/dbml-import/verify-dbml';
import {
clearErrorHighlight,
highlightErrorLine,
} from '@/components/code-snippet/dbml/utils';
const calculateContentSizeMB = (content: string): number => {
return content.length / (1024 * 1024); // Convert to MB
@@ -63,6 +55,49 @@ const calculateIsLargeFile = (content: string): boolean => {
const errorScriptOutputMessage =
'Invalid JSON. Please correct it or contact us at support@chartdb.io for help.';
// Helper to detect if content is likely SQL DDL or JSON
const detectContentType = (content: string): 'query' | 'ddl' | null => {
if (!content || content.trim().length === 0) return null;
// Common SQL DDL keywords
const ddlKeywords = [
'CREATE TABLE',
'ALTER TABLE',
'DROP TABLE',
'CREATE INDEX',
'CREATE VIEW',
'CREATE PROCEDURE',
'CREATE FUNCTION',
'CREATE SCHEMA',
'CREATE DATABASE',
];
const upperContent = content.toUpperCase();
// Check for SQL DDL patterns
const hasDDLKeywords = ddlKeywords.some((keyword) =>
upperContent.includes(keyword)
);
if (hasDDLKeywords) return 'ddl';
// Check if it looks like JSON
try {
// Just check structure, don't need full parse for detection
if (
(content.trim().startsWith('{') && content.trim().endsWith('}')) ||
(content.trim().startsWith('[') && content.trim().endsWith(']'))
) {
return 'query';
}
} catch (error) {
// Not valid JSON, might be partial
console.error('Error detecting content type:', error);
}
// If we can't confidently detect, return null
return null;
};
export interface ImportDatabaseProps {
goBack?: () => void;
onImport: () => void;
@@ -76,8 +111,8 @@ export interface ImportDatabaseProps {
>;
keepDialogAfterImport?: boolean;
title: string;
importMethod: ImportMethod;
setImportMethod: (method: ImportMethod) => void;
importMethod: 'query' | 'ddl';
setImportMethod: (method: 'query' | 'ddl') => void;
}
export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
@@ -97,7 +132,6 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
const { effectiveTheme } = useTheme();
const [errorMessage, setErrorMessage] = useState('');
const editorRef = useRef<editor.IStandaloneCodeEditor | null>(null);
const decorationsCollection = useRef<editor.IEditorDecorationsCollection>();
const pasteDisposableRef = useRef<IDisposable | null>(null);
const { t } = useTranslation();
@@ -112,20 +146,15 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
const [isAutoFixing, setIsAutoFixing] = useState(false);
const [showAutoFixButton, setShowAutoFixButton] = useState(false);
const clearDecorations = useCallback(() => {
clearErrorHighlight(decorationsCollection.current);
}, []);
useEffect(() => {
setScriptResult('');
setErrorMessage('');
setShowCheckJsonButton(false);
}, [importMethod, setScriptResult]);
// Check if the ddl or dbml is valid
// Check if the ddl is valid
useEffect(() => {
clearDecorations();
if (importMethod === 'query') {
if (importMethod !== 'ddl') {
setSqlValidation(null);
setShowAutoFixButton(false);
return;
@@ -134,54 +163,9 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
if (!scriptResult.trim()) {
setSqlValidation(null);
setShowAutoFixButton(false);
setErrorMessage('');
return;
}
if (importMethod === 'dbml') {
// Validate DBML by parsing it
const validateResponse = verifyDBML(scriptResult, { databaseType });
if (!validateResponse.hasError) {
setErrorMessage('');
setSqlValidation({
isValid: true,
errors: [],
warnings: [],
});
} else {
let errorMsg = 'Invalid DBML syntax';
let line: number = 1;
if (validateResponse.parsedError) {
errorMsg = validateResponse.parsedError.message;
line = validateResponse.parsedError.line;
highlightErrorLine({
error: validateResponse.parsedError,
model: editorRef.current?.getModel(),
editorDecorationsCollection:
decorationsCollection.current,
});
}
setSqlValidation({
isValid: false,
errors: [
{
message: errorMsg,
line: line,
type: 'syntax' as const,
},
],
warnings: [],
});
setErrorMessage(errorMsg);
}
setShowAutoFixButton(false);
return;
}
// SQL validation
// First run our validation based on database type
const validation = validateSQL(scriptResult, databaseType);
setSqlValidation(validation);
@@ -208,7 +192,7 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
setErrorMessage(result.error);
}
});
}, [importMethod, scriptResult, databaseType, clearDecorations]);
}, [importMethod, scriptResult, databaseType]);
// Check if the script result is a valid JSON
useEffect(() => {
@@ -336,8 +320,6 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
const handleEditorDidMount = useCallback(
(editor: editor.IStandaloneCodeEditor) => {
editorRef.current = editor;
decorationsCollection.current =
editor.createDecorationsCollection();
// Cleanup previous disposable if it exists
if (pasteDisposableRef.current) {
@@ -356,7 +338,7 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
const isLargeFile = calculateIsLargeFile(content);
// First, detect content type to determine if we should switch modes
const detectedType = detectImportMethod(content);
const detectedType = detectContentType(content);
if (detectedType && detectedType !== importMethod) {
// Switch to the detected mode immediately
setImportMethod(detectedType);
@@ -370,7 +352,7 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
?.run();
}, 100);
}
// For DDL and DBML modes, do NOT format as it can break the syntax
// For DDL mode, do NOT format as it can break the SQL
} else {
// Content type didn't change, apply formatting based on current mode
if (importMethod === 'query' && !isLargeFile) {
@@ -381,7 +363,7 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
?.run();
}, 100);
}
// For DDL and DBML modes or large files, do NOT format
// For DDL mode or large files, do NOT format
}
});
@@ -428,25 +410,16 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
<div className="w-full text-center text-xs text-muted-foreground">
{importMethod === 'query'
? 'Smart Query Output'
: importMethod === 'dbml'
? 'DBML Script'
: 'SQL Script'}
: 'SQL Script'}
</div>
<div className="flex-1 overflow-hidden">
<Suspense fallback={<Spinner />}>
<Editor
value={scriptResult}
onChange={debouncedHandleInputChange}
language={
importMethod === 'query'
? 'json'
: importMethod === 'dbml'
? 'dbml'
: 'sql'
}
language={importMethod === 'query' ? 'json' : 'sql'}
loading={<Spinner />}
onMount={handleEditorDidMount}
beforeMount={setupDBMLLanguage}
theme={
effectiveTheme === 'dark'
? 'dbml-dark'
@@ -457,6 +430,7 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
minimap: { enabled: false },
scrollBeyondLastLine: false,
automaticLayout: true,
glyphMargin: false,
lineNumbers: 'on',
guides: {
indentation: false,
@@ -481,9 +455,7 @@ export const ImportDatabase: React.FC<ImportDatabaseProps> = ({
</Suspense>
</div>
{errorMessage ||
((importMethod === 'ddl' || importMethod === 'dbml') &&
sqlValidation) ? (
{errorMessage || (importMethod === 'ddl' && sqlValidation) ? (
<SQLValidationStatus
validation={sqlValidation}
errorMessage={errorMessage}

View File

@@ -15,11 +15,9 @@ import {
AvatarImage,
} from '@/components/avatar/avatar';
import { useTranslation } from 'react-i18next';
import { Code, FileCode } from 'lucide-react';
import { Code } from 'lucide-react';
import { SmartQueryInstructions } from './instructions/smart-query-instructions';
import { DDLInstructions } from './instructions/ddl-instructions';
import { DBMLInstructions } from './instructions/dbml-instructions';
import type { ImportMethod } from '@/lib/import-method/import-method';
const DatabasesWithoutDDLInstructions: DatabaseType[] = [
DatabaseType.CLICKHOUSE,
@@ -32,8 +30,8 @@ export interface InstructionsSectionProps {
setDatabaseEdition: React.Dispatch<
React.SetStateAction<DatabaseEdition | undefined>
>;
importMethod: ImportMethod;
setImportMethod: (method: ImportMethod) => void;
importMethod: 'query' | 'ddl';
setImportMethod: (method: 'query' | 'ddl') => void;
showSSMSInfoDialog: boolean;
setShowSSMSInfoDialog: (show: boolean) => void;
}
@@ -117,60 +115,48 @@ export const InstructionsSection: React.FC<InstructionsSectionProps> = ({
</div>
) : null}
<div className="flex flex-col gap-1">
<p className="text-sm leading-6 text-primary">
How would you like to import?
</p>
<ToggleGroup
type="single"
className="ml-1 flex-wrap justify-start gap-2"
value={importMethod}
onValueChange={(value) => {
let selectedImportMethod: ImportMethod = 'query';
if (value) {
selectedImportMethod = value as ImportMethod;
}
{DatabasesWithoutDDLInstructions.includes(databaseType) ? null : (
<div className="flex flex-col gap-1">
<p className="text-sm leading-6 text-primary">
How would you like to import?
</p>
<ToggleGroup
type="single"
className="ml-1 flex-wrap justify-start gap-2"
value={importMethod}
onValueChange={(value) => {
let selectedImportMethod: 'query' | 'ddl' = 'query';
if (value) {
selectedImportMethod = value as 'query' | 'ddl';
}
setImportMethod(selectedImportMethod);
}}
>
<ToggleGroupItem
value="query"
variant="outline"
className="h-6 gap-1 p-0 px-2 shadow-none data-[state=on]:bg-slate-200 dark:data-[state=on]:bg-slate-700"
setImportMethod(selectedImportMethod);
}}
>
<Avatar className="h-3 w-4 rounded-none">
<AvatarImage src={logo} alt="query" />
<AvatarFallback>Query</AvatarFallback>
</Avatar>
Smart Query
</ToggleGroupItem>
{!DatabasesWithoutDDLInstructions.includes(
databaseType
) && (
<ToggleGroupItem
value="query"
variant="outline"
className="h-6 gap-1 p-0 px-2 shadow-none data-[state=on]:bg-slate-200 dark:data-[state=on]:bg-slate-700"
>
<Avatar className="h-3 w-4 rounded-none">
<AvatarImage src={logo} alt="query" />
<AvatarFallback>Query</AvatarFallback>
</Avatar>
Smart Query
</ToggleGroupItem>
<ToggleGroupItem
value="ddl"
variant="outline"
className="h-6 gap-1 p-0 px-2 shadow-none data-[state=on]:bg-slate-200 dark:data-[state=on]:bg-slate-700"
>
<Avatar className="size-4 rounded-none">
<FileCode size={16} />
<Code size={16} />
</Avatar>
SQL Script
</ToggleGroupItem>
)}
<ToggleGroupItem
value="dbml"
variant="outline"
className="h-6 gap-1 p-0 px-2 shadow-none data-[state=on]:bg-slate-200 dark:data-[state=on]:bg-slate-700"
>
<Avatar className="size-4 rounded-none">
<Code size={16} />
</Avatar>
DBML
</ToggleGroupItem>
</ToggleGroup>
</div>
</ToggleGroup>
</div>
)}
<div className="flex flex-col gap-2">
<div className="text-sm font-semibold">Instructions:</div>
@@ -181,13 +167,8 @@ export const InstructionsSection: React.FC<InstructionsSectionProps> = ({
showSSMSInfoDialog={showSSMSInfoDialog}
setShowSSMSInfoDialog={setShowSSMSInfoDialog}
/>
) : importMethod === 'ddl' ? (
<DDLInstructions
databaseType={databaseType}
databaseEdition={databaseEdition}
/>
) : (
<DBMLInstructions
<DDLInstructions
databaseType={databaseType}
databaseEdition={databaseEdition}
/>

View File

@@ -1,47 +0,0 @@
import React from 'react';
import type { DatabaseType } from '@/lib/domain/database-type';
import type { DatabaseEdition } from '@/lib/domain/database-edition';
import { CodeSnippet } from '@/components/code-snippet/code-snippet';
import { setupDBMLLanguage } from '@/components/code-snippet/languages/dbml-language';
export interface DBMLInstructionsProps {
databaseType: DatabaseType;
databaseEdition?: DatabaseEdition;
}
export const DBMLInstructions: React.FC<DBMLInstructionsProps> = () => {
return (
<>
<div className="flex flex-col gap-1 text-sm text-primary">
<div>
Paste your DBML (Database Markup Language) schema definition
here
</div>
</div>
<div className="flex h-64 flex-col gap-1 text-sm text-primary">
<h4 className="text-xs font-medium">Example:</h4>
<CodeSnippet
className="h-full"
allowCopy={false}
editorProps={{
beforeMount: setupDBMLLanguage,
}}
code={`Table users {
id int [pk]
username varchar
email varchar
}
Table posts {
id int [pk]
user_id int [ref: > users.id]
title varchar
content text
}`}
language={'dbml'}
/>
</div>
</>
);
};

View File

@@ -43,8 +43,8 @@ const DDLInstructionsMap: Record<DatabaseType, DDLInstruction[]> = {
},
{
text: 'Execute the following command in your terminal:',
code: `sqlite3 <database_file_path>\n".schema" > <output_file_path>`,
example: `sqlite3 my_db.db\n".schema" > schema_export.sql`,
code: `sqlite3 <database_file_path>\n.dump > <output_file_path>`,
example: `sqlite3 my_db.db\n.dump > schema_export.sql`,
},
{
text: 'Open the exported SQL file, copy its contents, and paste them here.',

View File

@@ -73,7 +73,7 @@ export const SQLValidationStatus: React.FC<SQLValidationStatusProps> = ({
{hasErrors ? (
<div className="rounded-md border border-red-200 bg-red-50 dark:border-red-800 dark:bg-red-950">
<ScrollArea className="h-fit max-h-24">
<ScrollArea className="h-24">
<div className="space-y-3 p-3 pt-2 text-red-700 dark:text-red-300">
{validation?.errors
.slice(0, 3)
@@ -137,7 +137,7 @@ export const SQLValidationStatus: React.FC<SQLValidationStatusProps> = ({
{hasWarnings && !hasErrors ? (
<div className="rounded-md border border-sky-200 bg-sky-50 dark:border-sky-800 dark:bg-sky-950">
<ScrollArea className="h-fit max-h-24">
<ScrollArea className="h-24">
<div className="space-y-3 p-3 pt-2 text-sky-700 dark:text-sky-300">
<div className="flex items-start gap-2">
<AlertTriangle className="mt-0.5 size-4 shrink-0 text-sky-700 dark:text-sky-300" />

View File

@@ -22,11 +22,6 @@ import { sqlImportToDiagram } from '@/lib/data/sql-import';
import type { SelectedTable } from '@/lib/data/import-metadata/filter-metadata';
import { filterMetadataByTables } from '@/lib/data/import-metadata/filter-metadata';
import { MAX_TABLES_WITHOUT_SHOWING_FILTER } from '../common/select-tables/constants';
import {
defaultDBMLDiagramName,
importDBMLToDiagram,
} from '@/lib/dbml/dbml-import/dbml-import';
import type { ImportMethod } from '@/lib/import-method/import-method';
export interface CreateDiagramDialogProps extends BaseDialogProps {}
@@ -35,11 +30,11 @@ export const CreateDiagramDialog: React.FC<CreateDiagramDialogProps> = ({
}) => {
const { diagramId } = useChartDB();
const { t } = useTranslation();
const [importMethod, setImportMethod] = useState<ImportMethod>('query');
const [importMethod, setImportMethod] = useState<'query' | 'ddl'>('query');
const [databaseType, setDatabaseType] = useState<DatabaseType>(
DatabaseType.GENERIC
);
const { closeCreateDiagramDialog } = useDialog();
const { closeCreateDiagramDialog, openImportDBMLDialog } = useDialog();
const { updateConfig } = useConfig();
const [scriptResult, setScriptResult] = useState('');
const [databaseEdition, setDatabaseEdition] = useState<
@@ -94,14 +89,6 @@ export const CreateDiagramDialog: React.FC<CreateDiagramDialogProps> = ({
sourceDatabaseType: databaseType,
targetDatabaseType: databaseType,
});
} else if (importMethod === 'dbml') {
diagram = await importDBMLToDiagram(scriptResult, {
databaseType,
});
// Update the diagram name if it's the default
if (diagram.name === defaultDBMLDiagramName) {
diagram.name = `Diagram ${diagramNumber}`;
}
} else {
let metadata: DatabaseMetadata | undefined = databaseMetadata;
@@ -165,6 +152,10 @@ export const CreateDiagramDialog: React.FC<CreateDiagramDialogProps> = ({
await updateConfig({ config: { defaultDiagramId: diagram.id } });
closeCreateDiagramDialog();
navigate(`/diagrams/${diagram.id}`);
setTimeout(
() => openImportDBMLDialog({ withCreateEmptyDiagram: true }),
700
);
}, [
databaseType,
addDiagram,
@@ -173,13 +164,14 @@ export const CreateDiagramDialog: React.FC<CreateDiagramDialogProps> = ({
navigate,
updateConfig,
diagramNumber,
openImportDBMLDialog,
]);
const importNewDiagramOrFilterTables = useCallback(async () => {
try {
setIsParsingMetadata(true);
if (importMethod === 'ddl' || importMethod === 'dbml') {
if (importMethod === 'ddl') {
await importNewDiagram();
} else {
// Parse metadata asynchronously to avoid blocking the UI

View File

@@ -15,8 +15,6 @@ import { useReactFlow } from '@xyflow/react';
import type { BaseDialogProps } from '../common/base-dialog-props';
import { useAlert } from '@/context/alert-context/alert-context';
import { sqlImportToDiagram } from '@/lib/data/sql-import';
import { importDBMLToDiagram } from '@/lib/dbml/dbml-import/dbml-import';
import type { ImportMethod } from '@/lib/import-method/import-method';
export interface ImportDatabaseDialogProps extends BaseDialogProps {
databaseType: DatabaseType;
@@ -26,7 +24,7 @@ export const ImportDatabaseDialog: React.FC<ImportDatabaseDialogProps> = ({
dialog,
databaseType,
}) => {
const [importMethod, setImportMethod] = useState<ImportMethod>('query');
const [importMethod, setImportMethod] = useState<'query' | 'ddl'>('query');
const { closeImportDatabaseDialog } = useDialog();
const { showAlert } = useAlert();
const {
@@ -67,10 +65,6 @@ export const ImportDatabaseDialog: React.FC<ImportDatabaseDialogProps> = ({
sourceDatabaseType: databaseType,
targetDatabaseType: databaseType,
});
} else if (importMethod === 'dbml') {
diagram = await importDBMLToDiagram(scriptResult, {
databaseType,
});
} else {
const databaseMetadata: DatabaseMetadata =
loadDatabaseMetadata(scriptResult);

View File

@@ -0,0 +1,359 @@
import React, {
useCallback,
useEffect,
useState,
Suspense,
useRef,
} from 'react';
import type * as monaco from 'monaco-editor';
import { useDialog } from '@/hooks/use-dialog';
import {
Dialog,
DialogClose,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogInternalContent,
DialogTitle,
} from '@/components/dialog/dialog';
import { Button } from '@/components/button/button';
import type { BaseDialogProps } from '../common/base-dialog-props';
import { useTranslation } from 'react-i18next';
import { Editor } from '@/components/code-snippet/code-snippet';
import { useTheme } from '@/hooks/use-theme';
import { AlertCircle } from 'lucide-react';
import {
importDBMLToDiagram,
sanitizeDBML,
preprocessDBML,
} from '@/lib/dbml/dbml-import/dbml-import';
import { useChartDB } from '@/hooks/use-chartdb';
import { Parser } from '@dbml/core';
import { useCanvas } from '@/hooks/use-canvas';
import { setupDBMLLanguage } from '@/components/code-snippet/languages/dbml-language';
import type { DBTable } from '@/lib/domain/db-table';
import { useToast } from '@/components/toast/use-toast';
import { Spinner } from '@/components/spinner/spinner';
import { debounce } from '@/lib/utils';
import { parseDBMLError } from '@/lib/dbml/dbml-import/dbml-import-error';
import {
clearErrorHighlight,
highlightErrorLine,
} from '@/components/code-snippet/dbml/utils';
export interface ImportDBMLDialogProps extends BaseDialogProps {
withCreateEmptyDiagram?: boolean;
}
export const ImportDBMLDialog: React.FC<ImportDBMLDialogProps> = ({
dialog,
withCreateEmptyDiagram,
}) => {
const { t } = useTranslation();
const initialDBML = `// Use DBML to define your database structure
// Simple Blog System with Comments Example
Table users {
id integer [primary key]
name varchar
email varchar
}
Table posts {
id integer [primary key]
title varchar
content text
user_id integer
created_at timestamp
}
Table comments {
id integer [primary key]
content text
post_id integer
user_id integer
created_at timestamp
}
// Relationships
Ref: posts.user_id > users.id // Each post belongs to one user
Ref: comments.post_id > posts.id // Each comment belongs to one post
Ref: comments.user_id > users.id // Each comment is written by one user`;
const [dbmlContent, setDBMLContent] = useState<string>(initialDBML);
const { closeImportDBMLDialog } = useDialog();
const [errorMessage, setErrorMessage] = useState<string | undefined>();
const { effectiveTheme } = useTheme();
const { toast } = useToast();
const {
addTables,
addRelationships,
tables,
relationships,
removeTables,
removeRelationships,
} = useChartDB();
const { reorderTables } = useCanvas();
const [reorder, setReorder] = useState(false);
const editorRef = useRef<monaco.editor.IStandaloneCodeEditor>();
const decorationsCollection =
useRef<monaco.editor.IEditorDecorationsCollection>();
const handleEditorDidMount = (
editor: monaco.editor.IStandaloneCodeEditor
) => {
editorRef.current = editor;
decorationsCollection.current = editor.createDecorationsCollection();
};
useEffect(() => {
if (reorder) {
reorderTables({
updateHistory: false,
});
setReorder(false);
}
}, [reorder, reorderTables]);
const clearDecorations = useCallback(() => {
clearErrorHighlight(decorationsCollection.current);
}, []);
const validateDBML = useCallback(
async (content: string) => {
// Clear previous errors
setErrorMessage(undefined);
clearDecorations();
if (!content.trim()) return;
try {
const preprocessedContent = preprocessDBML(content);
const sanitizedContent = sanitizeDBML(preprocessedContent);
const parser = new Parser();
parser.parse(sanitizedContent, 'dbmlv2');
} catch (e) {
const parsedError = parseDBMLError(e);
if (parsedError) {
setErrorMessage(
t('import_dbml_dialog.error.description') +
` (1 error found - in line ${parsedError.line})`
);
highlightErrorLine({
error: parsedError,
model: editorRef.current?.getModel(),
editorDecorationsCollection:
decorationsCollection.current,
});
} else {
setErrorMessage(
e instanceof Error ? e.message : JSON.stringify(e)
);
}
}
},
[clearDecorations, t]
);
const debouncedValidateRef = useRef<((value: string) => void) | null>(null);
// Set up debounced validation
useEffect(() => {
debouncedValidateRef.current = debounce((value: string) => {
validateDBML(value);
}, 500);
return () => {
debouncedValidateRef.current = null;
};
}, [validateDBML]);
// Trigger validation when content changes
useEffect(() => {
if (debouncedValidateRef.current) {
debouncedValidateRef.current(dbmlContent);
}
}, [dbmlContent]);
useEffect(() => {
if (!dialog.open) {
setErrorMessage(undefined);
clearDecorations();
setDBMLContent(initialDBML);
}
}, [dialog.open, initialDBML, clearDecorations]);
const handleImport = useCallback(async () => {
if (!dbmlContent.trim() || errorMessage) return;
try {
const importedDiagram = await importDBMLToDiagram(dbmlContent);
const tableIdsToRemove = tables
.filter((table) =>
importedDiagram.tables?.some(
(t: DBTable) =>
t.name === table.name && t.schema === table.schema
)
)
.map((table) => table.id);
// Find relationships that need to be removed
const relationshipIdsToRemove = relationships
.filter((relationship) => {
const sourceTable = tables.find(
(table: DBTable) =>
table.id === relationship.sourceTableId
);
const targetTable = tables.find(
(table: DBTable) =>
table.id === relationship.targetTableId
);
if (!sourceTable || !targetTable) return true;
const replacementSourceTable = importedDiagram.tables?.find(
(table: DBTable) =>
table.name === sourceTable.name &&
table.schema === sourceTable.schema
);
const replacementTargetTable = importedDiagram.tables?.find(
(table: DBTable) =>
table.name === targetTable.name &&
table.schema === targetTable.schema
);
return replacementSourceTable || replacementTargetTable;
})
.map((relationship) => relationship.id);
// Remove existing items
await Promise.all([
removeTables(tableIdsToRemove, { updateHistory: false }),
removeRelationships(relationshipIdsToRemove, {
updateHistory: false,
}),
]);
// Add new items
await Promise.all([
addTables(importedDiagram.tables ?? [], {
updateHistory: false,
}),
addRelationships(importedDiagram.relationships ?? [], {
updateHistory: false,
}),
]);
setReorder(true);
closeImportDBMLDialog();
} catch (e) {
toast({
title: t('import_dbml_dialog.error.title'),
variant: 'destructive',
description: (
<>
<div>{t('import_dbml_dialog.error.description')}</div>
{e instanceof Error ? e.message : JSON.stringify(e)}
</>
),
});
}
}, [
dbmlContent,
closeImportDBMLDialog,
tables,
relationships,
removeTables,
removeRelationships,
addTables,
addRelationships,
errorMessage,
toast,
setReorder,
t,
]);
return (
<Dialog
{...dialog}
onOpenChange={(open) => {
if (!open) {
closeImportDBMLDialog();
}
}}
>
<DialogContent
className="flex h-[80vh] max-h-screen w-full flex-col md:max-w-[900px]"
showClose
>
<DialogHeader>
<DialogTitle>
{withCreateEmptyDiagram
? t('import_dbml_dialog.example_title')
: t('import_dbml_dialog.title')}
</DialogTitle>
<DialogDescription>
{t('import_dbml_dialog.description')}
</DialogDescription>
</DialogHeader>
<DialogInternalContent>
<Suspense fallback={<Spinner />}>
<Editor
value={dbmlContent}
onChange={(value) => setDBMLContent(value || '')}
language="dbml"
onMount={handleEditorDidMount}
theme={
effectiveTheme === 'dark'
? 'dbml-dark'
: 'dbml-light'
}
beforeMount={setupDBMLLanguage}
options={{
minimap: { enabled: false },
scrollBeyondLastLine: false,
automaticLayout: true,
glyphMargin: true,
lineNumbers: 'on',
scrollbar: {
vertical: 'visible',
horizontal: 'visible',
},
}}
className="size-full"
/>
</Suspense>
</DialogInternalContent>
<DialogFooter>
<div className="flex w-full items-center justify-between">
<div className="flex items-center gap-4">
<DialogClose asChild>
<Button variant="secondary">
{withCreateEmptyDiagram
? t('import_dbml_dialog.skip_and_empty')
: t('import_dbml_dialog.cancel')}
</Button>
</DialogClose>
{errorMessage ? (
<div className="flex items-center gap-1">
<AlertCircle className="size-4 text-destructive" />
<span className="text-xs text-destructive">
{errorMessage ||
t(
'import_dbml_dialog.error.description'
)}
</span>
</div>
) : null}
</div>
<Button
onClick={handleImport}
disabled={!dbmlContent.trim() || !!errorMessage}
>
{withCreateEmptyDiagram
? t('import_dbml_dialog.show_example')
: t('import_dbml_dialog.import')}
</Button>
</div>
</DialogFooter>
</DialogContent>
</Dialog>
);
};

View File

@@ -0,0 +1,50 @@
import { useEffect, useCallback, type RefObject } from 'react';
/**
* Custom hook that handles click outside detection with capture phase
* to work properly with React Flow canvas and other event-stopping elements
*/
export function useClickOutside(
ref: RefObject<HTMLElement>,
handler: () => void,
isActive = true
) {
useEffect(() => {
if (!isActive) return;
const handleClickOutside = (event: MouseEvent) => {
if (ref.current && !ref.current.contains(event.target as Node)) {
handler();
}
};
// Use capture phase to catch events before React Flow or other libraries can stop them
document.addEventListener('mousedown', handleClickOutside, true);
return () => {
document.removeEventListener('mousedown', handleClickOutside, true);
};
}, [ref, handler, isActive]);
}
/**
* Specialized version of useClickOutside for edit mode inputs
* Adds a small delay to prevent race conditions with blur events
*/
export function useEditClickOutside(
inputRef: RefObject<HTMLElement>,
editMode: boolean,
onSave: () => void,
delay = 100
) {
const handleClickOutside = useCallback(() => {
if (editMode) {
// Small delay to ensure any pending state updates are processed
setTimeout(() => {
onSave();
}, delay);
}
}, [editMode, onSave, delay]);
useClickOutside(inputRef, handleClickOutside, editMode);
}

View File

@@ -1,7 +1,7 @@
import { useCallback, useMemo, useState, useEffect, useRef } from 'react';
import { useCallback, useMemo, useState, useEffect } from 'react';
import { useChartDB } from './use-chartdb';
import { useDebounce } from './use-debounce-v2';
import type { DatabaseType, DBField, DBTable } from '@/lib/domain';
import type { DBField, DBTable } from '@/lib/domain';
import type {
SelectBoxOption,
SelectBoxProps,
@@ -9,62 +9,49 @@ import type {
import {
dataTypeDataToDataType,
sortedDataTypeMap,
supportsArrayDataType,
autoIncrementAlwaysOn,
requiresNotNull,
} from '@/lib/data/data-types/data-types';
import { generateDBFieldSuffix } from '@/lib/domain/db-field';
import type { DataTypeData } from '@/lib/data/data-types/data-types';
const generateFieldRegexPatterns = (
dataType: DataTypeData,
databaseType: DatabaseType
dataType: DataTypeData
): {
regex?: string;
extractRegex?: RegExp;
} => {
const typeName = dataType.name;
const supportsArrays = supportsArrayDataType(dataType.id, databaseType);
const arrayPattern = supportsArrays ? '(\\[\\])?' : '';
if (!dataType.fieldAttributes) {
// For types without field attributes, support plain type + optional array notation
return {
regex: `^${typeName}${arrayPattern}$`,
extractRegex: new RegExp(`^${typeName}${arrayPattern}$`),
};
return { regex: undefined, extractRegex: undefined };
}
const typeName = dataType.name;
const fieldAttributes = dataType.fieldAttributes;
if (fieldAttributes.hasCharMaxLength) {
if (fieldAttributes.hasCharMaxLengthOption) {
return {
regex: `^${typeName}\\((\\d+|[mM][aA][xX])\\)${arrayPattern}$`,
extractRegex: supportsArrays
? /\((\d+|max)\)(\[\])?/i
: /\((\d+|max)\)/i,
regex: `^${typeName}\\((\\d+|[mM][aA][xX])\\)$`,
extractRegex: /\((\d+|max)\)/i,
};
}
return {
regex: `^${typeName}\\(\\d+\\)${arrayPattern}$`,
extractRegex: supportsArrays ? /\((\d+)\)(\[\])?/ : /\((\d+)\)/,
regex: `^${typeName}\\(\\d+\\)$`,
extractRegex: /\((\d+)\)/,
};
}
if (fieldAttributes.precision && fieldAttributes.scale) {
return {
regex: `^${typeName}\\s*\\(\\s*\\d+\\s*(?:,\\s*\\d+\\s*)?\\)${arrayPattern}$`,
regex: `^${typeName}\\s*\\(\\s*\\d+\\s*(?:,\\s*\\d+\\s*)?\\)$`,
extractRegex: new RegExp(
`${typeName}\\s*\\(\\s*(\\d+)\\s*(?:,\\s*(\\d+)\\s*)?\\)${arrayPattern}`
`${typeName}\\s*\\(\\s*(\\d+)\\s*(?:,\\s*(\\d+)\\s*)?\\)`
),
};
}
if (fieldAttributes.precision) {
return {
regex: `^${typeName}\\s*\\(\\s*\\d+\\s*\\)${arrayPattern}$`,
extractRegex: supportsArrays ? /\((\d+)\)(\[\])?/ : /\((\d+)\)/,
regex: `^${typeName}\\s*\\(\\s*\\d+\\s*\\)$`,
extractRegex: /\((\d+)\)/,
};
}
@@ -88,20 +75,12 @@ export const useUpdateTableField = (
const [localNullable, setLocalNullable] = useState(field.nullable);
const [localPrimaryKey, setLocalPrimaryKey] = useState(field.primaryKey);
const lastFieldNameRef = useRef<string>(field.name);
useEffect(() => {
if (localFieldName === lastFieldNameRef.current) {
lastFieldNameRef.current = field.name;
setLocalFieldName(field.name);
}
}, [field.name, localFieldName]);
// Update local state when field properties change externally
useEffect(() => {
setLocalFieldName(field.name);
setLocalNullable(field.nullable);
setLocalPrimaryKey(field.primaryKey);
}, [field.nullable, field.primaryKey]);
}, [field.name, field.nullable, field.primaryKey]);
// Use custom updateField if provided, otherwise use the chartDB one
const updateField = useMemo(
@@ -131,10 +110,7 @@ export const useUpdateTableField = (
const standardTypes: SelectBoxOption[] = sortedDataTypeMap[
databaseType
].map((type) => {
const regexPatterns = generateFieldRegexPatterns(
type,
databaseType
);
const regexPatterns = generateFieldRegexPatterns(type);
return {
label: type.name,
@@ -178,13 +154,8 @@ export const useUpdateTableField = (
let characterMaximumLength: string | undefined = undefined;
let precision: number | undefined = undefined;
let scale: number | undefined = undefined;
let isArray: boolean | undefined = undefined;
if (regexMatches?.length) {
// Check if the last captured group is the array indicator []
const lastMatch = regexMatches[regexMatches.length - 1];
const hasArrayIndicator = lastMatch === '[]';
if (dataType?.fieldAttributes?.hasCharMaxLength) {
characterMaximumLength = regexMatches[1]?.toLowerCase();
} else if (
@@ -198,17 +169,6 @@ export const useUpdateTableField = (
} else if (dataType?.fieldAttributes?.precision) {
precision = parseInt(regexMatches[1]);
}
// Set isArray if the array indicator was found and the type supports arrays
if (hasArrayIndicator) {
const typeId = value as string;
if (supportsArrayDataType(typeId, databaseType)) {
isArray = true;
}
} else {
// Explicitly set to false/undefined if no array indicator
isArray = undefined;
}
} else {
if (
dataType?.fieldAttributes?.hasCharMaxLength &&
@@ -226,17 +186,11 @@ export const useUpdateTableField = (
}
}
const newTypeName = dataType?.name ?? (value as string);
const typeRequiresNotNull = requiresNotNull(newTypeName);
const shouldForceIncrement = autoIncrementAlwaysOn(newTypeName);
updateField(table.id, field.id, {
characterMaximumLength,
precision,
scale,
isArray,
...(typeRequiresNotNull ? { nullable: false } : {}),
increment: shouldForceIncrement ? true : undefined,
increment: undefined,
default: undefined,
type: dataTypeDataToDataType(
dataType ?? {
@@ -274,16 +228,9 @@ export const useUpdateTableField = (
const debouncedNullableUpdate = useDebounce(
useCallback(
(value: boolean) => {
const updates: Partial<DBField> = { nullable: value };
// If setting to nullable, clear increment (auto-increment requires NOT NULL)
if (value && field.increment) {
updates.increment = undefined;
}
updateField(table.id, field.id, updates);
updateField(table.id, field.id, { nullable: value });
},
[updateField, table.id, field.id, field.increment]
[updateField, table.id, field.id]
),
100 // 100ms debounce for toggle
);
@@ -344,17 +291,11 @@ export const useUpdateTableField = (
// Utility function to generate field suffix for display
const generateFieldSuffix = useCallback(
(typeId?: string) => {
return generateDBFieldSuffix(
{
...field,
isArray: field.isArray && typeId === field.type.id,
},
{
databaseType,
forceExtended: true,
typeId,
}
);
return generateDBFieldSuffix(field, {
databaseType,
forceExtended: true,
typeId,
});
},
[field, databaseType]
);

View File

@@ -308,7 +308,7 @@ export const ar: LanguageTranslation = {
cancel: 'إلغاء',
import_from_file: 'استيراد من ملف',
back: 'رجوع',
empty_diagram: 'قاعدة بيانات فارغة',
empty_diagram: 'مخطط فارغ',
continue: 'متابعة',
import: 'استيراد',
},

View File

@@ -310,7 +310,7 @@ export const bn: LanguageTranslation = {
cancel: 'বাতিল করুন',
back: 'ফিরে যান',
import_from_file: 'ফাইল থেকে আমদানি করুন',
empty_diagram: 'খালি ডাটাবেস',
empty_diagram: 'ফাঁকা চিত্র',
continue: 'চালিয়ে যান',
import: 'আমদানি করুন',
},

View File

@@ -313,7 +313,7 @@ export const de: LanguageTranslation = {
back: 'Zurück',
// TODO: Translate
import_from_file: 'Import from File',
empty_diagram: 'Leere Datenbank',
empty_diagram: 'Leeres Diagramm',
continue: 'Weiter',
import: 'Importieren',
},

View File

@@ -301,7 +301,7 @@ export const en = {
cancel: 'Cancel',
import_from_file: 'Import from File',
back: 'Back',
empty_diagram: 'Empty database',
empty_diagram: 'Empty diagram',
continue: 'Continue',
import: 'Import',
},

View File

@@ -310,7 +310,7 @@ export const es: LanguageTranslation = {
back: 'Atrás',
// TODO: Translate
import_from_file: 'Import from File',
empty_diagram: 'Base de datos vacía',
empty_diagram: 'Diagrama vacío',
continue: 'Continuar',
import: 'Importar',
},

View File

@@ -307,7 +307,7 @@ export const fr: LanguageTranslation = {
cancel: 'Annuler',
back: 'Retour',
import_from_file: "Importer à partir d'un fichier",
empty_diagram: 'Base de données vide',
empty_diagram: 'Diagramme vide',
continue: 'Continuer',
import: 'Importer',
},

View File

@@ -310,7 +310,7 @@ export const gu: LanguageTranslation = {
cancel: 'રદ કરો',
back: 'પાછા',
import_from_file: 'ફાઇલમાંથી આયાત કરો',
empty_diagram: 'ખાલી ડેટાબેસ',
empty_diagram: 'ખાલી ડાયાગ્રામ',
continue: 'ચાલુ રાખો',
import: 'આયાત કરો',
},

View File

@@ -312,7 +312,7 @@ export const hi: LanguageTranslation = {
back: 'वापस',
// TODO: Translate
import_from_file: 'Import from File',
empty_diagram: 'खाली डेटाबेस',
empty_diagram: 'खाली आरेख',
continue: 'जारी रखें',
import: 'आयात करें',
},

View File

@@ -305,7 +305,7 @@ export const hr: LanguageTranslation = {
cancel: 'Odustani',
import_from_file: 'Uvezi iz datoteke',
back: 'Natrag',
empty_diagram: 'Prazna baza podataka',
empty_diagram: 'Prazan dijagram',
continue: 'Nastavi',
import: 'Uvezi',
},

View File

@@ -309,7 +309,7 @@ export const id_ID: LanguageTranslation = {
cancel: 'Batal',
import_from_file: 'Impor dari file',
back: 'Kembali',
empty_diagram: 'Database Kosong',
empty_diagram: 'Diagram Kosong',
continue: 'Lanjutkan',
import: 'Impor',
},

View File

@@ -314,7 +314,7 @@ export const ja: LanguageTranslation = {
back: '戻る',
// TODO: Translate
import_from_file: 'Import from File',
empty_diagram: '空のデータベース',
empty_diagram: '空のダイアグラム',
continue: '続行',
import: 'インポート',
},

View File

@@ -309,7 +309,7 @@ export const ko_KR: LanguageTranslation = {
cancel: '취소',
back: '뒤로가기',
import_from_file: '파일에서 가져오기',
empty_diagram: '빈 데이터베이스',
empty_diagram: '빈 다이어그램으로 시작',
continue: '계속',
import: '가져오기',
},

View File

@@ -315,7 +315,7 @@ export const mr: LanguageTranslation = {
// TODO: Add translations
import_from_file: 'Import from File',
back: 'मागे',
empty_diagram: 'रिक्त डेटाबेस',
empty_diagram: 'रिक्त आरेख',
continue: 'सुरू ठेवा',
import: 'आयात करा',
},

View File

@@ -311,7 +311,7 @@ export const ne: LanguageTranslation = {
cancel: 'रद्द गर्नुहोस्',
import_from_file: 'फाइलबाट आयात गर्नुहोस्',
back: 'फर्क',
empty_diagram: 'खाली डाटाबेस',
empty_diagram: 'रिक्त डायाग्राम',
continue: 'जारी राख्नुहोस्',
import: 'आयात गर्नुहोस्',
},

View File

@@ -311,7 +311,7 @@ export const pt_BR: LanguageTranslation = {
back: 'Voltar',
// TODO: Translate
import_from_file: 'Import from File',
empty_diagram: 'Banco de dados vazio',
empty_diagram: 'Diagrama vazio',
continue: 'Continuar',
import: 'Importar',
},

View File

@@ -307,7 +307,7 @@ export const ru: LanguageTranslation = {
cancel: 'Отменить',
back: 'Назад',
import_from_file: 'Импортировать из файла',
empty_diagram: 'Пустая база данных',
empty_diagram: 'Пустая диаграмма',
continue: 'Продолжить',
import: 'Импорт',
},

View File

@@ -312,7 +312,7 @@ export const te: LanguageTranslation = {
// TODO: Translate
import_from_file: 'Import from File',
back: 'తిరుగు',
empty_diagram: 'ఖాళీ డేటాబేస్',
empty_diagram: 'ఖాళీ చిత్రము',
continue: 'కొనసాగించు',
import: 'డిగుమతి',
},

View File

@@ -308,7 +308,7 @@ export const tr: LanguageTranslation = {
import_from_file: 'Import from File',
cancel: 'İptal',
back: 'Geri',
empty_diagram: 'Boş veritabanı',
empty_diagram: 'Boş diyagram',
continue: 'Devam',
import: 'İçe Aktar',
},

View File

@@ -308,7 +308,7 @@ export const uk: LanguageTranslation = {
cancel: 'Скасувати',
back: 'Назад',
import_from_file: 'Імпортувати з файлу',
empty_diagram: 'Порожня база даних',
empty_diagram: 'Порожня діаграма',
continue: 'Продовжити',
import: 'Імпорт',
},

View File

@@ -309,7 +309,7 @@ export const vi: LanguageTranslation = {
cancel: 'Hủy',
import_from_file: 'Nhập từ tệp',
back: 'Trở lại',
empty_diagram: 'Cơ sở dữ liệu trống',
empty_diagram: 'Sơ đồ trống',
continue: 'Tiếp tục',
import: 'Nhập',
},

View File

@@ -306,7 +306,7 @@ export const zh_CN: LanguageTranslation = {
cancel: '取消',
import_from_file: '从文件导入',
back: '上一步',
empty_diagram: '空数据库',
empty_diagram: '新建空关系图',
continue: '下一步',
import: '导入',
},

View File

@@ -305,7 +305,7 @@ export const zh_TW: LanguageTranslation = {
cancel: '取消',
import_from_file: '從檔案匯入',
back: '返回',
empty_diagram: '空資料庫',
empty_diagram: '空白圖表',
continue: '繼續',
import: '匯入',
},

View File

@@ -129,6 +129,9 @@ export const clickhouseDataTypes: readonly DataTypeData[] = [
{ name: 'enum', id: 'enum' },
{ name: 'lowcardinality', id: 'lowcardinality' },
// Array Type
{ name: 'array', id: 'array' },
// Tuple Type
{ name: 'tuple', id: 'tuple' },
{ name: 'map', id: 'map' },

View File

@@ -1,6 +1,5 @@
import { z } from 'zod';
import { DatabaseType } from '../../domain/database-type';
import { databaseSupportsArrays } from '../../domain/database-capabilities';
import { clickhouseDataTypes } from './clickhouse-data-types';
import { genericDataTypes } from './generic-data-types';
import { mariadbDataTypes } from './mariadb-data-types';
@@ -166,34 +165,3 @@ export const supportsAutoIncrementDataType = (
'decimal',
].includes(dataTypeName.toLocaleLowerCase());
};
export const autoIncrementAlwaysOn = (dataTypeName: string): boolean => {
return ['serial', 'bigserial', 'smallserial'].includes(
dataTypeName.toLowerCase()
);
};
export const requiresNotNull = (dataTypeName: string): boolean => {
return ['serial', 'bigserial', 'smallserial'].includes(
dataTypeName.toLowerCase()
);
};
const ARRAY_INCOMPATIBLE_TYPES = [
'serial',
'bigserial',
'smallserial',
] as const;
export const supportsArrayDataType = (
dataTypeName: string,
databaseType: DatabaseType
): boolean => {
if (!databaseSupportsArrays(databaseType)) {
return false;
}
return !ARRAY_INCOMPATIBLE_TYPES.includes(
dataTypeName.toLowerCase() as (typeof ARRAY_INCOMPATIBLE_TYPES)[number]
);
};

View File

@@ -12,7 +12,6 @@ export const postgresDataTypes: readonly DataTypeData[] = [
{ name: 'text', id: 'text', usageLevel: 1 },
{ name: 'boolean', id: 'boolean', usageLevel: 1 },
{ name: 'timestamp', id: 'timestamp', usageLevel: 1 },
{ name: 'timestamptz', id: 'timestamptz', usageLevel: 1 },
{ name: 'date', id: 'date', usageLevel: 1 },
// Level 2 - Second most common types
@@ -43,7 +42,6 @@ export const postgresDataTypes: readonly DataTypeData[] = [
id: 'timestamp_with_time_zone',
usageLevel: 2,
},
{ name: 'int', id: 'int', usageLevel: 2 },
// Less common types
{
@@ -97,6 +95,7 @@ export const postgresDataTypes: readonly DataTypeData[] = [
{ name: 'tsvector', id: 'tsvector' },
{ name: 'tsquery', id: 'tsquery' },
{ name: 'xml', id: 'xml' },
{ name: 'array', id: 'array' },
{ name: 'int4range', id: 'int4range' },
{ name: 'int8range', id: 'int8range' },
{ name: 'numrange', id: 'numrange' },

View File

@@ -57,10 +57,6 @@ export const createFieldsFromMetadata = ({
...(col.precision?.scale ? { scale: col.precision.scale } : {}),
...(col.default ? { default: col.default } : {}),
...(col.collation ? { collation: col.collation } : {}),
...(col.is_identity !== undefined
? { increment: col.is_identity }
: {}),
...(col.is_array !== undefined ? { isArray: col.is_array } : {}),
createdAt: Date.now(),
comments: col.comment ? col.comment : undefined,
})

View File

@@ -64,7 +64,7 @@ export const loadFromDatabaseMetadata = async ({
const diagram: Diagram = {
id: generateDiagramId(),
name: databaseMetadata.database_name
? `${databaseMetadata.database_name}`
? `${databaseMetadata.database_name}-db`
: diagramNumber
? `Diagram ${diagramNumber}`
: 'New Diagram',

View File

@@ -15,8 +15,6 @@ export interface ColumnInfo {
default?: string | null; // Default value for the column, nullable
collation?: string | null;
comment?: string | null;
is_identity?: boolean | null; // Indicates if the column is auto-increment/identity
is_array?: boolean | null; // Indicates if the column is an array type
}
export const ColumnInfoSchema: z.ZodType<ColumnInfo> = z.object({
@@ -37,6 +35,4 @@ export const ColumnInfoSchema: z.ZodType<ColumnInfo> = z.object({
default: z.string().nullable().optional(),
collation: z.string().nullable().optional(),
comment: z.string().nullable().optional(),
is_identity: z.boolean().nullable().optional(),
is_array: z.boolean().nullable().optional(),
});

View File

@@ -127,13 +127,7 @@ cols AS (
',"default":"', null,
'","collation":"', COALESCE(cols.COLLATION_NAME::TEXT, ''),
'","comment":"', COALESCE(replace(replace(dsc.description::TEXT, '"', '\\"'), '\\x', '\\\\x'), ''),
'","is_identity":', CASE
WHEN cols.is_identity = 'YES' THEN 'true'
WHEN cols.column_default IS NOT NULL AND cols.column_default LIKE 'nextval(%' THEN 'true'
WHEN cols.column_default LIKE 'unique_rowid()%' THEN 'true'
ELSE 'false'
END,
'}')), ',') AS cols_metadata
'"}')), ',') AS cols_metadata
FROM information_schema.columns cols
LEFT JOIN pg_catalog.pg_class c
ON c.relname = cols.table_name

View File

@@ -69,9 +69,7 @@ SELECT CAST(CONCAT(
',"ordinal_position":', cols.ordinal_position,
',"nullable":', IF(cols.is_nullable = 'YES', 'true', 'false'),
',"default":"', ${withExtras ? withDefault : withoutDefault},
'","collation":"', IFNULL(cols.collation_name, ''),
'","is_identity":', IF(cols.extra LIKE '%auto_increment%', 'true', 'false'),
'"}')
'","collation":"', IFNULL(cols.collation_name, ''), '"}')
) FROM (
SELECT cols.table_schema,
cols.table_name,
@@ -83,8 +81,7 @@ SELECT CAST(CONCAT(
cols.ordinal_position,
cols.is_nullable,
cols.column_default,
cols.collation_name,
cols.extra
cols.collation_name
FROM information_schema.columns cols
WHERE cols.table_schema = DATABASE()
) AS cols), ''),

View File

@@ -92,9 +92,7 @@ export const getMySQLQuery = (
',"ordinal_position":', cols.ordinal_position,
',"nullable":', IF(cols.is_nullable = 'YES', 'true', 'false'),
',"default":"', ${withExtras ? withDefault : withoutDefault},
'","collation":"', IFNULL(cols.collation_name, ''),
'","is_identity":', IF(cols.extra LIKE '%auto_increment%', 'true', 'false'),
'}'
'","collation":"', IFNULL(cols.collation_name, ''), '"}'
)))))
), indexes as (
(SELECT (@indexes:=NULL),

View File

@@ -194,12 +194,7 @@ cols AS (
',"default":"', ${withExtras ? withDefault : withoutDefault},
'","collation":"', COALESCE(cols.COLLATION_NAME, ''),
'","comment":"', ${withExtras ? withComments : withoutComments},
'","is_identity":', CASE
WHEN cols.is_identity = 'YES' THEN 'true'
WHEN cols.column_default IS NOT NULL AND cols.column_default LIKE 'nextval(%' THEN 'true'
ELSE 'false'
END,
'}')), ',') AS cols_metadata
'"}')), ',') AS cols_metadata
FROM information_schema.columns cols
LEFT JOIN pg_catalog.pg_class c
ON c.relname = cols.table_name

View File

@@ -119,13 +119,7 @@ WITH fk_info AS (
END
ELSE null
END,
'default', ${withExtras ? withDefault : withoutDefault},
'is_identity',
CASE
WHEN p.pk = 1 AND LOWER(p.type) LIKE '%int%' THEN json('true')
WHEN LOWER((SELECT sql FROM sqlite_master WHERE name = m.name)) LIKE '%' || p.name || '%autoincrement%' THEN json('true')
ELSE json('false')
END
'default', ${withExtras ? withDefault : withoutDefault}
)
) AS cols_metadata
FROM
@@ -298,13 +292,7 @@ WITH fk_info AS (
END
ELSE null
END,
'default', ${withExtras ? withDefault : withoutDefault},
'is_identity',
CASE
WHEN p.pk = 1 AND LOWER(p.type) LIKE '%int%' THEN json('true')
WHEN LOWER((SELECT sql FROM sqlite_master WHERE name = m.name)) LIKE '%' || p.name || '%autoincrement%' THEN json('true')
ELSE json('false')
END
'default', ${withExtras ? withDefault : withoutDefault}
)
) AS cols_metadata
FROM

View File

@@ -91,11 +91,6 @@ cols AS (
WHEN cols.COLLATION_NAME IS NULL THEN 'null'
ELSE '"' + STRING_ESCAPE(cols.COLLATION_NAME, 'json') + '"'
END +
', "is_identity": ' + CASE
WHEN COLUMNPROPERTY(OBJECT_ID(cols.TABLE_SCHEMA + '.' + cols.TABLE_NAME), cols.COLUMN_NAME, 'IsIdentity') = 1
THEN 'true'
ELSE 'false'
END +
N'}') COLLATE DATABASE_DEFAULT
), N','
) +

View File

@@ -1,356 +0,0 @@
import { describe, it, expect } from 'vitest';
import { generateId } from '@/lib/utils';
import { exportBaseSQL } from '../export-sql-script';
import { DatabaseType } from '@/lib/domain/database-type';
import type { Diagram } from '@/lib/domain/diagram';
describe('SQL Export - Array Fields (Fantasy RPG Theme)', () => {
it('should export array fields for magical spell components', () => {
const diagram: Diagram = {
id: 'test-diagram',
name: 'Magical Spell System',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'spells',
schema: '',
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'uuid', name: 'uuid' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'name',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: false,
createdAt: Date.now(),
characterMaximumLength: '200',
},
{
id: generateId(),
name: 'components',
type: { id: 'text', name: 'text' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
isArray: true,
comments: 'Magical components needed for the spell',
},
{
id: generateId(),
name: 'elemental_types',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
characterMaximumLength: '50',
isArray: true,
comments:
'Elements involved: fire, water, earth, air',
},
],
indexes: [],
x: 0,
y: 0,
color: '#3b82f6',
isView: false,
createdAt: Date.now(),
order: 0,
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const sql = exportBaseSQL({
diagram,
targetDatabaseType: DatabaseType.POSTGRESQL,
isDBMLFlow: true,
});
expect(sql).toContain('CREATE TABLE "spells"');
expect(sql).toContain('"components" text[]');
expect(sql).toContain('"elemental_types" varchar(50)[]');
});
it('should export array fields for hero inventory system', () => {
const diagram: Diagram = {
id: 'test-diagram',
name: 'RPG Inventory System',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'heroes',
schema: 'game',
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'bigint', name: 'bigint' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'name',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: false,
createdAt: Date.now(),
characterMaximumLength: '100',
},
{
id: generateId(),
name: 'abilities',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
characterMaximumLength: '100',
isArray: true,
comments:
'Special abilities like Stealth, Fireball, etc',
},
{
id: generateId(),
name: 'inventory_slots',
type: { id: 'integer', name: 'integer' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
isArray: true,
comments: 'Item IDs in inventory',
},
{
id: generateId(),
name: 'skill_levels',
type: { id: 'decimal', name: 'decimal' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
precision: 5,
scale: 2,
isArray: true,
comments: 'Skill proficiency levels',
},
],
indexes: [],
x: 0,
y: 0,
color: '#ef4444',
isView: false,
createdAt: Date.now(),
order: 0,
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const sql = exportBaseSQL({
diagram,
targetDatabaseType: DatabaseType.POSTGRESQL,
isDBMLFlow: true,
});
expect(sql).toContain('CREATE TABLE "game"."heroes"');
expect(sql).toContain('"abilities" varchar(100)[]');
expect(sql).toContain('"inventory_slots" integer[]');
expect(sql).toContain('"skill_levels" decimal(5, 2)[]');
});
it('should export non-array fields normally when isArray is false or undefined', () => {
const diagram: Diagram = {
id: 'test-diagram',
name: 'Quest System',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'quests',
schema: '',
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'uuid', name: 'uuid' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'title',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: false,
createdAt: Date.now(),
characterMaximumLength: '200',
isArray: false,
},
{
id: generateId(),
name: 'description',
type: { id: 'text', name: 'text' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
// isArray is undefined - should not be treated as array
},
],
indexes: [],
x: 0,
y: 0,
color: '#8b5cf6',
isView: false,
createdAt: Date.now(),
order: 0,
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const sql = exportBaseSQL({
diagram,
targetDatabaseType: DatabaseType.POSTGRESQL,
isDBMLFlow: true,
});
expect(sql).toContain('"title" varchar(200)');
expect(sql).not.toContain('"title" varchar(200)[]');
expect(sql).toContain('"description" text');
expect(sql).not.toContain('"description" text[]');
});
it('should handle mixed array and non-array fields in magical creatures table', () => {
const diagram: Diagram = {
id: 'test-diagram',
name: 'Bestiary System',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'magical_creatures',
schema: 'bestiary',
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'bigint', name: 'bigint' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'species_name',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: false,
createdAt: Date.now(),
characterMaximumLength: '100',
},
{
id: generateId(),
name: 'habitats',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
characterMaximumLength: '80',
isArray: true,
comments:
'Preferred habitats: forest, mountain, swamp',
},
{
id: generateId(),
name: 'danger_level',
type: { id: 'integer', name: 'integer' },
primaryKey: false,
unique: false,
nullable: false,
createdAt: Date.now(),
default: '1',
},
{
id: generateId(),
name: 'resistances',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
characterMaximumLength: '50',
isArray: true,
comments: 'Damage resistances',
},
{
id: generateId(),
name: 'is_tameable',
type: { id: 'boolean', name: 'boolean' },
primaryKey: false,
unique: false,
nullable: false,
createdAt: Date.now(),
default: 'false',
},
],
indexes: [],
x: 0,
y: 0,
color: '#10b981',
isView: false,
createdAt: Date.now(),
order: 0,
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const sql = exportBaseSQL({
diagram,
targetDatabaseType: DatabaseType.POSTGRESQL,
isDBMLFlow: true,
});
expect(sql).toContain('CREATE TABLE "bestiary"."magical_creatures"');
expect(sql).toContain('"species_name" varchar(100)');
expect(sql).not.toContain('"species_name" varchar(100)[]');
expect(sql).toContain('"habitats" varchar(80)[]');
expect(sql).toContain('"danger_level" integer');
expect(sql).not.toContain('"danger_level" integer[]');
expect(sql).toContain('"resistances" varchar(50)[]');
expect(sql).toContain('"is_tameable" boolean');
expect(sql).not.toContain('"is_tameable" boolean[]');
});
});

View File

@@ -1,67 +1,17 @@
import type { Diagram } from '../../domain/diagram';
import { OPENAI_API_KEY, OPENAI_API_ENDPOINT, LLM_MODEL_NAME } from '@/lib/env';
import { DatabaseType } from '@/lib/domain/database-type';
import {
DatabaseType,
databaseTypesWithCommentSupport,
} from '@/lib/domain/database-type';
import type { DBTable } from '@/lib/domain/db-table';
import { dataTypeMap, type DataType } from '../data-types/data-types';
import type { DataType } from '../data-types/data-types';
import { generateCacheKey, getFromCache, setInCache } from './export-sql-cache';
import { exportMSSQL } from './export-per-type/mssql';
import { exportPostgreSQL } from './export-per-type/postgresql';
import { exportSQLite } from './export-per-type/sqlite';
import { exportMySQL } from './export-per-type/mysql';
import { escapeSQLComment } from './export-per-type/common';
import {
databaseTypesWithCommentSupport,
supportsCustomTypes,
} from '@/lib/domain/database-capabilities';
// Function to format default values with proper quoting
const formatDefaultValue = (value: string): string => {
const trimmed = value.trim();
// SQL keywords and function-like keywords that don't need quotes
const keywords = [
'TRUE',
'FALSE',
'NULL',
'CURRENT_TIMESTAMP',
'CURRENT_DATE',
'CURRENT_TIME',
'NOW',
'GETDATE',
'NEWID',
'UUID',
];
if (keywords.includes(trimmed.toUpperCase())) {
return trimmed;
}
// Function calls (contain parentheses) don't need quotes
if (trimmed.includes('(') && trimmed.includes(')')) {
return trimmed;
}
// Numbers don't need quotes
if (/^-?\d+(\.\d+)?$/.test(trimmed)) {
return trimmed;
}
// Already quoted strings - keep as is
if (
(trimmed.startsWith("'") && trimmed.endsWith("'")) ||
(trimmed.startsWith('"') && trimmed.endsWith('"'))
) {
return trimmed;
}
// Check if it's a simple identifier (alphanumeric, no spaces) that might be a currency or enum
// These typically don't have spaces and are short (< 10 chars)
if (/^[A-Z][A-Z0-9_]*$/i.test(trimmed) && trimmed.length <= 10) {
return trimmed; // Treat as unquoted identifier (e.g., EUR, USD)
}
// Everything else needs to be quoted and escaped
return `'${trimmed.replace(/'/g, "''")}'`;
};
// Function to simplify verbose data type names
const simplifyDataType = (typeName: string): string => {
@@ -201,7 +151,10 @@ export const exportBaseSQL = ({
// or if we rely on the DBML generator to create Enums separately (as currently done)
// For now, let's assume PostgreSQL-style for demonstration if isDBMLFlow is false.
// If isDBMLFlow is true, we let TableDBML.tsx handle Enum syntax directly.
if (supportsCustomTypes(targetDatabaseType) && !isDBMLFlow) {
if (
targetDatabaseType === DatabaseType.POSTGRESQL &&
!isDBMLFlow
) {
const enumValues = customType.values
.map((v) => `'${v.replace(/'/g, "''")}'`)
.join(', ');
@@ -214,7 +167,10 @@ export const exportBaseSQL = ({
) {
// For PostgreSQL, generate CREATE TYPE ... AS (...)
// This is crucial for composite types to be recognized by the DBML importer
if (supportsCustomTypes(targetDatabaseType) || isDBMLFlow) {
if (
targetDatabaseType === DatabaseType.POSTGRESQL ||
isDBMLFlow
) {
// Assume other DBs might not support this or DBML flow needs it
const compositeFields = customType.fields
.map((f) => `${f.field} ${simplifyDataType(f.type)}`)
@@ -229,12 +185,13 @@ export const exportBaseSQL = ({
(ct.kind === 'enum' &&
ct.values &&
ct.values.length > 0 &&
supportsCustomTypes(targetDatabaseType) &&
targetDatabaseType === DatabaseType.POSTGRESQL &&
!isDBMLFlow) ||
(ct.kind === 'composite' &&
ct.fields &&
ct.fields.length > 0 &&
(supportsCustomTypes(targetDatabaseType) || isDBMLFlow))
(targetDatabaseType === DatabaseType.POSTGRESQL ||
isDBMLFlow))
)
) {
sqlScript += '\n';
@@ -294,7 +251,7 @@ export const exportBaseSQL = ({
if (
customEnumType &&
supportsCustomTypes(targetDatabaseType) &&
targetDatabaseType === DatabaseType.POSTGRESQL &&
!isDBMLFlow
) {
typeName = customEnumType.schema
@@ -337,14 +294,7 @@ export const exportBaseSQL = ({
}
const quotedFieldName = getQuotedFieldName(field.name, isDBMLFlow);
// Quote multi-word type names for DBML flow to prevent @dbml/core parser issues
const quotedTypeName =
isDBMLFlow && typeName.includes(' ')
? `"${typeName}"`
: typeName;
sqlScript += ` ${quotedFieldName} ${quotedTypeName}`;
sqlScript += ` ${quotedFieldName} ${typeName}`;
// Add size for character types
if (
@@ -364,31 +314,11 @@ export const exportBaseSQL = ({
sqlScript += `(1)`;
}
// Add precision and scale for numeric types only
const precisionAndScaleTypes = dataTypeMap[targetDatabaseType]
.filter(
(t) =>
t.fieldAttributes?.precision && t.fieldAttributes?.scale
)
.map((t) => t.name);
const isNumericType = precisionAndScaleTypes.some(
(t) =>
field.type.name.toLowerCase().includes(t) ||
typeName.toLowerCase().includes(t)
);
if (isNumericType) {
if (field.precision && field.scale) {
sqlScript += `(${field.precision}, ${field.scale})`;
} else if (field.precision) {
sqlScript += `(${field.precision})`;
}
}
// Add array suffix if field is an array (after type size and precision)
if (field.isArray) {
sqlScript += '[]';
// Add precision and scale for numeric types
if (field.precision && field.scale) {
sqlScript += `(${field.precision}, ${field.scale})`;
} else if (field.precision) {
sqlScript += `(${field.precision})`;
}
// Handle NOT NULL constraint
@@ -401,26 +331,9 @@ export const exportBaseSQL = ({
sqlScript += ` UNIQUE`;
}
// Handle AUTO INCREMENT
// Handle AUTO INCREMENT - add as a comment for AI to process
if (field.increment) {
if (isDBMLFlow) {
// For DBML flow, generate proper database-specific syntax
if (
targetDatabaseType === DatabaseType.MYSQL ||
targetDatabaseType === DatabaseType.MARIADB
) {
sqlScript += ` AUTO_INCREMENT`;
} else if (targetDatabaseType === DatabaseType.SQL_SERVER) {
sqlScript += ` IDENTITY(1,1)`;
} else if (targetDatabaseType === DatabaseType.SQLITE) {
// SQLite AUTOINCREMENT only works with INTEGER PRIMARY KEY
// Will be handled when PRIMARY KEY is added
}
// PostgreSQL/CockroachDB: increment attribute added by restoreIncrementAttribute in DBML export
} else {
// For non-DBML flow, add as a comment for AI to process
sqlScript += ` /* AUTO_INCREMENT */`;
}
sqlScript += ` /* AUTO_INCREMENT */`;
}
// Handle DEFAULT value
@@ -453,19 +366,7 @@ export const exportBaseSQL = ({
fieldDefault = `now()`;
}
// Fix CURRENT_DATE() for PostgreSQL in DBML flow - PostgreSQL uses CURRENT_DATE without parentheses
if (
isDBMLFlow &&
targetDatabaseType === DatabaseType.POSTGRESQL
) {
if (fieldDefault.toUpperCase() === 'CURRENT_DATE()') {
fieldDefault = 'CURRENT_DATE';
}
}
// Format default value with proper quoting
const formattedDefault = formatDefaultValue(fieldDefault);
sqlScript += ` DEFAULT ${formattedDefault}`;
sqlScript += ` DEFAULT ${fieldDefault}`;
}
}
@@ -473,17 +374,6 @@ export const exportBaseSQL = ({
const pkIndex = table.indexes.find((idx) => idx.isPrimaryKey);
if (field.primaryKey && !hasCompositePrimaryKey && !pkIndex?.name) {
sqlScript += ' PRIMARY KEY';
// For SQLite with DBML flow, add AUTOINCREMENT after PRIMARY KEY
if (
isDBMLFlow &&
field.increment &&
targetDatabaseType === DatabaseType.SQLITE &&
(typeName.toLowerCase() === 'integer' ||
typeName.toLowerCase() === 'int')
) {
sqlScript += ' AUTOINCREMENT';
}
}
// Add a comma after each field except the last one (or before PK constraint)
@@ -564,16 +454,10 @@ export const exportBaseSQL = ({
.join(', ');
if (fieldNames) {
const rawIndexName =
const indexName =
table.schema && !isDBMLFlow
? `${table.schema}_${index.name}`
: index.name;
// Quote index name if it contains special characters
// For DBML flow, also quote if contains special characters
const needsQuoting = /[^a-zA-Z0-9_]/.test(rawIndexName);
const indexName = needsQuoting
? `"${rawIndexName}"`
: rawIndexName;
sqlScript += `CREATE ${index.unique ? 'UNIQUE ' : ''}INDEX ${indexName} ON ${tableName} (${fieldNames});\n`;
}
});

View File

@@ -10,7 +10,6 @@ import { defaultTableColor } from '@/lib/colors';
import { DatabaseType } from '@/lib/domain/database-type';
import type { DBCustomType } from '@/lib/domain/db-custom-type';
import { DBCustomTypeKind } from '@/lib/domain/db-custom-type';
import { supportsCustomTypes } from '@/lib/domain/database-capabilities';
// Common interfaces for SQL entities
export interface SQLColumn {
@@ -664,7 +663,7 @@ export function convertToChartDBDiagram(
// Ensure integer types are preserved
mappedType = { id: 'integer', name: 'integer' };
} else if (
supportsCustomTypes(sourceDatabaseType) &&
sourceDatabaseType === DatabaseType.POSTGRESQL &&
parserResult.enums &&
parserResult.enums.some(
(e) => e.name.toLowerCase() === column.type.toLowerCase()

View File

@@ -0,0 +1,66 @@
import { describe, it } from 'vitest';
describe('node-sql-parser - CREATE TYPE handling', () => {
it('should show exact parser error for CREATE TYPE', async () => {
const { Parser } = await import('node-sql-parser');
const parser = new Parser();
const parserOpts = {
database: 'PostgreSQL',
};
console.log('\n=== Testing CREATE TYPE statement ===');
const createTypeSQL = `CREATE TYPE spell_element AS ENUM ('fire', 'water', 'earth', 'air');`;
try {
parser.astify(createTypeSQL, parserOpts);
console.log('CREATE TYPE parsed successfully');
} catch (error) {
console.log('CREATE TYPE parse error:', (error as Error).message);
}
console.log('\n=== Testing CREATE EXTENSION statement ===');
const createExtensionSQL = `CREATE EXTENSION IF NOT EXISTS "uuid-ossp";`;
try {
parser.astify(createExtensionSQL, parserOpts);
console.log('CREATE EXTENSION parsed successfully');
} catch (error) {
console.log(
'CREATE EXTENSION parse error:',
(error as Error).message
);
}
console.log('\n=== Testing CREATE TABLE with custom type ===');
const createTableWithTypeSQL = `CREATE TABLE wizards (
id UUID PRIMARY KEY,
element spell_element DEFAULT 'fire'
);`;
try {
parser.astify(createTableWithTypeSQL, parserOpts);
console.log('CREATE TABLE with custom type parsed successfully');
} catch (error) {
console.log(
'CREATE TABLE with custom type parse error:',
(error as Error).message
);
}
console.log('\n=== Testing CREATE TABLE with standard types only ===');
const createTableStandardSQL = `CREATE TABLE wizards (
id UUID PRIMARY KEY,
element VARCHAR(20) DEFAULT 'fire'
);`;
try {
parser.astify(createTableStandardSQL, parserOpts);
console.log('CREATE TABLE with standard types parsed successfully');
} catch (error) {
console.log(
'CREATE TABLE with standard types parse error:',
(error as Error).message
);
}
});
});

View File

@@ -1,178 +0,0 @@
import { describe, it, expect } from 'vitest';
import { fromSQLite } from '../sqlite';
describe('SQLite Import Tests', () => {
it('should parse SQLite script with sqlite_sequence table and all relationships', async () => {
const sql = `
CREATE TABLE users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT,
age INTEGER
);
CREATE TABLE sqlite_sequence(name,seq);
CREATE TABLE products (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT,
price REAL
);
CREATE TABLE user_products (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
product_id INTEGER NOT NULL,
purchased_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(id),
FOREIGN KEY (product_id) REFERENCES products(id)
);
`;
const result = await fromSQLite(sql);
// ============= CHECK TOTAL COUNTS =============
// Should have exactly 4 tables
expect(result.tables).toHaveLength(4);
// Should have exactly 2 foreign key relationships
expect(result.relationships).toHaveLength(2);
// ============= CHECK USERS TABLE =============
const usersTable = result.tables.find((t) => t.name === 'users');
expect(usersTable).toBeDefined();
expect(usersTable?.columns).toHaveLength(3); // id, name, age
// Check each column in users table
expect(usersTable?.columns[0]).toMatchObject({
name: 'id',
type: 'INTEGER',
primaryKey: true,
increment: true,
nullable: false,
});
expect(usersTable?.columns[1]).toMatchObject({
name: 'name',
type: 'TEXT',
primaryKey: false,
nullable: true,
});
expect(usersTable?.columns[2]).toMatchObject({
name: 'age',
type: 'INTEGER',
primaryKey: false,
nullable: true,
});
// ============= CHECK SQLITE_SEQUENCE TABLE =============
const sqliteSequenceTable = result.tables.find(
(t) => t.name === 'sqlite_sequence'
);
expect(sqliteSequenceTable).toBeDefined();
expect(sqliteSequenceTable?.columns).toHaveLength(2); // name, seq
// Check columns in sqlite_sequence table
expect(sqliteSequenceTable?.columns[0]).toMatchObject({
name: 'name',
type: 'TEXT', // Should default to TEXT when no type specified
primaryKey: false,
nullable: true,
});
expect(sqliteSequenceTable?.columns[1]).toMatchObject({
name: 'seq',
type: 'TEXT', // Should default to TEXT when no type specified
primaryKey: false,
nullable: true,
});
// ============= CHECK PRODUCTS TABLE =============
const productsTable = result.tables.find((t) => t.name === 'products');
expect(productsTable).toBeDefined();
expect(productsTable?.columns).toHaveLength(3); // id, name, price
// Check each column in products table
expect(productsTable?.columns[0]).toMatchObject({
name: 'id',
type: 'INTEGER',
primaryKey: true,
increment: true,
nullable: false,
});
expect(productsTable?.columns[1]).toMatchObject({
name: 'name',
type: 'TEXT',
primaryKey: false,
nullable: true,
});
expect(productsTable?.columns[2]).toMatchObject({
name: 'price',
type: 'REAL',
primaryKey: false,
nullable: true,
});
// ============= CHECK USER_PRODUCTS TABLE =============
const userProductsTable = result.tables.find(
(t) => t.name === 'user_products'
);
expect(userProductsTable).toBeDefined();
expect(userProductsTable?.columns).toHaveLength(4); // id, user_id, product_id, purchased_at
// Check each column in user_products table
expect(userProductsTable?.columns[0]).toMatchObject({
name: 'id',
type: 'INTEGER',
primaryKey: true,
increment: true,
nullable: false,
});
expect(userProductsTable?.columns[1]).toMatchObject({
name: 'user_id',
type: 'INTEGER',
primaryKey: false,
nullable: false, // NOT NULL constraint
});
expect(userProductsTable?.columns[2]).toMatchObject({
name: 'product_id',
type: 'INTEGER',
primaryKey: false,
nullable: false, // NOT NULL constraint
});
expect(userProductsTable?.columns[3]).toMatchObject({
name: 'purchased_at',
type: 'TIMESTAMP', // DATETIME should map to TIMESTAMP
primaryKey: false,
nullable: true,
default: 'CURRENT_TIMESTAMP',
});
// ============= CHECK FOREIGN KEY RELATIONSHIPS =============
// FK 1: user_products.user_id -> users.id
const userIdFK = result.relationships.find(
(r) =>
r.sourceTable === 'user_products' &&
r.sourceColumn === 'user_id' &&
r.targetTable === 'users' &&
r.targetColumn === 'id'
);
expect(userIdFK).toBeDefined();
expect(userIdFK).toMatchObject({
sourceTable: 'user_products',
sourceColumn: 'user_id',
targetTable: 'users',
targetColumn: 'id',
});
// FK 2: user_products.product_id -> products.id
const productIdFK = result.relationships.find(
(r) =>
r.sourceTable === 'user_products' &&
r.sourceColumn === 'product_id' &&
r.targetTable === 'products' &&
r.targetColumn === 'id'
);
expect(productIdFK).toBeDefined();
expect(productIdFK).toMatchObject({
sourceTable: 'user_products',
sourceColumn: 'product_id',
targetTable: 'products',
targetColumn: 'id',
});
});
});

View File

@@ -32,11 +32,11 @@ export async function fromSQLite(sqlContent: string): Promise<SQLParserResult> {
const tableMap: Record<string, string> = {}; // Maps table name to its ID
try {
// SPECIAL HANDLING: Direct regex-based parser for SQLite DDL
// This ensures we handle all SQLite-specific syntax including tables without types
// SPECIAL HANDLING: Direct line-by-line parser for SQLite DDL
// This ensures we preserve the exact data types from the original DDL
const directlyParsedTables = parseCreateTableStatements(sqlContent);
// Always try direct parsing first as it's more reliable for SQLite
// Check if we successfully parsed tables directly
if (directlyParsedTables.length > 0) {
// Map the direct parsing results to the expected SQLParserResult format
directlyParsedTables.forEach((table) => {
@@ -56,19 +56,8 @@ export async function fromSQLite(sqlContent: string): Promise<SQLParserResult> {
// Process foreign keys using the regex approach
findForeignKeysUsingRegex(sqlContent, tableMap, relationships);
// Create placeholder tables for any missing referenced tables
addPlaceholderTablesForFKReferences(
tables,
relationships,
tableMap
);
// Filter out any invalid relationships
const validRelationships = relationships.filter((rel) => {
return isValidForeignKeyRelationship(rel, tables);
});
return { tables, relationships: validRelationships };
// Return the result
return { tables, relationships };
}
// Preprocess SQL to handle SQLite quoted identifiers
@@ -141,181 +130,100 @@ function parseCreateTableStatements(sqlContent: string): {
columns: SQLColumn[];
}[] = [];
// Remove comments before processing
const cleanedSQL = sqlContent
.split('\n')
.map((line) => {
const commentIndex = line.indexOf('--');
if (commentIndex >= 0) {
return line.substring(0, commentIndex);
}
return line;
})
.join('\n');
// Split SQL content into lines
const lines = sqlContent.split('\n');
// Match all CREATE TABLE statements including those without column definitions
const createTableRegex =
/CREATE\s+TABLE\s+(?:IF\s+NOT\s+EXISTS\s+)?["'`]?(\w+)["'`]?\s*\(([^;]+?)\)\s*;/gis;
let match;
let currentTable: { name: string; columns: SQLColumn[] } | null = null;
let inCreateTable = false;
while ((match = createTableRegex.exec(cleanedSQL)) !== null) {
const tableName = match[1];
const tableBody = match[2].trim();
// Process each line
for (let i = 0; i < lines.length; i++) {
const line = lines[i].trim();
const table: { name: string; columns: SQLColumn[] } = {
name: tableName,
columns: [],
};
// Special case: sqlite_sequence or tables with columns but no types
if (tableName === 'sqlite_sequence' || !tableBody.includes(' ')) {
// Parse simple column list without types (e.g., "name,seq")
const simpleColumns = tableBody.split(',').map((col) => col.trim());
for (const colName of simpleColumns) {
if (
colName &&
!colName.toUpperCase().startsWith('FOREIGN KEY') &&
!colName.toUpperCase().startsWith('PRIMARY KEY') &&
!colName.toUpperCase().startsWith('UNIQUE') &&
!colName.toUpperCase().startsWith('CHECK') &&
!colName.toUpperCase().startsWith('CONSTRAINT')
) {
table.columns.push({
name: colName.replace(/["'`]/g, ''),
type: 'TEXT', // Default to TEXT for untyped columns
nullable: true,
primaryKey: false,
unique: false,
default: '',
increment: false,
});
}
}
} else {
// Parse normal table with typed columns
// Split by commas not inside parentheses
const columnDefs = [];
let current = '';
let parenDepth = 0;
for (let i = 0; i < tableBody.length; i++) {
const char = tableBody[i];
if (char === '(') parenDepth++;
else if (char === ')') parenDepth--;
else if (char === ',' && parenDepth === 0) {
columnDefs.push(current.trim());
current = '';
continue;
}
current += char;
}
if (current.trim()) {
columnDefs.push(current.trim());
}
for (const columnDef of columnDefs) {
const line = columnDef.trim();
// Skip constraints
if (
line.toUpperCase().startsWith('FOREIGN KEY') ||
line.toUpperCase().startsWith('PRIMARY KEY') ||
line.toUpperCase().startsWith('UNIQUE') ||
line.toUpperCase().startsWith('CHECK') ||
line.toUpperCase().startsWith('CONSTRAINT')
) {
continue;
}
// Parse column: handle both quoted and unquoted identifiers
// Pattern: [quotes]columnName[quotes] dataType [constraints]
const columnPattern = /^["'`]?([\w]+)["'`]?\s+(\w+)(.*)$/i;
const columnMatch = columnPattern.exec(line);
if (columnMatch) {
const columnName = columnMatch[1];
const rawType = columnMatch[2].toUpperCase();
const restOfLine = columnMatch[3] || '';
const upperRest = restOfLine.toUpperCase();
// Determine column properties
const isPrimaryKey = upperRest.includes('PRIMARY KEY');
const isAutoIncrement = upperRest.includes('AUTOINCREMENT');
const isNotNull =
upperRest.includes('NOT NULL') || isPrimaryKey;
const isUnique =
upperRest.includes('UNIQUE') || isPrimaryKey;
// Extract default value
let defaultValue = '';
const defaultMatch = /DEFAULT\s+([^,)]+)/i.exec(restOfLine);
if (defaultMatch) {
defaultValue = defaultMatch[1].trim();
// Remove quotes if present
if (
(defaultValue.startsWith("'") &&
defaultValue.endsWith("'")) ||
(defaultValue.startsWith('"') &&
defaultValue.endsWith('"'))
) {
defaultValue = defaultValue.slice(1, -1);
}
}
// Map to appropriate SQLite storage class
let columnType = rawType;
if (rawType === 'INTEGER' || rawType === 'INT') {
columnType = 'INTEGER';
} else if (
[
'REAL',
'FLOAT',
'DOUBLE',
'NUMERIC',
'DECIMAL',
].includes(rawType)
) {
columnType = 'REAL';
} else if (rawType === 'BLOB' || rawType === 'BINARY') {
columnType = 'BLOB';
} else if (
['TIMESTAMP', 'DATETIME', 'DATE', 'TIME'].includes(
rawType
)
) {
columnType = 'TIMESTAMP';
} else if (
['TEXT', 'VARCHAR', 'CHAR', 'CLOB', 'STRING'].includes(
rawType
) ||
rawType.startsWith('VARCHAR') ||
rawType.startsWith('CHAR')
) {
columnType = 'TEXT';
} else {
// Default to TEXT for unknown types
columnType = 'TEXT';
}
// Add column to the table
table.columns.push({
name: columnName,
type: columnType,
nullable: !isNotNull,
primaryKey: isPrimaryKey,
unique: isUnique,
default: defaultValue,
increment:
isPrimaryKey &&
isAutoIncrement &&
columnType === 'INTEGER',
});
}
}
// Skip empty lines and comments
if (!line || line.startsWith('--')) {
continue;
}
if (table.columns.length > 0 || tableName === 'sqlite_sequence') {
tables.push(table);
// Check for CREATE TABLE statement
if (line.toUpperCase().startsWith('CREATE TABLE')) {
// Extract table name
const tableNameMatch =
/CREATE\s+TABLE\s+(?:if\s+not\s+exists\s+)?["'`]?(\w+)["'`]?/i.exec(
line
);
if (tableNameMatch && tableNameMatch[1]) {
inCreateTable = true;
currentTable = {
name: tableNameMatch[1],
columns: [],
};
}
}
// Check for end of CREATE TABLE statement
else if (inCreateTable && line.includes(');')) {
if (currentTable) {
tables.push(currentTable);
}
inCreateTable = false;
currentTable = null;
}
// Process column definitions inside CREATE TABLE
else if (inCreateTable && currentTable && line.includes('"')) {
// Column line pattern optimized for user's DDL format
const columnPattern = /\s*["'`](\w+)["'`]\s+([A-Za-z0-9_]+)(.+)?/i;
const match = columnPattern.exec(line);
if (match) {
const columnName = match[1];
const rawType = match[2].toUpperCase();
const restOfLine = match[3] || '';
// Determine column properties
const isPrimaryKey = restOfLine
.toUpperCase()
.includes('PRIMARY KEY');
const isNotNull = restOfLine.toUpperCase().includes('NOT NULL');
const isUnique = restOfLine.toUpperCase().includes('UNIQUE');
// Extract default value
let defaultValue = '';
const defaultMatch = /DEFAULT\s+([^,\s)]+)/i.exec(restOfLine);
if (defaultMatch) {
defaultValue = defaultMatch[1];
}
// Map to appropriate SQLite storage class
let columnType = rawType;
if (rawType === 'INTEGER' || rawType === 'INT') {
columnType = 'INTEGER';
} else if (
['REAL', 'FLOAT', 'DOUBLE', 'NUMERIC', 'DECIMAL'].includes(
rawType
)
) {
columnType = 'REAL';
} else if (rawType === 'BLOB' || rawType === 'BINARY') {
columnType = 'BLOB';
} else if (
['TIMESTAMP', 'DATETIME', 'DATE'].includes(rawType)
) {
columnType = 'TIMESTAMP';
} else {
columnType = 'TEXT';
}
// Add column to the table
currentTable.columns.push({
name: columnName,
type: columnType,
nullable: !isNotNull,
primaryKey: isPrimaryKey,
unique: isUnique || isPrimaryKey,
default: defaultValue,
increment: isPrimaryKey && columnType === 'INTEGER',
});
}
}
}

View File

@@ -1,6 +1,6 @@
Table "public"."guy_table" {
"id" integer [pk, not null]
"created_at" "timestamp without time zone" [not null]
"created_at" timestamp [not null]
"column3" text
"arrayfield" text[]
"field_5" "character varying"

View File

@@ -1,7 +0,0 @@
Table "public"."orders" {
"order_id" integer [pk, not null, increment]
"customer_id" integer [not null]
"order_date" date [not null, default: `CURRENT_DATE`]
"total_amount" numeric [not null, default: 0]
"status" varchar(50) [not null, default: 'Pending']
}

View File

@@ -1 +0,0 @@
{"id":"6b81a1787207","name":"SQL Import (postgresql)","createdAt":"2025-09-15T08:46:26.747Z","updatedAt":"2025-09-17T11:32:13.876Z","databaseType":"postgresql","tables":[{"id":"5ytf0yj9etpmm7mhmhvpu8kfj","name":"orders","schema":"public","order":1,"fields":[{"id":"w7l77cy9hylvlitdovt4ktdmk","name":"order_id","type":{"id":"integer","name":"integer"},"nullable":false,"primaryKey":true,"unique":false,"default":"","createdAt":1757925986747,"increment":true},{"id":"vz7747t5fxrb62v1eepmahv9v","name":"customer_id","type":{"id":"integer","name":"integer"},"nullable":false,"primaryKey":false,"unique":false,"default":"","createdAt":1757925986747,"increment":false},{"id":"geq9qy6sv4ozl2lg9fvcyzxpf","name":"order_date","type":{"name":"date","id":"date","usageLevel":1},"nullable":false,"primaryKey":false,"unique":false,"default":"CURRENT_DATE()","createdAt":1757925986747,"increment":false},{"id":"z928n7umvpec79t2eif7kmde9","name":"total_amount","type":{"name":"numeric","id":"numeric","fieldAttributes":{"precision":{"max":999,"min":1,"default":10},"scale":{"max":999,"min":0,"default":2}}},"nullable":false,"primaryKey":false,"unique":false,"default":"0","createdAt":1757925986747,"increment":false},{"id":"7bkrd0rp1s17bi1lnle6pesc7","name":"status","type":{"name":"varchar","id":"varchar","fieldAttributes":{"hasCharMaxLength":true},"usageLevel":1},"nullable":false,"primaryKey":false,"unique":false,"default":"'Pending'","createdAt":1757925986747,"increment":false,"characterMaximumLength":"50"}],"indexes":[],"x":113,"y":747,"color":"#8eb7ff","isView":false,"createdAt":1757925986747,"diagramId":"6b81a1787207","parentAreaId":null}],"relationships":[],"dependencies":[],"storageMode":"project","lastProjectSavedAt":"2025-09-17T11:32:13.876Z","areas":[],"creationMethod":"imported","customTypes":[]}

View File

@@ -1,129 +0,0 @@
Enum "cbhpm_entradas_tipo" {
"grupo"
"subgrupo"
"procedimento"
}
Enum "cid_entradas_tipo" {
"capitulo"
"agrupamento"
"categoria"
"subcategoria"
}
Enum "digital_signature_provider" {
"soluti"
"valid"
}
Enum "impresso_posicao" {
"start"
"center"
"end"
}
Enum "otp_provider" {
"clinic"
"soluti_bird_id"
}
Enum "tipo_cobranca" {
"valor"
"porte"
}
Enum "tipo_contato_movel" {
"celular"
"telefone_residencial"
"telefone_comercial"
}
Enum "tipo_contrato" {
"trial"
"common"
}
Enum "tipo_endereco" {
"residencial"
"comercial"
"cobranca"
}
Enum "tipo_espectro_autista" {
"leve"
"moderado"
"severo"
}
Enum "tipo_estado_civil" {
"nao_infomado"
"solteiro"
"casado"
"divorciado"
"viuvo"
}
Enum "tipo_etnia" {
"nao_infomado"
"branca"
"preta"
"parda"
"amarela"
"indigena"
}
Enum "tipo_excecao" {
"bloqueio"
"compromisso"
}
Enum "tipo_metodo_reajuste" {
"percentual"
"valor"
}
Enum "tipo_pessoa" {
"fisica"
"juridica"
}
Enum "tipo_procedimento" {
"consulta"
"exame_laboratorial"
"exame_imagem"
"procedimento_clinico"
"procedimento_cirurgico"
"terapia"
"outros"
}
Enum "tipo_relacionamento" {
"pai"
"mae"
"conjuge"
"filho_a"
"tutor_legal"
"contato_emergencia"
"outro"
}
Enum "tipo_sexo" {
"nao_infomado"
"masculino"
"feminino"
"intersexo"
}
Enum "tipo_status_agendamento" {
"em espera"
"faltou"
"ok"
}
Table "public"."organizacao_cfg_impressos" {
"id_organizacao" integer [pk, not null, ref: < "public"."organizacao"."id"]
}
Table "public"."organizacao" {
"id" integer [pk, not null]
}

File diff suppressed because one or more lines are too long

View File

@@ -1,14 +0,0 @@
Table "users" {
"id" integer [pk, not null, increment]
"username" varchar(100) [unique, not null]
"email" varchar(255) [not null]
}
Table "posts" {
"post_id" bigint [pk, not null, increment]
"user_id" integer [not null]
"title" varchar(200) [not null]
"order_num" integer [not null, increment]
}
Ref "fk_0_fk_posts_users":"users"."id" < "posts"."user_id"

View File

@@ -1 +0,0 @@
{"id":"test_auto_increment","name":"Auto Increment Test (mysql)","createdAt":"2025-01-20T00:00:00.000Z","updatedAt":"2025-01-20T00:00:00.000Z","databaseType":"mysql","tables":[{"id":"table1","name":"users","order":1,"fields":[{"id":"field1","name":"id","type":{"id":"integer","name":"integer"},"nullable":false,"primaryKey":true,"unique":false,"default":"","increment":true,"createdAt":1705708800000},{"id":"field2","name":"username","type":{"id":"varchar","name":"varchar","fieldAttributes":{"hasCharMaxLength":true}},"nullable":false,"primaryKey":false,"unique":true,"default":"","increment":false,"characterMaximumLength":"100","createdAt":1705708800000},{"id":"field3","name":"email","type":{"id":"varchar","name":"varchar","fieldAttributes":{"hasCharMaxLength":true}},"nullable":false,"primaryKey":false,"unique":false,"default":"","increment":false,"characterMaximumLength":"255","createdAt":1705708800000}],"indexes":[],"x":100,"y":100,"color":"#8eb7ff","isView":false,"createdAt":1705708800000},{"id":"table2","name":"posts","order":2,"fields":[{"id":"field4","name":"post_id","type":{"id":"bigint","name":"bigint"},"nullable":false,"primaryKey":true,"unique":false,"default":"","increment":true,"createdAt":1705708800000},{"id":"field5","name":"user_id","type":{"id":"integer","name":"integer"},"nullable":false,"primaryKey":false,"unique":false,"default":"","increment":false,"createdAt":1705708800000},{"id":"field6","name":"title","type":{"id":"varchar","name":"varchar","fieldAttributes":{"hasCharMaxLength":true}},"nullable":false,"primaryKey":false,"unique":false,"default":"","increment":false,"characterMaximumLength":"200","createdAt":1705708800000},{"id":"field7","name":"order_num","type":{"id":"integer","name":"integer"},"nullable":false,"primaryKey":false,"unique":false,"default":"","increment":true,"createdAt":1705708800000}],"indexes":[],"x":300,"y":100,"color":"#8eb7ff","isView":false,"createdAt":1705708800000}],"relationships":[{"id":"rel1","name":"fk_posts_users","sourceTableId":"table2","targetTableId":"table1","sourceFieldId":"field5","targetFieldId":"field1","type":"one_to_many","sourceCardinality":"many","targetCardinality":"one","createdAt":1705708800000}],"dependencies":[],"storageMode":"project","areas":[],"creationMethod":"manual","customTypes":[]}

View File

@@ -1,205 +0,0 @@
import { describe, it, expect } from 'vitest';
import { generateDBMLFromDiagram } from '../dbml-export';
import { DatabaseType } from '@/lib/domain/database-type';
import type { Diagram } from '@/lib/domain/diagram';
import { generateId, generateDiagramId } from '@/lib/utils';
describe('DBML Export - Empty Tables', () => {
it('should filter out tables with no fields', () => {
const diagram: Diagram = {
id: generateDiagramId(),
name: 'Test Diagram',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'valid_table',
schema: 'public',
x: 0,
y: 0,
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'integer', name: 'integer' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'empty_table',
schema: 'public',
x: 0,
y: 0,
fields: [], // Empty fields array
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'another_valid_table',
schema: 'public',
x: 0,
y: 0,
fields: [
{
id: generateId(),
name: 'name',
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const result = generateDBMLFromDiagram(diagram);
// Verify the DBML doesn't contain the empty table
expect(result.inlineDbml).not.toContain('empty_table');
expect(result.standardDbml).not.toContain('empty_table');
// Verify the valid tables are still present
expect(result.inlineDbml).toContain('valid_table');
expect(result.inlineDbml).toContain('another_valid_table');
});
it('should handle diagram with only empty tables', () => {
const diagram: Diagram = {
id: generateDiagramId(),
name: 'Test Diagram',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'empty_table_1',
schema: 'public',
x: 0,
y: 0,
fields: [],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'empty_table_2',
schema: 'public',
x: 0,
y: 0,
fields: [],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const result = generateDBMLFromDiagram(diagram);
// Should not error and should return empty DBML (or just enums if any)
expect(result.inlineDbml).toBeTruthy();
expect(result.standardDbml).toBeTruthy();
expect(result.error).toBeUndefined();
});
it('should filter out table that becomes empty after removing invalid fields', () => {
const diagram: Diagram = {
id: generateDiagramId(),
name: 'Test Diagram',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'table_with_only_empty_field_names',
schema: 'public',
x: 0,
y: 0,
fields: [
{
id: generateId(),
name: '', // Empty field name - will be filtered
type: { id: 'integer', name: 'integer' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
{
id: generateId(),
name: '', // Empty field name - will be filtered
type: { id: 'varchar', name: 'varchar' },
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'valid_table',
schema: 'public',
x: 0,
y: 0,
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'integer', name: 'integer' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const result = generateDBMLFromDiagram(diagram);
// Table with only empty field names should be filtered out
expect(result.inlineDbml).not.toContain(
'table_with_only_empty_field_names'
);
// Valid table should remain
expect(result.inlineDbml).toContain('valid_table');
});
});

View File

@@ -4,74 +4,64 @@ import { generateDBMLFromDiagram } from '../dbml-export';
import * as fs from 'fs';
import * as path from 'path';
const testCase = (caseNumber: string) => {
// Read the JSON file
const jsonPath = path.join(__dirname, 'cases', `${caseNumber}.json`);
const jsonContent = fs.readFileSync(jsonPath, 'utf-8');
// Parse the JSON and convert to diagram
const diagram = diagramFromJSONInput(jsonContent);
// Generate DBML from the diagram
const result = generateDBMLFromDiagram(diagram);
// Check for both regular and inline DBML files
const regularDbmlPath = path.join(__dirname, 'cases', `${caseNumber}.dbml`);
const inlineDbmlPath = path.join(
__dirname,
'cases',
`${caseNumber}.inline.dbml`
);
const hasRegularDbml = fs.existsSync(regularDbmlPath);
const hasInlineDbml = fs.existsSync(inlineDbmlPath);
// Test regular DBML if file exists
if (hasRegularDbml) {
const expectedRegularDBML = fs.readFileSync(regularDbmlPath, 'utf-8');
expect(result.standardDbml).toBe(expectedRegularDBML);
}
// Test inline DBML if file exists
if (hasInlineDbml) {
const expectedInlineDBML = fs.readFileSync(inlineDbmlPath, 'utf-8');
expect(result.inlineDbml).toBe(expectedInlineDBML);
}
// Ensure at least one DBML file exists
if (!hasRegularDbml && !hasInlineDbml) {
throw new Error(
`No DBML file found for test case ${caseNumber}. Expected either ${caseNumber}.dbml or ${caseNumber}.inline.dbml`
);
}
};
describe('DBML Export cases', () => {
describe('DBML Export - Diagram Case 1 Tests', () => {
it('should handle case 1 diagram', { timeout: 30000 }, async () => {
testCase('1');
// Read the JSON file
const jsonPath = path.join(__dirname, 'cases', '1.json');
const jsonContent = fs.readFileSync(jsonPath, 'utf-8');
// Parse the JSON and convert to diagram
const diagram = diagramFromJSONInput(jsonContent);
// Generate DBML from the diagram
const result = generateDBMLFromDiagram(diagram);
const generatedDBML = result.standardDbml;
// Read the expected DBML file
const dbmlPath = path.join(__dirname, 'cases', '1.dbml');
const expectedDBML = fs.readFileSync(dbmlPath, 'utf-8');
// Compare the generated DBML with the expected DBML
expect(generatedDBML).toBe(expectedDBML);
});
it('should handle case 2 diagram', { timeout: 30000 }, async () => {
testCase('2');
// Read the JSON file
const jsonPath = path.join(__dirname, 'cases', '2.json');
const jsonContent = fs.readFileSync(jsonPath, 'utf-8');
// Parse the JSON and convert to diagram
const diagram = diagramFromJSONInput(jsonContent);
// Generate DBML from the diagram
const result = generateDBMLFromDiagram(diagram);
const generatedDBML = result.standardDbml;
// Read the expected DBML file
const dbmlPath = path.join(__dirname, 'cases', '2.dbml');
const expectedDBML = fs.readFileSync(dbmlPath, 'utf-8');
// Compare the generated DBML with the expected DBML
expect(generatedDBML).toBe(expectedDBML);
});
it('should handle case 3 diagram', { timeout: 30000 }, async () => {
testCase('3');
});
// Read the JSON file
const jsonPath = path.join(__dirname, 'cases', '3.json');
const jsonContent = fs.readFileSync(jsonPath, 'utf-8');
it('should handle case 4 diagram', { timeout: 30000 }, async () => {
testCase('4');
});
// Parse the JSON and convert to diagram
const diagram = diagramFromJSONInput(jsonContent);
it('should handle case 5 diagram', { timeout: 30000 }, async () => {
testCase('5');
});
// Generate DBML from the diagram
const result = generateDBMLFromDiagram(diagram);
const generatedDBML = result.standardDbml;
it(
'should handle case 6 diagram - auto increment',
{ timeout: 30000 },
async () => {
testCase('6');
}
);
// Read the expected DBML file
const dbmlPath = path.join(__dirname, 'cases', '3.dbml');
const expectedDBML = fs.readFileSync(dbmlPath, 'utf-8');
// Compare the generated DBML with the expected DBML
expect(generatedDBML).toBe(expectedDBML);
});
});

View File

@@ -1,248 +0,0 @@
import { describe, it, expect } from 'vitest';
import { generateDBMLFromDiagram } from '../dbml-export';
import { importDBMLToDiagram } from '../../dbml-import/dbml-import';
import { DatabaseType } from '@/lib/domain/database-type';
import type { Diagram } from '@/lib/domain/diagram';
import { generateId, generateDiagramId } from '@/lib/utils';
describe('DBML Export - Timestamp with Time Zone', () => {
it('should preserve "timestamp with time zone" type through export and reimport', async () => {
// Create a diagram with timestamp with time zone field
const diagram: Diagram = {
id: generateDiagramId(),
name: 'Test Diagram',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'events',
schema: 'public',
x: 0,
y: 0,
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'integer', name: 'integer' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'created_at',
type: {
id: 'timestamp_with_time_zone',
name: 'timestamp with time zone',
},
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'updated_at',
type: {
id: 'timestamp_without_time_zone',
name: 'timestamp without time zone',
},
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
// Export to DBML
const exportResult = generateDBMLFromDiagram(diagram);
// Verify the DBML contains quoted multi-word types
expect(exportResult.inlineDbml).toContain('"timestamp with time zone"');
expect(exportResult.inlineDbml).toContain(
'"timestamp without time zone"'
);
// Reimport the DBML
const reimportedDiagram = await importDBMLToDiagram(
exportResult.inlineDbml,
{
databaseType: DatabaseType.POSTGRESQL,
}
);
// Verify the types are preserved
const table = reimportedDiagram.tables?.find(
(t) => t.name === 'events'
);
expect(table).toBeDefined();
const createdAtField = table?.fields.find(
(f) => f.name === 'created_at'
);
const updatedAtField = table?.fields.find(
(f) => f.name === 'updated_at'
);
expect(createdAtField?.type.name).toBe('timestamp with time zone');
expect(updatedAtField?.type.name).toBe('timestamp without time zone');
});
it('should handle time with time zone types', async () => {
const diagram: Diagram = {
id: generateDiagramId(),
name: 'Test Diagram',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'schedules',
schema: 'public',
x: 0,
y: 0,
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'integer', name: 'integer' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'start_time',
type: {
id: 'time_with_time_zone',
name: 'time with time zone',
},
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'end_time',
type: {
id: 'time_without_time_zone',
name: 'time without time zone',
},
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const exportResult = generateDBMLFromDiagram(diagram);
expect(exportResult.inlineDbml).toContain('"time with time zone"');
expect(exportResult.inlineDbml).toContain('"time without time zone"');
const reimportedDiagram = await importDBMLToDiagram(
exportResult.inlineDbml,
{
databaseType: DatabaseType.POSTGRESQL,
}
);
const table = reimportedDiagram.tables?.find(
(t) => t.name === 'schedules'
);
const startTimeField = table?.fields.find(
(f) => f.name === 'start_time'
);
const endTimeField = table?.fields.find((f) => f.name === 'end_time');
expect(startTimeField?.type.name).toBe('time with time zone');
expect(endTimeField?.type.name).toBe('time without time zone');
});
it('should handle double precision type', async () => {
const diagram: Diagram = {
id: generateDiagramId(),
name: 'Test Diagram',
databaseType: DatabaseType.POSTGRESQL,
tables: [
{
id: generateId(),
name: 'measurements',
schema: 'public',
x: 0,
y: 0,
fields: [
{
id: generateId(),
name: 'id',
type: { id: 'integer', name: 'integer' },
primaryKey: true,
unique: true,
nullable: false,
createdAt: Date.now(),
},
{
id: generateId(),
name: 'value',
type: {
id: 'double_precision',
name: 'double precision',
},
primaryKey: false,
unique: false,
nullable: true,
createdAt: Date.now(),
},
],
indexes: [],
color: '#8eb7ff',
isView: false,
createdAt: Date.now(),
},
],
relationships: [],
createdAt: new Date(),
updatedAt: new Date(),
};
const exportResult = generateDBMLFromDiagram(diagram);
expect(exportResult.inlineDbml).toContain('"double precision"');
const reimportedDiagram = await importDBMLToDiagram(
exportResult.inlineDbml,
{
databaseType: DatabaseType.POSTGRESQL,
}
);
const table = reimportedDiagram.tables?.find(
(t) => t.name === 'measurements'
);
const valueField = table?.fields.find((f) => f.name === 'value');
expect(valueField?.type.name).toBe('double precision');
});
});

View File

@@ -3,6 +3,7 @@ import { exportBaseSQL } from '@/lib/data/sql-export/export-sql-script';
import type { Diagram } from '@/lib/domain/diagram';
import { DatabaseType } from '@/lib/domain/database-type';
import type { DBTable } from '@/lib/domain/db-table';
import { type DBField } from '@/lib/domain/db-field';
import type { DBCustomType } from '@/lib/domain/db-custom-type';
import { DBCustomTypeKind } from '@/lib/domain/db-custom-type';
@@ -501,35 +502,38 @@ const convertToInlineRefs = (dbml: string): string => {
return cleanedDbml;
};
// Function to check for DBML reserved keywords
const isDBMLKeyword = (name: string): boolean => {
const keywords = new Set([
'YES',
'NO',
'TRUE',
'FALSE',
'NULL', // DBML reserved keywords (boolean literals)
]);
return keywords.has(name.toUpperCase());
};
// Function to check for SQL keywords (add more if needed)
const isSQLKeyword = (name: string): boolean => {
const keywords = new Set(['CASE', 'ORDER', 'GROUP', 'FROM', 'TO', 'USER']); // Common SQL keywords
return keywords.has(name.toUpperCase());
};
// Function to remove duplicate relationships from the diagram
const deduplicateRelationships = (diagram: Diagram): Diagram => {
if (!diagram.relationships) return diagram;
const seenRelationships = new Set<string>();
const seenBidirectional = new Set<string>();
const uniqueRelationships = diagram.relationships.filter((rel) => {
// Create a unique key based on the relationship endpoints
const relationshipKey = `${rel.sourceTableId}-${rel.sourceFieldId}->${rel.targetTableId}-${rel.targetFieldId}`;
// Create a normalized key that's the same for both directions
const normalizedKey = [
`${rel.sourceTableId}-${rel.sourceFieldId}`,
`${rel.targetTableId}-${rel.targetFieldId}`,
]
.sort()
.join('<->');
if (seenRelationships.has(relationshipKey)) {
return false; // Skip exact duplicate
}
if (seenBidirectional.has(normalizedKey)) {
// This is a bidirectional relationship, skip the second one
return false;
return false; // Skip duplicate
}
seenRelationships.add(relationshipKey);
seenBidirectional.add(normalizedKey);
return true; // Keep unique relationship
});
@@ -539,6 +543,48 @@ const deduplicateRelationships = (diagram: Diagram): Diagram => {
};
};
// Function to append comment statements for renamed tables and fields
const appendRenameComments = (
baseScript: string,
sqlRenamedTables: Map<string, string>,
fieldRenames: Array<{
table: string;
originalName: string;
newName: string;
}>,
finalDiagramForExport: Diagram
): string => {
let script = baseScript;
// Append COMMENTS for tables renamed due to SQL keywords
sqlRenamedTables.forEach((originalName, newName) => {
const escapedOriginal = originalName.replace(/'/g, "\\'");
// Find the table to get its schema
const table = finalDiagramForExport.tables?.find(
(t) => t.name === newName
);
const tableIdentifier = table?.schema
? `"${table.schema}"."${newName}"`
: `"${newName}"`;
script += `\nCOMMENT ON TABLE ${tableIdentifier} IS 'Original name was "${escapedOriginal}" (renamed due to SQL keyword conflict).';`;
});
// Append COMMENTS for fields renamed due to SQL keyword conflicts
fieldRenames.forEach(({ table, originalName, newName }) => {
const escapedOriginal = originalName.replace(/'/g, "\\'");
// Find the table to get its schema
const tableObj = finalDiagramForExport.tables?.find(
(t) => t.name === table
);
const tableIdentifier = tableObj?.schema
? `"${tableObj.schema}"."${table}"`
: `"${table}"`;
script += `\nCOMMENT ON COLUMN ${tableIdentifier}."${newName}" IS 'Original name was "${escapedOriginal}" (renamed due to SQL keyword conflict).';`;
});
return script;
};
// Fix DBML formatting to ensure consistent display of char and varchar types
const normalizeCharTypeFormat = (dbml: string): string => {
// Replace "char (N)" with "char(N)" to match varchar's formatting
@@ -583,54 +629,6 @@ const fixMultilineTableNames = (dbml: string): string => {
);
};
// Restore increment attribute for auto-incrementing fields
const restoreIncrementAttribute = (dbml: string, tables: DBTable[]): string => {
if (!tables || tables.length === 0) return dbml;
let result = dbml;
tables.forEach((table) => {
// Find fields with increment=true
const incrementFields = table.fields.filter((f) => f.increment);
incrementFields.forEach((field) => {
// Build the table identifier pattern
const tableIdentifier = table.schema
? `"${table.schema.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}"\\."${table.name.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}"`
: `"${table.name.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}"`;
// Escape field name for regex
const escapedFieldName = field.name.replace(
/[.*+?^${}()|[\]\\]/g,
'\\$&'
);
// Pattern to match the field line with existing attributes in brackets
// Matches: "field_name" type [existing, attributes]
const fieldPattern = new RegExp(
`(Table ${tableIdentifier} \\{[^}]*?^\\s*"${escapedFieldName}"[^\\[\\n]+)(\\[[^\\]]*\\])`,
'gms'
);
result = result.replace(
fieldPattern,
(match, fieldPart, brackets) => {
// Check if increment already exists
if (brackets.includes('increment')) {
return match;
}
// Add increment to the attributes
const newBrackets = brackets.replace(']', ', increment]');
return fieldPart + newBrackets;
}
);
});
});
return result;
};
// Restore composite primary key names in the DBML
const restoreCompositePKNames = (dbml: string, tables: DBTable[]): string => {
if (!tables || tables.length === 0) return dbml;
@@ -780,17 +778,9 @@ const restoreTableSchemas = (dbml: string, tables: DBTable[]): string => {
return result;
};
// Function to extract only Ref statements from DBML
const extractRelationshipsDbml = (dbml: string): string => {
const lines = dbml.split('\n');
const refLines = lines.filter((line) => line.trim().startsWith('Ref '));
return refLines.join('\n').trim();
};
export interface DBMLExportResult {
standardDbml: string;
inlineDbml: string;
relationshipsDbml: string;
error?: string;
}
@@ -807,37 +797,31 @@ export function generateDBMLFromDiagram(diagram: Diagram): DBMLExportResult {
};
}) ?? [];
// Filter out empty tables and duplicates in a single pass for performance
// Remove duplicate tables (consider both schema and table name)
const seenTableIdentifiers = new Set<string>();
const tablesWithFields = sanitizedTables.filter((table) => {
// Skip tables with no fields (empty tables cause DBML export to fail)
if (table.fields.length === 0) {
return false;
}
const uniqueTables = sanitizedTables.filter((table) => {
// Create a unique identifier combining schema and table name
const tableIdentifier = table.schema
? `${table.schema}.${table.name}`
: table.name;
// Skip duplicate tables
if (seenTableIdentifiers.has(tableIdentifier)) {
return false;
return false; // Skip duplicate
}
seenTableIdentifiers.add(tableIdentifier);
return true; // Keep unique, non-empty table
return true; // Keep unique table
});
// Create the base filtered diagram structure
const filteredDiagram: Diagram = {
...diagram,
tables: tablesWithFields,
tables: uniqueTables,
relationships:
diagram.relationships?.filter((rel) => {
const sourceTable = tablesWithFields.find(
const sourceTable = uniqueTables.find(
(t) => t.id === rel.sourceTableId
);
const targetTable = tablesWithFields.find(
const targetTable = uniqueTables.find(
(t) => t.id === rel.targetTableId
);
const sourceFieldExists = sourceTable?.fields.some(
@@ -859,33 +843,105 @@ export function generateDBMLFromDiagram(diagram: Diagram): DBMLExportResult {
// Sanitize field names ('from'/'to' in 'relation' table)
const cleanDiagram = fixProblematicFieldNames(filteredDiagram);
// Simplified processing - just handle duplicate field names
// --- Final sanitization and renaming pass ---
// Only rename keywords for PostgreSQL/SQLite
// For other databases, we'll wrap problematic names in quotes instead
const shouldRenameKeywords =
diagram.databaseType === DatabaseType.POSTGRESQL ||
diagram.databaseType === DatabaseType.SQLITE;
const sqlRenamedTables = new Map<string, string>();
const fieldRenames: Array<{
table: string;
originalName: string;
newName: string;
}> = [];
const processTable = (table: DBTable) => {
const originalName = table.name;
let safeTableName = originalName;
// If name contains spaces or special characters, wrap in quotes
if (/[^\w]/.test(originalName)) {
safeTableName = `"${originalName.replace(/"/g, '\\"')}"`;
}
// Rename table if it's a keyword (PostgreSQL/SQLite only)
if (
shouldRenameKeywords &&
(isDBMLKeyword(originalName) || isSQLKeyword(originalName))
) {
const newName = `${originalName}_table`;
sqlRenamedTables.set(newName, originalName);
safeTableName = /[^\w]/.test(newName)
? `"${newName.replace(/"/g, '\\"')}"`
: newName;
}
// For other databases, just quote DBML keywords
else if (!shouldRenameKeywords && isDBMLKeyword(originalName)) {
safeTableName = `"${originalName.replace(/"/g, '\\"')}"`;
}
const fieldNameCounts = new Map<string, number>();
const processedFields = table.fields.map((field) => {
let finalSafeName = field.name;
// If field name contains spaces or special characters, wrap in quotes
if (/[^\w]/.test(field.name)) {
finalSafeName = `"${field.name.replace(/"/g, '\\"')}"`;
}
// Handle duplicate field names
const count = fieldNameCounts.get(field.name) || 0;
if (count > 0) {
const newName = `${field.name}_${count + 1}`;
return {
...field,
name: newName,
};
finalSafeName = /[^\w]/.test(newName)
? `"${newName.replace(/"/g, '\\"')}"`
: newName;
}
fieldNameCounts.set(field.name, count + 1);
return field;
// Create sanitized field
const sanitizedField: DBField = {
...field,
name: finalSafeName,
};
// Rename field if it's a keyword (PostgreSQL/SQLite only)
if (
shouldRenameKeywords &&
(isDBMLKeyword(field.name) || isSQLKeyword(field.name))
) {
const newFieldName = `${field.name}_field`;
fieldRenames.push({
table: safeTableName,
originalName: field.name,
newName: newFieldName,
});
sanitizedField.name = /[^\w]/.test(newFieldName)
? `"${newFieldName.replace(/"/g, '\\"')}"`
: newFieldName;
}
// For other databases, just quote DBML keywords
else if (!shouldRenameKeywords && isDBMLKeyword(field.name)) {
sanitizedField.name = `"${field.name.replace(/"/g, '\\"')}"`;
}
return sanitizedField;
});
return {
...table,
name: safeTableName,
fields: processedFields,
indexes: (table.indexes || [])
.filter((index) => !index.isPrimaryKey) // Filter out PK indexes as they're handled separately
.map((index) => ({
...index,
name:
index.name ||
`idx_${Math.random().toString(36).substring(2, 8)}`,
name: index.name
? /[^\w]/.test(index.name)
? `"${index.name.replace(/"/g, '\\"')}"`
: index.name
: `idx_${Math.random().toString(36).substring(2, 8)}`,
})),
};
};
@@ -923,6 +979,19 @@ export function generateDBMLFromDiagram(diagram: Diagram): DBMLExportResult {
baseScript = sanitizeSQLforDBML(baseScript);
// Append comments for renamed tables and fields (PostgreSQL/SQLite only)
if (
shouldRenameKeywords &&
(sqlRenamedTables.size > 0 || fieldRenames.length > 0)
) {
baseScript = appendRenameComments(
baseScript,
sqlRenamedTables,
fieldRenames,
finalDiagramForExport
);
}
standard = fixArrayTypes(
normalizeCharTypeFormat(
fixMultilineTableNames(
@@ -937,13 +1006,10 @@ export function generateDBMLFromDiagram(diagram: Diagram): DBMLExportResult {
);
// Restore schema information that may have been stripped by DBML importer
standard = restoreTableSchemas(standard, tablesWithFields);
standard = restoreTableSchemas(standard, uniqueTables);
// Restore composite primary key names
standard = restoreCompositePKNames(standard, tablesWithFields);
// Restore increment attribute for auto-incrementing fields
standard = restoreIncrementAttribute(standard, tablesWithFields);
standard = restoreCompositePKNames(standard, uniqueTables);
// Prepend Enum DBML to the standard output
if (enumsDBML) {
@@ -988,13 +1054,5 @@ export function generateDBMLFromDiagram(diagram: Diagram): DBMLExportResult {
}
}
// Extract relationships DBML from standard output
const relationshipsDbml = extractRelationshipsDbml(standard);
return {
standardDbml: standard,
inlineDbml: inline,
relationshipsDbml,
error: errorMsg,
};
return { standardDbml: standard, inlineDbml: inline, error: errorMsg };
}

View File

@@ -1,3 +0,0 @@
Table "public"."table_3"{
"id" bigint [pk]
}

View File

@@ -1 +0,0 @@
{"id":"mqqwkkodxt6p","name":"Diagram 3","createdAt":"2025-09-16T15:33:25.300Z","updatedAt":"2025-09-16T15:33:31.563Z","databaseType":"postgresql","tables":[{"id":"loyxg6mafzos5u971uirjs3zh","name":"table_3","schema":"","order":0,"fields":[{"id":"29e2p9bom0uxo1n0a9ze5auuy","name":"id","type":{"name":"bigint","id":"bigint","usageLevel":2},"nullable":true,"primaryKey":true,"unique":true,"createdAt":1758036805300}],"indexes":[{"id":"5gf0aeptch1uk1bxv0x89wxxe","name":"pk_table_3_id","fieldIds":["29e2p9bom0uxo1n0a9ze5auuy"],"unique":true,"isPrimaryKey":true,"createdAt":1758036811564}],"x":0,"y":0,"color":"#8eb7ff","isView":false,"createdAt":1758036805300,"diagramId":"mqqwkkodxt6p"}],"relationships":[],"dependencies":[],"areas":[],"customTypes":[]}

View File

@@ -1,7 +0,0 @@
Table "table_3" {
"id" bigint [pk]
}
Table "table_2" {
"id" bigint [pk, not null, ref: < "table_3"."id"]
}

View File

@@ -1 +0,0 @@
{"id":"mqqwkkod6r09","name":"Diagram 10","createdAt":"2025-09-16T15:47:40.655Z","updatedAt":"2025-09-16T15:47:50.179Z","databaseType":"postgresql","tables":[{"id":"6xbco4ihmuiyv2heuw9fggbgx","name":"table_3","schema":"","order":0,"fields":[{"id":"rxftaey7uxvq5qg6ix1hbak1c","name":"id","type":{"name":"bigint","id":"bigint","usageLevel":2},"nullable":true,"primaryKey":true,"unique":true,"createdAt":1758037660654}],"indexes":[{"id":"vsyjjaq2l58urkh9qm2g9hqhd","name":"pk_table_3_id","fieldIds":["rxftaey7uxvq5qg6ix1hbak1c"],"unique":true,"isPrimaryKey":true,"createdAt":1758037660654}],"x":0,"y":0,"color":"#8eb7ff","isView":false,"createdAt":1758037660654,"diagramId":"mqqwkkod6r09"},{"id":"klu6k5ntddcxfdsu0fsfcwbiw","name":"table_2","schema":"","order":1,"fields":[{"id":"qq2415tivmtvun8vd727d9mr2","name":"id","type":{"name":"bigint","id":"bigint","usageLevel":2},"nullable":false,"primaryKey":true,"unique":true,"createdAt":1758037660655}],"indexes":[{"id":"cvv7sgmq07i9y54lz9a97nah5","name":"pk_table_2_id","fieldIds":["qq2415tivmtvun8vd727d9mr2"],"unique":true,"isPrimaryKey":true,"createdAt":1758037660655}],"x":300,"y":0,"color":"#8eb7ff","isView":false,"createdAt":1758037660655,"diagramId":"mqqwkkod6r09"}],"relationships":[{"id":"yw2pbcumsabuncc6rjnp3n87t","name":"table_3_id_table_2_id","sourceSchema":"","targetSchema":"","sourceTableId":"6xbco4ihmuiyv2heuw9fggbgx","targetTableId":"klu6k5ntddcxfdsu0fsfcwbiw","sourceFieldId":"rxftaey7uxvq5qg6ix1hbak1c","targetFieldId":"qq2415tivmtvun8vd727d9mr2","sourceCardinality":"one","targetCardinality":"one","createdAt":1758037660655,"diagramId":"mqqwkkod6r09"}],"dependencies":[],"areas":[],"customTypes":[]}

View File

@@ -1,317 +0,0 @@
import { describe, it, expect } from 'vitest';
import { importDBMLToDiagram } from '../dbml-import';
import { generateDBMLFromDiagram } from '../../dbml-export/dbml-export';
import { DatabaseType } from '@/lib/domain/database-type';
describe('DBML Array Fields - Fantasy RPG Theme', () => {
describe('Import - Spell and Magic Arrays', () => {
it('should import spell components as array fields', async () => {
const dbml = `
Table "magic"."spells" {
"id" uuid [pk, not null]
"name" varchar(200) [not null]
"level" integer [not null]
"components" text[] [note: 'Magical components: bat wing, dragon scale, phoenix feather']
"elemental_types" varchar(50)[] [note: 'Elements: fire, water, earth, air']
"mana_cost" integer [not null]
"created_at" timestamp [not null]
Indexes {
(name, level) [unique, name: "unique_spell"]
}
}
`;
const result = await importDBMLToDiagram(dbml, {
databaseType: DatabaseType.POSTGRESQL,
});
expect(result.tables).toHaveLength(1);
const table = result.tables![0];
expect(table.name).toBe('spells');
expect(table.schema).toBe('magic');
// Find the array fields
const components = table.fields.find(
(f) => f.name === 'components'
);
const elementalTypes = table.fields.find(
(f) => f.name === 'elemental_types'
);
// Verify they are marked as arrays
expect(components).toBeDefined();
expect(components?.isArray).toBe(true);
expect(components?.type.name).toBe('text');
expect(elementalTypes).toBeDefined();
expect(elementalTypes?.isArray).toBe(true);
expect(elementalTypes?.type.name).toBe('varchar');
expect(elementalTypes?.characterMaximumLength).toBe('50');
// Verify non-array fields don't have isArray set
const idField = table.fields.find((f) => f.name === 'id');
expect(idField?.isArray).toBeUndefined();
});
it('should import hero inventory with various array types', async () => {
const dbml = `
Table "heroes" {
"id" bigint [pk]
"name" varchar(100) [not null]
"abilities" varchar(100)[]
"inventory_slots" integer[]
"skill_levels" decimal(5, 2)[]
"quest_log" text[]
}
`;
const result = await importDBMLToDiagram(dbml, {
databaseType: DatabaseType.POSTGRESQL,
});
const table = result.tables![0];
const abilities = table.fields.find((f) => f.name === 'abilities');
expect(abilities?.isArray).toBe(true);
expect(abilities?.type.name).toBe('varchar');
expect(abilities?.characterMaximumLength).toBe('100');
const inventorySlots = table.fields.find(
(f) => f.name === 'inventory_slots'
);
expect(inventorySlots?.isArray).toBe(true);
expect(inventorySlots?.type.name).toBe('integer');
const skillLevels = table.fields.find(
(f) => f.name === 'skill_levels'
);
expect(skillLevels?.isArray).toBe(true);
expect(skillLevels?.type.name).toBe('decimal');
expect(skillLevels?.precision).toBe(5);
expect(skillLevels?.scale).toBe(2);
const questLog = table.fields.find((f) => f.name === 'quest_log');
expect(questLog?.isArray).toBe(true);
expect(questLog?.type.name).toBe('text');
});
it('should handle mixed array and non-array fields in creature table', async () => {
const dbml = `
Table "bestiary"."creatures" {
"id" uuid [pk]
"species_name" varchar(100) [not null]
"habitats" varchar(50)[]
"danger_level" integer [not null]
"resistances" varchar(50)[]
"is_tameable" boolean [not null]
}
`;
const result = await importDBMLToDiagram(dbml, {
databaseType: DatabaseType.POSTGRESQL,
});
const table = result.tables![0];
// Non-array fields
const id = table.fields.find((f) => f.name === 'id');
expect(id?.isArray).toBeUndefined();
const speciesName = table.fields.find(
(f) => f.name === 'species_name'
);
expect(speciesName?.isArray).toBeUndefined();
const dangerLevel = table.fields.find(
(f) => f.name === 'danger_level'
);
expect(dangerLevel?.isArray).toBeUndefined();
// Array fields
const habitats = table.fields.find((f) => f.name === 'habitats');
expect(habitats?.isArray).toBe(true);
const resistances = table.fields.find(
(f) => f.name === 'resistances'
);
expect(resistances?.isArray).toBe(true);
});
});
describe('Round-trip - Quest and Adventure Arrays', () => {
it('should preserve quest rewards array through export and re-import', async () => {
const originalDbml = `
Table "adventures"."quests" {
"id" uuid [pk, not null]
"title" varchar(200) [not null]
"difficulty" varchar(20) [not null]
"reward_items" text[] [note: 'Legendary sword, enchanted armor, healing potion']
"required_skills" varchar(100)[]
"experience_points" integer [not null]
"gold_reward" decimal(10, 2) [not null]
"created_at" timestamp [not null]
Indexes {
(title, difficulty) [unique, name: "unique_quest"]
}
}
`;
// Import the DBML
const diagram = await importDBMLToDiagram(originalDbml, {
databaseType: DatabaseType.POSTGRESQL,
});
// Verify array fields were imported correctly
const table = diagram.tables![0];
const rewardItems = table.fields.find(
(f) => f.name === 'reward_items'
);
const requiredSkills = table.fields.find(
(f) => f.name === 'required_skills'
);
expect(rewardItems?.isArray).toBe(true);
expect(requiredSkills?.isArray).toBe(true);
// Export back to DBML
const { standardDbml: exportedDbml } =
generateDBMLFromDiagram(diagram);
// Verify the exported DBML contains array syntax
expect(exportedDbml).toContain('text[]');
expect(exportedDbml).toContain('"reward_items" text[]');
expect(exportedDbml).toContain('"required_skills" varchar(100)[]');
// Re-import the exported DBML
const reimportedDiagram = await importDBMLToDiagram(exportedDbml, {
databaseType: DatabaseType.POSTGRESQL,
});
// Verify array fields are still marked as arrays
const reimportedTable = reimportedDiagram.tables![0];
const reimportedRewards = reimportedTable.fields.find(
(f) => f.name === 'reward_items'
);
const reimportedSkills = reimportedTable.fields.find(
(f) => f.name === 'required_skills'
);
expect(reimportedRewards?.isArray).toBe(true);
expect(reimportedSkills?.isArray).toBe(true);
});
it('should handle guild members with different array types in round-trip', async () => {
const originalDbml = `
Table "guilds"."members" {
"id" uuid [pk]
"name" varchar(100) [not null]
"class_specializations" varchar(50)[]
"completed_quest_ids" integer[]
"skill_ratings" decimal(3, 1)[]
"titles_earned" text[]
}
`;
// Import
const diagram = await importDBMLToDiagram(originalDbml, {
databaseType: DatabaseType.POSTGRESQL,
});
// Export
const { standardDbml: exportedDbml } =
generateDBMLFromDiagram(diagram);
// Verify exported DBML has correct array syntax with types
expect(exportedDbml).toContain('varchar(50)[]');
expect(exportedDbml).toContain('integer[]');
expect(exportedDbml).toContain('decimal(3,1)[]');
expect(exportedDbml).toContain('text[]');
// Re-import
const reimportedDiagram = await importDBMLToDiagram(exportedDbml, {
databaseType: DatabaseType.POSTGRESQL,
});
const table = reimportedDiagram.tables![0];
const classSpecs = table.fields.find(
(f) => f.name === 'class_specializations'
);
expect(classSpecs?.isArray).toBe(true);
expect(classSpecs?.characterMaximumLength).toBe('50');
const questIds = table.fields.find(
(f) => f.name === 'completed_quest_ids'
);
expect(questIds?.isArray).toBe(true);
const skillRatings = table.fields.find(
(f) => f.name === 'skill_ratings'
);
expect(skillRatings?.isArray).toBe(true);
expect(skillRatings?.precision).toBe(3);
expect(skillRatings?.scale).toBe(1);
const titles = table.fields.find((f) => f.name === 'titles_earned');
expect(titles?.isArray).toBe(true);
});
it('should preserve dungeon loot tables with mixed array and non-array fields', async () => {
const originalDbml = `
Table "dungeons"."loot_tables" {
"id" bigint [pk]
"dungeon_name" varchar(150) [not null]
"boss_name" varchar(100)
"common_drops" text[]
"rare_drops" text[]
"legendary_drops" text[]
"gold_range_min" integer [not null]
"gold_range_max" integer [not null]
"drop_rates" decimal(5, 2)[]
}
`;
// Import, export, and re-import
const diagram = await importDBMLToDiagram(originalDbml, {
databaseType: DatabaseType.POSTGRESQL,
});
const { standardDbml: exportedDbml } =
generateDBMLFromDiagram(diagram);
const reimportedDiagram = await importDBMLToDiagram(exportedDbml, {
databaseType: DatabaseType.POSTGRESQL,
});
const table = reimportedDiagram.tables![0];
// Verify non-array fields
expect(
table.fields.find((f) => f.name === 'id')?.isArray
).toBeUndefined();
expect(
table.fields.find((f) => f.name === 'dungeon_name')?.isArray
).toBeUndefined();
expect(
table.fields.find((f) => f.name === 'gold_range_min')?.isArray
).toBeUndefined();
// Verify array fields
expect(
table.fields.find((f) => f.name === 'common_drops')?.isArray
).toBe(true);
expect(
table.fields.find((f) => f.name === 'rare_drops')?.isArray
).toBe(true);
expect(
table.fields.find((f) => f.name === 'legendary_drops')?.isArray
).toBe(true);
expect(
table.fields.find((f) => f.name === 'drop_rates')?.isArray
).toBe(true);
});
});
});

View File

@@ -1,426 +0,0 @@
import { describe, it, expect } from 'vitest';
import { importDBMLToDiagram } from '../dbml-import';
import * as fs from 'fs';
import * as path from 'path';
import { DatabaseType } from '@/lib/domain/database-type';
import type { DBTable } from '@/lib/domain/db-table';
import type { DBField } from '@/lib/domain/db-field';
import type { DBRelationship } from '@/lib/domain/db-relationship';
import { defaultSchemas } from '@/lib/data/default-schemas';
// Type for field map entries
interface FieldMapEntry {
tableName: string;
fieldName: string;
}
// Helper function to compare field properties (excluding IDs and timestamps)
function expectFieldsMatch(
actualFields: DBField[],
expectedFields: DBField[]
): void {
expect(actualFields).toHaveLength(expectedFields.length);
for (let i = 0; i < actualFields.length; i++) {
const actual = actualFields[i];
const expected = expectedFields[i];
// Compare field properties (excluding ID and createdAt)
expect(actual.name).toBe(expected.name);
// Handle type comparison (could be string or object with name property)
if (typeof expected.type === 'object' && expected.type?.name) {
expect(actual.type?.name).toBe(expected.type.name);
} else if (typeof expected.type === 'string') {
expect(actual.type?.name).toBe(expected.type);
}
// Boolean flags with defaults
expect(actual.primaryKey).toBe(expected.primaryKey || false);
expect(actual.unique).toBe(expected.unique || false);
expect(actual.nullable).toBe(expected.nullable ?? true);
// Optional boolean flag
if (expected.increment !== undefined) {
expect(actual.increment).toBe(expected.increment);
}
// Optional string/number properties
if (expected.characterMaximumLength !== undefined) {
expect(actual.characterMaximumLength).toBe(
expected.characterMaximumLength
);
}
if (expected.precision !== undefined) {
expect(actual.precision).toBe(expected.precision);
}
if (expected.scale !== undefined) {
expect(actual.scale).toBe(expected.scale);
}
if (expected.default !== undefined) {
expect(actual.default).toBe(expected.default);
}
if (expected.collation !== undefined) {
expect(actual.collation).toBe(expected.collation);
}
if (expected.comments !== undefined) {
expect(actual.comments).toBe(expected.comments);
}
}
}
// Helper function to compare table properties (excluding IDs)
function expectTablesMatch(
actualTables: DBTable[],
expectedTables: DBTable[],
databaseType: DatabaseType
): void {
expect(actualTables).toHaveLength(expectedTables.length);
// Sort tables by name for consistent comparison
const sortedActual = [...actualTables].sort((a, b) =>
a.name.localeCompare(b.name)
);
const sortedExpected = [...expectedTables].sort((a, b) =>
a.name.localeCompare(b.name)
);
for (let i = 0; i < sortedActual.length; i++) {
const actual = sortedActual[i];
const expected = sortedExpected[i];
// Compare table properties (excluding ID and position)
expect(actual.name).toBe(expected.name);
// Schema comparison - handle differences in how schemas are represented
if (expected.schema) {
const defaultSchema = defaultSchemas[databaseType];
if (defaultSchema && expected.schema === defaultSchema) {
// DBML parser might not include default schema or might handle it differently
expect(
actual.schema === expected.schema ||
actual.schema === '' ||
actual.schema === undefined
).toBeTruthy();
} else {
expect(actual.schema).toBe(expected.schema);
}
}
// Compare fields
expectFieldsMatch(actual.fields, expected.fields);
// Check indexes exist for tables with primary keys
const hasPrimaryKeyField = actual.fields.some((f) => f.primaryKey);
if (hasPrimaryKeyField) {
expect(actual.indexes).toBeDefined();
expect(actual.indexes.length).toBeGreaterThan(0);
const pkIndex = actual.indexes.find((idx) => idx.isPrimaryKey);
expect(pkIndex).toBeDefined();
expect(pkIndex?.unique).toBe(true);
}
// Check comments if present
if (expected.comments !== undefined) {
expect(actual.comments).toBe(expected.comments);
}
}
}
// Helper function to compare relationships (excluding IDs)
function expectRelationshipsMatch(
actualRelationships: DBRelationship[],
expectedRelationships: DBRelationship[],
actualTables: DBTable[],
expectedTables: DBTable[]
): void {
expect(actualRelationships).toHaveLength(expectedRelationships.length);
// Create lookup maps for table and field names by ID
const expectedTableMap = new Map(expectedTables.map((t) => [t.id, t.name]));
const actualTableMap = new Map(actualTables.map((t) => [t.id, t.name]));
const expectedFieldMap = new Map<string, FieldMapEntry>();
const actualFieldMap = new Map<string, FieldMapEntry>();
expectedTables.forEach((table) => {
table.fields.forEach((field) => {
expectedFieldMap.set(field.id, {
tableName: table.name,
fieldName: field.name,
});
});
});
actualTables.forEach((table) => {
table.fields.forEach((field) => {
actualFieldMap.set(field.id, {
tableName: table.name,
fieldName: field.name,
});
});
});
// Sort relationships for consistent comparison
const sortRelationships = (
rels: DBRelationship[],
tableMap: Map<string, string>,
fieldMap: Map<string, FieldMapEntry>
) => {
return [...rels].sort((a, b) => {
const aSourceTable = tableMap.get(a.sourceTableId) || '';
const bSourceTable = tableMap.get(b.sourceTableId) || '';
const aTargetTable = tableMap.get(a.targetTableId) || '';
const bTargetTable = tableMap.get(b.targetTableId) || '';
const tableCompare =
aSourceTable.localeCompare(bSourceTable) ||
aTargetTable.localeCompare(bTargetTable);
if (tableCompare !== 0) return tableCompare;
const aSourceField = fieldMap.get(a.sourceFieldId)?.fieldName || '';
const bSourceField = fieldMap.get(b.sourceFieldId)?.fieldName || '';
const aTargetField = fieldMap.get(a.targetFieldId)?.fieldName || '';
const bTargetField = fieldMap.get(b.targetFieldId)?.fieldName || '';
return (
aSourceField.localeCompare(bSourceField) ||
aTargetField.localeCompare(bTargetField)
);
});
};
const sortedActual = sortRelationships(
actualRelationships,
actualTableMap,
actualFieldMap
);
const sortedExpected = sortRelationships(
expectedRelationships,
expectedTableMap,
expectedFieldMap
);
for (let i = 0; i < sortedActual.length; i++) {
const actual = sortedActual[i];
const expected = sortedExpected[i];
// Get table and field names for comparison
const actualSourceTable = actualTableMap.get(actual.sourceTableId);
const actualTargetTable = actualTableMap.get(actual.targetTableId);
const expectedSourceTable = expectedTableMap.get(
expected.sourceTableId
);
const expectedTargetTable = expectedTableMap.get(
expected.targetTableId
);
const actualSourceField = actualFieldMap.get(actual.sourceFieldId);
const actualTargetField = actualFieldMap.get(actual.targetFieldId);
const expectedSourceField = expectedFieldMap.get(
expected.sourceFieldId
);
const expectedTargetField = expectedFieldMap.get(
expected.targetFieldId
);
// Compare relationship by table and field names
expect(actualSourceTable).toBe(expectedSourceTable);
expect(actualTargetTable).toBe(expectedTargetTable);
expect(actualSourceField?.fieldName).toBe(
expectedSourceField?.fieldName
);
expect(actualTargetField?.fieldName).toBe(
expectedTargetField?.fieldName
);
// Compare cardinality
expect(actual.sourceCardinality).toBe(expected.sourceCardinality);
expect(actual.targetCardinality).toBe(expected.targetCardinality);
// Compare relationship name if present
if (expected.name !== undefined) {
expect(actual.name).toBe(expected.name);
}
}
}
// Main test helper function
async function testDBMLImportCase(caseNumber: string): Promise<void> {
// Read the DBML file
const dbmlPath = path.join(__dirname, 'cases', `${caseNumber}.dbml`);
const dbmlContent = fs.readFileSync(dbmlPath, 'utf-8');
// Read the expected JSON file
const jsonPath = path.join(__dirname, 'cases', `${caseNumber}.json`);
const jsonContent = fs.readFileSync(jsonPath, 'utf-8');
const expectedData = JSON.parse(jsonContent);
// Import DBML to diagram
const result = await importDBMLToDiagram(dbmlContent, {
databaseType: expectedData.databaseType || DatabaseType.POSTGRESQL,
});
// Check basic diagram properties
expect(result.name).toBe('DBML Import'); // Name is always 'DBML Import'
expect(result.databaseType).toBe(expectedData.databaseType);
// Check tables and fields
expectTablesMatch(
result.tables || [],
expectedData.tables || [],
expectedData.databaseType || DatabaseType.POSTGRESQL
);
// Check relationships
expectRelationshipsMatch(
result.relationships || [],
expectedData.relationships || [],
result.tables || [],
expectedData.tables || []
);
}
describe('DBML Import cases', () => {
it('should handle case 1 - simple table with pk and unique', async () => {
await testDBMLImportCase('1');
});
it('should handle case 2 - tables with relationships', async () => {
await testDBMLImportCase('2');
});
it('should handle table with default values', async () => {
const dbmlContent = `Table "public"."products" {
"id" bigint [pk, not null]
"name" varchar(255) [not null]
"price" decimal(10,2) [not null, default: 0]
"is_active" boolean [not null, default: true]
"status" varchar(50) [not null, default: "deprecated"]
"description" varchar(100) [default: \`complex "value" with quotes\`]
"created_at" timestamp [not null, default: "now()"]
Indexes {
(name) [name: "idx_products_name"]
}
}`;
const result = await importDBMLToDiagram(dbmlContent, {
databaseType: DatabaseType.POSTGRESQL,
});
expect(result.tables).toHaveLength(1);
const table = result.tables![0];
expect(table.name).toBe('products');
expect(table.fields).toHaveLength(7);
// Check numeric default (0)
const priceField = table.fields.find((f) => f.name === 'price');
expect(priceField?.default).toBe('0');
// Check boolean default (true)
const isActiveField = table.fields.find((f) => f.name === 'is_active');
expect(isActiveField?.default).toBe('true');
// Check string default with all quotes removed
const statusField = table.fields.find((f) => f.name === 'status');
expect(statusField?.default).toBe('deprecated');
// Check backtick string - all quotes removed
const descField = table.fields.find((f) => f.name === 'description');
expect(descField?.default).toBe('complex value with quotes');
// Check function default with all quotes removed
const createdAtField = table.fields.find(
(f) => f.name === 'created_at'
);
expect(createdAtField?.default).toBe('now()');
});
it('should handle auto-increment fields correctly', async () => {
const dbmlContent = `Table "public"."table_1" {
"id" integer [pk, not null, increment]
"field_2" bigint [increment]
"field_3" serial [increment]
"field_4" varchar(100) [not null]
}`;
const result = await importDBMLToDiagram(dbmlContent, {
databaseType: DatabaseType.POSTGRESQL,
});
expect(result.tables).toHaveLength(1);
const table = result.tables![0];
expect(table.name).toBe('table_1');
expect(table.fields).toHaveLength(4);
// field with [pk, not null, increment] - should be not null and increment
const idField = table.fields.find((f) => f.name === 'id');
expect(idField?.increment).toBe(true);
expect(idField?.nullable).toBe(false);
expect(idField?.primaryKey).toBe(true);
// field with [increment] only - should be not null and increment
// (auto-increment requires NOT NULL even if not explicitly stated)
const field2 = table.fields.find((f) => f.name === 'field_2');
expect(field2?.increment).toBe(true);
expect(field2?.nullable).toBe(false); // CRITICAL: must be false!
// SERIAL type with [increment] - should be not null and increment
const field3 = table.fields.find((f) => f.name === 'field_3');
expect(field3?.increment).toBe(true);
expect(field3?.nullable).toBe(false);
expect(field3?.type?.name).toBe('serial');
// Regular field with [not null] - should be not null, no increment
const field4 = table.fields.find((f) => f.name === 'field_4');
expect(field4?.increment).toBeUndefined();
expect(field4?.nullable).toBe(false);
});
it('should handle SERIAL types without increment attribute', async () => {
const dbmlContent = `Table "public"."test_table" {
"id" serial [pk]
"counter" bigserial
"small_counter" smallserial
"regular" integer
}`;
const result = await importDBMLToDiagram(dbmlContent, {
databaseType: DatabaseType.POSTGRESQL,
});
expect(result.tables).toHaveLength(1);
const table = result.tables![0];
expect(table.fields).toHaveLength(4);
// SERIAL type without [increment] - should STILL be not null (type requires it)
const idField = table.fields.find((f) => f.name === 'id');
expect(idField?.type?.name).toBe('serial');
expect(idField?.nullable).toBe(false); // CRITICAL: Type requires NOT NULL
expect(idField?.primaryKey).toBe(true);
// BIGSERIAL without [increment] - should be not null
const counterField = table.fields.find((f) => f.name === 'counter');
expect(counterField?.type?.name).toBe('bigserial');
expect(counterField?.nullable).toBe(false); // CRITICAL: Type requires NOT NULL
// SMALLSERIAL without [increment] - should be not null
const smallCounterField = table.fields.find(
(f) => f.name === 'small_counter'
);
expect(smallCounterField?.type?.name).toBe('smallserial');
expect(smallCounterField?.nullable).toBe(false); // CRITICAL: Type requires NOT NULL
// Regular INTEGER - should be nullable by default
const regularField = table.fields.find((f) => f.name === 'regular');
expect(regularField?.type?.name).toBe('integer');
expect(regularField?.nullable).toBe(true); // No NOT NULL constraint
});
});

View File

@@ -1,7 +1,6 @@
import { describe, it, expect } from 'vitest';
import { importDBMLToDiagram } from '../dbml-import';
import { DBCustomTypeKind } from '@/lib/domain/db-custom-type';
import { DatabaseType } from '@/lib/domain/database-type';
describe('DBML Import - Fantasy Examples', () => {
describe('Magical Academy System', () => {
@@ -150,9 +149,7 @@ Table ranks {
max_spell_level integer [not null]
}`;
const diagram = await importDBMLToDiagram(magicalAcademyDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(magicalAcademyDBML);
// Verify tables
expect(diagram.tables).toHaveLength(8);
@@ -369,9 +366,7 @@ Note marketplace_note {
'This marketplace handles both standard purchases and barter trades'
}`;
const diagram = await importDBMLToDiagram(marketplaceDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(marketplaceDBML);
// Verify tables
expect(diagram.tables).toHaveLength(7);
@@ -572,9 +567,7 @@ Note quest_system_note {
'Quest difficulty and status use enums that will be converted to varchar'
}`;
const diagram = await importDBMLToDiagram(questSystemDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(questSystemDBML);
// Verify tables
expect(diagram.tables).toHaveLength(7);
@@ -664,9 +657,7 @@ Table projects {
priority enum // inline enum without values - will be converted to varchar
}`;
const diagram = await importDBMLToDiagram(dbmlWithEnums, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(dbmlWithEnums);
// Verify customTypes are created for enums
expect(diagram.customTypes).toBeDefined();
@@ -753,9 +744,7 @@ Table orders {
status order_status [not null]
}`;
const diagram = await importDBMLToDiagram(dbmlWithEnumNotes, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(dbmlWithEnumNotes);
// Verify enum is created
expect(diagram.customTypes).toHaveLength(1);
@@ -799,9 +788,7 @@ Table admin.users {
status admin.status
}`;
const diagram = await importDBMLToDiagram(dbmlWithSameEnumNames, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(dbmlWithSameEnumNames);
// Verify both enums are created
expect(diagram.customTypes).toHaveLength(2);
@@ -904,9 +891,7 @@ Note dragon_note {
'Dragons are very protective of their hoards!'
}`;
const diagram = await importDBMLToDiagram(edgeCaseDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(edgeCaseDBML);
// Verify preprocessing worked
expect(diagram.tables).toHaveLength(2);
@@ -971,9 +956,7 @@ Note dragon_note {
it('should handle empty DBML gracefully', async () => {
const emptyDBML = '';
const diagram = await importDBMLToDiagram(emptyDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(emptyDBML);
expect(diagram.tables).toHaveLength(0);
expect(diagram.relationships).toHaveLength(0);
@@ -986,9 +969,7 @@ Note dragon_note {
/* Multi-line
comment */
`;
const diagram = await importDBMLToDiagram(commentOnlyDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(commentOnlyDBML);
expect(diagram.tables).toHaveLength(0);
expect(diagram.relationships).toHaveLength(0);
@@ -999,9 +980,7 @@ Note dragon_note {
Table empty_table {
id int
}`;
const diagram = await importDBMLToDiagram(minimalDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(minimalDBML);
expect(diagram.tables).toHaveLength(1);
expect(diagram.tables?.[0]?.fields).toHaveLength(1);
@@ -1017,9 +996,7 @@ Table "aa"."users" {
Table "bb"."users" {
id integer [primary key]
}`;
const diagram = await importDBMLToDiagram(dbml, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(dbml);
expect(diagram.tables).toHaveLength(2);
@@ -1094,9 +1071,7 @@ Table "public_3"."comments" {
id [unique, name: "public_3_index_1"]
}
}`;
const diagram = await importDBMLToDiagram(dbml, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(dbml);
// Verify tables
expect(diagram.tables).toHaveLength(3);
@@ -1281,9 +1256,7 @@ Table products {
Note: 'This table stores product information'
}`;
const diagram = await importDBMLToDiagram(dbmlWithTableNote, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(dbmlWithTableNote);
expect(diagram.tables).toHaveLength(1);
const productsTable = diagram.tables?.[0];
@@ -1300,9 +1273,7 @@ Table orders {
total numeric(10,2) [note: 'Order total including tax']
}`;
const diagram = await importDBMLToDiagram(dbmlWithFieldNote, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(dbmlWithFieldNote);
expect(diagram.tables).toHaveLength(1);
const ordersTable = diagram.tables?.[0];

View File

@@ -5,7 +5,6 @@ import {
importDBMLToDiagram,
} from '../dbml-import';
import { Parser } from '@dbml/core';
import { DatabaseType } from '@/lib/domain/database-type';
describe('DBML Import', () => {
describe('preprocessDBML', () => {
@@ -23,7 +22,7 @@ TableGroup "Test Group" [color: #CA4243] {
Table posts {
id int
}`;
const { content: result } = preprocessDBML(dbml);
const result = preprocessDBML(dbml);
expect(result).not.toContain('TableGroup');
expect(result).toContain('Table users');
expect(result).toContain('Table posts');
@@ -38,20 +37,20 @@ Table users {
Note note_test {
'This is a note'
}`;
const { content: result } = preprocessDBML(dbml);
const result = preprocessDBML(dbml);
expect(result).not.toContain('Note');
expect(result).toContain('Table users');
});
it('should remove array syntax while preserving base type', () => {
it('should convert array types to text', () => {
const dbml = `
Table users {
tags text[]
domains varchar[]
}`;
const { content: result } = preprocessDBML(dbml);
const result = preprocessDBML(dbml);
expect(result).toContain('tags text');
expect(result).toContain('domains varchar');
expect(result).toContain('domains text');
expect(result).not.toContain('[]');
});
@@ -61,7 +60,7 @@ Table users {
status enum
verification_type enum // comment here
}`;
const { content: result } = preprocessDBML(dbml);
const result = preprocessDBML(dbml);
expect(result).toContain('status varchar');
expect(result).toContain('verification_type varchar');
expect(result).not.toContain('enum');
@@ -72,7 +71,7 @@ Table users {
Table users [headercolor: #24BAB1] {
id int
}`;
const { content: result } = preprocessDBML(dbml);
const result = preprocessDBML(dbml);
expect(result).toContain('Table users {');
expect(result).not.toContain('headercolor');
});
@@ -106,9 +105,7 @@ Note note_test {
'This is a test note'
}`;
const diagram = await importDBMLToDiagram(complexDBML, {
databaseType: DatabaseType.POSTGRESQL,
});
const diagram = await importDBMLToDiagram(complexDBML);
expect(diagram.tables).toHaveLength(2);
expect(diagram.relationships).toHaveLength(1);
@@ -152,7 +149,7 @@ Note note_1750185617764 {
}`;
// Test that preprocessing handles all issues
const { content: preprocessed } = preprocessDBML(problematicDBML);
const preprocessed = preprocessDBML(problematicDBML);
const sanitized = sanitizeDBML(preprocessed);
// Should not throw

View File

@@ -1,159 +0,0 @@
import { describe, it, expect } from 'vitest';
import { DatabaseType } from '@/lib/domain/database-type';
import { importDBMLToDiagram } from '@/lib/dbml/dbml-import/dbml-import';
// This test verifies the DBML integration without UI components
describe('DBML Integration Tests', () => {
it('should handle DBML import in create diagram flow', async () => {
const dbmlContent = `
Table users {
id uuid [pk, not null]
email varchar [unique, not null]
created_at timestamp
}
Table posts {
id uuid [pk]
title varchar
content text
user_id uuid [ref: > users.id]
created_at timestamp
}
Table comments {
id uuid [pk]
content text
post_id uuid [ref: > posts.id]
user_id uuid [ref: > users.id]
}
// This will be ignored
TableGroup "Content" {
posts
comments
}
// This will be ignored too
Note test_note {
'This is a test note'
}`;
const diagram = await importDBMLToDiagram(dbmlContent, {
databaseType: DatabaseType.POSTGRESQL,
});
// Verify basic structure
expect(diagram).toBeDefined();
expect(diagram.tables).toHaveLength(3);
expect(diagram.relationships).toHaveLength(3);
// Verify tables
const tableNames = diagram.tables?.map((t) => t.name).sort();
expect(tableNames).toEqual(['comments', 'posts', 'users']);
// Verify users table
const usersTable = diagram.tables?.find((t) => t.name === 'users');
expect(usersTable).toBeDefined();
expect(usersTable?.fields).toHaveLength(3);
const emailField = usersTable?.fields.find((f) => f.name === 'email');
expect(emailField?.unique).toBe(true);
expect(emailField?.nullable).toBe(false);
// Verify relationships
// There should be 3 relationships total
expect(diagram.relationships).toHaveLength(3);
// Find the relationship from users to posts (DBML ref is: posts.user_id > users.id)
// This creates a relationship FROM users TO posts (one user has many posts)
const postsTable = diagram.tables?.find((t) => t.name === 'posts');
const usersTableId = usersTable?.id;
const userPostRelation = diagram.relationships?.find(
(r) =>
r.sourceTableId === usersTableId &&
r.targetTableId === postsTable?.id
);
expect(userPostRelation).toBeDefined();
expect(userPostRelation?.sourceCardinality).toBe('one');
expect(userPostRelation?.targetCardinality).toBe('many');
});
it('should handle DBML with special features', async () => {
const dbmlContent = `
// Enum will be converted to varchar
Table users {
id int [pk]
status enum
tags text[] // Array will be converted to text
favorite_product_id int
}
Table products [headercolor: #FF0000] {
id int [pk]
name varchar
price decimal(10,2)
}
Ref: products.id < users.favorite_product_id`;
const diagram = await importDBMLToDiagram(dbmlContent, {
databaseType: DatabaseType.POSTGRESQL,
});
expect(diagram.tables).toHaveLength(2);
// Check enum conversion
const usersTable = diagram.tables?.find((t) => t.name === 'users');
const statusField = usersTable?.fields.find((f) => f.name === 'status');
expect(statusField?.type.id).toBe('varchar');
// Check array type conversion
const tagsField = usersTable?.fields.find((f) => f.name === 'tags');
expect(tagsField?.type.id).toBe('text');
// Check that header color was removed
const productsTable = diagram.tables?.find(
(t) => t.name === 'products'
);
expect(productsTable).toBeDefined();
expect(productsTable?.name).toBe('products');
});
it('should handle empty or invalid DBML gracefully', async () => {
// Empty DBML
const emptyDiagram = await importDBMLToDiagram('', {
databaseType: DatabaseType.POSTGRESQL,
});
expect(emptyDiagram.tables).toHaveLength(0);
expect(emptyDiagram.relationships).toHaveLength(0);
// Only comments
const commentDiagram = await importDBMLToDiagram('// Just a comment', {
databaseType: DatabaseType.POSTGRESQL,
});
expect(commentDiagram.tables).toHaveLength(0);
expect(commentDiagram.relationships).toHaveLength(0);
});
it('should preserve diagram metadata when importing DBML', async () => {
const dbmlContent = `Table test {
id int [pk]
}`;
const diagram = await importDBMLToDiagram(dbmlContent, {
databaseType: DatabaseType.GENERIC,
});
// Default values
expect(diagram.name).toBe('DBML Import');
expect(diagram.databaseType).toBe(DatabaseType.GENERIC);
// These can be overridden by the dialog
diagram.name = 'My Custom Diagram';
diagram.databaseType = DatabaseType.POSTGRESQL;
expect(diagram.name).toBe('My Custom Diagram');
expect(diagram.databaseType).toBe(DatabaseType.POSTGRESQL);
});
});

View File

@@ -180,7 +180,7 @@ describe('DBML Schema Handling - Fantasy Realm Database', () => {
expect(artifactsTable?.schema).toBe(''); // No schema = empty string
});
it('should handle reserved keywords for PostgreSQL', async () => {
it('should rename reserved keywords for PostgreSQL', async () => {
const dbmlContent = `
Table "magic_items" {
"id" bigint [pk]
@@ -197,9 +197,10 @@ describe('DBML Schema Handling - Fantasy Realm Database', () => {
const exported = generateDBMLFromDiagram(diagram);
expect(exported.standardDbml).toContain('Order');
expect(exported.standardDbml).toContain('Yes');
expect(exported.standardDbml).toContain('No');
// For PostgreSQL, keywords should be renamed in export
expect(exported.standardDbml).toContain('Order_field');
expect(exported.standardDbml).toContain('Yes_field');
expect(exported.standardDbml).toContain('No_field');
});
});

View File

@@ -1,6 +1,4 @@
import type { CompilerError } from '@dbml/core/types/parse/error';
import type { DatabaseType } from '@/lib/domain/database-type';
import { databaseSupportsArrays } from '@/lib/domain/database-capabilities';
export interface DBMLError {
message: string;
@@ -8,59 +6,8 @@ export interface DBMLError {
column: number;
}
export class DBMLValidationError extends Error {
public readonly dbmlError: DBMLError;
constructor(message: string, line: number, column: number = 1) {
super(message);
this.name = 'DBMLValidationError';
this.dbmlError = { message, line, column };
}
}
export const getPositionFromIndex = (
content: string,
matchIndex: number
): { line: number; column: number } => {
const lines = content.substring(0, matchIndex).split('\n');
return {
line: lines.length,
column: lines[lines.length - 1].length + 1,
};
};
export const validateArrayTypesForDatabase = (
content: string,
databaseType: DatabaseType
): void => {
// Only validate if database doesn't support arrays
if (databaseSupportsArrays(databaseType)) {
return;
}
const arrayFieldPattern = /"?(\w+)"?\s+(\w+(?:\(\d+(?:,\s*\d+)?\))?)\[\]/g;
const matches = [...content.matchAll(arrayFieldPattern)];
for (const match of matches) {
const fieldName = match[1];
const dataType = match[2];
const { line, column } = getPositionFromIndex(content, match.index!);
throw new DBMLValidationError(
`Array types are not supported for ${databaseType} database. Field "${fieldName}" has array type "${dataType}[]" which is not allowed.`,
line,
column
);
}
};
export function parseDBMLError(error: unknown): DBMLError | null {
try {
// Check for our custom DBMLValidationError
if (error instanceof DBMLValidationError) {
return error.dbmlError;
}
if (typeof error === 'string') {
const parsed = JSON.parse(error);
if (parsed.diags?.[0]) {

View File

@@ -5,10 +5,7 @@ import type { DBTable } from '@/lib/domain/db-table';
import type { Cardinality, DBRelationship } from '@/lib/domain/db-relationship';
import type { DBField } from '@/lib/domain/db-field';
import type { DataTypeData } from '@/lib/data/data-types/data-types';
import {
findDataTypeDataById,
requiresNotNull,
} from '@/lib/data/data-types/data-types';
import { findDataTypeDataById } from '@/lib/data/data-types/data-types';
import { defaultTableColor } from '@/lib/colors';
import { DatabaseType } from '@/lib/domain/database-type';
import type Field from '@dbml/core/types/model_structure/field';
@@ -17,21 +14,11 @@ import {
DBCustomTypeKind,
type DBCustomType,
} from '@/lib/domain/db-custom-type';
import { validateArrayTypesForDatabase } from './dbml-import-error';
export const defaultDBMLDiagramName = 'DBML Import';
interface PreprocessDBMLResult {
content: string;
arrayFields: Map<string, Set<string>>;
}
export const preprocessDBML = (content: string): PreprocessDBMLResult => {
// Preprocess DBML to handle unsupported features
export const preprocessDBML = (content: string): string => {
let processed = content;
// Track array fields found during preprocessing
const arrayFields = new Map<string, Set<string>>();
// Remove TableGroup blocks (not supported by parser)
processed = processed.replace(/TableGroup\s+[^{]*\{[^}]*\}/gs, '');
@@ -41,37 +28,8 @@ export const preprocessDBML = (content: string): PreprocessDBMLResult => {
// Don't remove enum definitions - we'll parse them
// processed = processed.replace(/enum\s+\w+\s*\{[^}]*\}/gs, '');
// Handle array types by tracking them and converting syntax for DBML parser
// Note: DBML doesn't officially support array syntax, so we convert type[] to type
// but track which fields should be arrays
// First, find all array field declarations and track them
const tablePattern =
/Table\s+(?:"([^"]+)"\.)?(?:"([^"]+)"|(\w+))\s*(?:\[[^\]]*\])?\s*\{([^}]+)\}/gs;
let match;
while ((match = tablePattern.exec(content)) !== null) {
const schema = match[1] || '';
const tableName = match[2] || match[3];
const tableBody = match[4];
const fullTableName = schema ? `${schema}.${tableName}` : tableName;
// Find array field declarations within this table
const fieldPattern = /"?(\w+)"?\s+(\w+(?:\([^)]+\))?)\[\]/g;
let fieldMatch;
while ((fieldMatch = fieldPattern.exec(tableBody)) !== null) {
const fieldName = fieldMatch[1];
if (!arrayFields.has(fullTableName)) {
arrayFields.set(fullTableName, new Set());
}
arrayFields.get(fullTableName)!.add(fieldName);
}
}
// Now convert array syntax for DBML parser (keep the base type, remove [])
processed = processed.replace(/(\w+(?:\(\d+(?:,\s*\d+)?\))?)\[\]/g, '$1');
// Handle array types by converting them to text
processed = processed.replace(/(\w+)\[\]/g, 'text');
// Handle inline enum types without values by converting to varchar
processed = processed.replace(
@@ -86,7 +44,7 @@ export const preprocessDBML = (content: string): PreprocessDBMLResult => {
'Table $1 {'
);
return { content: processed, arrayFields };
return processed;
};
// Simple function to replace Spanish special characters
@@ -125,12 +83,10 @@ interface DBMLField {
pk?: boolean;
not_null?: boolean;
increment?: boolean;
isArray?: boolean;
characterMaximumLength?: string | null;
precision?: number | null;
scale?: number | null;
note?: string | { value: string } | null;
default?: string | null;
}
interface DBMLIndexColumn {
@@ -231,8 +187,8 @@ const determineCardinality = (
export const importDBMLToDiagram = async (
dbmlContent: string,
options: {
databaseType: DatabaseType;
options?: {
databaseType?: DatabaseType;
}
): Promise<Diagram> => {
try {
@@ -240,7 +196,7 @@ export const importDBMLToDiagram = async (
if (!dbmlContent.trim()) {
return {
id: generateDiagramId(),
name: defaultDBMLDiagramName,
name: 'DBML Import',
databaseType: options?.databaseType ?? DatabaseType.GENERIC,
tables: [],
relationships: [],
@@ -249,20 +205,16 @@ export const importDBMLToDiagram = async (
};
}
// Validate array types BEFORE preprocessing (preprocessing removes [])
validateArrayTypesForDatabase(dbmlContent, options.databaseType);
const parser = new Parser();
// Preprocess and sanitize DBML content
const { content: preprocessedContent, arrayFields } =
preprocessDBML(dbmlContent);
const preprocessedContent = preprocessDBML(dbmlContent);
const sanitizedContent = sanitizeDBML(preprocessedContent);
// Handle content that becomes empty after preprocessing
if (!sanitizedContent.trim()) {
return {
id: generateDiagramId(),
name: defaultDBMLDiagramName,
name: 'DBML Import',
databaseType: options?.databaseType ?? DatabaseType.GENERIC,
tables: [],
relationships: [],
@@ -277,7 +229,7 @@ export const importDBMLToDiagram = async (
if (!parsedData.schemas || parsedData.schemas.length === 0) {
return {
id: generateDiagramId(),
name: defaultDBMLDiagramName,
name: 'DBML Import',
databaseType: options?.databaseType ?? DatabaseType.GENERIC,
tables: [],
relationships: [],
@@ -380,33 +332,6 @@ export const importDBMLToDiagram = async (
schema: schemaName,
note: table.note,
fields: table.fields.map((field): DBMLField => {
// Extract default value and remove all quotes
let defaultValue: string | undefined;
if (
field.dbdefault !== undefined &&
field.dbdefault !== null
) {
const rawDefault = String(
field.dbdefault.value
);
defaultValue = rawDefault.replace(/['"`]/g, '');
}
// Check if this field should be an array
const fullTableName = schemaName
? `${schemaName}.${table.name}`
: table.name;
let isArray = arrayFields
.get(fullTableName)
?.has(field.name);
if (!isArray && schemaName) {
isArray = arrayFields
.get(table.name)
?.has(field.name);
}
return {
name: field.name,
type: field.type,
@@ -414,9 +339,7 @@ export const importDBMLToDiagram = async (
pk: field.pk,
not_null: field.not_null,
increment: field.increment,
isArray: isArray || undefined,
note: field.note,
default: defaultValue,
...getFieldExtraAttributes(field, allEnums),
} satisfies DBMLField;
}),
@@ -555,20 +478,14 @@ export const importDBMLToDiagram = async (
...options,
enums: extractedData.enums,
}),
nullable:
field.increment || requiresNotNull(field.type.type_name)
? false
: !field.not_null,
nullable: !field.not_null,
primaryKey: field.pk || false,
unique: field.unique || field.pk || false, // Primary keys are always unique
unique: field.unique || false,
createdAt: Date.now(),
characterMaximumLength: field.characterMaximumLength,
precision: field.precision,
scale: field.scale,
...(field.increment ? { increment: field.increment } : {}),
...(field.isArray ? { isArray: field.isArray } : {}),
...(fieldComment ? { comments: fieldComment } : {}),
...(field.default ? { default: field.default } : {}),
};
});
@@ -817,7 +734,7 @@ export const importDBMLToDiagram = async (
return {
id: generateDiagramId(),
name: defaultDBMLDiagramName,
name: 'DBML Import',
databaseType: options?.databaseType ?? DatabaseType.GENERIC,
tables,
relationships,

View File

@@ -1,65 +0,0 @@
import { Parser } from '@dbml/core';
import { preprocessDBML, sanitizeDBML } from './dbml-import';
import type { DBMLError } from './dbml-import-error';
import {
parseDBMLError,
validateArrayTypesForDatabase,
} from './dbml-import-error';
import type { DatabaseType } from '@/lib/domain/database-type';
export const verifyDBML = (
content: string,
{
databaseType,
}: {
databaseType: DatabaseType;
}
):
| {
hasError: true;
error: unknown;
parsedError?: DBMLError;
errorText: string;
}
| {
hasError: false;
} => {
try {
// Validate array types BEFORE preprocessing (preprocessing removes [])
validateArrayTypesForDatabase(content, databaseType);
const { content: preprocessedContent } = preprocessDBML(content);
const sanitizedContent = sanitizeDBML(preprocessedContent);
const parser = new Parser();
parser.parse(sanitizedContent, 'dbmlv2');
} catch (e) {
const parsedError = parseDBMLError(e);
if (parsedError) {
return {
hasError: true,
parsedError: parsedError,
error: e,
errorText: parsedError.message,
};
} else {
if (e instanceof Error) {
return {
hasError: true,
error: e,
errorText: e.message,
};
}
return {
hasError: true,
error: e,
errorText: JSON.stringify(e),
};
}
}
return {
hasError: false,
};
};

View File

@@ -1,57 +0,0 @@
import { DatabaseType } from './database-type';
export interface DatabaseCapabilities {
supportsArrays?: boolean;
supportsCustomTypes?: boolean;
supportsSchemas?: boolean;
supportsComments?: boolean;
}
export const DATABASE_CAPABILITIES: Record<DatabaseType, DatabaseCapabilities> =
{
[DatabaseType.POSTGRESQL]: {
supportsArrays: true,
supportsCustomTypes: true,
supportsSchemas: true,
supportsComments: true,
},
[DatabaseType.COCKROACHDB]: {
supportsArrays: true,
supportsSchemas: true,
supportsComments: true,
},
[DatabaseType.MYSQL]: {},
[DatabaseType.MARIADB]: {},
[DatabaseType.SQL_SERVER]: {
supportsSchemas: true,
},
[DatabaseType.SQLITE]: {},
[DatabaseType.CLICKHOUSE]: {
supportsSchemas: true,
},
[DatabaseType.ORACLE]: {
supportsSchemas: true,
supportsComments: true,
},
[DatabaseType.GENERIC]: {},
};
export const getDatabaseCapabilities = (
databaseType: DatabaseType
): DatabaseCapabilities => {
return DATABASE_CAPABILITIES[databaseType];
};
export const databaseSupportsArrays = (databaseType: DatabaseType): boolean => {
return getDatabaseCapabilities(databaseType).supportsArrays ?? false;
};
export const databaseTypesWithCommentSupport: DatabaseType[] = Object.keys(
DATABASE_CAPABILITIES
).filter(
(dbType) => DATABASE_CAPABILITIES[dbType as DatabaseType].supportsComments
) as DatabaseType[];
export const supportsCustomTypes = (databaseType: DatabaseType): boolean => {
return getDatabaseCapabilities(databaseType).supportsCustomTypes ?? false;
};

View File

@@ -9,3 +9,9 @@ export enum DatabaseType {
COCKROACHDB = 'cockroachdb',
ORACLE = 'oracle',
}
export const databaseTypesWithCommentSupport: DatabaseType[] = [
DatabaseType.POSTGRESQL,
DatabaseType.COCKROACHDB,
DatabaseType.ORACLE,
];

View File

@@ -2,10 +2,9 @@ import { z } from 'zod';
import {
dataTypeSchema,
findDataTypeDataById,
supportsArrayDataType,
type DataType,
} from '../data/data-types/data-types';
import { DatabaseType } from './database-type';
import type { DatabaseType } from './database-type';
export interface DBField {
id: string;
@@ -15,7 +14,6 @@ export interface DBField {
unique: boolean;
nullable: boolean;
increment?: boolean | null;
isArray?: boolean | null;
createdAt: number;
characterMaximumLength?: string | null;
precision?: number | null;
@@ -33,7 +31,6 @@ export const dbFieldSchema: z.ZodType<DBField> = z.object({
unique: z.boolean(),
nullable: z.boolean(),
increment: z.boolean().or(z.null()).optional(),
isArray: z.boolean().or(z.null()).optional(),
createdAt: z.number(),
characterMaximumLength: z.string().or(z.null()).optional(),
precision: z.number().or(z.null()).optional(),
@@ -55,26 +52,11 @@ export const generateDBFieldSuffix = (
typeId?: string;
} = {}
): string => {
let suffix = '';
if (databaseType && forceExtended && typeId) {
suffix = generateExtendedSuffix(field, databaseType, typeId);
} else {
suffix = generateStandardSuffix(field);
return generateExtendedSuffix(field, databaseType, typeId);
}
// Add array notation if field is an array
if (
field.isArray &&
supportsArrayDataType(
typeId ?? field.type.id,
databaseType ?? DatabaseType.GENERIC
)
) {
suffix += '[]';
}
return suffix;
return generateStandardSuffix(field);
};
const generateExtendedSuffix = (

View File

@@ -1,5 +1,4 @@
import { DATABASE_CAPABILITIES } from './database-capabilities';
import type { DatabaseType } from './database-type';
import { DatabaseType } from './database-type';
export interface DBSchema {
id: string;
@@ -19,8 +18,10 @@ export const schemaNameToDomainSchemaName = (
? undefined
: schema?.trim();
export const databasesWithSchemas: DatabaseType[] = Object.keys(
DATABASE_CAPABILITIES
).filter(
(dbType) => DATABASE_CAPABILITIES[dbType as DatabaseType].supportsSchemas
) as DatabaseType[];
export const databasesWithSchemas: DatabaseType[] = [
DatabaseType.POSTGRESQL,
DatabaseType.SQL_SERVER,
DatabaseType.CLICKHOUSE,
DatabaseType.COCKROACHDB,
DatabaseType.ORACLE,
];

View File

@@ -1,77 +0,0 @@
import { z } from 'zod';
import type { Area } from '../area';
export type AreaDiffAttribute = keyof Pick<
Area,
'name' | 'color' | 'x' | 'y' | 'width' | 'height'
>;
const areaDiffAttributeSchema: z.ZodType<AreaDiffAttribute> = z.union([
z.literal('name'),
z.literal('color'),
z.literal('x'),
z.literal('y'),
z.literal('width'),
z.literal('height'),
]);
export interface AreaDiffChanged {
object: 'area';
type: 'changed';
areaId: string;
attribute: AreaDiffAttribute;
oldValue?: string | number | null;
newValue?: string | number | null;
}
export const AreaDiffChangedSchema: z.ZodType<AreaDiffChanged> = z.object({
object: z.literal('area'),
type: z.literal('changed'),
areaId: z.string(),
attribute: areaDiffAttributeSchema,
oldValue: z.union([z.string(), z.number(), z.null()]).optional(),
newValue: z.union([z.string(), z.number(), z.null()]).optional(),
});
export interface AreaDiffRemoved {
object: 'area';
type: 'removed';
areaId: string;
}
export const AreaDiffRemovedSchema: z.ZodType<AreaDiffRemoved> = z.object({
object: z.literal('area'),
type: z.literal('removed'),
areaId: z.string(),
});
export interface AreaDiffAdded<T = Area> {
object: 'area';
type: 'added';
areaAdded: T;
}
export const createAreaDiffAddedSchema = <T = Area>(
areaSchema: z.ZodType<T>
): z.ZodType<AreaDiffAdded<T>> => {
return z.object({
object: z.literal('area'),
type: z.literal('added'),
areaAdded: areaSchema,
}) as z.ZodType<AreaDiffAdded<T>>;
};
export type AreaDiff<T = Area> =
| AreaDiffChanged
| AreaDiffRemoved
| AreaDiffAdded<T>;
export const createAreaDiffSchema = <T = Area>(
areaSchema: z.ZodType<T>
): z.ZodType<AreaDiff<T>> => {
return z.union([
AreaDiffChangedSchema,
AreaDiffRemovedSchema,
createAreaDiffAddedSchema(areaSchema),
]) as z.ZodType<AreaDiff<T>>;
};

View File

@@ -1,883 +0,0 @@
import { describe, it, expect } from 'vitest';
import { generateDiff } from '../diff-check';
import type { Diagram } from '@/lib/domain/diagram';
import type { DBTable } from '@/lib/domain/db-table';
import type { DBField } from '@/lib/domain/db-field';
import type { DBIndex } from '@/lib/domain/db-index';
import type { DBRelationship } from '@/lib/domain/db-relationship';
import type { Area } from '@/lib/domain/area';
import { DatabaseType } from '@/lib/domain/database-type';
import type { TableDiffChanged } from '../../table-diff';
import type { FieldDiffChanged } from '../../field-diff';
import type { AreaDiffChanged } from '../../area-diff';
// Helper function to create a mock diagram
function createMockDiagram(overrides?: Partial<Diagram>): Diagram {
return {
id: 'diagram-1',
name: 'Test Diagram',
databaseType: DatabaseType.POSTGRESQL,
tables: [],
relationships: [],
areas: [],
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
// Helper function to create a mock table
function createMockTable(overrides?: Partial<DBTable>): DBTable {
return {
id: 'table-1',
name: 'users',
fields: [],
indexes: [],
x: 0,
y: 0,
...overrides,
} as DBTable;
}
// Helper function to create a mock field
function createMockField(overrides?: Partial<DBField>): DBField {
return {
id: 'field-1',
name: 'id',
type: { id: 'integer', name: 'integer' },
primaryKey: false,
nullable: true,
unique: false,
...overrides,
} as DBField;
}
// Helper function to create a mock relationship
function createMockRelationship(
overrides?: Partial<DBRelationship>
): DBRelationship {
return {
id: 'rel-1',
sourceTableId: 'table-1',
targetTableId: 'table-2',
sourceFieldId: 'field-1',
targetFieldId: 'field-2',
type: 'one-to-many',
...overrides,
} as DBRelationship;
}
// Helper function to create a mock area
function createMockArea(overrides?: Partial<Area>): Area {
return {
id: 'area-1',
name: 'Main Area',
x: 0,
y: 0,
width: 100,
height: 100,
color: 'blue',
...overrides,
} as Area;
}
describe('generateDiff', () => {
describe('Basic Table Diffing', () => {
it('should detect added tables', () => {
const oldDiagram = createMockDiagram({ tables: [] });
const newDiagram = createMockDiagram({
tables: [createMockTable()],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('table-table-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('added');
expect(result.changedTables.has('table-1')).toBe(true);
});
it('should detect removed tables', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable()],
});
const newDiagram = createMockDiagram({ tables: [] });
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('table-table-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('removed');
expect(result.changedTables.has('table-1')).toBe(true);
});
it('should detect table name changes', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ name: 'users' })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ name: 'customers' })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('table-name-table-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('changed');
expect((diff as TableDiffChanged)?.attribute).toBe('name');
});
it('should detect table position changes', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ x: 0, y: 0 })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ x: 100, y: 200 })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
attributes: {
tables: ['name', 'comments', 'color', 'x', 'y'],
},
},
});
expect(result.diffMap.size).toBe(2);
expect(result.diffMap.has('table-x-table-1')).toBe(true);
expect(result.diffMap.has('table-y-table-1')).toBe(true);
});
it('should detect table width changes', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ width: 150 })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ width: 250 })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
attributes: {
tables: ['width'],
},
},
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('table-width-table-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('changed');
expect((diff as TableDiffChanged)?.attribute).toBe('width');
expect((diff as TableDiffChanged)?.oldValue).toBe(150);
expect((diff as TableDiffChanged)?.newValue).toBe(250);
});
it('should detect multiple table dimension changes', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ x: 0, y: 0, width: 100 })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ x: 50, y: 75, width: 200 })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
attributes: {
tables: ['x', 'y', 'width'],
},
},
});
expect(result.diffMap.size).toBe(3);
expect(result.diffMap.has('table-x-table-1')).toBe(true);
expect(result.diffMap.has('table-y-table-1')).toBe(true);
expect(result.diffMap.has('table-width-table-1')).toBe(true);
const widthDiff = result.diffMap.get('table-width-table-1');
expect(widthDiff?.type).toBe('changed');
expect((widthDiff as TableDiffChanged)?.oldValue).toBe(100);
expect((widthDiff as TableDiffChanged)?.newValue).toBe(200);
});
});
describe('Field Diffing', () => {
it('should detect added fields', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ fields: [] })],
});
const newDiagram = createMockDiagram({
tables: [
createMockTable({
fields: [createMockField()],
}),
],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('field-field-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('added');
expect(result.changedFields.has('field-1')).toBe(true);
});
it('should detect removed fields', () => {
const oldDiagram = createMockDiagram({
tables: [
createMockTable({
fields: [createMockField()],
}),
],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ fields: [] })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('field-field-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('removed');
});
it('should detect field type changes', () => {
const oldDiagram = createMockDiagram({
tables: [
createMockTable({
fields: [
createMockField({
type: { id: 'integer', name: 'integer' },
}),
],
}),
],
});
const newDiagram = createMockDiagram({
tables: [
createMockTable({
fields: [
createMockField({
type: { id: 'varchar', name: 'varchar' },
}),
],
}),
],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('field-type-field-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('changed');
expect((diff as FieldDiffChanged)?.attribute).toBe('type');
});
});
describe('Relationship Diffing', () => {
it('should detect added relationships', () => {
const oldDiagram = createMockDiagram({ relationships: [] });
const newDiagram = createMockDiagram({
relationships: [createMockRelationship()],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('relationship-rel-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('added');
});
it('should detect removed relationships', () => {
const oldDiagram = createMockDiagram({
relationships: [createMockRelationship()],
});
const newDiagram = createMockDiagram({ relationships: [] });
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('relationship-rel-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('removed');
});
});
describe('Area Diffing', () => {
it('should detect added areas when includeAreas is true', () => {
const oldDiagram = createMockDiagram({ areas: [] });
const newDiagram = createMockDiagram({
areas: [createMockArea()],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
includeAreas: true,
},
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('area-area-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('added');
expect(result.changedAreas.has('area-1')).toBe(true);
});
it('should not detect area changes when includeAreas is false', () => {
const oldDiagram = createMockDiagram({ areas: [] });
const newDiagram = createMockDiagram({
areas: [createMockArea()],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
includeAreas: false,
},
});
expect(result.diffMap.size).toBe(0);
});
it('should detect area width changes', () => {
const oldDiagram = createMockDiagram({
areas: [createMockArea({ width: 100 })],
});
const newDiagram = createMockDiagram({
areas: [createMockArea({ width: 200 })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
includeAreas: true,
attributes: {
areas: ['width'],
},
},
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('area-width-area-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('changed');
expect((diff as AreaDiffChanged)?.attribute).toBe('width');
expect((diff as AreaDiffChanged)?.oldValue).toBe(100);
expect((diff as AreaDiffChanged)?.newValue).toBe(200);
});
it('should detect area height changes', () => {
const oldDiagram = createMockDiagram({
areas: [createMockArea({ height: 100 })],
});
const newDiagram = createMockDiagram({
areas: [createMockArea({ height: 300 })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
includeAreas: true,
attributes: {
areas: ['height'],
},
},
});
expect(result.diffMap.size).toBe(1);
const diff = result.diffMap.get('area-height-area-1');
expect(diff).toBeDefined();
expect(diff?.type).toBe('changed');
expect((diff as AreaDiffChanged)?.attribute).toBe('height');
expect((diff as AreaDiffChanged)?.oldValue).toBe(100);
expect((diff as AreaDiffChanged)?.newValue).toBe(300);
});
it('should detect multiple area dimension changes', () => {
const oldDiagram = createMockDiagram({
areas: [
createMockArea({ x: 0, y: 0, width: 100, height: 100 }),
],
});
const newDiagram = createMockDiagram({
areas: [
createMockArea({ x: 50, y: 50, width: 200, height: 300 }),
],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
includeAreas: true,
attributes: {
areas: ['x', 'y', 'width', 'height'],
},
},
});
expect(result.diffMap.size).toBe(4);
expect(result.diffMap.has('area-x-area-1')).toBe(true);
expect(result.diffMap.has('area-y-area-1')).toBe(true);
expect(result.diffMap.has('area-width-area-1')).toBe(true);
expect(result.diffMap.has('area-height-area-1')).toBe(true);
});
});
describe('Custom Matchers', () => {
it('should use custom table matcher to match by name', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-1', name: 'users' })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-2', name: 'users' })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
matchers: {
table: (table, tables) =>
tables.find((t) => t.name === table.name),
},
},
});
// Should not detect any changes since tables match by name
expect(result.diffMap.size).toBe(0);
});
it('should detect changes when custom matcher finds no match', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-1', name: 'users' })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-2', name: 'customers' })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
matchers: {
table: (table, tables) =>
tables.find((t) => t.name === table.name),
},
},
});
// Should detect both added and removed since names don't match
expect(result.diffMap.size).toBe(2);
expect(result.diffMap.has('table-table-1')).toBe(true); // removed
expect(result.diffMap.has('table-table-2')).toBe(true); // added
});
it('should use custom field matcher to match by name', () => {
const field1 = createMockField({
id: 'field-1',
name: 'email',
nullable: true,
});
const field2 = createMockField({
id: 'field-2',
name: 'email',
nullable: false,
});
const oldDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-1', fields: [field1] })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-1', fields: [field2] })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
matchers: {
field: (field, fields) =>
fields.find((f) => f.name === field.name),
},
},
});
// With name-based matching, field-1 should match field-2 by name
// and detect the nullable change
const nullableChange = result.diffMap.get('field-nullable-field-1');
expect(nullableChange).toBeDefined();
expect(nullableChange?.type).toBe('changed');
expect((nullableChange as FieldDiffChanged)?.attribute).toBe(
'nullable'
);
});
it('should use case-insensitive custom matcher', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-1', name: 'Users' })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-2', name: 'users' })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
matchers: {
table: (table, tables) =>
tables.find(
(t) =>
t.name.toLowerCase() ===
table.name.toLowerCase()
),
},
},
});
// With case-insensitive name matching, the tables are matched
// but the name case difference is still detected as a change
expect(result.diffMap.size).toBe(1);
const nameChange = result.diffMap.get('table-name-table-1');
expect(nameChange).toBeDefined();
expect(nameChange?.type).toBe('changed');
expect((nameChange as TableDiffChanged)?.attribute).toBe('name');
expect((nameChange as TableDiffChanged)?.oldValue).toBe('Users');
expect((nameChange as TableDiffChanged)?.newValue).toBe('users');
});
});
describe('Filtering Options', () => {
it('should only check specified change types', () => {
const oldDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-1', name: 'users' })],
});
const newDiagram = createMockDiagram({
tables: [createMockTable({ id: 'table-2', name: 'products' })],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
changeTypes: {
tables: ['added'], // Only check for added tables
},
},
});
// Should only detect added table (table-2)
const addedTables = Array.from(result.diffMap.values()).filter(
(diff) => diff.type === 'added' && diff.object === 'table'
);
expect(addedTables.length).toBe(1);
// Should not detect removed table (table-1)
const removedTables = Array.from(result.diffMap.values()).filter(
(diff) => diff.type === 'removed' && diff.object === 'table'
);
expect(removedTables.length).toBe(0);
});
it('should only check specified attributes', () => {
const oldDiagram = createMockDiagram({
tables: [
createMockTable({
id: 'table-1',
name: 'users',
color: 'blue',
comments: 'old comment',
}),
],
});
const newDiagram = createMockDiagram({
tables: [
createMockTable({
id: 'table-1',
name: 'customers',
color: 'red',
comments: 'new comment',
}),
],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
attributes: {
tables: ['name'], // Only check name changes
},
},
});
// Should only detect name change
const nameChanges = Array.from(result.diffMap.values()).filter(
(diff) =>
diff.type === 'changed' &&
diff.attribute === 'name' &&
diff.object === 'table'
);
expect(nameChanges.length).toBe(1);
// Should not detect color or comments changes
const otherChanges = Array.from(result.diffMap.values()).filter(
(diff) =>
diff.type === 'changed' &&
(diff.attribute === 'color' ||
diff.attribute === 'comments') &&
diff.object === 'table'
);
expect(otherChanges.length).toBe(0);
});
it('should respect include flags', () => {
const oldDiagram = createMockDiagram({
tables: [
createMockTable({
fields: [createMockField()],
indexes: [{ id: 'idx-1', name: 'idx' } as DBIndex],
}),
],
});
const newDiagram = createMockDiagram({
tables: [
createMockTable({
fields: [],
indexes: [],
}),
],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
includeFields: false,
includeIndexes: true,
},
});
// Should only detect index removal, not field removal
expect(result.diffMap.has('index-idx-1')).toBe(true);
expect(result.diffMap.has('field-field-1')).toBe(false);
});
});
describe('Complex Scenarios', () => {
it('should detect all dimensional changes for tables and areas', () => {
const oldDiagram = createMockDiagram({
tables: [
createMockTable({
id: 'table-1',
x: 0,
y: 0,
width: 100,
}),
],
areas: [
createMockArea({
id: 'area-1',
x: 0,
y: 0,
width: 200,
height: 150,
}),
],
});
const newDiagram = createMockDiagram({
tables: [
createMockTable({
id: 'table-1',
x: 10,
y: 20,
width: 120,
}),
],
areas: [
createMockArea({
id: 'area-1',
x: 25,
y: 35,
width: 250,
height: 175,
}),
],
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
options: {
includeAreas: true,
attributes: {
tables: ['x', 'y', 'width'],
areas: ['x', 'y', 'width', 'height'],
},
},
});
// Table dimensional changes
expect(result.diffMap.has('table-x-table-1')).toBe(true);
expect(result.diffMap.has('table-y-table-1')).toBe(true);
expect(result.diffMap.has('table-width-table-1')).toBe(true);
// Area dimensional changes
expect(result.diffMap.has('area-x-area-1')).toBe(true);
expect(result.diffMap.has('area-y-area-1')).toBe(true);
expect(result.diffMap.has('area-width-area-1')).toBe(true);
expect(result.diffMap.has('area-height-area-1')).toBe(true);
// Verify the correct values
const tableWidthDiff = result.diffMap.get('table-width-table-1');
expect((tableWidthDiff as TableDiffChanged)?.oldValue).toBe(100);
expect((tableWidthDiff as TableDiffChanged)?.newValue).toBe(120);
const areaWidthDiff = result.diffMap.get('area-width-area-1');
expect((areaWidthDiff as AreaDiffChanged)?.oldValue).toBe(200);
expect((areaWidthDiff as AreaDiffChanged)?.newValue).toBe(250);
const areaHeightDiff = result.diffMap.get('area-height-area-1');
expect((areaHeightDiff as AreaDiffChanged)?.oldValue).toBe(150);
expect((areaHeightDiff as AreaDiffChanged)?.newValue).toBe(175);
});
it('should handle multiple simultaneous changes', () => {
const oldDiagram = createMockDiagram({
tables: [
createMockTable({
id: 'table-1',
name: 'users',
fields: [
createMockField({ id: 'field-1', name: 'id' }),
createMockField({ id: 'field-2', name: 'email' }),
],
}),
createMockTable({
id: 'table-2',
name: 'products',
}),
],
relationships: [createMockRelationship()],
});
const newDiagram = createMockDiagram({
tables: [
createMockTable({
id: 'table-1',
name: 'customers', // Changed name
fields: [
createMockField({ id: 'field-1', name: 'id' }),
// Removed field-2
createMockField({ id: 'field-3', name: 'name' }), // Added field
],
}),
// Removed table-2
createMockTable({
id: 'table-3',
name: 'orders', // Added table
}),
],
relationships: [], // Removed relationship
});
const result = generateDiff({
diagram: oldDiagram,
newDiagram,
});
// Verify all changes are detected
expect(result.diffMap.has('table-name-table-1')).toBe(true); // Table name change
expect(result.diffMap.has('field-field-2')).toBe(true); // Removed field
expect(result.diffMap.has('field-field-3')).toBe(true); // Added field
expect(result.diffMap.has('table-table-2')).toBe(true); // Removed table
expect(result.diffMap.has('table-table-3')).toBe(true); // Added table
expect(result.diffMap.has('relationship-rel-1')).toBe(true); // Removed relationship
});
it('should handle empty diagrams', () => {
const emptyDiagram1 = createMockDiagram();
const emptyDiagram2 = createMockDiagram();
const result = generateDiff({
diagram: emptyDiagram1,
newDiagram: emptyDiagram2,
});
expect(result.diffMap.size).toBe(0);
expect(result.changedTables.size).toBe(0);
expect(result.changedFields.size).toBe(0);
expect(result.changedAreas.size).toBe(0);
});
it('should handle diagrams with undefined collections', () => {
const diagram1 = createMockDiagram({
tables: undefined,
relationships: undefined,
areas: undefined,
});
const diagram2 = createMockDiagram({
tables: [createMockTable({ id: 'table-1' })],
relationships: [createMockRelationship({ id: 'rel-1' })],
areas: [createMockArea({ id: 'area-1' })],
});
const result = generateDiff({
diagram: diagram1,
newDiagram: diagram2,
options: {
includeAreas: true,
},
});
// Should detect all as added
expect(result.diffMap.has('table-table-1')).toBe(true);
expect(result.diffMap.has('relationship-rel-1')).toBe(true);
expect(result.diffMap.has('area-area-1')).toBe(true);
});
});
});

File diff suppressed because it is too large Load Diff

View File

@@ -8,43 +8,36 @@ import type { RelationshipDiff } from './relationship-diff';
import { createRelationshipDiffSchema } from './relationship-diff';
import type { TableDiff } from './table-diff';
import { createTableDiffSchema } from './table-diff';
import type { AreaDiff } from './area-diff';
import { createAreaDiffSchema } from './area-diff';
import type { DBField, DBIndex, DBRelationship, DBTable, Area } from '..';
import type { DBField, DBIndex, DBRelationship, DBTable } from '..';
export type ChartDBDiff<
TTable = DBTable,
TField = DBField,
TIndex = DBIndex,
TRelationship = DBRelationship,
TArea = Area,
> =
| TableDiff<TTable>
| FieldDiff<TField>
| IndexDiff<TIndex>
| RelationshipDiff<TRelationship>
| AreaDiff<TArea>;
| RelationshipDiff<TRelationship>;
export const createChartDBDiffSchema = <
TTable = DBTable,
TField = DBField,
TIndex = DBIndex,
TRelationship = DBRelationship,
TArea = Area,
>(
tableSchema: z.ZodType<TTable>,
fieldSchema: z.ZodType<TField>,
indexSchema: z.ZodType<TIndex>,
relationshipSchema: z.ZodType<TRelationship>,
areaSchema: z.ZodType<TArea>
): z.ZodType<ChartDBDiff<TTable, TField, TIndex, TRelationship, TArea>> => {
relationshipSchema: z.ZodType<TRelationship>
): z.ZodType<ChartDBDiff<TTable, TField, TIndex, TRelationship>> => {
return z.union([
createTableDiffSchema(tableSchema),
createFieldDiffSchema(fieldSchema),
createIndexDiffSchema(indexSchema),
createRelationshipDiffSchema(relationshipSchema),
createAreaDiffSchema(areaSchema),
]) as z.ZodType<ChartDBDiff<TTable, TField, TIndex, TRelationship, TArea>>;
]) as z.ZodType<ChartDBDiff<TTable, TField, TIndex, TRelationship>>;
};
export type DiffMap<
@@ -52,21 +45,18 @@ export type DiffMap<
TField = DBField,
TIndex = DBIndex,
TRelationship = DBRelationship,
TArea = Area,
> = Map<string, ChartDBDiff<TTable, TField, TIndex, TRelationship, TArea>>;
> = Map<string, ChartDBDiff<TTable, TField, TIndex, TRelationship>>;
export type DiffObject<
TTable = DBTable,
TField = DBField,
TIndex = DBIndex,
TRelationship = DBRelationship,
TArea = Area,
> =
| TableDiff<TTable>['object']
| FieldDiff<TField>['object']
| IndexDiff<TIndex>['object']
| RelationshipDiff<TRelationship>['object']
| AreaDiff<TArea>['object'];
| RelationshipDiff<TRelationship>['object'];
type ExtractDiffKind<T> = T extends { object: infer O; type: infer Type }
? T extends { attribute: infer A }
@@ -79,18 +69,16 @@ export type DiffKind<
TField = DBField,
TIndex = DBIndex,
TRelationship = DBRelationship,
TArea = Area,
> = ExtractDiffKind<ChartDBDiff<TTable, TField, TIndex, TRelationship, TArea>>;
> = ExtractDiffKind<ChartDBDiff<TTable, TField, TIndex, TRelationship>>;
export const isDiffOfKind = <
TTable = DBTable,
TField = DBField,
TIndex = DBIndex,
TRelationship = DBRelationship,
TArea = Area,
>(
diff: ChartDBDiff<TTable, TField, TIndex, TRelationship, TArea>,
kind: DiffKind<TTable, TField, TIndex, TRelationship, TArea>
diff: ChartDBDiff<TTable, TField, TIndex, TRelationship>,
kind: DiffKind<TTable, TField, TIndex, TRelationship>
): boolean => {
if ('attribute' in kind) {
return (

View File

@@ -15,9 +15,7 @@ export type FieldDiffAttribute =
| 'comments'
| 'characterMaximumLength'
| 'precision'
| 'scale'
| 'increment'
| 'isArray';
| 'scale';
export const fieldDiffAttributeSchema: z.ZodType<FieldDiffAttribute> = z.union([
z.literal('name'),

View File

@@ -3,16 +3,13 @@ import type { DBTable } from '../db-table';
export type TableDiffAttribute = keyof Pick<
DBTable,
'name' | 'comments' | 'color' | 'x' | 'y' | 'width'
'name' | 'comments' | 'color'
>;
const tableDiffAttributeSchema: z.ZodType<TableDiffAttribute> = z.union([
z.literal('name'),
z.literal('comments'),
z.literal('color'),
z.literal('x'),
z.literal('y'),
z.literal('width'),
]);
export interface TableDiffChanged {
@@ -20,8 +17,8 @@ export interface TableDiffChanged {
type: 'changed';
tableId: string;
attribute: TableDiffAttribute;
oldValue?: string | number | null;
newValue?: string | number | null;
oldValue?: string | null;
newValue?: string | null;
}
export const TableDiffChangedSchema: z.ZodType<TableDiffChanged> = z.object({
@@ -29,8 +26,8 @@ export const TableDiffChangedSchema: z.ZodType<TableDiffChanged> = z.object({
type: z.literal('changed'),
tableId: z.string(),
attribute: tableDiffAttributeSchema,
oldValue: z.union([z.string(), z.number(), z.null()]).optional(),
newValue: z.union([z.string(), z.number(), z.null()]).optional(),
oldValue: z.string().or(z.null()).optional(),
newValue: z.string().or(z.null()).optional(),
});
export interface TableDiffRemoved {

View File

@@ -1,157 +0,0 @@
import { describe, it, expect } from 'vitest';
import { detectImportMethod } from '../detect-import-method';
describe('detectImportMethod', () => {
describe('DBML detection', () => {
it('should detect DBML with Table definition', () => {
const content = `Table users {
id int [pk]
name varchar
}`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should detect DBML with Ref definition', () => {
const content = `Table posts {
user_id int
}
Ref: posts.user_id > users.id`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should detect DBML with pk attribute', () => {
const content = `id integer [pk]`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should detect DBML with ref attribute', () => {
const content = `user_id int [ref: > users.id]`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should detect DBML with Enum definition', () => {
const content = `Enum status {
active
inactive
}`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should detect DBML with TableGroup', () => {
const content = `TableGroup commerce {
users
orders
}`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should detect DBML with Note', () => {
const content = `Note project_note {
'This is a note about the project'
}`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should prioritize DBML over SQL when both patterns exist', () => {
const content = `CREATE TABLE test (id int);
Table users {
id int [pk]
}`;
expect(detectImportMethod(content)).toBe('dbml');
});
});
describe('SQL DDL detection', () => {
it('should detect CREATE TABLE statement', () => {
const content = `CREATE TABLE users (
id INT PRIMARY KEY,
name VARCHAR(255)
);`;
expect(detectImportMethod(content)).toBe('ddl');
});
it('should detect ALTER TABLE statement', () => {
const content = `ALTER TABLE users ADD COLUMN email VARCHAR(255);`;
expect(detectImportMethod(content)).toBe('ddl');
});
it('should detect DROP TABLE statement', () => {
const content = `DROP TABLE IF EXISTS users;`;
expect(detectImportMethod(content)).toBe('ddl');
});
it('should detect CREATE INDEX statement', () => {
const content = `CREATE INDEX idx_users_email ON users(email);`;
expect(detectImportMethod(content)).toBe('ddl');
});
it('should detect multiple DDL statements', () => {
const content = `CREATE TABLE users (id INT);
CREATE TABLE posts (id INT);
ALTER TABLE posts ADD CONSTRAINT fk_user FOREIGN KEY (user_id) REFERENCES users(id);`;
expect(detectImportMethod(content)).toBe('ddl');
});
it('should detect DDL case-insensitively', () => {
const content = `create table users (id int);`;
expect(detectImportMethod(content)).toBe('ddl');
});
});
describe('JSON detection', () => {
it('should detect JSON object', () => {
const content = `{
"tables": [],
"relationships": []
}`;
expect(detectImportMethod(content)).toBe('query');
});
it('should detect JSON array', () => {
const content = `[
{"name": "users"},
{"name": "posts"}
]`;
expect(detectImportMethod(content)).toBe('query');
});
it('should detect minified JSON', () => {
const content = `{"tables":[],"relationships":[]}`;
expect(detectImportMethod(content)).toBe('query');
});
it('should detect JSON with whitespace', () => {
const content = ` {
"data": true
} `;
expect(detectImportMethod(content)).toBe('query');
});
});
describe('edge cases', () => {
it('should return null for empty content', () => {
expect(detectImportMethod('')).toBeNull();
expect(detectImportMethod(' ')).toBeNull();
expect(detectImportMethod('\n\n')).toBeNull();
});
it('should return null for unrecognized content', () => {
const content = `This is just some random text
that doesn't match any pattern`;
expect(detectImportMethod(content)).toBeNull();
});
it('should handle content with special characters', () => {
const content = `Table users {
name varchar // Special chars: áéíóú
}`;
expect(detectImportMethod(content)).toBe('dbml');
});
it('should handle malformed JSON gracefully', () => {
const content = `{ "incomplete": `;
expect(detectImportMethod(content)).toBeNull();
});
});
});

Some files were not shown because too many files have changed in this diff Show More