0
Fork 0
mirror of https://github.com/withastro/astro.git synced 2024-12-16 21:46:22 -05:00

Introduce the db integration (prerelease) (#10201)

* Initial DB migration code

* chore: update package.json

* chore: update lockfile

* feat: add db as top-level config value

* Small package change

* Add a very basic test

* deps: remove unused

* chore: astro/db scripts, exports, deps

* chore: set tsconfig to monorepo defaults

* feat: MVP to use libsql in dev

* fix: test fixture paths

* fix: test file paths

* chore: remove CLI tests

* fix: add astro-scripts to db

* fix: package.json merge

* fix: load seed file separately for build

* feat: create db on filesystem for build

* fix: ul test. It passes now!

* Squashed commit of the following:

commit acdddd728c56f25e42975db7f367ab8a998e8c41
Author: Princesseuh <3019731+Princesseuh@users.noreply.github.com>
Date:   Wed Jan 10 14:06:16 2024 -0500

    fix: proper type augment for the config

commit b41ca9aacf291d1e5f0a27b6d6339ce4fc608ec3
Author: Nate Moore <nate@astro.build>
Date:   Tue Jan 9 14:33:42 2024 -0600

    wip: type augmentation

* feat: data() fn with basic type safety

* wip: update from seed file to data()

* fix: bad collectionSchema data type

* refactor: move dev to use filesystem to run seed at right time

* chore: remove seed file logic

* docs: add basic README

* CLI sync command

* Runtime checking of writable

* docs: add join example

* Implement defineWritableCollection

* feat: use studio connection for studio builds

* fix: add defineWritableCollection export

* refactor: use getTableName() util

* feat(db): add support for pass-through `astro db` command

* chore: add error map

* fix: add drizzle import

* refactor: studio -> db cli

* feat: add ticketing example

* fix: bad types in astro config

* fix: remove invalid `data()` on writable collection

* fix: vite warning on load-astro-config

* wip: add seeding for readable collections (nonstable ids!)

* merge migration work into branch

* cleanup migration commands

* migrate seed data to new db push command

* add migrations table to db

* fix remote db bugs

* fix: warn writable collections when studio false

* chore: delete README contents (now on Notion)

* chore: remove blank README

* chore: add dev dependency on db

* Removed unused deps

* 0.1.0

* Add config-augment.d.ts to published artifacts"

* Fixes merge issues with main

* fix: support promise response from data()

* feat: basic glob fixture

* Add a main field

* Give a help message when no db command is provided

* feat: `db push --token` for GitHub CI secrets

* fix getPackage for db package

* 0.1.2

* wip: build a table type

* chore: update lockfile

* chore: temporarily remove `Table` type

* feat: better Table object for type inference

* format

* add primaryKey support

* improve snapshot parsing support

* cleanup primary key support, db push

* add simple db shell

* cleanup old copy paste code

* feat: scrappy global data() prototype

* feat(test): recipes example

* fix: use Extract to narrow keyof to strings

* 0.1.3

* Create a runtime version of createRemoteDatabaseClient

* 0.1.4

* Grab the dbUrl from the environment

* 0.1.5

* Expose the database to the build output

* 0.1.6

* 0.1.7

* 0.1.15

* wip: data() -> set() concept

* fix: just infer insert keys for now

* refactor: rewrite to injected set() function

* deps: chokidar, drizzle

* feat: glob support with { db, table } signature

* chore: move basics to new data set

* refactor: set -> seed

* feat: expose Model.table

* refactor: clean up types

* feat: migrations now working!

* upgrade @libsql/client

* format

* expose relevant types

* 0.1.16

* feat: config design

* feat: add indexes from collectionToTable

* feat: add indexes to setupDbTables

* fix: remove unique constraint on recipeId

* Use an import statement to grab the database file

* 0.1.17

* Remove unused import

* Rename to ?fileurl

* 0.1.18

* feat: index migrations

* move migration logic to turso, add back sync support

* feat: add queries unit tests and fix related bugs

* refactor: move field queries to field-queries.test

* feat: index query tests

* refactor: reorganize the rats nest of files

* Make the DB_URL be root relative

* Upgrade to latest libsql

* 0.1.19

* 0.1.20

* Make work in webcontainer

* 0.1.22

* Remove content database from the static build

* 0.1.23

* chore: remove `optional: true` from pk

* chore: disable console linting for CLI

* fix: remove `id` column from Table type

* chore: remove `AstroId` type

* fix(ex): add `id` col to ticketing

* 0.2.0

* 0.2.1

* add keywords

* 0.2.2

* feat: API shape

* feat: FINALLY collection and name attached

* refactor: move to arrow function signature

* fix: foreignKeys references signature

* chore: unused imports

* feat: add foreignkeys to create table

* chore: lint

* chore: enable foreign keys in local mode only

* refactor: objShallowEqual -> deep diff

* fix: correct `hasDefault` inference

* fix: correct type Config reference

* fix: respect primaryKey from hasDefault

* fix: mark data keys as optional until we have type inference

* improve conflict and dataloss handling

- moved prompts to db push
- moved prompt logic out of lower-level functions
- improved logic overall
- improved user-facing prompt messages

* improve error messaging around studio config missing

* make it more clear when remove vs. local db is in use

* fix bug in prompt logic

* feat: better field.x() types

* feat: better seed() types

* chore: remove `as any` on seed values

* feat: good enough return type on seed :)

* feat: defineData()

* fix: add back promptResponse injection

* fix: use schema.parse to resolve dates

* fix: correctly respect primary key on INSERT INTO

* add short-lived db tokens

* add help output

* add better token error logging

* fix studio tests

* add shortcut link command from studio web ui

* Add support for SQL defaults

You can now use sql`CURRENT_TIMESTAMP`, `NOW`, and a couple of other
helpers, to set defaults.

* chore: todo

* feat: ignore `optional` and `default` when pk is present

* refactor: type `false` instead of type `never`

* feat: prevent `optional` on text pk

* fix db URL import for windows

* fix: add back textField multiline

* fix: remove explicit AUTOINCREMENT on int pk

* feat(db-cli): clean up CLI logging, support --json flag for `astro db verify`, extract shared logic to a utility

* prepare to run seed on all db push commands

* chore: expose setMeta for unit testing

* feat(test): reference add and remove tests

* feat: add references checks to migratiosn

* feat: remove useForeignKey checks

* feat: add pragma when pushing migrations

* feat(test): foreignKeys

* fix: transform collection config to be JSON serializable

* refactor: _setMeta -> preprocess for `table`

* refactor: reference tests

* chore: remove console log

* fix: handle serialized SQL object correctly

* refactor: store raw sql instead

* seed on every push

* Move field schema only a `schema` object

* Fix references test

* 0.3.0

* add default URLs to db package

* 0.3.1

* Fix input types

* fix: primaryKey type check

* 0.3.2

* fix: respect default in table types

* fix: avoid dropping tables on production seed

* fix: escape name on drop table

* feat: allow verify to mock migration file

* Handle unauthorized linking

* Fix verbiage of unauthorized link warning

* Add some color to the unauthorized message

* 0.3.3

* Improve the unauthorized error output

* 0.3.4

* fix: better error message

* Seed the Themes in build too

* Push skipped test

* chore: remove dead isJsonSerializable check

* fix: use `dateType` for dates (oops)

* refactor: clarify date coerce comment

* refactor: remove unused coerce

* chore: unskip date test

* feat: seed -> seedReturning

* refactor: throw when seeding writable in prod

* Add unsafeWritable option

* refactor: use FieldsConfig for Table generic

* chore: lint

* fix: use z.input for AstroConfigWithDB type

* fix: add defaults for boolean config options

* Support new CLI command structure

* Allow writable in the tests

* fix: handle defaults for safe type changes

* refactor: avoid selecting ['schema'] on input types

* 0.3.5

* Rename field->column, collection->table

* Rename collections->tables

* rename to defineReadableTable

* deps: upgrade ticketing-example

* fix: stray console.log

* deps: bump preact again

* chore: preact->react

* fix: parse params.event as number

* fix: correct event references

* Allow integrations to define schema

* fix: file-url plugin failure on repeated generateBundle() runs

* update url

* Cleanup

* Linting

* fix windows file permission issue

When runnng `astro dev`, the watcher would close trying to delete the `content.db` file due to a file permission error. This change makes the local DB client a disposable to allow cleanup after usage.

* Formatting

* "fix" Symbol.dispose usage

---------

Co-authored-by: Nate Moore <nate@astro.build>
Co-authored-by: bholmesdev <hey@bholmes.dev>
Co-authored-by: Fred K. Schott <fkschott@gmail.com>
Co-authored-by: itsMapleLeaf <19603573+itsMapleLeaf@users.noreply.github.com>
This commit is contained in:
Matthew Phillips 2024-02-22 14:50:44 -05:00 committed by GitHub
parent 3411e05ee4
commit 31a9f8469c
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
71 changed files with 11439 additions and 81 deletions

View file

@ -122,6 +122,12 @@ module.exports = {
'no-console': 'off',
},
},
{
files: ['packages/db/**/cli/**/*.ts'],
rules: {
'no-console': 'off',
},
},
{
files: ['packages/astro/src/core/errors/errors-data.ts'],
rules: {

View file

@ -10,6 +10,9 @@ benchmark/results/
**/vendor
**/.vercel
# Short-term need to format
!packages/db/test/fixtures
# Directories
.github
.changeset

View file

@ -24,7 +24,6 @@ export async function db({ flags }: { flags: Arguments }) {
}
const { cli } = dbPackage;
const inlineConfig = flagsToAstroInlineConfig(flags);
const { astroConfig } = await resolveConfig(inlineConfig, 'build');

View file

@ -30,6 +30,7 @@ async function printAstroHelp() {
['add', 'Add an integration.'],
['build', 'Build your project and write it to disk.'],
['check', 'Check your project for errors.'],
['db', 'Manage your Astro database.'],
['dev', 'Start the development server.'],
['docs', 'Open documentation in your web browser.'],
['info', 'List info about your current Astro setup.'],
@ -75,6 +76,10 @@ function resolveCommand(flags: yargs.Arguments): CLICommand {
'docs',
'db',
'info',
'login',
'loutout',
'link',
'init',
]);
if (supportedCommands.has(cmd)) {
return cmd as CLICommand;
@ -145,7 +150,11 @@ async function runCommand(cmd: string, flags: yargs.Arguments) {
await add(packages, { flags });
return;
}
case 'db': {
case 'db':
case 'login':
case 'logout':
case 'link':
case 'init': {
const { db } = await import('./db/index.js');
await db({ flags });
return;

View file

@ -6,6 +6,9 @@ import prompts from 'prompts';
import resolvePackage from 'resolve';
import whichPm from 'which-pm';
import { type Logger } from '../core/logger/core.js';
import { createRequire } from 'node:module';
import { sep } from 'node:path';
const require = createRequire(import.meta.url);
type GetPackageOptions = {
skipAsk?: boolean;
@ -18,12 +21,20 @@ export async function getPackage<T>(
options: GetPackageOptions,
otherDeps: string[] = []
): Promise<T | undefined> {
let packageImport;
try {
// Custom resolution logic for @astrojs/db. Since it lives in our monorepo,
// the generic tryResolve() method doesn't work.
if (packageName === '@astrojs/db') {
const packageJsonLoc = require.resolve(packageName + '/package.json', {
paths: [options.cwd ?? process.cwd()],
});
const packageLoc = packageJsonLoc.replace(`package.json`, 'dist/index.js');
const packageImport = await import(packageLoc);
return packageImport as T;
}
await tryResolve(packageName, options.cwd ?? process.cwd());
// The `require.resolve` is required as to avoid Node caching the failed `import`
packageImport = await import(packageName);
const packageImport = await import(packageName);
return packageImport as T;
} catch (e) {
logger.info(
null,
@ -32,13 +43,12 @@ export async function getPackage<T>(
const result = await installPackage([packageName, ...otherDeps], options, logger);
if (result) {
packageImport = await import(packageName);
const packageImport = await import(packageName);
return packageImport;
} else {
return undefined;
}
}
return packageImport as T;
}
function tryResolve(packageName: string, cwd: string) {

View file

@ -13,7 +13,7 @@ function mergeConfigRecursively(
continue;
}
const existing = merged[key];
let existing = merged[key];
if (existing == null) {
merged[key] = value;
@ -38,6 +38,14 @@ function mergeConfigRecursively(
}
}
if (key === 'data' && rootPath === 'db') {
// db.data can be a function or an array of functions. When
// merging, make sure they become an array
if (!Array.isArray(existing) && !Array.isArray(value)) {
existing = [existing];
}
}
if (Array.isArray(existing) || Array.isArray(value)) {
merged[key] = [...arraify(existing ?? []), ...arraify(value ?? [])];
continue;

0
packages/db/CHANGELOG.md Normal file
View file

4
packages/db/config-augment.d.ts vendored Normal file
View file

@ -0,0 +1,4 @@
declare namespace Config {
type DBUserConfig = import('./dist/core/types.js').DBUserConfig;
export interface Database extends DBUserConfig {}
}

3
packages/db/index.d.ts vendored Normal file
View file

@ -0,0 +1,3 @@
/// <reference types="./config-augment.d.ts" />
export * from './dist/index.js';
export { default } from './dist/index.js';

81
packages/db/package.json Normal file
View file

@ -0,0 +1,81 @@
{
"name": "@astrojs/db",
"version": "0.3.5",
"description": "",
"license": "MIT",
"type": "module",
"types": "./index.d.ts",
"author": "withastro",
"main": "./dist/index.js",
"exports": {
".": {
"types": "./index.d.ts",
"import": "./dist/index.js"
},
"./runtime": {
"types": "./dist/runtime/index.d.ts",
"import": "./dist/runtime/index.js"
},
"./runtime/drizzle": {
"types": "./dist/runtime/drizzle.d.ts",
"import": "./dist/runtime/drizzle.js"
},
"./package.json": "./package.json"
},
"typesVersions": {
"*": {
".": [
"./index.d.ts"
],
"runtime": [
"./dist/runtime/index.d.ts"
],
"runtime/drizzle": [
"./dist/runtime/drizzle.d.ts"
]
}
},
"files": [
"index.d.ts",
"config-augment.d.ts",
"dist"
],
"keywords": [
"withastro",
"astro-integration"
],
"scripts": {
"build": "astro-scripts build \"src/**/*.ts\" && tsc",
"build:ci": "astro-scripts build \"src/**/*.ts\"",
"dev": "astro-scripts dev \"src/**/*.ts\"",
"test": "mocha --exit --timeout 20000 \"test/*.js\" \"test/unit/**/*.js\"",
"test:match": "mocha --timeout 20000 \"test/*.js\" \"test/unit/*.js\" -g"
},
"dependencies": {
"@libsql/client": "^0.4.3",
"deep-diff": "^1.0.2",
"drizzle-orm": "^0.28.6",
"kleur": "^4.1.5",
"nanoid": "^5.0.1",
"open": "^10.0.3",
"ora": "^7.0.1",
"prompts": "^2.4.2",
"yargs-parser": "^21.1.1",
"zod": "^3.22.4"
},
"devDependencies": {
"@types/chai": "^4.3.6",
"@types/deep-diff": "^1.0.5",
"@types/diff": "^5.0.8",
"@types/mocha": "^10.0.2",
"@types/prompts": "^2.4.8",
"@types/yargs-parser": "^21.0.3",
"astro": "workspace:*",
"astro-scripts": "workspace:*",
"chai": "^4.3.10",
"cheerio": "1.0.0-rc.12",
"mocha": "^10.2.0",
"typescript": "^5.2.2",
"vite": "^4.4.11"
}
}

View file

@ -0,0 +1,43 @@
import type { AstroConfig } from 'astro';
import type { Arguments } from 'yargs-parser';
import { writeFile } from 'node:fs/promises';
import {
MIGRATIONS_CREATED,
MIGRATIONS_UP_TO_DATE,
getMigrationStatus,
initializeMigrationsDirectory,
} from '../../migrations.js';
import { getMigrationQueries } from '../../migration-queries.js';
import { bgRed, red, reset } from 'kleur/colors';
export async function cmd({ config }: { config: AstroConfig; flags: Arguments }) {
const migration = await getMigrationStatus(config);
if (migration.state === 'no-migrations-found') {
await initializeMigrationsDirectory(migration.currentSnapshot);
console.log(MIGRATIONS_CREATED);
return;
} else if (migration.state === 'up-to-date') {
console.log(MIGRATIONS_UP_TO_DATE);
return;
}
const { oldSnapshot, newSnapshot, newFilename, diff } = migration;
const { queries: migrationQueries, confirmations } = await getMigrationQueries({
oldSnapshot,
newSnapshot,
});
// Warn the user about any changes that lead to data-loss.
// When the user runs `db push`, they will be prompted to confirm these changes.
confirmations.map((message) => console.log(bgRed(' !!! ') + ' ' + red(message)));
const migrationFileContent = {
diff,
db: migrationQueries,
// TODO(fks): Encode the relevant data, instead of the raw message.
// This will give `db push` more control over the formatting of the message.
confirm: confirmations.map((c) => reset(c)),
};
const migrationFileName = `./migrations/${newFilename}`;
await writeFile(migrationFileName, JSON.stringify(migrationFileContent, undefined, 2));
console.log(migrationFileName + ' created!');
}

View file

@ -0,0 +1,79 @@
import type { AstroConfig } from 'astro';
import { mkdir, writeFile } from 'node:fs/promises';
import { bgRed, cyan } from 'kleur/colors';
import prompts from 'prompts';
import type { Arguments } from 'yargs-parser';
import { PROJECT_ID_FILE, getSessionIdFromFile } from '../../../tokens.js';
import { getAstroStudioUrl } from '../../../utils.js';
import { MISSING_SESSION_ID_ERROR } from '../../../errors.js';
export async function cmd({ flags }: { config: AstroConfig; flags: Arguments }) {
const linkUrl = new URL(getAstroStudioUrl() + '/auth/cli/link');
const sessionToken = await getSessionIdFromFile();
if (!sessionToken) {
console.error(MISSING_SESSION_ID_ERROR);
process.exit(1);
}
let body = { id: flags._[4] } as {
id?: string;
projectIdName?: string;
workspaceIdName?: string;
};
if (!body.id) {
const workspaceIdName = await promptWorkspaceName();
const projectIdName = await promptProjectName();
body = { projectIdName, workspaceIdName };
}
const response = await fetch(linkUrl, {
method: 'POST',
headers: {
Authorization: `Bearer ${await getSessionIdFromFile()}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(body),
});
if (!response.ok) {
// Unauthorized
if (response.status === 401) {
console.error(
`${bgRed('Unauthorized')}\n\n Are you logged in?\n Run ${cyan(
'astro db login'
)} to authenticate and then try linking again.\n\n`
);
process.exit(1);
}
console.error(`Failed to link project: ${response.status} ${response.statusText}`);
process.exit(1);
}
const { data } = await response.json();
await mkdir(new URL('.', PROJECT_ID_FILE), { recursive: true });
await writeFile(PROJECT_ID_FILE, `${data.id}`);
console.info('Project linked.');
}
export async function promptProjectName(defaultName?: string): Promise<string> {
const { projectName } = await prompts({
type: 'text',
name: 'projectName',
message: 'Project ID',
initial: defaultName,
});
if (typeof projectName !== 'string') {
process.exit(0);
}
return projectName;
}
export async function promptWorkspaceName(defaultName?: string): Promise<string> {
const { workspaceName } = await prompts({
type: 'text',
name: 'workspaceName',
message: 'Workspace ID',
initial: defaultName,
});
if (typeof workspaceName !== 'string') {
process.exit(0);
}
return workspaceName;
}

View file

@ -0,0 +1,52 @@
import type { AstroConfig } from 'astro';
import { cyan } from 'kleur/colors';
import { mkdir, writeFile } from 'node:fs/promises';
import { createServer } from 'node:http';
import ora from 'ora';
import type { Arguments } from 'yargs-parser';
import { getAstroStudioUrl } from '../../../utils.js';
import open from 'open';
import { SESSION_LOGIN_FILE } from '../../../tokens.js';
function serveAndResolveSession(): Promise<string> {
let resolve: (value: string | PromiseLike<string>) => void,
reject: (value?: string | PromiseLike<string>) => void;
const sessionPromise = new Promise<string>((_resolve, _reject) => {
resolve = _resolve;
reject = _reject;
});
const server = createServer((req, res) => {
res.writeHead(200);
res.end();
const url = new URL(req.url ?? '/', `http://${req.headers.host}`);
const session = url.searchParams.get('session');
if (!session) {
reject();
} else {
resolve(session);
}
}).listen(5710, 'localhost');
return sessionPromise.finally(() => {
server.closeAllConnections();
server.close();
});
}
export async function cmd({ flags }: { config: AstroConfig; flags: Arguments }) {
let session = flags.session;
const loginUrl = getAstroStudioUrl() + '/auth/cli';
if (!session) {
console.log(`Opening ${cyan(loginUrl)} in your browser...`);
console.log(`If something goes wrong, copy-and-paste the URL into your browser.`);
open(loginUrl);
const spinner = ora('Waiting for confirmation...');
session = await serveAndResolveSession();
spinner.succeed('Successfully logged in!');
}
await mkdir(new URL('.', SESSION_LOGIN_FILE), { recursive: true });
await writeFile(SESSION_LOGIN_FILE, `${session}`);
}

View file

@ -0,0 +1,9 @@
import type { AstroConfig } from 'astro';
import { unlink } from 'node:fs/promises';
import type { Arguments } from 'yargs-parser';
import { SESSION_LOGIN_FILE } from '../../../tokens.js';
export async function cmd({}: { config: AstroConfig; flags: Arguments }) {
await unlink(SESSION_LOGIN_FILE);
console.log('Successfully logged out of Astro Studio.');
}

View file

@ -0,0 +1,264 @@
import { createClient, type InStatement } from '@libsql/client';
import type { AstroConfig } from 'astro';
import { drizzle as drizzleProxy } from 'drizzle-orm/sqlite-proxy';
import { drizzle as drizzleLibsql } from 'drizzle-orm/libsql';
import { SQLiteAsyncDialect } from 'drizzle-orm/sqlite-core';
import { red } from 'kleur/colors';
import prompts from 'prompts';
import type { Arguments } from 'yargs-parser';
import { recreateTables, seedData } from '../../../queries.js';
import { getManagedAppTokenOrExit } from '../../../tokens.js';
import { tablesSchema, type AstroConfigWithDB, type DBSnapshot } from '../../../types.js';
import { getRemoteDatabaseUrl } from '../../../utils.js';
import { getMigrationQueries } from '../../migration-queries.js';
import {
createEmptySnapshot,
getMigrations,
getMigrationStatus,
loadInitialSnapshot,
loadMigration,
MIGRATION_NEEDED,
MIGRATIONS_NOT_INITIALIZED,
MIGRATIONS_UP_TO_DATE,
} from '../../migrations.js';
import { MISSING_SESSION_ID_ERROR } from '../../../errors.js';
export async function cmd({ config, flags }: { config: AstroConfig; flags: Arguments }) {
const isDryRun = flags.dryRun;
const appToken = await getManagedAppTokenOrExit(flags.token);
const migration = await getMigrationStatus(config);
if (migration.state === 'no-migrations-found') {
console.log(MIGRATIONS_NOT_INITIALIZED);
process.exit(1);
} else if (migration.state === 'ahead') {
console.log(MIGRATION_NEEDED);
process.exit(1);
}
// get all migrations from the filesystem
const allLocalMigrations = await getMigrations();
let missingMigrations: string[] = [];
try {
const { data } = await prepareMigrateQuery({
migrations: allLocalMigrations,
appToken: appToken.token,
});
missingMigrations = data;
} catch (error) {
if (error instanceof Error) {
if (error.message.startsWith('{')) {
const { error: { code } = { code: '' } } = JSON.parse(error.message);
if (code === 'TOKEN_UNAUTHORIZED') {
console.error(MISSING_SESSION_ID_ERROR);
}
}
}
console.error(error);
process.exit(1);
}
// push the database schema
if (missingMigrations.length === 0) {
console.log(MIGRATIONS_UP_TO_DATE);
} else {
console.log(`Pushing ${missingMigrations.length} migrations...`);
await pushSchema({
migrations: missingMigrations,
appToken: appToken.token,
isDryRun,
currentSnapshot: migration.currentSnapshot,
});
}
// push the database seed data
console.info('Pushing data...');
await pushData({ config, appToken: appToken.token, isDryRun });
// cleanup and exit
await appToken.destroy();
console.info('Push complete!');
}
async function pushSchema({
migrations,
appToken,
isDryRun,
currentSnapshot,
}: {
migrations: string[];
appToken: string;
isDryRun: boolean;
currentSnapshot: DBSnapshot;
}) {
// load all missing migrations
const initialSnapshot = migrations.find((m) => m === '0000_snapshot.json');
const filteredMigrations = migrations.filter((m) => m !== '0000_snapshot.json');
const missingMigrationContents = await Promise.all(filteredMigrations.map(loadMigration));
// create a migration for the initial snapshot, if needed
const initialMigrationBatch = initialSnapshot
? (
await getMigrationQueries({
oldSnapshot: createEmptySnapshot(),
newSnapshot: await loadInitialSnapshot(),
})
).queries
: [];
// combine all missing migrations into a single batch
const confirmations = missingMigrationContents.reduce((acc, curr) => {
return [...acc, ...(curr.confirm || [])];
}, [] as string[]);
if (confirmations.length > 0) {
const response = await prompts([
...confirmations.map((message, index) => ({
type: 'confirm' as const,
name: String(index),
message: red('Warning: ') + message + '\nContinue?',
initial: true,
})),
]);
if (
Object.values(response).length === 0 ||
Object.values(response).some((value) => value === false)
) {
process.exit(1);
}
}
// combine all missing migrations into a single batch
const queries = missingMigrationContents.reduce((acc, curr) => {
return [...acc, ...curr.db];
}, initialMigrationBatch);
// apply the batch to the DB
await runMigrateQuery({ queries, migrations, snapshot: currentSnapshot, appToken, isDryRun });
}
const sqlite = new SQLiteAsyncDialect();
async function pushData({
config,
appToken,
isDryRun,
}: {
config: AstroConfigWithDB;
appToken: string;
isDryRun?: boolean;
}) {
const queries: InStatement[] = [];
if (config.db?.data) {
const libsqlClient = createClient({ url: ':memory:' });
// Stand up tables locally to mirror inserts.
// Needed to generate return values.
await recreateTables({
db: drizzleLibsql(libsqlClient),
tables: tablesSchema.parse(config.db.tables ?? {}),
});
for (const [collectionName, { writable }] of Object.entries(config.db.tables ?? {})) {
if (!writable) {
queries.push({
sql: `DELETE FROM ${sqlite.escapeName(collectionName)}`,
args: [],
});
}
}
// Use proxy to trace all queries to queue up in a batch.
const db = await drizzleProxy(async (sqlQuery, params, method) => {
const stmt: InStatement = { sql: sqlQuery, args: params };
queries.push(stmt);
// Use in-memory database to generate results for `returning()`.
const { rows } = await libsqlClient.execute(stmt);
const rowValues: unknown[][] = [];
for (const row of rows) {
if (row != null && typeof row === 'object') {
rowValues.push(Object.values(row));
}
}
if (method === 'get') {
return { rows: rowValues[0] };
}
return { rows: rowValues };
});
await seedData({
db,
mode: 'build',
data: config.db.data,
});
}
const url = new URL('/db/query', getRemoteDatabaseUrl());
if (isDryRun) {
console.info('[DRY RUN] Batch data seed:', JSON.stringify(queries, null, 2));
return new Response(null, { status: 200 });
}
return await fetch(url, {
method: 'POST',
headers: new Headers({
Authorization: `Bearer ${appToken}`,
}),
body: JSON.stringify(queries),
});
}
async function runMigrateQuery({
queries: baseQueries,
migrations,
snapshot,
appToken,
isDryRun,
}: {
queries: string[];
migrations: string[];
snapshot: DBSnapshot;
appToken: string;
isDryRun?: boolean;
}) {
const queries = ['pragma defer_foreign_keys=true;', ...baseQueries];
const requestBody = {
snapshot,
migrations,
sql: queries,
experimentalVersion: 1,
};
if (isDryRun) {
console.info('[DRY RUN] Batch query:', JSON.stringify(requestBody, null, 2));
return new Response(null, { status: 200 });
}
const url = new URL('/migrations/run', getRemoteDatabaseUrl());
return await fetch(url, {
method: 'POST',
headers: new Headers({
Authorization: `Bearer ${appToken}`,
}),
body: JSON.stringify(requestBody),
});
}
async function prepareMigrateQuery({
migrations,
appToken,
}: {
migrations: string[];
appToken: string;
}) {
const url = new URL('/migrations/prepare', getRemoteDatabaseUrl());
const requestBody = {
migrations,
experimentalVersion: 1,
};
const result = await fetch(url, {
method: 'POST',
headers: new Headers({
Authorization: `Bearer ${appToken}`,
}),
body: JSON.stringify(requestBody),
});
if (result.status >= 400) {
throw new Error(await result.text());
}
return await result.json();
}

View file

@ -0,0 +1,16 @@
import type { AstroConfig } from 'astro';
import { sql } from 'drizzle-orm';
import type { Arguments } from 'yargs-parser';
import { createRemoteDatabaseClient } from '../../../../runtime/db-client.js';
import { getManagedAppTokenOrExit } from '../../../tokens.js';
import { getRemoteDatabaseUrl } from '../../../utils.js';
export async function cmd({ flags }: { config: AstroConfig; flags: Arguments }) {
const query = flags.query;
const appToken = await getManagedAppTokenOrExit(flags.token);
const db = createRemoteDatabaseClient(appToken.token, getRemoteDatabaseUrl());
// Temporary: create the migration table just in case it doesn't exist
const result = await db.run(sql.raw(query));
await appToken.destroy();
console.log(result);
}

View file

@ -0,0 +1,43 @@
import type { AstroConfig } from 'astro';
import type { Arguments } from 'yargs-parser';
import {
getMigrationStatus,
MIGRATION_NEEDED,
MIGRATIONS_NOT_INITIALIZED,
MIGRATIONS_UP_TO_DATE,
} from '../../migrations.js';
import { getMigrationQueries } from '../../migration-queries.js';
export async function cmd({ config, flags }: { config: AstroConfig; flags: Arguments }) {
const status = await getMigrationStatus(config);
const { state } = status;
if (flags.json) {
if (state === 'ahead') {
const { queries: migrationQueries } = await getMigrationQueries({
oldSnapshot: status.oldSnapshot,
newSnapshot: status.newSnapshot,
});
const newFileContent = {
diff: status.diff,
db: migrationQueries,
};
status.newFileContent = JSON.stringify(newFileContent, null, 2);
}
console.log(JSON.stringify(status));
process.exit(state === 'up-to-date' ? 0 : 1);
}
switch (state) {
case 'no-migrations-found': {
console.log(MIGRATIONS_NOT_INITIALIZED);
process.exit(1);
}
case 'ahead': {
console.log(MIGRATION_NEEDED);
process.exit(1);
}
case 'up-to-date': {
console.log(MIGRATIONS_UP_TO_DATE);
return;
}
}
}

View file

@ -0,0 +1,73 @@
import type { AstroConfig } from 'astro';
import type { Arguments } from 'yargs-parser';
import { STUDIO_CONFIG_MISSING_CLI_ERROR } from '../errors.js';
export async function cli({ flags, config }: { flags: Arguments; config: AstroConfig }) {
const args = flags._ as string[];
// Most commands are `astro db foo`, but for now login/logout
// are also handled by this package, so first check if this is a db command.
const command = args[2] === 'db' ? args[3] : args[2];
if (!config.db?.studio) {
console.log(STUDIO_CONFIG_MISSING_CLI_ERROR);
process.exit(1);
}
switch (command) {
case 'shell': {
const { cmd } = await import('./commands/shell/index.js');
return await cmd({ config, flags });
}
case 'gen':
case 'sync': {
const { cmd } = await import('./commands/gen/index.js');
return await cmd({ config, flags });
}
case 'push': {
const { cmd } = await import('./commands/push/index.js');
return await cmd({ config, flags });
}
case 'verify': {
const { cmd } = await import('./commands/verify/index.js');
return await cmd({ config, flags });
}
case 'login': {
const { cmd } = await import('./commands/login/index.js');
return await cmd({ config, flags });
}
case 'logout': {
const { cmd } = await import('./commands/logout/index.js');
return await cmd({ config, flags });
}
case 'link': {
const { cmd } = await import('./commands/link/index.js');
return await cmd({ config, flags });
}
default: {
if (command == null) {
console.error(`No command provided.
${showHelp()}`);
} else {
console.error(`Unknown command: ${command}
${showHelp()}`);
}
return;
}
}
function showHelp() {
return `astro db <command>
Usage:
astro login Authenticate your machine with Astro Studio
astro logout End your authenticated session with Astro Studio
astro link Link this directory to an Astro Studio project
astro db gen Creates snapshot based on your schema
astro db push Pushes migrations to Astro Studio
astro db verify Verifies migrations have been pushed and errors if not`;
}
}

View file

@ -0,0 +1,554 @@
import * as color from 'kleur/colors';
import deepDiff from 'deep-diff';
import {
columnSchema,
type BooleanColumn,
type DBTable,
type DBTables,
type DBColumn,
type DBColumns,
type DBSnapshot,
type DateColumn,
type ColumnType,
type Indexes,
type JsonColumn,
type NumberColumn,
type TextColumn,
} from '../types.js';
import { SQLiteAsyncDialect } from 'drizzle-orm/sqlite-core';
import { customAlphabet } from 'nanoid';
import prompts from 'prompts';
import {
getCreateIndexQueries,
getCreateTableQuery,
getModifiers,
getReferencesConfig,
hasDefault,
schemaTypeToSqlType,
} from '../queries.js';
import { hasPrimaryKey } from '../../runtime/index.js';
import { isSerializedSQL } from '../../runtime/types.js';
const sqlite = new SQLiteAsyncDialect();
const genTempTableName = customAlphabet('abcdefghijklmnopqrstuvwxyz', 10);
/** Dependency injected for unit testing */
type AmbiguityResponses = {
collectionRenames: Record<string, string>;
columnRenames: {
[collectionName: string]: Record<string, string>;
};
};
export async function getMigrationQueries({
oldSnapshot,
newSnapshot,
ambiguityResponses,
}: {
oldSnapshot: DBSnapshot;
newSnapshot: DBSnapshot;
ambiguityResponses?: AmbiguityResponses;
}): Promise<{ queries: string[]; confirmations: string[] }> {
const queries: string[] = [];
const confirmations: string[] = [];
let added = getAddedCollections(oldSnapshot, newSnapshot);
let dropped = getDroppedCollections(oldSnapshot, newSnapshot);
if (!isEmpty(added) && !isEmpty(dropped)) {
const resolved = await resolveCollectionRenames(added, dropped, ambiguityResponses);
added = resolved.added;
dropped = resolved.dropped;
for (const { from, to } of resolved.renamed) {
const renameQuery = `ALTER TABLE ${sqlite.escapeName(from)} RENAME TO ${sqlite.escapeName(
to
)}`;
queries.push(renameQuery);
}
}
for (const [collectionName, collection] of Object.entries(added)) {
queries.push(getCreateTableQuery(collectionName, collection));
queries.push(...getCreateIndexQueries(collectionName, collection));
}
for (const [collectionName] of Object.entries(dropped)) {
const dropQuery = `DROP TABLE ${sqlite.escapeName(collectionName)}`;
queries.push(dropQuery);
}
for (const [collectionName, newCollection] of Object.entries(newSnapshot.schema)) {
const oldCollection = oldSnapshot.schema[collectionName];
if (!oldCollection) continue;
const result = await getCollectionChangeQueries({
collectionName,
oldCollection,
newCollection,
});
queries.push(...result.queries);
confirmations.push(...result.confirmations);
}
return { queries, confirmations };
}
export async function getCollectionChangeQueries({
collectionName,
oldCollection,
newCollection,
ambiguityResponses,
}: {
collectionName: string;
oldCollection: DBTable;
newCollection: DBTable;
ambiguityResponses?: AmbiguityResponses;
}): Promise<{ queries: string[]; confirmations: string[] }> {
const queries: string[] = [];
const confirmations: string[] = [];
const updated = getUpdatedColumns(oldCollection.columns, newCollection.columns);
let added = getAdded(oldCollection.columns, newCollection.columns);
let dropped = getDropped(oldCollection.columns, newCollection.columns);
/** Any foreign key changes require a full table recreate */
const hasForeignKeyChanges = Boolean(
deepDiff(oldCollection.foreignKeys, newCollection.foreignKeys)
);
if (!hasForeignKeyChanges && isEmpty(updated) && isEmpty(added) && isEmpty(dropped)) {
return {
queries: getChangeIndexQueries({
collectionName,
oldIndexes: oldCollection.indexes,
newIndexes: newCollection.indexes,
}),
confirmations,
};
}
if (!hasForeignKeyChanges && !isEmpty(added) && !isEmpty(dropped)) {
const resolved = await resolveColumnRenames(collectionName, added, dropped, ambiguityResponses);
added = resolved.added;
dropped = resolved.dropped;
queries.push(...getColumnRenameQueries(collectionName, resolved.renamed));
}
if (
!hasForeignKeyChanges &&
isEmpty(updated) &&
Object.values(dropped).every(canAlterTableDropColumn) &&
Object.values(added).every(canAlterTableAddColumn)
) {
queries.push(
...getAlterTableQueries(collectionName, added, dropped),
...getChangeIndexQueries({
collectionName,
oldIndexes: oldCollection.indexes,
newIndexes: newCollection.indexes,
})
);
return { queries, confirmations };
}
const dataLossCheck = canRecreateTableWithoutDataLoss(added, updated);
if (dataLossCheck.dataLoss) {
const { reason, columnName } = dataLossCheck;
const reasonMsgs: Record<DataLossReason, string> = {
'added-required': `New column ${color.bold(
collectionName + '.' + columnName
)} is required with no default value.\nThis requires deleting existing data in the ${color.bold(
collectionName
)} collection.`,
'added-unique': `New column ${color.bold(
collectionName + '.' + columnName
)} is marked as unique.\nThis requires deleting existing data in the ${color.bold(
collectionName
)} collection.`,
'updated-type': `Updated column ${color.bold(
collectionName + '.' + columnName
)} cannot convert data to new column data type.\nThis requires deleting existing data in the ${color.bold(
collectionName
)} collection.`,
};
confirmations.push(reasonMsgs[reason]);
}
const primaryKeyExists = Object.entries(newCollection.columns).find(([, column]) =>
hasPrimaryKey(column)
);
const droppedPrimaryKey = Object.entries(dropped).find(([, column]) => hasPrimaryKey(column));
const recreateTableQueries = getRecreateTableQueries({
collectionName,
newCollection,
added,
hasDataLoss: dataLossCheck.dataLoss,
migrateHiddenPrimaryKey: !primaryKeyExists && !droppedPrimaryKey,
});
queries.push(...recreateTableQueries, ...getCreateIndexQueries(collectionName, newCollection));
return { queries, confirmations };
}
function getChangeIndexQueries({
collectionName,
oldIndexes = {},
newIndexes = {},
}: {
collectionName: string;
oldIndexes?: Indexes;
newIndexes?: Indexes;
}) {
const added = getAdded(oldIndexes, newIndexes);
const dropped = getDropped(oldIndexes, newIndexes);
const updated = getUpdated(oldIndexes, newIndexes);
Object.assign(dropped, updated);
Object.assign(added, updated);
const queries: string[] = [];
for (const indexName of Object.keys(dropped)) {
const dropQuery = `DROP INDEX ${sqlite.escapeName(indexName)}`;
queries.push(dropQuery);
}
queries.push(...getCreateIndexQueries(collectionName, { indexes: added }));
return queries;
}
type Renamed = Array<{ from: string; to: string }>;
async function resolveColumnRenames(
collectionName: string,
mightAdd: DBColumns,
mightDrop: DBColumns,
ambiguityResponses?: AmbiguityResponses
): Promise<{ added: DBColumns; dropped: DBColumns; renamed: Renamed }> {
const added: DBColumns = {};
const dropped: DBColumns = {};
const renamed: Renamed = [];
for (const [columnName, column] of Object.entries(mightAdd)) {
let oldColumnName = ambiguityResponses
? ambiguityResponses.columnRenames[collectionName]?.[columnName] ?? '__NEW__'
: undefined;
if (!oldColumnName) {
const res = await prompts(
{
type: 'select',
name: 'columnName',
message:
'New column ' +
color.blue(color.bold(`${collectionName}.${columnName}`)) +
' detected. Was this renamed from an existing column?',
choices: [
{ title: 'New column (not renamed from existing)', value: '__NEW__' },
...Object.keys(mightDrop)
.filter((key) => !(key in renamed))
.map((key) => ({ title: key, value: key })),
],
},
{
onCancel: () => {
process.exit(1);
},
}
);
oldColumnName = res.columnName as string;
}
if (oldColumnName === '__NEW__') {
added[columnName] = column;
} else {
renamed.push({ from: oldColumnName, to: columnName });
}
}
for (const [droppedColumnName, droppedColumn] of Object.entries(mightDrop)) {
if (!renamed.find((r) => r.from === droppedColumnName)) {
dropped[droppedColumnName] = droppedColumn;
}
}
return { added, dropped, renamed };
}
async function resolveCollectionRenames(
mightAdd: DBTables,
mightDrop: DBTables,
ambiguityResponses?: AmbiguityResponses
): Promise<{ added: DBTables; dropped: DBTables; renamed: Renamed }> {
const added: DBTables = {};
const dropped: DBTables = {};
const renamed: Renamed = [];
for (const [collectionName, collection] of Object.entries(mightAdd)) {
let oldCollectionName = ambiguityResponses
? ambiguityResponses.collectionRenames[collectionName] ?? '__NEW__'
: undefined;
if (!oldCollectionName) {
const res = await prompts(
{
type: 'select',
name: 'collectionName',
message:
'New collection ' +
color.blue(color.bold(collectionName)) +
' detected. Was this renamed from an existing collection?',
choices: [
{ title: 'New collection (not renamed from existing)', value: '__NEW__' },
...Object.keys(mightDrop)
.filter((key) => !(key in renamed))
.map((key) => ({ title: key, value: key })),
],
},
{
onCancel: () => {
process.exit(1);
},
}
);
oldCollectionName = res.collectionName as string;
}
if (oldCollectionName === '__NEW__') {
added[collectionName] = collection;
} else {
renamed.push({ from: oldCollectionName, to: collectionName });
}
}
for (const [droppedCollectionName, droppedCollection] of Object.entries(mightDrop)) {
if (!renamed.find((r) => r.from === droppedCollectionName)) {
dropped[droppedCollectionName] = droppedCollection;
}
}
return { added, dropped, renamed };
}
function getAddedCollections(oldCollections: DBSnapshot, newCollections: DBSnapshot): DBTables {
const added: DBTables = {};
for (const [key, newCollection] of Object.entries(newCollections.schema)) {
if (!(key in oldCollections.schema)) added[key] = newCollection;
}
return added;
}
function getDroppedCollections(oldCollections: DBSnapshot, newCollections: DBSnapshot): DBTables {
const dropped: DBTables = {};
for (const [key, oldCollection] of Object.entries(oldCollections.schema)) {
if (!(key in newCollections.schema)) dropped[key] = oldCollection;
}
return dropped;
}
function getColumnRenameQueries(unescapedCollectionName: string, renamed: Renamed): string[] {
const queries: string[] = [];
const collectionName = sqlite.escapeName(unescapedCollectionName);
for (const { from, to } of renamed) {
const q = `ALTER TABLE ${collectionName} RENAME COLUMN ${sqlite.escapeName(
from
)} TO ${sqlite.escapeName(to)}`;
queries.push(q);
}
return queries;
}
/**
* Get ALTER TABLE queries to update the table schema. Assumes all added and dropped columns pass
* `canUseAlterTableAddColumn` and `canAlterTableDropColumn` checks!
*/
function getAlterTableQueries(
unescapedCollectionName: string,
added: DBColumns,
dropped: DBColumns
): string[] {
const queries: string[] = [];
const collectionName = sqlite.escapeName(unescapedCollectionName);
for (const [unescColumnName, column] of Object.entries(added)) {
const columnName = sqlite.escapeName(unescColumnName);
const type = schemaTypeToSqlType(column.type);
const q = `ALTER TABLE ${collectionName} ADD COLUMN ${columnName} ${type}${getModifiers(
columnName,
column
)}`;
queries.push(q);
}
for (const unescColumnName of Object.keys(dropped)) {
const columnName = sqlite.escapeName(unescColumnName);
const q = `ALTER TABLE ${collectionName} DROP COLUMN ${columnName}`;
queries.push(q);
}
return queries;
}
function getRecreateTableQueries({
collectionName: unescCollectionName,
newCollection,
added,
hasDataLoss,
migrateHiddenPrimaryKey,
}: {
collectionName: string;
newCollection: DBTable;
added: Record<string, DBColumn>;
hasDataLoss: boolean;
migrateHiddenPrimaryKey: boolean;
}): string[] {
const unescTempName = `${unescCollectionName}_${genTempTableName()}`;
const tempName = sqlite.escapeName(unescTempName);
const collectionName = sqlite.escapeName(unescCollectionName);
if (hasDataLoss) {
return [
`DROP TABLE ${collectionName}`,
getCreateTableQuery(unescCollectionName, newCollection),
];
}
const newColumns = [...Object.keys(newCollection.columns)];
if (migrateHiddenPrimaryKey) {
newColumns.unshift('_id');
}
const escapedColumns = newColumns
.filter((i) => !(i in added))
.map((c) => sqlite.escapeName(c))
.join(', ');
return [
getCreateTableQuery(unescTempName, newCollection),
`INSERT INTO ${tempName} (${escapedColumns}) SELECT ${escapedColumns} FROM ${collectionName}`,
`DROP TABLE ${collectionName}`,
`ALTER TABLE ${tempName} RENAME TO ${collectionName}`,
];
}
function isEmpty(obj: Record<string, unknown>) {
return Object.keys(obj).length === 0;
}
/**
* ADD COLUMN is preferred for O(1) table updates, but is only supported for _some_ column
* definitions.
*
* @see https://www.sqlite.org/lang_altertable.html#alter_table_add_column
*/
function canAlterTableAddColumn(column: DBColumn) {
if (column.schema.unique) return false;
if (hasRuntimeDefault(column)) return false;
if (!column.schema.optional && !hasDefault(column)) return false;
if (hasPrimaryKey(column)) return false;
if (getReferencesConfig(column)) return false;
return true;
}
function canAlterTableDropColumn(column: DBColumn) {
if (column.schema.unique) return false;
if (hasPrimaryKey(column)) return false;
return true;
}
type DataLossReason = 'added-required' | 'added-unique' | 'updated-type';
type DataLossResponse =
| { dataLoss: false }
| { dataLoss: true; columnName: string; reason: DataLossReason };
function canRecreateTableWithoutDataLoss(
added: DBColumns,
updated: UpdatedColumns
): DataLossResponse {
for (const [columnName, a] of Object.entries(added)) {
if (hasPrimaryKey(a) && a.type !== 'number' && !hasDefault(a)) {
return { dataLoss: true, columnName, reason: 'added-required' };
}
if (!a.schema.optional && !hasDefault(a)) {
return { dataLoss: true, columnName, reason: 'added-required' };
}
if (!a.schema.optional && a.schema.unique) {
return { dataLoss: true, columnName, reason: 'added-unique' };
}
}
for (const [columnName, u] of Object.entries(updated)) {
if (u.old.type !== u.new.type && !canChangeTypeWithoutQuery(u.old, u.new)) {
return { dataLoss: true, columnName, reason: 'updated-type' };
}
}
return { dataLoss: false };
}
function getAdded<T>(oldObj: Record<string, T>, newObj: Record<string, T>) {
const added: Record<string, T> = {};
for (const [key, value] of Object.entries(newObj)) {
if (!(key in oldObj)) added[key] = value;
}
return added;
}
function getDropped<T>(oldObj: Record<string, T>, newObj: Record<string, T>) {
const dropped: Record<string, T> = {};
for (const [key, value] of Object.entries(oldObj)) {
if (!(key in newObj)) dropped[key] = value;
}
return dropped;
}
function getUpdated<T>(oldObj: Record<string, T>, newObj: Record<string, T>) {
const updated: Record<string, T> = {};
for (const [key, value] of Object.entries(newObj)) {
const oldValue = oldObj[key];
if (!oldValue) continue;
if (deepDiff(oldValue, value)) updated[key] = value;
}
return updated;
}
type UpdatedColumns = Record<string, { old: DBColumn; new: DBColumn }>;
function getUpdatedColumns(oldColumns: DBColumns, newColumns: DBColumns): UpdatedColumns {
const updated: UpdatedColumns = {};
for (const [key, newColumn] of Object.entries(newColumns)) {
let oldColumn = oldColumns[key];
if (!oldColumn) continue;
if (oldColumn.type !== newColumn.type && canChangeTypeWithoutQuery(oldColumn, newColumn)) {
// If we can safely change the type without a query,
// try parsing the old schema as the new schema.
// This lets us diff the columns as if they were the same type.
const asNewColumn = columnSchema.safeParse({
type: newColumn.type,
schema: oldColumn.schema,
});
if (asNewColumn.success) {
oldColumn = asNewColumn.data;
}
// If parsing fails, move on to the standard diff.
}
const diff = deepDiff(oldColumn, newColumn);
if (diff) {
updated[key] = { old: oldColumn, new: newColumn };
}
}
return updated;
}
const typeChangesWithoutQuery: Array<{ from: ColumnType; to: ColumnType }> = [
{ from: 'boolean', to: 'number' },
{ from: 'date', to: 'text' },
{ from: 'json', to: 'text' },
];
function canChangeTypeWithoutQuery(oldColumn: DBColumn, newColumn: DBColumn) {
return typeChangesWithoutQuery.some(
({ from, to }) => oldColumn.type === from && newColumn.type === to
);
}
// Using `DBColumn` will not narrow `default` based on the column `type`
// Handle each column separately
type WithDefaultDefined<T extends DBColumn> = T & Required<Pick<T['schema'], 'default'>>;
type DBColumnWithDefault =
| WithDefaultDefined<TextColumn>
| WithDefaultDefined<DateColumn>
| WithDefaultDefined<NumberColumn>
| WithDefaultDefined<BooleanColumn>
| WithDefaultDefined<JsonColumn>;
function hasRuntimeDefault(column: DBColumn): column is DBColumnWithDefault {
return !!(column.schema.default && isSerializedSQL(column.schema.default));
}

View file

@ -0,0 +1,140 @@
import deepDiff from 'deep-diff';
import { mkdir, readFile, readdir, writeFile } from 'fs/promises';
import { tablesSchema, type DBSnapshot } from '../types.js';
import type { AstroConfig } from 'astro';
import { cyan, green, yellow } from 'kleur/colors';
const { applyChange, diff: generateDiff } = deepDiff;
export type MigrationStatus =
| {
state: 'no-migrations-found';
currentSnapshot: DBSnapshot;
}
| {
state: 'ahead';
oldSnapshot: DBSnapshot;
newSnapshot: DBSnapshot;
diff: deepDiff.Diff<DBSnapshot, DBSnapshot>[];
newFilename: string;
summary: string;
newFileContent?: string;
}
| {
state: 'up-to-date';
currentSnapshot: DBSnapshot;
};
export async function getMigrationStatus(config: AstroConfig): Promise<MigrationStatus> {
const currentSnapshot = createCurrentSnapshot(config);
const allMigrationFiles = await getMigrations();
if (allMigrationFiles.length === 0) {
return {
state: 'no-migrations-found',
currentSnapshot,
};
}
const previousSnapshot = await initializeFromMigrations(allMigrationFiles);
const diff = generateDiff(previousSnapshot, currentSnapshot);
if (diff) {
const n = getNewMigrationNumber(allMigrationFiles);
const newFilename = `${String(n + 1).padStart(4, '0')}_migration.json`;
return {
state: 'ahead',
oldSnapshot: previousSnapshot,
newSnapshot: currentSnapshot,
diff,
newFilename,
summary: generateDiffSummary(diff),
};
}
return {
state: 'up-to-date',
currentSnapshot,
};
}
export const MIGRATIONS_CREATED = `${green(
'■ Migrations initialized!'
)}\n\n To execute your migrations, run\n ${cyan('astro db push')}`;
export const MIGRATIONS_UP_TO_DATE = `${green(
'■ No migrations needed!'
)}\n\n Your database is up to date.\n`;
export const MIGRATIONS_NOT_INITIALIZED = `${yellow(
'▶ No migrations found!'
)}\n\n To scaffold your migrations folder, run\n ${cyan('astro db sync')}\n`;
export const MIGRATION_NEEDED = `${yellow(
'▶ Changes detected!'
)}\n\n To create the necessary migration file, run\n ${cyan('astro db sync')}\n`;
function generateDiffSummary(diff: deepDiff.Diff<DBSnapshot, DBSnapshot>[]) {
// TODO: human readable summary
return JSON.stringify(diff, null, 2);
}
function getNewMigrationNumber(allMigrationFiles: string[]): number {
const len = allMigrationFiles.length - 1;
return allMigrationFiles.reduce((acc, curr) => {
const num = Number.parseInt(curr.split('_')[0] ?? len, 10);
return num > acc ? num : acc;
}, 0);
}
export async function getMigrations(): Promise<string[]> {
const migrationFiles = await readdir('./migrations').catch((err) => {
if (err.code === 'ENOENT') {
return [];
}
throw err;
});
return migrationFiles;
}
export async function loadMigration(
migration: string
): Promise<{ diff: any[]; db: string[]; confirm?: string[] }> {
return JSON.parse(await readFile(`./migrations/${migration}`, 'utf-8'));
}
export async function loadInitialSnapshot(): Promise<DBSnapshot> {
const snapshot = JSON.parse(await readFile('./migrations/0000_snapshot.json', 'utf-8'));
// `experimentalVersion: 1` -- added the version column
if (snapshot.experimentalVersion === 1) {
return snapshot;
}
// `experimentalVersion: 0` -- initial format
if (!snapshot.schema) {
return { experimentalVersion: 1, schema: snapshot };
}
throw new Error('Invalid snapshot format');
}
export async function initializeMigrationsDirectory(currentSnapshot: DBSnapshot) {
await mkdir('./migrations', { recursive: true });
await writeFile('./migrations/0000_snapshot.json', JSON.stringify(currentSnapshot, undefined, 2));
}
export async function initializeFromMigrations(allMigrationFiles: string[]): Promise<DBSnapshot> {
const prevSnapshot = await loadInitialSnapshot();
for (const migration of allMigrationFiles) {
if (migration === '0000_snapshot.json') continue;
const migrationContent = await loadMigration(migration);
migrationContent.diff.forEach((change: any) => {
applyChange(prevSnapshot, {}, change);
});
}
return prevSnapshot;
}
export function createCurrentSnapshot(config: AstroConfig): DBSnapshot {
// Parse to resolve non-serializable types like () => references
const tablesConfig = tablesSchema.parse(config.db?.tables ?? {});
const schema = JSON.parse(JSON.stringify(tablesConfig));
return { experimentalVersion: 1, schema };
}
export function createEmptySnapshot(): DBSnapshot {
return { experimentalVersion: 1, schema: {} };
}

View file

@ -0,0 +1,14 @@
import { readFileSync } from 'node:fs';
export const PACKAGE_NAME = JSON.parse(
readFileSync(new URL('../../package.json', import.meta.url), 'utf8')
).name;
export const RUNTIME_IMPORT = JSON.stringify(`${PACKAGE_NAME}/runtime`);
export const RUNTIME_DRIZZLE_IMPORT = JSON.stringify(`${PACKAGE_NAME}/runtime/drizzle`);
export const DB_TYPES_FILE = 'db-types.d.ts';
export const VIRTUAL_MODULE_ID = 'astro:db';
export const DB_PATH = '.astro/content.db';

View file

@ -0,0 +1,43 @@
import { cyan, bold, red, green, yellow } from 'kleur/colors';
export const MISSING_SESSION_ID_ERROR = `${red('▶ Login required!')}
To authenticate with Astro Studio, run
${cyan('astro db login')}\n`;
export const MISSING_PROJECT_ID_ERROR = `${red('▶ Directory not linked.')}
To link this directory to an Astro Studio project, run
${cyan('astro db link')}\n`;
export const STUDIO_CONFIG_MISSING_WRITABLE_COLLECTIONS_ERROR = (collectionName: string) => `${red(
`▶ Writable collection ${bold(collectionName)} requires Astro Studio or the ${yellow(
'unsafeWritable'
)} option.`
)}
Visit ${cyan('https://astro.build/studio')} to create your account
and set ${green('studio: true')} in your astro.config.mjs file to enable Studio.\n`;
export const UNSAFE_WRITABLE_WARNING = `${yellow(
'unsafeWritable'
)} option is enabled and you are using writable tables.
Redeploying your app may result in wiping away your database.
I hope you know what you are doing.\n`;
export const STUDIO_CONFIG_MISSING_CLI_ERROR = `${red('▶ This command requires Astro Studio.')}
Visit ${cyan('https://astro.build/studio')} to create your account
and set ${green('studio: true')} in your astro.config.mjs file to enable Studio.\n`;
export const MIGRATIONS_NOT_INITIALIZED = `${yellow(
'▶ No migrations found!'
)}\n\n To scaffold your migrations folder, run\n ${cyan('astro db sync')}\n`;
export const SEED_WRITABLE_IN_PROD_ERROR = (collectionName: string) => {
return `${red(
`Writable tables should not be seeded in production with data().`
)} You can seed ${bold(
collectionName
)} in development mode only using the "mode" flag. See the docs for more: https://www.notion.so/astroinc/astrojs-db-README-dcf6fa10de9a4f528be56cee96e8c054?pvs=4#278aed3fc37e4cec80240d1552ff6ac5`;
};

View file

@ -0,0 +1,104 @@
/**
* This is a modified version of Astro's error map. source:
* https://github.com/withastro/astro/blob/main/packages/astro/src/content/error-map.ts
*/
import type { z } from 'astro/zod';
interface TypeOrLiteralErrByPathEntry {
code: 'invalid_type' | 'invalid_literal';
received: unknown;
expected: unknown[];
}
export const errorMap: z.ZodErrorMap = (baseError, ctx) => {
const baseErrorPath = flattenErrorPath(baseError.path);
if (baseError.code === 'invalid_union') {
// Optimization: Combine type and literal errors for keys that are common across ALL union types
// Ex. a union between `{ key: z.literal('tutorial') }` and `{ key: z.literal('blog') }` will
// raise a single error when `key` does not match:
// > Did not match union.
// > key: Expected `'tutorial' | 'blog'`, received 'foo'
const typeOrLiteralErrByPath = new Map<string, TypeOrLiteralErrByPathEntry>();
for (const unionError of baseError.unionErrors.flatMap((e) => e.errors)) {
if (unionError.code === 'invalid_type' || unionError.code === 'invalid_literal') {
const flattenedErrorPath = flattenErrorPath(unionError.path);
const typeOrLiteralErr = typeOrLiteralErrByPath.get(flattenedErrorPath);
if (typeOrLiteralErr) {
typeOrLiteralErr.expected.push(unionError.expected);
} else {
typeOrLiteralErrByPath.set(flattenedErrorPath, {
code: unionError.code,
received: (unionError as any).received,
expected: [unionError.expected],
});
}
}
}
const messages: string[] = [
prefix(
baseErrorPath,
typeOrLiteralErrByPath.size ? 'Did not match union:' : 'Did not match union.'
),
];
return {
message: messages
.concat(
[...typeOrLiteralErrByPath.entries()]
// If type or literal error isn't common to ALL union types,
// filter it out. Can lead to confusing noise.
.filter(([, error]) => error.expected.length === baseError.unionErrors.length)
.map(([key, error]) =>
// Avoid printing the key again if it's a base error
key === baseErrorPath
? `> ${getTypeOrLiteralMsg(error)}`
: `> ${prefix(key, getTypeOrLiteralMsg(error))}`
)
)
.join('\n'),
};
}
if (baseError.code === 'invalid_literal' || baseError.code === 'invalid_type') {
return {
message: prefix(
baseErrorPath,
getTypeOrLiteralMsg({
code: baseError.code,
received: (baseError as any).received,
expected: [baseError.expected],
})
),
};
} else if (baseError.message) {
return { message: prefix(baseErrorPath, baseError.message) };
} else {
return { message: prefix(baseErrorPath, ctx.defaultError) };
}
};
const getTypeOrLiteralMsg = (error: TypeOrLiteralErrByPathEntry): string => {
if (error.received === 'undefined') return 'Required';
const expectedDeduped = new Set(error.expected);
switch (error.code) {
case 'invalid_type':
return `Expected type \`${unionExpectedVals(expectedDeduped)}\`, received ${JSON.stringify(
error.received
)}`;
case 'invalid_literal':
return `Expected \`${unionExpectedVals(expectedDeduped)}\`, received ${JSON.stringify(
error.received
)}`;
}
};
const prefix = (key: string, msg: string) => (key.length ? `**${key}**: ${msg}` : msg);
const unionExpectedVals = (expectedVals: Set<unknown>) =>
[...expectedVals]
.map((expectedVal, idx) => {
if (idx === 0) return JSON.stringify(expectedVal);
const sep = ' | ';
return `${sep}${JSON.stringify(expectedVal)}`;
})
.join('');
const flattenErrorPath = (errorPath: Array<string | number>) => errorPath.join('.');

View file

@ -0,0 +1,95 @@
import type { AstroConfig, AstroIntegration } from 'astro';
import type { VitePlugin } from '../utils.js';
import fs from 'node:fs';
import { pathToFileURL } from 'node:url';
import path from 'node:path';
async function copyFile(toDir: URL, fromUrl: URL, toUrl: URL) {
await fs.promises.mkdir(toDir, { recursive: true });
await fs.promises.rename(fromUrl, toUrl);
}
export function fileURLIntegration(): AstroIntegration {
const fileNames: string[] = [];
function createVitePlugin(command: 'build' | 'preview' | 'dev'): VitePlugin {
let referenceIds: string[] = [];
return {
name: '@astrojs/db/file-url',
enforce: 'pre',
async load(id) {
if (id.endsWith('?fileurl')) {
const filePath = id.slice(0, id.indexOf('?'));
if (command === 'build') {
const data = await fs.promises.readFile(filePath);
const name = path.basename(filePath);
const referenceId = this.emitFile({
name,
source: data,
type: 'asset',
});
referenceIds.push(referenceId);
return `export default import.meta.ROLLUP_FILE_URL_${referenceId};`;
}
// dev mode
else {
return `export default new URL(${JSON.stringify(pathToFileURL(filePath).toString())})`;
}
}
},
generateBundle() {
// Save file names so we can copy them back over.
for (const referenceId of referenceIds) {
fileNames.push(this.getFileName(referenceId));
}
// Reset `referenceIds` for later generateBundle() runs.
// Prevents lookup for ids that have already been copied.
referenceIds = [];
},
};
}
let config: AstroConfig;
return {
name: '@astrojs/db/file-url',
hooks: {
'astro:config:setup'({ updateConfig, command }) {
updateConfig({
vite: {
plugins: [createVitePlugin(command)],
},
});
},
'astro:config:done': ({ config: _config }) => {
config = _config;
},
async 'astro:build:done'() {
if (config.output === 'static') {
// Delete the files since they are only used for the build process.
const unlinks: Promise<unknown>[] = [];
for (const fileName of fileNames) {
const url = new URL(fileName, config.outDir);
unlinks.push(fs.promises.unlink(url));
}
await Promise.all(unlinks);
const assetDir = new URL(config.build.assets, config.outDir);
const assetFiles = await fs.promises.readdir(assetDir);
if (!assetFiles.length) {
// Directory is empty, delete it.
await fs.promises.rmdir(assetDir);
}
} else {
// Move files back over to the dist output path
const moves: Promise<unknown>[] = [];
for (const fileName of fileNames) {
const fromUrl = new URL(fileName, config.build.client);
const toUrl = new URL(fileName, config.build.server);
const toDir = new URL('./', toUrl);
moves.push(copyFile(toDir, fromUrl, toUrl));
}
await Promise.all(moves);
}
},
},
};
}

View file

@ -0,0 +1,139 @@
import type { AstroIntegration } from 'astro';
import { vitePluginDb } from './vite-plugin-db.js';
import { vitePluginInjectEnvTs } from './vite-plugin-inject-env-ts.js';
import { typegen } from './typegen.js';
import { existsSync } from 'fs';
import { mkdir, rm, writeFile } from 'fs/promises';
import { DB_PATH } from '../consts.js';
import { createLocalDatabaseClient } from '../../runtime/db-client.js';
import { astroConfigWithDbSchema, type DBTables } from '../types.js';
import { type VitePlugin } from '../utils.js';
import {
STUDIO_CONFIG_MISSING_WRITABLE_COLLECTIONS_ERROR,
UNSAFE_WRITABLE_WARNING,
} from '../errors.js';
import { errorMap } from './error-map.js';
import { dirname } from 'path';
import { fileURLToPath } from 'url';
import { blue, yellow } from 'kleur/colors';
import { fileURLIntegration } from './file-url.js';
import { recreateTables, seedData } from '../queries.js';
import { getManagedAppTokenOrExit, type ManagedAppToken } from '../tokens.js';
function astroDBIntegration(): AstroIntegration {
let connectedToRemote = false;
let appToken: ManagedAppToken | undefined;
let schemas = {
tables(): DBTables {
throw new Error('tables not found');
},
};
let command: 'dev' | 'build' | 'preview';
return {
name: 'astro:db',
hooks: {
'astro:config:setup': async ({ updateConfig, config, command: _command, logger }) => {
command = _command;
if (_command === 'preview') return;
let dbPlugin: VitePlugin | undefined = undefined;
const studio = config.db?.studio ?? false;
if (studio && command === 'build' && process.env.ASTRO_DB_TEST_ENV !== '1') {
appToken = await getManagedAppTokenOrExit();
connectedToRemote = true;
dbPlugin = vitePluginDb({
connectToStudio: true,
appToken: appToken.token,
schemas,
root: config.root,
});
} else {
dbPlugin = vitePluginDb({
connectToStudio: false,
schemas,
root: config.root,
});
}
updateConfig({
vite: {
assetsInclude: [DB_PATH],
plugins: [dbPlugin, vitePluginInjectEnvTs(config, logger)],
},
});
},
'astro:config:done': async ({ config, logger }) => {
// TODO: refine where we load tables
// @matthewp: may want to load tables by path at runtime
const configWithDb = astroConfigWithDbSchema.parse(config, { errorMap });
const tables = configWithDb.db?.tables ?? {};
// Redefine getTables so our integration can grab them
schemas.tables = () => tables;
const studio = configWithDb.db?.studio ?? false;
const unsafeWritable = Boolean(configWithDb.db?.unsafeWritable);
const foundWritableCollection = Object.entries(tables).find(([, c]) => c.writable);
const writableAllowed = studio || unsafeWritable;
if (!writableAllowed && foundWritableCollection) {
logger.error(
STUDIO_CONFIG_MISSING_WRITABLE_COLLECTIONS_ERROR(foundWritableCollection[0])
);
process.exit(1);
}
// Using writable tables with the opt-in flag. Warn them to let them
// know the risk.
else if (unsafeWritable && foundWritableCollection) {
logger.warn(UNSAFE_WRITABLE_WARNING);
}
if (!connectedToRemote) {
const dbUrl = new URL(DB_PATH, config.root);
if (existsSync(dbUrl)) {
await rm(dbUrl);
}
await mkdir(dirname(fileURLToPath(dbUrl)), { recursive: true });
await writeFile(dbUrl, '');
using db = await createLocalDatabaseClient({
tables,
dbUrl: dbUrl.toString(),
seeding: true,
});
await recreateTables({ db, tables });
if (configWithDb.db?.data) {
await seedData({
db,
data: configWithDb.db.data,
logger,
mode: command === 'dev' ? 'dev' : 'build',
});
}
logger.debug('Database setup complete.');
}
await typegen({ tables, root: config.root });
},
'astro:server:start': async ({ logger }) => {
// Wait for the server startup to log, so that this can come afterwards.
setTimeout(() => {
logger.info(
connectedToRemote ? 'Connected to remote database.' : 'New local database created.'
);
}, 100);
},
'astro:build:start': async ({ logger }) => {
logger.info(
'database: ' + (connectedToRemote ? yellow('remote') : blue('local database.'))
);
},
'astro:build:done': async ({}) => {
await appToken?.destroy();
},
},
};
}
export function integration(): AstroIntegration[] {
return [astroDBIntegration(), fileURLIntegration()];
}

View file

@ -0,0 +1,92 @@
import fs from 'node:fs';
import path from 'node:path';
import { pathToFileURL } from 'node:url';
import { bold, red } from 'kleur/colors';
import { type ViteDevServer, createServer } from 'vite';
/**
* Pulled from the mothership, Astro core
*
* @see https://github.com/withastro/astro/blob/main/packages/astro/src/core/config/config.ts#L121
*/
export async function loadAstroConfig(root: string): Promise<Record<string, unknown>> {
const configPath = search(root);
if (!configPath) return {};
// Create a vite server to load the config
try {
return await loadConfigWithVite(configPath);
} catch (e) {
// Config errors should bypass log level as it breaks startup
// eslint-disable-next-line no-console
console.error(`${bold(red('[astro]'))} Unable to load Astro config.\n`);
throw e;
}
}
function search(root: string) {
const paths = [
'astro.config.mjs',
'astro.config.js',
'astro.config.ts',
'astro.config.mts',
'astro.config.cjs',
'astro.config.cts',
].map((p) => path.join(root, p));
for (const file of paths) {
if (fs.existsSync(file)) {
return file;
}
}
}
async function loadConfigWithVite(configPath: string): Promise<Record<string, unknown>> {
if (/\.[cm]?js$/.test(configPath)) {
try {
const config = await import(
/* @vite-ignore */ pathToFileURL(configPath).toString() + '?t=' + Date.now()
);
return config.default ?? {};
} catch (e) {
// We do not need to throw the error here as we have a Vite fallback below
}
}
// Try Loading with Vite
let server: ViteDevServer | undefined;
try {
server = await createViteServer();
const mod = await server.ssrLoadModule(configPath, { fixStacktrace: true });
return mod.default ?? {};
} finally {
if (server) {
await server.close();
}
}
}
async function createViteServer(): Promise<ViteDevServer> {
const viteServer = await createServer({
server: { middlewareMode: true, hmr: false, watch: { ignored: ['**'] } },
optimizeDeps: { disabled: true },
clearScreen: false,
appType: 'custom',
ssr: {
// NOTE: Vite doesn't externalize linked packages by default. During testing locally,
// these dependencies trip up Vite's dev SSR transform. Awaiting upstream feature:
// https://github.com/vitejs/vite/pull/10939
external: [
'@astrojs/tailwind',
'@astrojs/mdx',
'@astrojs/react',
'@astrojs/preact',
'@astrojs/sitemap',
'@astrojs/markdoc',
],
},
});
return viteServer;
}

View file

@ -0,0 +1,46 @@
import { existsSync } from 'node:fs';
import { mkdir, writeFile } from 'node:fs/promises';
import type { DBTable, DBTables } from '../types.js';
import { DB_TYPES_FILE, RUNTIME_DRIZZLE_IMPORT, RUNTIME_IMPORT } from '../consts.js';
export async function typegen({ tables, root }: { tables: DBTables; root: URL }) {
const content = `// This file is generated by \`studio sync\`
declare module 'astro:db' {
export const db: import(${RUNTIME_IMPORT}).SqliteDB;
export const dbUrl: string;
export * from ${RUNTIME_DRIZZLE_IMPORT};
${Object.entries(tables)
.map(([name, collection]) => generateTableType(name, collection))
.join('\n')}
}
`;
const dotAstroDir = new URL('.astro/', root);
if (!existsSync(dotAstroDir)) {
await mkdir(dotAstroDir);
}
await writeFile(new URL(DB_TYPES_FILE, dotAstroDir), content);
}
function generateTableType(name: string, collection: DBTable): string {
let tableType = ` export const ${name}: import(${RUNTIME_IMPORT}).Table<
${JSON.stringify(name)},
${JSON.stringify(
Object.fromEntries(
Object.entries(collection.columns).map(([columnName, column]) => [
columnName,
{
// Only select columns Drizzle needs for inference
type: column.type,
optional: column.schema.optional,
default: column.schema.default,
},
])
)
)}
>;`;
return tableType;
}

View file

@ -0,0 +1,98 @@
import { DB_PATH, RUNTIME_DRIZZLE_IMPORT, RUNTIME_IMPORT, VIRTUAL_MODULE_ID } from '../consts.js';
import type { DBTables } from '../types.js';
import type { VitePlugin } from '../utils.js';
const resolvedVirtualModuleId = '\0' + VIRTUAL_MODULE_ID;
type LateSchema = {
tables: () => DBTables;
};
type VitePluginDBParams =
| {
connectToStudio: false;
schemas: LateSchema;
root: URL;
}
| {
connectToStudio: true;
schemas: LateSchema;
appToken: string;
root: URL;
};
export function vitePluginDb(params: VitePluginDBParams): VitePlugin {
return {
name: 'astro:db',
enforce: 'pre',
resolveId(id) {
if (id === VIRTUAL_MODULE_ID) {
return resolvedVirtualModuleId;
}
},
load(id) {
if (id !== resolvedVirtualModuleId) return;
if (params.connectToStudio) {
return getStudioVirtualModContents({
appToken: params.appToken,
tables: params.schemas.tables(),
});
}
return getVirtualModContents({
root: params.root,
tables: params.schemas.tables(),
});
},
};
}
export function getVirtualModContents({ tables, root }: { tables: DBTables; root: URL }) {
const dbUrl = new URL(DB_PATH, root);
return `
import { collectionToTable, createLocalDatabaseClient } from ${RUNTIME_IMPORT};
import dbUrl from ${JSON.stringify(`${dbUrl}?fileurl`)};
const params = ${JSON.stringify({
tables,
seeding: false,
})};
params.dbUrl = dbUrl;
export const db = await createLocalDatabaseClient(params);
export * from ${RUNTIME_DRIZZLE_IMPORT};
${getStringifiedCollectionExports(tables)}
`;
}
export function getStudioVirtualModContents({
tables,
appToken,
}: {
tables: DBTables;
appToken: string;
}) {
return `
import {collectionToTable, createRemoteDatabaseClient} from ${RUNTIME_IMPORT};
export const db = await createRemoteDatabaseClient(${JSON.stringify(
appToken
)}, import.meta.env.ASTRO_STUDIO_REMOTE_DB_URL);
export * from ${RUNTIME_DRIZZLE_IMPORT};
${getStringifiedCollectionExports(tables)}
`;
}
function getStringifiedCollectionExports(tables: DBTables) {
return Object.entries(tables)
.map(
([name, collection]) =>
`export const ${name} = collectionToTable(${JSON.stringify(name)}, ${JSON.stringify(
collection
)}, false)`
)
.join('\n');
}

View file

@ -0,0 +1,65 @@
import { existsSync } from 'node:fs';
import { readFile, writeFile } from 'node:fs/promises';
import path from 'node:path';
import { fileURLToPath } from 'node:url';
import { bold, cyan } from 'kleur/colors';
import { normalizePath } from 'vite';
import { DB_TYPES_FILE } from '../consts.js';
import type { VitePlugin } from '../utils.js';
import type { AstroIntegrationLogger } from 'astro';
export function vitePluginInjectEnvTs(
{ srcDir, root }: { srcDir: URL; root: URL },
logger: AstroIntegrationLogger
): VitePlugin {
return {
name: 'db-inject-env-ts',
enforce: 'post',
async config() {
await setUpEnvTs({ srcDir, root, logger });
},
};
}
export async function setUpEnvTs({
srcDir,
root,
logger,
}: {
srcDir: URL;
root: URL;
logger: AstroIntegrationLogger;
}) {
const envTsPath = getEnvTsPath({ srcDir });
const envTsPathRelativetoRoot = normalizePath(
path.relative(fileURLToPath(root), fileURLToPath(envTsPath))
);
if (existsSync(envTsPath)) {
let typesEnvContents = await readFile(envTsPath, 'utf-8');
const dotAstroDir = new URL('.astro/', root);
if (!existsSync(dotAstroDir)) return;
const dbTypeReference = getDBTypeReference({ srcDir, dotAstroDir });
if (!typesEnvContents.includes(dbTypeReference)) {
typesEnvContents = `${dbTypeReference}\n${typesEnvContents}`;
await writeFile(envTsPath, typesEnvContents, 'utf-8');
logger.info(`${cyan(bold('[astro:db]'))} Added ${bold(envTsPathRelativetoRoot)} types`);
}
}
}
function getDBTypeReference({ srcDir, dotAstroDir }: { srcDir: URL; dotAstroDir: URL }) {
const dbTypesFile = new URL(DB_TYPES_FILE, dotAstroDir);
const contentTypesRelativeToSrcDir = normalizePath(
path.relative(fileURLToPath(srcDir), fileURLToPath(dbTypesFile))
);
return `/// <reference path=${JSON.stringify(contentTypesRelativeToSrcDir)} />`;
}
function getEnvTsPath({ srcDir }: { srcDir: URL }) {
return new URL('env.d.ts', srcDir);
}

View file

@ -0,0 +1,271 @@
import type { SqliteRemoteDatabase } from 'drizzle-orm/sqlite-proxy';
import {
type BooleanColumn,
type DBTable,
type DBTables,
type DBColumn,
type DateColumn,
type ColumnType,
type JsonColumn,
type NumberColumn,
type TextColumn,
} from '../core/types.js';
import { bold } from 'kleur/colors';
import { type SQL, sql, getTableName } from 'drizzle-orm';
import { SQLiteAsyncDialect, type SQLiteInsert } from 'drizzle-orm/sqlite-core';
import type { AstroIntegrationLogger } from 'astro';
import type { DBUserConfig } from '../core/types.js';
import { hasPrimaryKey } from '../runtime/index.js';
import { isSerializedSQL } from '../runtime/types.js';
import { SEED_WRITABLE_IN_PROD_ERROR } from './errors.js';
const sqlite = new SQLiteAsyncDialect();
export async function recreateTables({
db,
tables,
}: {
db: SqliteRemoteDatabase;
tables: DBTables;
}) {
const setupQueries: SQL[] = [];
for (const [name, collection] of Object.entries(tables)) {
const dropQuery = sql.raw(`DROP TABLE IF EXISTS ${sqlite.escapeName(name)}`);
const createQuery = sql.raw(getCreateTableQuery(name, collection));
const indexQueries = getCreateIndexQueries(name, collection);
setupQueries.push(dropQuery, createQuery, ...indexQueries.map((s) => sql.raw(s)));
}
for (const q of setupQueries) {
await db.run(q);
}
}
export async function seedData({
db,
data,
logger,
mode,
}: {
db: SqliteRemoteDatabase;
data: DBUserConfig['data'];
logger?: AstroIntegrationLogger;
mode: 'dev' | 'build';
}) {
try {
const dataFns = Array.isArray(data) ? data : [data];
for (const dataFn of dataFns) {
await dataFn({
seed: async ({ table, writable }, values) => {
if (writable && mode === 'build' && process.env.ASTRO_DB_TEST_ENV !== '1') {
(logger ?? console).error(SEED_WRITABLE_IN_PROD_ERROR(getTableName(table)));
process.exit(1);
}
await db.insert(table).values(values as any);
},
seedReturning: async ({ table, writable }, values) => {
if (writable && mode === 'build' && process.env.ASTRO_DB_TEST_ENV !== '1') {
(logger ?? console).error(SEED_WRITABLE_IN_PROD_ERROR(getTableName(table)));
process.exit(1);
}
let result: SQLiteInsert<any, any, any, any> = db
.insert(table)
.values(values as any)
.returning();
if (!Array.isArray(values)) {
result = result.get();
}
return result;
},
db,
mode,
});
}
} catch (error) {
(logger ?? console).error(
`Failed to seed data. Did you update to match recent schema changes?`
);
(logger ?? console).error(error as string);
}
}
export function getCreateTableQuery(collectionName: string, collection: DBTable) {
let query = `CREATE TABLE ${sqlite.escapeName(collectionName)} (`;
const colQueries = [];
const colHasPrimaryKey = Object.entries(collection.columns).find(([, column]) =>
hasPrimaryKey(column)
);
if (!colHasPrimaryKey) {
colQueries.push('_id INTEGER PRIMARY KEY');
}
for (const [columnName, column] of Object.entries(collection.columns)) {
const colQuery = `${sqlite.escapeName(columnName)} ${schemaTypeToSqlType(
column.type
)}${getModifiers(columnName, column)}`;
colQueries.push(colQuery);
}
colQueries.push(...getCreateForeignKeyQueries(collectionName, collection));
query += colQueries.join(', ') + ')';
return query;
}
export function getCreateIndexQueries(
collectionName: string,
collection: Pick<DBTable, 'indexes'>
) {
let queries: string[] = [];
for (const [indexName, indexProps] of Object.entries(collection.indexes ?? {})) {
const onColNames = asArray(indexProps.on);
const onCols = onColNames.map((colName) => sqlite.escapeName(colName));
const unique = indexProps.unique ? 'UNIQUE ' : '';
const indexQuery = `CREATE ${unique}INDEX ${sqlite.escapeName(
indexName
)} ON ${sqlite.escapeName(collectionName)} (${onCols.join(', ')})`;
queries.push(indexQuery);
}
return queries;
}
export function getCreateForeignKeyQueries(collectionName: string, collection: DBTable) {
let queries: string[] = [];
for (const foreignKey of collection.foreignKeys ?? []) {
const columns = asArray(foreignKey.columns);
const references = asArray(foreignKey.references);
if (columns.length !== references.length) {
throw new Error(
`Foreign key on ${collectionName} is misconfigured. \`columns\` and \`references\` must be the same length.`
);
}
const referencedCollection = references[0]?.schema.collection;
if (!referencedCollection) {
throw new Error(
`Foreign key on ${collectionName} is misconfigured. \`references\` cannot be empty.`
);
}
const query = `FOREIGN KEY (${columns
.map((f) => sqlite.escapeName(f))
.join(', ')}) REFERENCES ${sqlite.escapeName(referencedCollection)}(${references
.map((r) => sqlite.escapeName(r.schema.name!))
.join(', ')})`;
queries.push(query);
}
return queries;
}
function asArray<T>(value: T | T[]) {
return Array.isArray(value) ? value : [value];
}
export function schemaTypeToSqlType(type: ColumnType): 'text' | 'integer' {
switch (type) {
case 'date':
case 'text':
case 'json':
return 'text';
case 'number':
case 'boolean':
return 'integer';
}
}
export function getModifiers(columnName: string, column: DBColumn) {
let modifiers = '';
if (hasPrimaryKey(column)) {
return ' PRIMARY KEY';
}
if (!column.schema.optional) {
modifiers += ' NOT NULL';
}
if (column.schema.unique) {
modifiers += ' UNIQUE';
}
if (hasDefault(column)) {
modifiers += ` DEFAULT ${getDefaultValueSql(columnName, column)}`;
}
const references = getReferencesConfig(column);
if (references) {
const { collection, name } = references.schema;
if (!collection || !name) {
throw new Error(
`Column ${collection}.${name} references a collection that does not exist. Did you apply the referenced collection to the \`tables\` object in your Astro config?`
);
}
modifiers += ` REFERENCES ${sqlite.escapeName(collection)} (${sqlite.escapeName(name)})`;
}
return modifiers;
}
export function getReferencesConfig(column: DBColumn) {
const canHaveReferences = column.type === 'number' || column.type === 'text';
if (!canHaveReferences) return undefined;
return column.schema.references;
}
// Using `DBColumn` will not narrow `default` based on the column `type`
// Handle each column separately
type WithDefaultDefined<T extends DBColumn> = T & {
schema: Required<Pick<T['schema'], 'default'>>;
};
type DBColumnWithDefault =
| WithDefaultDefined<TextColumn>
| WithDefaultDefined<DateColumn>
| WithDefaultDefined<NumberColumn>
| WithDefaultDefined<BooleanColumn>
| WithDefaultDefined<JsonColumn>;
// Type narrowing the default fails on union types, so use a type guard
export function hasDefault(column: DBColumn): column is DBColumnWithDefault {
if (column.schema.default !== undefined) {
return true;
}
if (hasPrimaryKey(column) && column.type === 'number') {
return true;
}
return false;
}
function toDefault<T>(def: T | SQL<any>): string {
const type = typeof def;
if (type === 'string') {
return sqlite.escapeString(def as string);
} else if (type === 'boolean') {
return def ? 'TRUE' : 'FALSE';
} else {
return def + '';
}
}
function getDefaultValueSql(columnName: string, column: DBColumnWithDefault): string {
if (isSerializedSQL(column.schema.default)) {
return column.schema.default.sql;
}
switch (column.type) {
case 'boolean':
case 'number':
case 'text':
case 'date':
return toDefault(column.schema.default);
case 'json': {
let stringified = '';
try {
stringified = JSON.stringify(column.schema.default);
} catch (e) {
// eslint-disable-next-line no-console
console.log(
`Invalid default value for column ${bold(
columnName
)}. Defaults must be valid JSON when using the \`json()\` type.`
);
process.exit(0);
}
return sqlite.escapeString(stringified);
}
}
}

View file

@ -0,0 +1,143 @@
import { readFile } from 'node:fs/promises';
import { homedir } from 'node:os';
import { join } from 'node:path';
import { pathToFileURL } from 'node:url';
import { getAstroStudioEnv, getAstroStudioUrl } from './utils.js';
import { MISSING_PROJECT_ID_ERROR, MISSING_SESSION_ID_ERROR } from './errors.js';
export const SESSION_LOGIN_FILE = pathToFileURL(join(homedir(), '.astro', 'session-token'));
export const PROJECT_ID_FILE = pathToFileURL(join(process.cwd(), '.astro', 'link'));
export interface ManagedAppToken {
token: string;
renew(): Promise<void>;
destroy(): Promise<void>;
}
class ManagedLocalAppToken implements ManagedAppToken {
token: string;
constructor(token: string) {
this.token = token;
}
async renew() {}
async destroy() {}
}
class ManagedRemoteAppToken implements ManagedAppToken {
token: string;
session: string;
projectId: string;
ttl: number;
renewTimer: NodeJS.Timeout | undefined;
static async create(sessionToken: string, projectId: string) {
const response = await fetch(new URL(`${getAstroStudioUrl()}/auth/cli/token-create`), {
method: 'POST',
headers: new Headers({
Authorization: `Bearer ${sessionToken}`,
}),
body: JSON.stringify({ projectId }),
});
const { token: shortLivedAppToken, ttl } = await response.json();
return new ManagedRemoteAppToken({
token: shortLivedAppToken,
session: sessionToken,
projectId,
ttl,
});
}
constructor(options: { token: string; session: string; projectId: string; ttl: number }) {
this.token = options.token;
this.session = options.session;
this.projectId = options.projectId;
this.ttl = options.ttl;
this.renewTimer = setTimeout(() => this.renew(), (1000 * 60 * 5) / 2);
}
private async fetch(url: string, body: unknown) {
return fetch(`${getAstroStudioUrl()}${url}`, {
method: 'POST',
headers: {
Authorization: `Bearer ${this.session}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(body),
});
}
async renew() {
clearTimeout(this.renewTimer);
delete this.renewTimer;
try {
const response = await this.fetch('/auth/cli/token-renew', {
token: this.token,
projectId: this.projectId,
});
if (response.status === 200) {
this.renewTimer = setTimeout(() => this.renew(), (1000 * 60 * this.ttl) / 2);
} else {
throw new Error(`Unexpected response: ${response.status} ${response.statusText}`);
}
} catch (error: any) {
const retryIn = (60 * this.ttl) / 10;
// eslint-disable-next-line no-console
console.error(`Failed to renew token. Retrying in ${retryIn} seconds.`, error?.message);
this.renewTimer = setTimeout(() => this.renew(), retryIn * 1000);
}
}
async destroy() {
try {
const response = await this.fetch('/auth/cli/token-delete', {
token: this.token,
projectId: this.projectId,
});
if (response.status !== 200) {
throw new Error(`Unexpected response: ${response.status} ${response.statusText}`);
}
} catch (error: any) {
// eslint-disable-next-line no-console
console.error('Failed to delete token.', error?.message);
}
}
}
export async function getProjectIdFromFile() {
try {
return await readFile(PROJECT_ID_FILE, 'utf-8');
} catch (error) {
return undefined;
}
}
export async function getSessionIdFromFile() {
try {
return await readFile(SESSION_LOGIN_FILE, 'utf-8');
} catch (error) {
return undefined;
}
}
export async function getManagedAppTokenOrExit(token?: string): Promise<ManagedAppToken> {
if (token) {
return new ManagedLocalAppToken(token);
}
const { ASTRO_STUDIO_APP_TOKEN } = getAstroStudioEnv();
if (ASTRO_STUDIO_APP_TOKEN) {
return new ManagedLocalAppToken(ASTRO_STUDIO_APP_TOKEN);
}
const sessionToken = await getSessionIdFromFile();
if (!sessionToken) {
// eslint-disable-next-line no-console
console.error(MISSING_SESSION_ID_ERROR);
process.exit(1);
}
const projectId = await getProjectIdFromFile();
if (!sessionToken || !projectId) {
// eslint-disable-next-line no-console
console.error(MISSING_PROJECT_ID_ERROR);
process.exit(1);
}
return ManagedRemoteAppToken.create(sessionToken, projectId);
}

View file

@ -0,0 +1,386 @@
import { SQLiteAsyncDialect, type SQLiteInsertValue } from 'drizzle-orm/sqlite-core';
import type { InferSelectModel } from 'drizzle-orm';
import { collectionToTable, type SqliteDB, type Table } from '../runtime/index.js';
import { z, type ZodTypeDef } from 'zod';
import { SQL } from 'drizzle-orm';
import { errorMap } from './integration/error-map.js';
import { SERIALIZED_SQL_KEY, type SerializedSQL } from '../runtime/types.js';
export type MaybePromise<T> = T | Promise<T>;
export type MaybeArray<T> = T | T[];
// Transform to serializable object for migration files
const sqlite = new SQLiteAsyncDialect();
const sqlSchema = z.instanceof(SQL<any>).transform(
(sqlObj): SerializedSQL => ({
[SERIALIZED_SQL_KEY]: true,
sql: sqlite.sqlToQuery(sqlObj).sql,
})
);
const baseColumnSchema = z.object({
label: z.string().optional(),
optional: z.boolean().optional().default(false),
unique: z.boolean().optional().default(false),
// Defined when `defineReadableTable()` is called
name: z.string().optional(),
collection: z.string().optional(),
});
const booleanColumnSchema = z.object({
type: z.literal('boolean'),
schema: baseColumnSchema.extend({
default: z.union([z.boolean(), sqlSchema]).optional(),
}),
});
const numberColumnBaseSchema = baseColumnSchema.omit({ optional: true }).and(
z.union([
z.object({
primaryKey: z.literal(false).optional().default(false),
optional: baseColumnSchema.shape.optional,
default: z.union([z.number(), sqlSchema]).optional(),
}),
z.object({
// `integer primary key` uses ROWID as the default value.
// `optional` and `default` do not have an effect,
// so disable these config options for primary keys.
primaryKey: z.literal(true),
optional: z.literal(false).optional(),
default: z.literal(undefined).optional(),
}),
])
);
const numberColumnOptsSchema: z.ZodType<
z.infer<typeof numberColumnBaseSchema> & {
// ReferenceableColumn creates a circular type. Define ZodType to resolve.
references?: NumberColumn;
},
ZodTypeDef,
z.input<typeof numberColumnBaseSchema> & {
references?: () => z.input<typeof numberColumnSchema>;
}
> = numberColumnBaseSchema.and(
z.object({
references: z
.function()
.returns(z.lazy(() => numberColumnSchema))
.optional()
.transform((fn) => fn?.()),
})
);
const numberColumnSchema = z.object({
type: z.literal('number'),
schema: numberColumnOptsSchema,
});
const textColumnBaseSchema = baseColumnSchema
.omit({ optional: true })
.extend({
default: z.union([z.string(), sqlSchema]).optional(),
multiline: z.boolean().optional(),
})
.and(
z.union([
z.object({
primaryKey: z.literal(false).optional().default(false),
optional: baseColumnSchema.shape.optional,
}),
z.object({
// text primary key allows NULL values.
// NULL values bypass unique checks, which could
// lead to duplicate URLs per record in Astro Studio.
// disable `optional` for primary keys.
primaryKey: z.literal(true),
optional: z.literal(false).optional(),
}),
])
);
const textColumnOptsSchema: z.ZodType<
z.infer<typeof textColumnBaseSchema> & {
// ReferenceableColumn creates a circular type. Define ZodType to resolve.
references?: TextColumn;
},
ZodTypeDef,
z.input<typeof textColumnBaseSchema> & {
references?: () => z.input<typeof textColumnSchema>;
}
> = textColumnBaseSchema.and(
z.object({
references: z
.function()
.returns(z.lazy(() => textColumnSchema))
.optional()
.transform((fn) => fn?.()),
})
);
const textColumnSchema = z.object({
type: z.literal('text'),
schema: textColumnOptsSchema,
});
const dateColumnSchema = z.object({
type: z.literal('date'),
schema: baseColumnSchema.extend({
default: z
.union([
sqlSchema,
// transform to ISO string for serialization
z.date().transform((d) => d.toISOString()),
])
.optional(),
}),
});
const jsonColumnSchema = z.object({
type: z.literal('json'),
schema: baseColumnSchema.extend({
default: z.unknown().optional(),
}),
});
export const columnSchema = z.union([
booleanColumnSchema,
numberColumnSchema,
textColumnSchema,
dateColumnSchema,
jsonColumnSchema,
]);
export const referenceableColumnSchema = z.union([textColumnSchema, numberColumnSchema]);
const columnsSchema = z.record(columnSchema);
export const indexSchema = z.object({
on: z.string().or(z.array(z.string())),
unique: z.boolean().optional(),
});
type ForeignKeysInput = {
columns: MaybeArray<string>;
references: () => MaybeArray<Omit<z.input<typeof referenceableColumnSchema>, 'references'>>;
};
type ForeignKeysOutput = Omit<ForeignKeysInput, 'references'> & {
// reference fn called in `transform`. Ensures output is JSON serializable.
references: MaybeArray<Omit<z.output<typeof referenceableColumnSchema>, 'references'>>;
};
const foreignKeysSchema: z.ZodType<ForeignKeysOutput, ZodTypeDef, ForeignKeysInput> = z.object({
columns: z.string().or(z.array(z.string())),
references: z
.function()
.returns(z.lazy(() => referenceableColumnSchema.or(z.array(referenceableColumnSchema))))
.transform((fn) => fn()),
});
export type Indexes = Record<string, z.infer<typeof indexSchema>>;
const baseCollectionSchema = z.object({
columns: columnsSchema,
indexes: z.record(indexSchema).optional(),
foreignKeys: z.array(foreignKeysSchema).optional(),
});
export const readableCollectionSchema = baseCollectionSchema.extend({
writable: z.literal(false),
});
export const writableCollectionSchema = baseCollectionSchema.extend({
writable: z.literal(true),
});
export const collectionSchema = z.union([readableCollectionSchema, writableCollectionSchema]);
export const tablesSchema = z.preprocess((rawCollections) => {
// Use `z.any()` to avoid breaking object references
const tables = z.record(z.any()).parse(rawCollections, { errorMap });
for (const [collectionName, collection] of Object.entries(tables)) {
// Append `table` object for data seeding.
// Must append at runtime so table name exists.
collection.table = collectionToTable(
collectionName,
collectionSchema.parse(collection, { errorMap })
);
// Append collection and column names to columns.
// Used to track collection info for references.
const { columns } = z.object({ columns: z.record(z.any()) }).parse(collection, { errorMap });
for (const [columnName, column] of Object.entries(columns)) {
column.schema.name = columnName;
column.schema.collection = collectionName;
}
}
return rawCollections;
}, z.record(collectionSchema));
export type BooleanColumn = z.infer<typeof booleanColumnSchema>;
export type BooleanColumnInput = z.input<typeof booleanColumnSchema>;
export type NumberColumn = z.infer<typeof numberColumnSchema>;
export type NumberColumnInput = z.input<typeof numberColumnSchema>;
export type TextColumn = z.infer<typeof textColumnSchema>;
export type TextColumnInput = z.input<typeof textColumnSchema>;
export type DateColumn = z.infer<typeof dateColumnSchema>;
export type DateColumnInput = z.input<typeof dateColumnSchema>;
export type JsonColumn = z.infer<typeof jsonColumnSchema>;
export type JsonColumnInput = z.input<typeof jsonColumnSchema>;
export type ColumnType =
| BooleanColumn['type']
| NumberColumn['type']
| TextColumn['type']
| DateColumn['type']
| JsonColumn['type'];
export type DBColumn = z.infer<typeof columnSchema>;
export type DBColumnInput =
| DateColumnInput
| BooleanColumnInput
| NumberColumnInput
| TextColumnInput
| JsonColumnInput;
export type DBColumns = z.infer<typeof columnsSchema>;
export type DBTable = z.infer<typeof readableCollectionSchema | typeof writableCollectionSchema>;
export type DBTables = Record<string, DBTable>;
export type DBSnapshot = {
schema: Record<string, DBTable>;
/**
* Snapshot version. Breaking changes to the snapshot format increment this number.
* @todo Rename to "version" once closer to release.
*/
experimentalVersion: number;
};
export type ReadableDBTable = z.infer<typeof readableCollectionSchema>;
export type WritableDBTable = z.infer<typeof writableCollectionSchema>;
export type DBDataContext = {
db: SqliteDB;
seed: <TColumns extends ColumnsConfig>(
collection: ResolvedCollectionConfig<TColumns>,
data: MaybeArray<SQLiteInsertValue<Table<string, TColumns>>>
) => Promise<void>;
seedReturning: <
TColumns extends ColumnsConfig,
TData extends MaybeArray<SQLiteInsertValue<Table<string, TColumns>>>,
>(
collection: ResolvedCollectionConfig<TColumns>,
data: TData
) => Promise<
TData extends Array<SQLiteInsertValue<Table<string, TColumns>>>
? InferSelectModel<Table<string, TColumns>>[]
: InferSelectModel<Table<string, TColumns>>
>;
mode: 'dev' | 'build';
};
export function defineData(fn: (ctx: DBDataContext) => MaybePromise<void>) {
return fn;
}
const dbDataFn = z.function().returns(z.union([z.void(), z.promise(z.void())]));
export const dbConfigSchema = z.object({
studio: z.boolean().optional(),
tables: tablesSchema.optional(),
data: z.union([dbDataFn, z.array(dbDataFn)]).optional(),
unsafeWritable: z.boolean().optional().default(false),
});
type DataFunction = (params: DBDataContext) => MaybePromise<void>;
export type DBUserConfig = Omit<z.input<typeof dbConfigSchema>, 'data'> & {
data: DataFunction | DataFunction[];
};
export const astroConfigWithDbSchema = z.object({
db: dbConfigSchema.optional(),
});
export type ColumnsConfig = z.input<typeof collectionSchema>['columns'];
interface CollectionConfig<TColumns extends ColumnsConfig = ColumnsConfig>
// use `extends` to ensure types line up with zod,
// only adding generics for type completions.
extends Pick<z.input<typeof collectionSchema>, 'columns' | 'indexes' | 'foreignKeys'> {
columns: TColumns;
foreignKeys?: Array<{
columns: MaybeArray<Extract<keyof TColumns, string>>;
// TODO: runtime error if parent collection doesn't match for all columns. Can't put a generic here...
references: () => MaybeArray<z.input<typeof referenceableColumnSchema>>;
}>;
indexes?: Record<string, IndexConfig<TColumns>>;
}
interface IndexConfig<TColumns extends ColumnsConfig> extends z.input<typeof indexSchema> {
on: MaybeArray<Extract<keyof TColumns, string>>;
}
export type ResolvedCollectionConfig<
TColumns extends ColumnsConfig = ColumnsConfig,
Writable extends boolean = boolean,
> = CollectionConfig<TColumns> & {
writable: Writable;
table: Table<string, TColumns>;
};
function baseDefineCollection<TColumns extends ColumnsConfig, TWritable extends boolean>(
userConfig: CollectionConfig<TColumns>,
writable: TWritable
): ResolvedCollectionConfig<TColumns, TWritable> {
return {
...userConfig,
writable,
// set at runtime to get the table name
table: null!,
};
}
export function defineReadableTable<TColumns extends ColumnsConfig>(
userConfig: CollectionConfig<TColumns>
): ResolvedCollectionConfig<TColumns, false> {
return baseDefineCollection(userConfig, false);
}
export function defineWritableTable<TColumns extends ColumnsConfig>(
userConfig: CollectionConfig<TColumns>
): ResolvedCollectionConfig<TColumns, true> {
return baseDefineCollection(userConfig, true);
}
export type AstroConfigWithDB = z.input<typeof astroConfigWithDbSchema>;
// We cannot use `Omit<NumberColumn | TextColumn, 'type'>`,
// since Omit collapses our union type on primary key.
type NumberColumnOpts = z.input<typeof numberColumnOptsSchema>;
type TextColumnOpts = z.input<typeof textColumnOptsSchema>;
function createColumn<S extends string, T extends Record<string, unknown>>(type: S, schema: T) {
return {
type,
/**
* @internal
*/
schema,
};
}
export const column = {
number: <T extends NumberColumnOpts>(opts: T = {} as T) => {
return createColumn('number', opts) satisfies { type: 'number' };
},
boolean: <T extends BooleanColumnInput['schema']>(opts: T = {} as T) => {
return createColumn('boolean', opts) satisfies { type: 'boolean' };
},
text: <T extends TextColumnOpts>(opts: T = {} as T) => {
return createColumn('text', opts) satisfies { type: 'text' };
},
date<T extends DateColumnInput['schema']>(opts: T = {} as T) {
return createColumn('date', opts) satisfies { type: 'date' };
},
json<T extends JsonColumnInput['schema']>(opts: T = {} as T) {
return createColumn('json', opts) satisfies { type: 'json' };
},
};

View file

@ -0,0 +1,19 @@
import type { AstroConfig } from 'astro';
import { loadEnv } from 'vite';
export type VitePlugin = Required<AstroConfig['vite']>['plugins'][number];
export function getAstroStudioEnv(envMode = ''): Record<`ASTRO_STUDIO_${string}`, string> {
const env = loadEnv(envMode, process.cwd(), 'ASTRO_STUDIO_');
return env;
}
export function getRemoteDatabaseUrl(): string {
const env = getAstroStudioEnv();
return env.ASTRO_STUDIO_REMOTE_DB_URL || 'https://db.services.astro.build';
}
export function getAstroStudioUrl(): string {
const env = getAstroStudioEnv();
return env.ASTRO_STUDIO_URL || 'https://stardate.astro.build';
}

5
packages/db/src/index.ts Normal file
View file

@ -0,0 +1,5 @@
export { defineReadableTable, defineWritableTable, defineData, column } from './core/types.js';
export type { ResolvedCollectionConfig, DBDataContext } from './core/types.js';
export { cli } from './core/cli/index.js';
export { integration as default } from './core/integration/index.js';
export { sql, NOW, TRUE, FALSE } from './runtime/index.js';

View file

@ -0,0 +1,111 @@
import type { InStatement } from '@libsql/client';
import { createClient } from '@libsql/client';
import { type DBTables } from '../core/types.js';
import type { LibSQLDatabase } from 'drizzle-orm/libsql';
import { drizzle as drizzleLibsql } from 'drizzle-orm/libsql';
import { drizzle as drizzleProxy } from 'drizzle-orm/sqlite-proxy';
import { type SQLiteTable } from 'drizzle-orm/sqlite-core';
import { z } from 'zod';
import { getTableName } from 'drizzle-orm';
const isWebContainer = !!process.versions?.webcontainer;
interface LocalDatabaseClient extends LibSQLDatabase, Disposable {}
export async function createLocalDatabaseClient({
tables,
dbUrl,
seeding,
}: {
dbUrl: string;
tables: DBTables;
seeding: boolean;
}): Promise<LocalDatabaseClient> {
const url = isWebContainer ? 'file:content.db' : dbUrl;
const client = createClient({ url });
const db = Object.assign(drizzleLibsql(client), {
[Symbol.dispose || Symbol.for('Symbol.dispose')]() {
client.close();
},
});
if (seeding) return db;
const { insert: drizzleInsert, update: drizzleUpdate, delete: drizzleDelete } = db;
return Object.assign(db, {
insert(Table: SQLiteTable) {
checkIfModificationIsAllowed(tables, Table);
return drizzleInsert.call(this, Table);
},
update(Table: SQLiteTable) {
checkIfModificationIsAllowed(tables, Table);
return drizzleUpdate.call(this, Table);
},
delete(Table: SQLiteTable) {
checkIfModificationIsAllowed(tables, Table);
return drizzleDelete.call(this, Table);
},
});
}
function checkIfModificationIsAllowed(tables: DBTables, Table: SQLiteTable) {
const tableName = getTableName(Table);
const collection = tables[tableName];
if (!collection.writable) {
throw new Error(`The [${tableName}] collection is read-only.`);
}
}
export function createRemoteDatabaseClient(appToken: string, remoteDbURL: string) {
const url = new URL('/db/query', remoteDbURL);
const db = drizzleProxy(async (sql, parameters, method) => {
const requestBody: InStatement = { sql, args: parameters };
// eslint-disable-next-line no-console
console.info(JSON.stringify(requestBody));
const res = await fetch(url, {
method: 'POST',
headers: {
Authorization: `Bearer ${appToken}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(requestBody),
});
if (!res.ok) {
throw new Error(
`Failed to execute query.\nQuery: ${sql}\nFull error: ${res.status} ${await res.text()}}`
);
}
const queryResultSchema = z.object({
rows: z.array(z.unknown()),
});
let rows: unknown[];
try {
const json = await res.json();
rows = queryResultSchema.parse(json).rows;
} catch (e) {
throw new Error(
`Failed to execute query.\nQuery: ${sql}\nFull error: Unexpected JSON response. ${
e instanceof Error ? e.message : String(e)
}`
);
}
// Drizzle expects each row as an array of its values
const rowValues: unknown[][] = [];
for (const row of rows) {
if (row != null && typeof row === 'object') {
rowValues.push(Object.values(row));
}
}
if (method === 'get') {
return { rows: rowValues[0] };
}
return { rows: rowValues };
});
return db;
}

View file

@ -0,0 +1,25 @@
// Drizzle utilities we expose directly from `astro:db`
export {
sql,
eq,
gt,
gte,
lt,
lte,
ne,
isNull,
isNotNull,
inArray,
notInArray,
exists,
notExists,
between,
notBetween,
like,
notIlike,
not,
asc,
desc,
and,
or,
} from 'drizzle-orm';

View file

@ -0,0 +1,139 @@
import type { SqliteRemoteDatabase } from 'drizzle-orm/sqlite-proxy';
import { type DBTable, type DBColumn } from '../core/types.js';
import { type ColumnBuilderBaseConfig, type ColumnDataType, sql } from 'drizzle-orm';
import {
customType,
integer,
sqliteTable,
text,
index,
type SQLiteColumnBuilderBase,
type IndexBuilder,
} from 'drizzle-orm/sqlite-core';
import { isSerializedSQL, type SerializedSQL } from './types.js';
export { sql };
export type SqliteDB = SqliteRemoteDatabase;
export type { Table } from './types.js';
export { createRemoteDatabaseClient, createLocalDatabaseClient } from './db-client.js';
export function hasPrimaryKey(column: DBColumn) {
return 'primaryKey' in column.schema && !!column.schema.primaryKey;
}
// Exports a few common expressions
export const NOW = sql`CURRENT_TIMESTAMP`;
export const TRUE = sql`TRUE`;
export const FALSE = sql`FALSE`;
const dateType = customType<{ data: Date; driverData: string }>({
dataType() {
return 'text';
},
toDriver(value) {
return value.toISOString();
},
fromDriver(value) {
return new Date(value);
},
});
const jsonType = customType<{ data: unknown; driverData: string }>({
dataType() {
return 'text';
},
toDriver(value) {
return JSON.stringify(value);
},
fromDriver(value) {
return JSON.parse(value);
},
});
type D1ColumnBuilder = SQLiteColumnBuilderBase<
ColumnBuilderBaseConfig<ColumnDataType, string> & { data: unknown }
>;
export function collectionToTable(name: string, collection: DBTable) {
const columns: Record<string, D1ColumnBuilder> = {};
if (!Object.entries(collection.columns).some(([, column]) => hasPrimaryKey(column))) {
columns['_id'] = integer('_id').primaryKey();
}
for (const [columnName, column] of Object.entries(collection.columns)) {
columns[columnName] = columnMapper(columnName, column);
}
const table = sqliteTable(name, columns, (ormTable) => {
const indexes: Record<string, IndexBuilder> = {};
for (const [indexName, indexProps] of Object.entries(collection.indexes ?? {})) {
const onColNames = Array.isArray(indexProps.on) ? indexProps.on : [indexProps.on];
const onCols = onColNames.map((colName) => ormTable[colName]);
if (!atLeastOne(onCols)) continue;
indexes[indexName] = index(indexName).on(...onCols);
}
return indexes;
});
return table;
}
function atLeastOne<T>(arr: T[]): arr is [T, ...T[]] {
return arr.length > 0;
}
function columnMapper(columnName: string, column: DBColumn) {
let c: ReturnType<
| typeof text
| typeof integer
| typeof jsonType
| typeof dateType
| typeof integer<string, 'boolean'>
>;
switch (column.type) {
case 'text': {
c = text(columnName);
// Duplicate default logic across cases to preserve type inference.
// No clean generic for every column builder.
if (column.schema.default !== undefined)
c = c.default(handleSerializedSQL(column.schema.default));
if (column.schema.primaryKey === true) c = c.primaryKey();
break;
}
case 'number': {
c = integer(columnName);
if (column.schema.default !== undefined)
c = c.default(handleSerializedSQL(column.schema.default));
if (column.schema.primaryKey === true) c = c.primaryKey();
break;
}
case 'boolean': {
c = integer(columnName, { mode: 'boolean' });
if (column.schema.default !== undefined)
c = c.default(handleSerializedSQL(column.schema.default));
break;
}
case 'json':
c = jsonType(columnName);
if (column.schema.default !== undefined) c = c.default(column.schema.default);
break;
case 'date': {
c = dateType(columnName);
if (column.schema.default !== undefined) {
const def = handleSerializedSQL(column.schema.default);
c = c.default(typeof def === 'string' ? new Date(def) : def);
}
break;
}
}
if (!column.schema.optional) c = c.notNull();
if (column.schema.unique) c = c.unique();
return c;
}
function handleSerializedSQL<T>(def: T | SerializedSQL) {
if (isSerializedSQL(def)) {
return sql.raw(def.sql);
}
return def;
}

View file

@ -0,0 +1,109 @@
import type { ColumnDataType, ColumnBaseConfig } from 'drizzle-orm';
import type { SQLiteColumn, SQLiteTableWithColumns } from 'drizzle-orm/sqlite-core';
import type { DBColumn, ColumnsConfig } from '../core/types.js';
type GeneratedConfig<T extends ColumnDataType = ColumnDataType> = Pick<
ColumnBaseConfig<T, string>,
'name' | 'tableName' | 'notNull' | 'hasDefault'
>;
export type AstroText<T extends GeneratedConfig<'string'>> = SQLiteColumn<
T & {
data: string;
dataType: 'string';
columnType: 'SQLiteText';
driverParam: string;
enumValues: never;
baseColumn: never;
}
>;
export type AstroDate<T extends GeneratedConfig<'custom'>> = SQLiteColumn<
T & {
data: Date;
dataType: 'custom';
columnType: 'SQLiteCustomColumn';
driverParam: string;
enumValues: never;
baseColumn: never;
}
>;
export type AstroBoolean<T extends GeneratedConfig<'boolean'>> = SQLiteColumn<
T & {
data: boolean;
dataType: 'boolean';
columnType: 'SQLiteBoolean';
driverParam: number;
enumValues: never;
baseColumn: never;
}
>;
export type AstroNumber<T extends GeneratedConfig<'number'>> = SQLiteColumn<
T & {
data: number;
dataType: 'number';
columnType: 'SQLiteInteger';
driverParam: number;
enumValues: never;
baseColumn: never;
}
>;
export type AstroJson<T extends GeneratedConfig<'custom'>> = SQLiteColumn<
T & {
data: unknown;
dataType: 'custom';
columnType: 'SQLiteCustomColumn';
driverParam: string;
enumValues: never;
baseColumn: never;
}
>;
export type Column<T extends DBColumn['type'], S extends GeneratedConfig> = T extends 'boolean'
? AstroBoolean<S>
: T extends 'number'
? AstroNumber<S>
: T extends 'text'
? AstroText<S>
: T extends 'date'
? AstroDate<S>
: T extends 'json'
? AstroJson<S>
: never;
export type Table<
TTableName extends string,
TColumns extends ColumnsConfig,
> = SQLiteTableWithColumns<{
name: TTableName;
schema: undefined;
dialect: 'sqlite';
columns: {
[K in Extract<keyof TColumns, string>]: Column<
TColumns[K]['type'],
{
tableName: TTableName;
name: K;
hasDefault: TColumns[K]['schema'] extends { default: NonNullable<unknown> }
? true
: TColumns[K]['schema'] extends { primaryKey: true }
? true
: false;
notNull: TColumns[K]['schema']['optional'] extends true ? false : true;
}
>;
};
}>;
export const SERIALIZED_SQL_KEY = '__serializedSQL';
export type SerializedSQL = {
[SERIALIZED_SQL_KEY]: true;
sql: string;
};
export function isSerializedSQL(value: any): value is SerializedSQL {
return typeof value === 'object' && value !== null && SERIALIZED_SQL_KEY in value;
}

View file

@ -0,0 +1,104 @@
import { expect } from 'chai';
import { load as cheerioLoad } from 'cheerio';
import testAdapter from '../../astro/test/test-adapter.js';
import { loadFixture } from '../../astro/test/test-utils.js';
// TODO(fks): Rename this to something more generic/generally useful
// like `ASTRO_MONOREPO_TEST_ENV` if @astrojs/db is merged into astro.
process.env.ASTRO_DB_TEST_ENV = '1';
describe('astro:db', () => {
let fixture;
before(async () => {
fixture = await loadFixture({
root: new URL('./fixtures/basics/', import.meta.url),
output: 'server',
adapter: testAdapter(),
});
});
describe('production', () => {
before(async () => {
await fixture.build();
});
it('Prints the list of authors', async () => {
const app = await fixture.loadTestAdapterApp();
const request = new Request('http://example.com/');
const res = await app.render(request);
const html = await res.text();
const $ = cheerioLoad(html);
const ul = $('.authors-list');
expect(ul.children()).to.have.a.lengthOf(5);
expect(ul.children().eq(0).text()).to.equal('Ben');
});
it('Errors when inserting to a readonly collection', async () => {
const app = await fixture.loadTestAdapterApp();
const request = new Request('http://example.com/insert-into-readonly');
const res = await app.render(request);
const html = await res.text();
const $ = cheerioLoad(html);
expect($('#error').text()).to.equal('The [Author] collection is read-only.');
});
it('Does not error when inserting into writable collection', async () => {
const app = await fixture.loadTestAdapterApp();
const request = new Request('http://example.com/insert-into-writable');
const res = await app.render(request);
const html = await res.text();
const $ = cheerioLoad(html);
expect($('#error').text()).to.equal('');
});
describe('Expression defaults', () => {
let app;
before(async () => {
app = await fixture.loadTestAdapterApp();
});
it('Allows expression defaults for date columns', async () => {
const request = new Request('http://example.com/');
const res = await app.render(request);
const html = await res.text();
const $ = cheerioLoad(html);
const themeAdded = $($('.themes-list .theme-added')[0]).text();
expect(new Date(themeAdded).getTime()).to.not.be.NaN;
});
it('Defaults can be overridden for dates', async () => {
const request = new Request('http://example.com/');
const res = await app.render(request);
const html = await res.text();
const $ = cheerioLoad(html);
const themeAdded = $($('.themes-list .theme-added')[1]).text();
expect(new Date(themeAdded).getTime()).to.not.be.NaN;
});
it('Allows expression defaults for text columns', async () => {
const request = new Request('http://example.com/');
const res = await app.render(request);
const html = await res.text();
const $ = cheerioLoad(html);
const themeOwner = $($('.themes-list .theme-owner')[0]).text();
expect(themeOwner).to.equal('');
});
it('Allows expression defaults for boolean columns', async () => {
const request = new Request('http://example.com/');
const res = await app.render(request);
const html = await res.text();
const $ = cheerioLoad(html);
const themeDark = $($('.themes-list .theme-dark')[0]).text();
expect(themeDark).to.equal('dark mode');
});
});
});
});

View file

@ -0,0 +1,28 @@
import db, { defineReadableTable, column } from '@astrojs/db';
import { defineConfig } from 'astro/config';
import { themes } from './themes-integration';
const Author = defineReadableTable({
columns: {
name: column.text(),
},
});
// https://astro.build/config
export default defineConfig({
integrations: [db(), themes()],
db: {
studio: false,
unsafeWritable: true,
tables: { Author },
async data({ seed }) {
await seed(Author, [
{ name: 'Ben' },
{ name: 'Nate' },
{ name: 'Erika' },
{ name: 'Bjorn' },
{ name: 'Sarah' },
]);
},
},
});

View file

@ -0,0 +1,9 @@
{
"name": "@test/db-aliases",
"version": "0.0.0",
"private": true,
"dependencies": {
"@astrojs/db": "workspace:*",
"astro": "workspace:*"
}
}

View file

@ -0,0 +1,26 @@
---
import { Author, db, Themes } from 'astro:db';
const authors = await db.select().from(Author);
const themes = await db.select().from(Themes);
---
<h2>Authors</h2>
<ul class="authors-list">
{authors.map((author) => <li>{author.name}</li>)}
</ul>
<h2>Themes</h2>
<ul class="themes-list">
{
themes.map((theme) => (
<li>
<div class="theme-name">{theme.name}</div>
<div class="theme-added">{theme.added}</div>
<div class="theme-updated">{theme.updated}</div>
<div class="theme-dark">{theme.isDark ? 'dark' : 'light'} mode</div>
<div class="theme-owner">{theme.owner}</div>
</li>
))
}
</ul>

View file

@ -0,0 +1,14 @@
---
import { Author, db } from 'astro:db';
const authors = await db.select().from(Author);
let error: any = {};
try {
db.insert(Author).values({ name: 'Person A' });
} catch (err) {
error = err;
}
---
<div id="error">{error.message}</div>

View file

@ -0,0 +1,12 @@
---
import { Themes, db } from 'astro:db';
let error: any = {};
try {
db.insert(Themes).values({ name: 'Person A' });
} catch (err) {
error = err;
}
---
<div id="error">{error.message}</div>

View file

@ -0,0 +1,36 @@
import { NOW, column, defineWritableTable, sql } from '@astrojs/db';
import type { AstroIntegration } from 'astro';
const Themes = defineWritableTable({
columns: {
name: column.text(),
added: column.date({
default: sql`CURRENT_TIMESTAMP`,
}),
updated: column.date({
default: NOW,
}),
isDark: column.boolean({ default: sql`TRUE` }),
owner: column.text({ optional: true, default: sql`NULL` }),
},
});
export function themes(): AstroIntegration {
return {
name: 'themes-integration',
hooks: {
'astro:config:setup': ({ updateConfig }) => {
updateConfig({
db: {
tables: { Themes },
async data({ seed }) {
// Seed writable tables in dev mode, only
// but in this case we do it for both, due to tests
await seed(Themes, [{ name: 'dracula' }, { name: 'monokai', added: new Date() }]);
},
},
});
},
},
};
}

View file

@ -0,0 +1,25 @@
import db, { defineReadableTable, column } from '@astrojs/db';
import { defineConfig } from 'astro/config';
import { asJson, createGlob } from './utils';
const Quote = defineReadableTable({
columns: {
author: column.text(),
body: column.text(),
file: column.text({ unique: true }),
},
});
export default defineConfig({
db: {
tables: { Quote },
data({ seed, ...ctx }) {
const glob = createGlob(ctx);
glob('quotes/*.json', {
into: Quote,
parse: asJson,
});
},
},
integrations: [db()],
});

View file

@ -0,0 +1,21 @@
{
"name": "glob",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"dev": "astro dev",
"build": "astro build",
"preview": "astro preview"
},
"dependencies": {
"@astrojs/db": "workspace:*",
"astro": "workspace:*",
"chokidar": "^3.5.3",
"drizzle-orm": "^0.28.6",
"fast-glob": "^3.3.2"
},
"keywords": [],
"author": "",
"license": "ISC"
}

View file

@ -0,0 +1,4 @@
{
"author": "Erika",
"body": "Put the quote in the database."
}

View file

@ -0,0 +1,4 @@
{
"author": "Tony Sull",
"body": "All content is data, but not all data is content."
}

View file

@ -0,0 +1,25 @@
---
/// <reference types="../../.astro/db-types.d.ts" />
import { Quote, db } from 'astro:db';
const quotes = await db.select().from(Quote);
---
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Document</title>
</head>
<body>
{
quotes.map((q) => (
<figure>
<blockquote>{q.body}</blockquote>
<figcaption>{q.author}</figcaption>
</figure>
))
}
</body>
</html>

60
packages/db/test/fixtures/glob/utils.ts vendored Normal file
View file

@ -0,0 +1,60 @@
import { type DBDataContext, type ResolvedCollectionConfig } from '@astrojs/db';
import chokidar from 'chokidar';
import { eq } from 'drizzle-orm';
import fastGlob from 'fast-glob';
import { readFile } from 'fs/promises';
export function createGlob({ db, mode }: Pick<DBDataContext, 'db' | 'mode'>) {
return async function glob(
pattern: string,
opts: {
into: ResolvedCollectionConfig;
parse: (params: { file: string; content: string }) => Record<string, any>;
}
) {
// TODO: expose `table`
const { table } = opts.into as any;
const fileColumn = table.file;
if (!fileColumn) {
throw new Error('`file` column is required for glob tables.');
}
if (mode === 'dev') {
chokidar
.watch(pattern)
.on('add', async (file) => {
const content = await readFile(file, 'utf-8');
const parsed = opts.parse({ file, content });
await db.insert(table).values({ ...parsed, file });
})
.on('change', async (file) => {
const content = await readFile(file, 'utf-8');
const parsed = opts.parse({ file, content });
await db
.insert(table)
.values({ ...parsed, file })
.onConflictDoUpdate({
target: fileColumn,
set: parsed,
});
})
.on('unlink', async (file) => {
await db.delete(table).where(eq(fileColumn, file));
});
} else {
const files = await fastGlob(pattern);
for (const file of files) {
const content = await readFile(file, 'utf-8');
const parsed = opts.parse({ file, content });
await db.insert(table).values({ ...parsed, file });
}
}
};
}
export function asJson(params: { file: string; content: string }) {
try {
return JSON.parse(params.content);
} catch (e) {
throw new Error(`Error parsing ${params.file}: ${e.message}`);
}
}

View file

@ -0,0 +1,82 @@
import astroDb, { defineReadableTable, column } from '@astrojs/db';
import { defineConfig } from 'astro/config';
const Recipe = defineReadableTable({
columns: {
id: column.number({ primaryKey: true }),
title: column.text(),
description: column.text(),
},
});
const Ingredient = defineReadableTable({
columns: {
id: column.number({ primaryKey: true }),
name: column.text(),
quantity: column.number(),
recipeId: column.number(),
},
indexes: {
recipeIdx: { on: 'recipeId' },
},
foreignKeys: [{ columns: 'recipeId', references: () => [Recipe.columns.id] }],
});
export default defineConfig({
integrations: [astroDb()],
db: {
tables: { Recipe, Ingredient },
async data({ seed, seedReturning }) {
const pancakes = await seedReturning(Recipe, {
title: 'Pancakes',
description: 'A delicious breakfast',
});
await seed(Ingredient, [
{
name: 'Flour',
quantity: 1,
recipeId: pancakes.id,
},
{
name: 'Eggs',
quantity: 2,
recipeId: pancakes.id,
},
{
name: 'Milk',
quantity: 1,
recipeId: pancakes.id,
},
]);
const pizza = await seedReturning(Recipe, {
title: 'Pizza',
description: 'A delicious dinner',
});
await seed(Ingredient, [
{
name: 'Flour',
quantity: 1,
recipeId: pizza.id,
},
{
name: 'Eggs',
quantity: 2,
recipeId: pizza.id,
},
{
name: 'Milk',
quantity: 1,
recipeId: pizza.id,
},
{
name: 'Tomato Sauce',
quantity: 1,
recipeId: pizza.id,
},
]);
},
},
});

View file

@ -0,0 +1,16 @@
{
"name": "@test/recipes",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"@astrojs/db": "workspace:*",
"astro": "workspace:*"
}
}

View file

@ -0,0 +1,25 @@
---
/// <reference path="../../.astro/db-types.d.ts" />
import { Recipe, Ingredient, db, eq } from 'astro:db';
const ingredientsByRecipe = await db
.select({
name: Ingredient.name,
recipeName: Recipe.title,
})
.from(Ingredient)
.innerJoin(Recipe, eq(Ingredient.recipeId, Recipe.id));
console.log(ingredientsByRecipe);
---
<h2>Shopping list</h2>
<ul>
{
ingredientsByRecipe.map(({ name, recipeName }) => (
<li>
{name} ({recipeName})
</li>
))
}
</ul>

View file

@ -0,0 +1,24 @@
# build output
dist/
# generated types
.astro/
# dependencies
node_modules/
# logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
# environment variables
.env
.env.production
# macOS-specific files
.DS_Store
# Cloudflare
.wrangler/

View file

@ -0,0 +1,54 @@
# Astro Starter Kit: Basics
```sh
npm create astro@latest -- --template basics
```
[![Open in StackBlitz](https://developer.stackblitz.com/img/open_in_stackblitz.svg)](https://stackblitz.com/github/withastro/astro/tree/latest/examples/basics)
[![Open with CodeSandbox](https://assets.codesandbox.io/github/button-edit-lime.svg)](https://codesandbox.io/p/sandbox/github/withastro/astro/tree/latest/examples/basics)
[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/withastro/astro?devcontainer_path=.devcontainer/basics/devcontainer.json)
> 🧑‍🚀 **Seasoned astronaut?** Delete this file. Have fun!
![just-the-basics](https://github.com/withastro/astro/assets/2244813/a0a5533c-a856-4198-8470-2d67b1d7c554)
## 🚀 Project Structure
Inside of your Astro project, you'll see the following folders and files:
```text
/
├── public/
│ └── favicon.svg
├── src/
│ ├── components/
│ │ └── Card.astro
│ ├── layouts/
│ │ └── Layout.astro
│ └── pages/
│ └── index.astro
└── package.json
```
Astro looks for `.astro` or `.md` files in the `src/pages/` directory. Each page is exposed as a route based on its file name.
There's nothing special about `src/components/`, but that's where we like to put any Astro/React/Vue/Svelte/Preact components.
Any static assets, like images, can be placed in the `public/` directory.
## 🧞 Commands
All commands are run from the root of the project, from a terminal:
| Command | Action |
| :------------------------ | :----------------------------------------------- |
| `npm install` | Installs dependencies |
| `npm run dev` | Starts local dev server at `localhost:4321` |
| `npm run build` | Build your production site to `./dist/` |
| `npm run preview` | Preview your build locally, before deploying |
| `npm run astro ...` | Run CLI commands like `astro add`, `astro check` |
| `npm run astro -- --help` | Get help using the Astro CLI |
## 👀 Want to learn more?
Feel free to check [our documentation](https://docs.astro.build) or jump into our [Discord server](https://astro.build/chat).

View file

@ -0,0 +1,56 @@
import db, { defineReadableTable, defineWritableTable, column } from '@astrojs/db';
import node from '@astrojs/node';
import react from '@astrojs/react';
import { defineConfig } from 'astro/config';
import simpleStackForm from 'simple-stack-form';
const Event = defineReadableTable({
columns: {
id: column.number({
primaryKey: true,
}),
name: column.text(),
description: column.text(),
ticketPrice: column.number(),
date: column.date(),
location: column.text(),
},
});
const Ticket = defineWritableTable({
columns: {
eventId: column.number({ references: () => Event.columns.id }),
email: column.text(),
quantity: column.number(),
newsletter: column.boolean({
default: false,
}),
},
});
// https://astro.build/config
export default defineConfig({
integrations: [simpleStackForm(), db(), react()],
output: 'server',
adapter: node({
mode: 'standalone',
}),
db: {
studio: true,
tables: {
Event,
Ticket,
},
data({ seed }) {
seed(Event, [
{
name: 'Sampha LIVE in Brooklyn',
description:
'Sampha is on tour with his new, flawless album Lahai. Come see the live performance outdoors in Prospect Park. Yes, there will be a grand piano 🎹',
date: new Date('2024-01-01'),
ticketPrice: 10000,
location: 'Brooklyn, NY',
},
]);
},
},
});

View file

@ -0,0 +1,27 @@
{
"name": "eventbrite-from-scratch",
"type": "module",
"version": "0.0.1",
"scripts": {
"dev": "pnpm astro dev",
"start": "astro dev",
"build": "astro check && astro build",
"preview": "astro preview",
"astro": "astro"
},
"dependencies": {
"@astrojs/check": "^0.5.5",
"@astrojs/db": "workspace:*",
"@astrojs/node": "workspace:*",
"@astrojs/react": "^3.0.10",
"@types/react": "^18.2.57",
"@types/react-dom": "^18.2.19",
"astro": "workspace:*",
"open-props": "^1.6.17",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"simple-stack-form": "^0.1.10",
"typescript": "^5.3.2",
"zod": "^3.22.4"
}
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,9 @@
<svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 128 128">
<path d="M50.4 78.5a75.1 75.1 0 0 0-28.5 6.9l24.2-65.7c.7-2 1.9-3.2 3.4-3.2h29c1.5 0 2.7 1.2 3.4 3.2l24.2 65.7s-11.6-7-28.5-7L67 45.5c-.4-1.7-1.6-2.8-2.9-2.8-1.3 0-2.5 1.1-2.9 2.7L50.4 78.5Zm-1.1 28.2Zm-4.2-20.2c-2 6.6-.6 15.8 4.2 20.2a17.5 17.5 0 0 1 .2-.7 5.5 5.5 0 0 1 5.7-4.5c2.8.1 4.3 1.5 4.7 4.7.2 1.1.2 2.3.2 3.5v.4c0 2.7.7 5.2 2.2 7.4a13 13 0 0 0 5.7 4.9v-.3l-.2-.3c-1.8-5.6-.5-9.5 4.4-12.8l1.5-1a73 73 0 0 0 3.2-2.2 16 16 0 0 0 6.8-11.4c.3-2 .1-4-.6-6l-.8.6-1.6 1a37 37 0 0 1-22.4 2.7c-5-.7-9.7-2-13.2-6.2Z" />
<style>
path { fill: #000; }
@media (prefers-color-scheme: dark) {
path { fill: #FFF; }
}
</style>
</svg>

After

Width:  |  Height:  |  Size: 749 B

View file

@ -0,0 +1,119 @@
// Generated by simple:form
import { type ComponentProps, createContext, useContext, useState } from 'react';
import { navigate } from 'astro:transitions/client';
import {
type FieldErrors,
type FormState,
type FormValidator,
formNameInputProps,
getInitialFormState,
toSetValidationErrors,
toTrackAstroSubmitStatus,
toValidateField,
validateForm,
} from 'simple:form';
export function useCreateFormContext(validator: FormValidator, fieldErrors?: FieldErrors) {
const initial = getInitialFormState({ validator, fieldErrors });
const [formState, setFormState] = useState<FormState>(initial);
return {
value: formState,
set: setFormState,
setValidationErrors: toSetValidationErrors(setFormState),
validateField: toValidateField(setFormState),
trackAstroSubmitStatus: toTrackAstroSubmitStatus(setFormState),
};
}
export function useFormContext() {
const formContext = useContext(FormContext);
if (!formContext) {
throw new Error(
'Form context not found. `useFormContext()` should only be called from children of a <Form> component.'
);
}
return formContext;
}
type FormContextType = ReturnType<typeof useCreateFormContext>;
const FormContext = createContext<FormContextType | undefined>(undefined);
export function Form({
children,
validator,
context,
fieldErrors,
name,
...formProps
}: {
validator: FormValidator;
context?: FormContextType;
fieldErrors?: FieldErrors;
} & Omit<ComponentProps<'form'>, 'method' | 'onSubmit'>) {
const formContext = context ?? useCreateFormContext(validator, fieldErrors);
return (
<FormContext.Provider value={formContext}>
<form
{...formProps}
method="POST"
onSubmit={async (e) => {
e.preventDefault();
e.stopPropagation();
const formData = new FormData(e.currentTarget);
formContext.set((formState) => ({
...formState,
isSubmitPending: true,
submitStatus: 'validating',
}));
const parsed = await validateForm({ formData, validator });
if (parsed.data) {
navigate(formProps.action ?? '', { formData });
return formContext.trackAstroSubmitStatus();
}
formContext.setValidationErrors(parsed.fieldErrors);
}}
>
{name ? <input {...formNameInputProps} value={name} /> : null}
{children}
</form>
</FormContext.Provider>
);
}
export function Input(inputProps: ComponentProps<'input'> & { name: string }) {
const formContext = useFormContext();
const fieldState = formContext.value.fields[inputProps.name];
if (!fieldState) {
throw new Error(
`Input "${inputProps.name}" not found in form. Did you use the <Form> component?`
);
}
const { hasErroredOnce, validationErrors, validator } = fieldState;
return (
<>
<input
onBlur={async (e) => {
const value = e.target.value;
if (value === '') return;
formContext.validateField(inputProps.name, value, validator);
}}
onChange={async (e) => {
if (!hasErroredOnce) return;
const value = e.target.value;
formContext.validateField(inputProps.name, value, validator);
}}
{...inputProps}
/>
{validationErrors?.map((e) => (
<p className="error" key={e}>
{e}
</p>
))}
</>
);
}

View file

@ -0,0 +1,80 @@
---
import { ViewTransitions } from 'astro:transitions';
import 'open-props/normalize';
import 'open-props/style';
interface Props {
title: string;
}
const { title } = Astro.props;
---
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="description" content="Astro description" />
<meta name="viewport" content="width=device-width" />
<link rel="icon" type="image/svg+xml" href="/favicon.svg" />
<meta name="generator" content={Astro.generator} />
<title>{title}</title>
<ViewTransitions handleForms />
</head>
<body>
<slot />
<style is:global>
main {
max-width: 600px;
margin: 0 auto;
padding: var(--size-4);
display: flex;
flex-direction: column;
gap: var(--size-4);
}
form {
display: flex;
flex-direction: column;
gap: var(--size-2);
margin-bottom: var(--size-4);
background: var(--surface-2);
padding-inline: var(--size-4);
padding-block: var(--size-6);
border-radius: var(--radius-2);
}
.error {
color: var(--red-6);
margin-bottom: var(--size-2);
grid-column: 1 / -1;
}
form button {
grid-column: 1 / -1;
background: var(--orange-8);
border-radius: var(--radius-2);
padding-block: var(--size-2);
}
.youre-going {
background: var(--surface-2);
padding: var(--size-2);
border-radius: var(--radius-2);
display: flex;
flex-direction: column;
}
h2 {
font-size: var(--font-size-4);
margin-bottom: var(--size-2);
}
.newsletter {
display: flex;
align-items: center;
gap: var(--size-2);
}
</style>
</body>
</html>

View file

@ -0,0 +1,40 @@
import { useState } from 'react';
import { z } from 'zod';
import { Form, Input } from '../../components/Form';
import { createForm } from 'simple:form';
export const ticketForm = createForm({
email: z.string().email(),
quantity: z.number().max(10),
newsletter: z.boolean(),
});
export function TicketForm({ price }: { price: number }) {
const [quantity, setQuantity] = useState(1);
return (
<>
<Form validator={ticketForm.validator}>
<h3>${(quantity * price) / 100}</h3>
<label htmlFor="quantity">Quantity</label>
<Input
id="quantity"
{...ticketForm.inputProps.quantity}
onInput={(e) => {
const value = Number(e.currentTarget.value);
setQuantity(value);
}}
/>
<label htmlFor="email">Email</label>
<Input id="email" {...ticketForm.inputProps.email} />
<div className="newsletter">
<Input id="newsletter" {...ticketForm.inputProps.newsletter} />
<label htmlFor="newsletter">Hear about the next event in your area</label>
</div>
<button>Buy tickets</button>
</Form>
</>
);
}

View file

@ -0,0 +1,50 @@
---
import { Event, Ticket, db, eq } from 'astro:db';
import Layout from '../../layouts/Layout.astro';
import { TicketForm, ticketForm } from './_Ticket';
const eventId = Number(Astro.params.event);
if (isNaN(eventId)) return Astro.redirect('/');
const event = await db.select().from(Event).where(eq(Event.id, eventId)).get();
if (!event) return Astro.redirect('/');
const res = await Astro.locals.form.getData(ticketForm);
if (res?.data) {
await db.insert(Ticket).values({
eventId,
email: res.data.email,
quantity: res.data.quantity,
newsletter: res.data.newsletter,
});
}
const ticket = await db.select().from(Ticket).where(eq(Ticket.eventId, eventId)).get();
---
<Layout title="Welcome to Astro.">
<main>
<h1>{event.name}</h1>
<p>
{event.description}
</p>
<TicketForm price={event.ticketPrice} client:load />
{
ticket && (
<section class="youre-going">
<h2>You're going 🙌</h2>
<p>
You have purchased {ticket.quantity} tickets for {event.name}!
</p>
<p>
Check <strong>{ticket.email}</strong> for your tickets.
</p>
</section>
)
}
</main>
</Layout>

View file

@ -0,0 +1,17 @@
---
import { Event, db } from 'astro:db';
const firstEvent = await db.select().from(Event).get();
---
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Eventbrite</title>
</head>
<body>
<meta http-equiv="refresh" content={`0; url=${firstEvent!.id}`} />
</body>
</html>

View file

@ -0,0 +1,7 @@
{
"extends": "astro/tsconfigs/strict",
"compilerOptions": {
"jsx": "react-jsx",
"jsxImportSource": "react"
}
}

View file

@ -0,0 +1,449 @@
import { expect } from 'chai';
import { describe, it } from 'mocha';
import {
getCollectionChangeQueries,
getMigrationQueries,
} from '../../dist/core/cli/migration-queries.js';
import { getCreateTableQuery } from '../../dist/core/queries.js';
import { collectionSchema, column, defineReadableTable } from '../../dist/core/types.js';
import { NOW } from '../../dist/runtime/index.js';
const COLLECTION_NAME = 'Users';
// `parse` to resolve schema transformations
// ex. convert column.date() to ISO strings
const userInitial = collectionSchema.parse(
defineReadableTable({
columns: {
name: column.text(),
age: column.number(),
email: column.text({ unique: true }),
mi: column.text({ optional: true }),
},
})
);
const defaultAmbiguityResponses = {
collectionRenames: {},
columnRenames: {},
};
function userChangeQueries(
oldCollection,
newCollection,
ambiguityResponses = defaultAmbiguityResponses
) {
return getCollectionChangeQueries({
collectionName: COLLECTION_NAME,
oldCollection,
newCollection,
ambiguityResponses,
});
}
function configChangeQueries(
oldCollections,
newCollections,
ambiguityResponses = defaultAmbiguityResponses
) {
return getMigrationQueries({
oldSnapshot: { schema: oldCollections, experimentalVersion: 1 },
newSnapshot: { schema: newCollections, experimentalVersion: 1 },
ambiguityResponses,
});
}
describe('column queries', () => {
describe('getMigrationQueries', () => {
it('should be empty when tables are the same', async () => {
const oldCollections = { [COLLECTION_NAME]: userInitial };
const newCollections = { [COLLECTION_NAME]: userInitial };
const { queries } = await configChangeQueries(oldCollections, newCollections);
expect(queries).to.deep.equal([]);
});
it('should create table for new tables', async () => {
const oldCollections = {};
const newCollections = { [COLLECTION_NAME]: userInitial };
const { queries } = await configChangeQueries(oldCollections, newCollections);
expect(queries).to.deep.equal([getCreateTableQuery(COLLECTION_NAME, userInitial)]);
});
it('should drop table for removed tables', async () => {
const oldCollections = { [COLLECTION_NAME]: userInitial };
const newCollections = {};
const { queries } = await configChangeQueries(oldCollections, newCollections);
expect(queries).to.deep.equal([`DROP TABLE "${COLLECTION_NAME}"`]);
});
it('should rename table for renamed tables', async () => {
const rename = 'Peeps';
const oldCollections = { [COLLECTION_NAME]: userInitial };
const newCollections = { [rename]: userInitial };
const { queries } = await configChangeQueries(oldCollections, newCollections, {
...defaultAmbiguityResponses,
collectionRenames: { [rename]: COLLECTION_NAME },
});
expect(queries).to.deep.equal([`ALTER TABLE "${COLLECTION_NAME}" RENAME TO "${rename}"`]);
});
});
describe('getCollectionChangeQueries', () => {
it('should be empty when tables are the same', async () => {
const { queries } = await userChangeQueries(userInitial, userInitial);
expect(queries).to.deep.equal([]);
});
it('should be empty when type updated to same underlying SQL type', async () => {
const blogInitial = collectionSchema.parse({
...userInitial,
columns: {
title: column.text(),
draft: column.boolean(),
},
});
const blogFinal = collectionSchema.parse({
...userInitial,
columns: {
...blogInitial.columns,
draft: column.number(),
},
});
const { queries } = await userChangeQueries(blogInitial, blogFinal);
expect(queries).to.deep.equal([]);
});
it('should respect user primary key without adding a hidden id', async () => {
const user = collectionSchema.parse({
...userInitial,
columns: {
...userInitial.columns,
id: column.number({ primaryKey: true }),
},
});
const userFinal = collectionSchema.parse({
...user,
columns: {
...user.columns,
name: column.text({ unique: true, optional: true }),
},
});
const { queries } = await userChangeQueries(user, userFinal);
expect(queries[0]).to.not.be.undefined;
const tempTableName = getTempTableName(queries[0]);
expect(queries).to.deep.equal([
`CREATE TABLE \"${tempTableName}\" (\"name\" text UNIQUE, \"age\" integer NOT NULL, \"email\" text NOT NULL UNIQUE, \"mi\" text, \"id\" integer PRIMARY KEY)`,
`INSERT INTO \"${tempTableName}\" (\"name\", \"age\", \"email\", \"mi\", \"id\") SELECT \"name\", \"age\", \"email\", \"mi\", \"id\" FROM \"Users\"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
describe('ALTER RENAME COLUMN', () => {
it('when renaming a column', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
},
};
userFinal.columns.middleInitial = userFinal.columns.mi;
delete userFinal.columns.mi;
const { queries } = await userChangeQueries(userInitial, userFinal, {
collectionRenames: {},
columnRenames: { [COLLECTION_NAME]: { middleInitial: 'mi' } },
});
expect(queries).to.deep.equal([
`ALTER TABLE "${COLLECTION_NAME}" RENAME COLUMN "mi" TO "middleInitial"`,
]);
});
});
describe('Lossy table recreate', () => {
it('when changing a column type', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
age: column.text(),
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.deep.equal([
'DROP TABLE "Users"',
`CREATE TABLE "Users" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" text NOT NULL, "email" text NOT NULL UNIQUE, "mi" text)`,
]);
});
it('when adding a required column without a default', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
phoneNumber: column.text(),
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.deep.equal([
'DROP TABLE "Users"',
`CREATE TABLE "Users" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" integer NOT NULL, "email" text NOT NULL UNIQUE, "mi" text, "phoneNumber" text NOT NULL)`,
]);
});
});
describe('Lossless table recreate', () => {
it('when adding a primary key', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
id: column.number({ primaryKey: true }),
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries[0]).to.not.be.undefined;
const tempTableName = getTempTableName(queries[0]);
expect(queries).to.deep.equal([
`CREATE TABLE \"${tempTableName}\" (\"name\" text NOT NULL, \"age\" integer NOT NULL, \"email\" text NOT NULL UNIQUE, \"mi\" text, \"id\" integer PRIMARY KEY)`,
`INSERT INTO \"${tempTableName}\" (\"name\", \"age\", \"email\", \"mi\") SELECT \"name\", \"age\", \"email\", \"mi\" FROM \"Users\"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
it('when dropping a primary key', async () => {
const user = {
...userInitial,
columns: {
...userInitial.columns,
id: column.number({ primaryKey: true }),
},
};
const { queries } = await userChangeQueries(user, userInitial);
expect(queries[0]).to.not.be.undefined;
const tempTableName = getTempTableName(queries[0]);
expect(queries).to.deep.equal([
`CREATE TABLE \"${tempTableName}\" (_id INTEGER PRIMARY KEY, \"name\" text NOT NULL, \"age\" integer NOT NULL, \"email\" text NOT NULL UNIQUE, \"mi\" text)`,
`INSERT INTO \"${tempTableName}\" (\"name\", \"age\", \"email\", \"mi\") SELECT \"name\", \"age\", \"email\", \"mi\" FROM \"Users\"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
it('when adding an optional unique column', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
phoneNumber: column.text({ unique: true, optional: true }),
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.have.lengthOf(4);
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.be.a('string');
expect(queries).to.deep.equal([
`CREATE TABLE "${tempTableName}" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" integer NOT NULL, "email" text NOT NULL UNIQUE, "mi" text, "phoneNumber" text UNIQUE)`,
`INSERT INTO "${tempTableName}" ("_id", "name", "age", "email", "mi") SELECT "_id", "name", "age", "email", "mi" FROM "Users"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
it('when dropping unique column', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
},
};
delete userFinal.columns.email;
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.have.lengthOf(4);
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.be.a('string');
expect(queries).to.deep.equal([
`CREATE TABLE "${tempTableName}" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" integer NOT NULL, "mi" text)`,
`INSERT INTO "${tempTableName}" ("_id", "name", "age", "mi") SELECT "_id", "name", "age", "mi" FROM "Users"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
it('when updating to a runtime default', async () => {
const initial = collectionSchema.parse({
...userInitial,
columns: {
...userInitial.columns,
age: column.date(),
},
});
const userFinal = collectionSchema.parse({
...initial,
columns: {
...initial.columns,
age: column.date({ default: NOW }),
},
});
const { queries } = await userChangeQueries(initial, userFinal);
expect(queries).to.have.lengthOf(4);
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.be.a('string');
expect(queries).to.deep.equal([
`CREATE TABLE "${tempTableName}" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" text NOT NULL DEFAULT CURRENT_TIMESTAMP, "email" text NOT NULL UNIQUE, "mi" text)`,
`INSERT INTO "${tempTableName}" ("_id", "name", "age", "email", "mi") SELECT "_id", "name", "age", "email", "mi" FROM "Users"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
it('when adding a column with a runtime default', async () => {
const userFinal = collectionSchema.parse({
...userInitial,
columns: {
...userInitial.columns,
birthday: column.date({ default: NOW }),
},
});
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.have.lengthOf(4);
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.be.a('string');
expect(queries).to.deep.equal([
`CREATE TABLE "${tempTableName}" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" integer NOT NULL, "email" text NOT NULL UNIQUE, "mi" text, "birthday" text NOT NULL DEFAULT CURRENT_TIMESTAMP)`,
`INSERT INTO "${tempTableName}" ("_id", "name", "age", "email", "mi") SELECT "_id", "name", "age", "email", "mi" FROM "Users"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
/**
* REASON: to follow the "expand" and "contract" migration model,
* you'll need to update the schema from NOT NULL to NULL.
* It's up to the user to ensure all data follows the new schema!
*
* @see https://planetscale.com/blog/safely-making-database-schema-changes#backwards-compatible-changes
*/
it('when changing a column to required', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
mi: column.text(),
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.have.lengthOf(4);
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.be.a('string');
expect(queries).to.deep.equal([
`CREATE TABLE "${tempTableName}" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" integer NOT NULL, "email" text NOT NULL UNIQUE, "mi" text NOT NULL)`,
`INSERT INTO "${tempTableName}" ("_id", "name", "age", "email", "mi") SELECT "_id", "name", "age", "email", "mi" FROM "Users"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
it('when changing a column to unique', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
age: column.number({ unique: true }),
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.have.lengthOf(4);
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.be.a('string');
expect(queries).to.deep.equal([
`CREATE TABLE "${tempTableName}" (_id INTEGER PRIMARY KEY, "name" text NOT NULL, "age" integer NOT NULL UNIQUE, "email" text NOT NULL UNIQUE, "mi" text)`,
`INSERT INTO "${tempTableName}" ("_id", "name", "age", "email", "mi") SELECT "_id", "name", "age", "email", "mi" FROM "Users"`,
'DROP TABLE "Users"',
`ALTER TABLE "${tempTableName}" RENAME TO "Users"`,
]);
});
});
describe('ALTER ADD COLUMN', () => {
it('when adding an optional column', async () => {
const userFinal = {
...userInitial,
columns: {
...userInitial.columns,
birthday: column.date({ optional: true }),
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.deep.equal(['ALTER TABLE "Users" ADD COLUMN "birthday" text']);
});
it('when adding a required column with default', async () => {
const defaultDate = new Date('2023-01-01');
const userFinal = collectionSchema.parse({
...userInitial,
columns: {
...userInitial.columns,
birthday: column.date({ default: new Date('2023-01-01') }),
},
});
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.deep.equal([
`ALTER TABLE "Users" ADD COLUMN "birthday" text NOT NULL DEFAULT '${defaultDate.toISOString()}'`,
]);
});
});
describe('ALTER DROP COLUMN', () => {
it('when removing optional or required columns', async () => {
const userFinal = {
...userInitial,
columns: {
name: userInitial.columns.name,
email: userInitial.columns.email,
},
};
const { queries } = await userChangeQueries(userInitial, userFinal);
expect(queries).to.deep.equal([
'ALTER TABLE "Users" DROP COLUMN "age"',
'ALTER TABLE "Users" DROP COLUMN "mi"',
]);
});
});
});
});
/** @param {string} query */
function getTempTableName(query) {
// eslint-disable-next-line regexp/no-unused-capturing-group
return query.match(/Users_([a-z\d]+)/)?.[0];
}

View file

@ -0,0 +1,97 @@
import { expect } from 'chai';
import { describe, it } from 'mocha';
import { getCollectionChangeQueries } from '../../dist/core/cli/migration-queries.js';
import { collectionSchema, column } from '../../dist/core/types.js';
const userInitial = collectionSchema.parse({
columns: {
name: column.text(),
age: column.number(),
email: column.text({ unique: true }),
mi: column.text({ optional: true }),
},
indexes: {},
writable: false,
});
describe('index queries', () => {
it('adds indexes', async () => {
/** @type {import('../../dist/types.js').DBTable} */
const userFinal = {
...userInitial,
indexes: {
nameIdx: { on: ['name'], unique: false },
emailIdx: { on: ['email'], unique: true },
},
};
const { queries } = await getCollectionChangeQueries({
collectionName: 'user',
oldCollection: userInitial,
newCollection: userFinal,
});
expect(queries).to.deep.equal([
'CREATE INDEX "nameIdx" ON "user" ("name")',
'CREATE UNIQUE INDEX "emailIdx" ON "user" ("email")',
]);
});
it('drops indexes', async () => {
/** @type {import('../../dist/types.js').DBTable} */
const initial = {
...userInitial,
indexes: {
nameIdx: { on: ['name'], unique: false },
emailIdx: { on: ['email'], unique: true },
},
};
/** @type {import('../../dist/types.js').DBTable} */
const final = {
...userInitial,
indexes: {},
};
const { queries } = await getCollectionChangeQueries({
collectionName: 'user',
oldCollection: initial,
newCollection: final,
});
expect(queries).to.deep.equal(['DROP INDEX "nameIdx"', 'DROP INDEX "emailIdx"']);
});
it('drops and recreates modified indexes', async () => {
/** @type {import('../../dist/types.js').DBTable} */
const initial = {
...userInitial,
indexes: {
nameIdx: { on: ['name'], unique: false },
emailIdx: { on: ['email'], unique: true },
},
};
/** @type {import('../../dist/types.js').DBTable} */
const final = {
...userInitial,
indexes: {
nameIdx: { on: ['name'], unique: true },
emailIdx: { on: ['email'] },
},
};
const { queries } = await getCollectionChangeQueries({
collectionName: 'user',
oldCollection: initial,
newCollection: final,
});
expect(queries).to.deep.equal([
'DROP INDEX "nameIdx"',
'DROP INDEX "emailIdx"',
'CREATE UNIQUE INDEX "nameIdx" ON "user" ("name")',
'CREATE INDEX "emailIdx" ON "user" ("email")',
]);
});
});

View file

@ -0,0 +1,178 @@
import { expect } from 'chai';
import { describe, it } from 'mocha';
import { getCollectionChangeQueries } from '../../dist/core/cli/migration-queries.js';
import { column, defineReadableTable, tablesSchema } from '../../dist/core/types.js';
const BaseUser = defineReadableTable({
columns: {
id: column.number({ primaryKey: true }),
name: column.text(),
age: column.number(),
email: column.text({ unique: true }),
mi: column.text({ optional: true }),
},
});
const BaseSentBox = defineReadableTable({
columns: {
to: column.number(),
toName: column.text(),
subject: column.text(),
body: column.text(),
},
});
const defaultAmbiguityResponses = {
collectionRenames: {},
columnRenames: {},
};
/**
* @typedef {import('../../dist/core/types.js').DBTable} DBTable
* @param {{ User: DBTable, SentBox: DBTable }} params
* @returns
*/
function resolveReferences(
{ User = BaseUser, SentBox = BaseSentBox } = {
User: BaseUser,
SentBox: BaseSentBox,
}
) {
return tablesSchema.parse({ User, SentBox });
}
function userChangeQueries(
oldCollection,
newCollection,
ambiguityResponses = defaultAmbiguityResponses
) {
return getCollectionChangeQueries({
collectionName: 'User',
oldCollection,
newCollection,
ambiguityResponses,
});
}
describe('reference queries', () => {
it('adds references with lossless table recreate', async () => {
const { SentBox: Initial } = resolveReferences();
const { SentBox: Final } = resolveReferences({
SentBox: defineReadableTable({
columns: {
...BaseSentBox.columns,
to: column.number({ references: () => BaseUser.columns.id }),
},
}),
});
const { queries } = await userChangeQueries(Initial, Final);
expect(queries[0]).to.not.be.undefined;
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.not.be.undefined;
expect(queries).to.deep.equal([
`CREATE TABLE \"${tempTableName}\" (_id INTEGER PRIMARY KEY, \"to\" integer NOT NULL REFERENCES \"User\" (\"id\"), \"toName\" text NOT NULL, \"subject\" text NOT NULL, \"body\" text NOT NULL)`,
`INSERT INTO \"${tempTableName}\" (\"_id\", \"to\", \"toName\", \"subject\", \"body\") SELECT \"_id\", \"to\", \"toName\", \"subject\", \"body\" FROM \"User\"`,
'DROP TABLE "User"',
`ALTER TABLE \"${tempTableName}\" RENAME TO \"User\"`,
]);
});
it('removes references with lossless table recreate', async () => {
const { SentBox: Initial } = resolveReferences({
SentBox: defineReadableTable({
columns: {
...BaseSentBox.columns,
to: column.number({ references: () => BaseUser.columns.id }),
},
}),
});
const { SentBox: Final } = resolveReferences();
const { queries } = await userChangeQueries(Initial, Final);
expect(queries[0]).to.not.be.undefined;
const tempTableName = getTempTableName(queries[0]);
expect(tempTableName).to.not.be.undefined;
expect(queries).to.deep.equal([
`CREATE TABLE \"${tempTableName}\" (_id INTEGER PRIMARY KEY, \"to\" integer NOT NULL, \"toName\" text NOT NULL, \"subject\" text NOT NULL, \"body\" text NOT NULL)`,
`INSERT INTO \"${tempTableName}\" (\"_id\", \"to\", \"toName\", \"subject\", \"body\") SELECT \"_id\", \"to\", \"toName\", \"subject\", \"body\" FROM \"User\"`,
'DROP TABLE "User"',
`ALTER TABLE \"${tempTableName}\" RENAME TO \"User\"`,
]);
});
it('does not use ADD COLUMN when adding optional column with reference', async () => {
const { SentBox: Initial } = resolveReferences();
const { SentBox: Final } = resolveReferences({
SentBox: defineReadableTable({
columns: {
...BaseSentBox.columns,
from: column.number({ references: () => BaseUser.columns.id, optional: true }),
},
}),
});
const { queries } = await userChangeQueries(Initial, Final);
expect(queries[0]).to.not.be.undefined;
const tempTableName = getTempTableName(queries[0]);
expect(queries).to.deep.equal([
`CREATE TABLE \"${tempTableName}\" (_id INTEGER PRIMARY KEY, \"to\" integer NOT NULL, \"toName\" text NOT NULL, \"subject\" text NOT NULL, \"body\" text NOT NULL, \"from\" integer REFERENCES \"User\" (\"id\"))`,
`INSERT INTO \"${tempTableName}\" (\"_id\", \"to\", \"toName\", \"subject\", \"body\") SELECT \"_id\", \"to\", \"toName\", \"subject\", \"body\" FROM \"User\"`,
'DROP TABLE "User"',
`ALTER TABLE \"${tempTableName}\" RENAME TO \"User\"`,
]);
});
it('adds and updates foreign key with lossless table recreate', async () => {
const { SentBox: InitialWithoutFK } = resolveReferences();
const { SentBox: InitialWithDifferentFK } = resolveReferences({
SentBox: defineReadableTable({
...BaseSentBox,
foreignKeys: [{ columns: ['to'], references: () => [BaseUser.columns.id] }],
}),
});
const { SentBox: Final } = resolveReferences({
SentBox: defineReadableTable({
...BaseSentBox,
foreignKeys: [
{
columns: ['to', 'toName'],
references: () => [BaseUser.columns.id, BaseUser.columns.name],
},
],
}),
});
const expected = (tempTableName) => [
`CREATE TABLE \"${tempTableName}\" (_id INTEGER PRIMARY KEY, \"to\" integer NOT NULL, \"toName\" text NOT NULL, \"subject\" text NOT NULL, \"body\" text NOT NULL, FOREIGN KEY (\"to\", \"toName\") REFERENCES \"User\"(\"id\", \"name\"))`,
`INSERT INTO \"${tempTableName}\" (\"_id\", \"to\", \"toName\", \"subject\", \"body\") SELECT \"_id\", \"to\", \"toName\", \"subject\", \"body\" FROM \"User\"`,
'DROP TABLE "User"',
`ALTER TABLE \"${tempTableName}\" RENAME TO \"User\"`,
];
const addedForeignKey = await userChangeQueries(InitialWithoutFK, Final);
const updatedForeignKey = await userChangeQueries(InitialWithDifferentFK, Final);
expect(addedForeignKey.queries[0]).to.not.be.undefined;
expect(updatedForeignKey.queries[0]).to.not.be.undefined;
expect(addedForeignKey.queries).to.deep.equal(
expected(getTempTableName(addedForeignKey.queries[0]))
);
expect(updatedForeignKey.queries).to.deep.equal(
expected(getTempTableName(updatedForeignKey.queries[0]))
);
});
});
/** @param {string | undefined} query */
function getTempTableName(query) {
// eslint-disable-next-line regexp/no-unused-capturing-group
return query.match(/User_([a-z\d]+)/)?.[0];
}

View file

@ -0,0 +1,7 @@
{
"extends": "../../tsconfig.base.json",
"include": ["src"],
"compilerOptions": {
"outDir": "./dist"
}
}

File diff suppressed because it is too large Load diff