Utilities Reference
Baasix exports various utility functions and modules that can be used in your extensions, hooks, and custom code. All utilities are exported from the main @tspvivek/baasix package.
Table of Contents
- Environment (env)
- Logger
- Database (db)
- Cache
- Schema Manager
- Query Utilities
- Spatial Utilities
- Field Utilities
- Import Utilities
- Schema Validator
- Seed Utilities
- Workflow Utilities
- Sort Utilities
- Error Handling
- Type Mapper
- Relation Utilities
- System Schemas
- Custom Types
- Soft Delete Plugin
- Auth Utilities
Importing Utilities
import {
// Environment
env,
// Logger
getLogger,
initializeLogger,
getOriginalConsole,
// Database
initializeDatabase,
getDatabase,
getSqlClient,
testConnection,
closeDatabase,
// Cache
getCacheService,
initializeCacheService,
closeCacheService,
invalidateCollection,
invalidateEntireCache,
// Schema
schemaManager,
systemSchemas,
// Query building
drizzleWhere,
drizzleOrder,
combineFilters,
applyPagination,
applyFullTextSearch,
// Utilities
spatialUtils,
fieldUtils,
importUtils,
schemaValidator,
seedUtility,
// Sorting
sortItems,
reorderItems,
getNextSortValue,
// Error handling
APIError,
errorHandler,
// Type mapping
mapJsonTypeToDrizzle,
typeMapper,
// Relations
relationBuilder,
RelationBuilder,
// Workflow
checkWorkflowRoleAccess,
validateWorkflowAccess,
// Custom types
point,
arrayText,
rangeInteger,
// Soft delete
withSoftDelete,
SoftDeleteHelper,
// Auth
createAuth,
google,
facebook,
apple,
github,
} from '@tspvivek/baasix';Environment (env)
Access environment variables with type conversion and caching.
Methods
get
Get an environment variable.
env.get(key: string, defaultValue?: string): string | undefinedExample:
const port = env.get('PORT', '3000');
const dbUrl = env.get('DATABASE_URL');require
Get a required environment variable (throws if not set).
env.require(key: string): stringExample:
const secretKey = env.require('SECRET_KEY'); // Throws if not setgetBoolean
Get a boolean environment variable.
env.getBoolean(key: string, defaultValue?: boolean): booleanExample:
const isDebug = env.getBoolean('DEBUG', false);
const isMultiTenant = env.getBoolean('MULTI_TENANT');getNumber
Get a numeric environment variable.
env.getNumber(key: string, defaultValue?: number): number | undefinedExample:
const maxFileSize = env.getNumber('MAX_FILE_SIZE', 52428800);
const poolSize = env.getNumber('DB_POOL_SIZE', 10);set
Set an environment variable (useful for testing).
env.set(key: string, value: string): voidLogger
Baasix uses Pino for high-performance structured logging. The logger outputs to stdout (stdio) by default and can be configured with custom transports for services like Datadog, Grafana Loki, Elasticsearch, and more.
Server Configuration
Configure the logger when starting the server:
import { startServer } from "@tspvivek/baasix";
// Basic usage - pretty printing in dev, JSON in production
startServer();
// With custom logger options
startServer({
port: 8056,
logger: {
level: "info", // Log level
pretty: true, // Human-readable output
}
});Logger Options
| Option | Type | Description |
|---|---|---|
level | string | Log level: 'fatal', 'error', 'warn', 'info', 'debug', 'trace', 'silent' |
pretty | boolean | Enable pretty printing (default: true in development) |
transport | object | Custom Pino transport configuration |
destination | stream | Custom destination stream |
options | object | Additional Pino options |
Functions
getLogger
Get the initialized Pino logger instance.
import { getLogger } from "@tspvivek/baasix";
const logger = getLogger();
logger.info("Server started");
logger.error({ err: error }, "Failed to process request");
logger.debug({ userId, action }, "User action logged");initializeLogger
Initialize the logger with custom options (usually done automatically by startServer).
import { initializeLogger } from "@tspvivek/baasix";
const logger = initializeLogger({
level: "debug",
pretty: true
});Log Levels
| Level | Priority | Use Case |
|---|---|---|
fatal | 60 | Application crash |
error | 50 | Error conditions |
warn | 40 | Warning conditions |
info | 30 | Informational messages (default) |
debug | 20 | Debug information |
trace | 10 | Detailed trace information |
silent | - | Disable all logging |
Console Override
Baasix automatically overrides console.* methods to use the Pino logger:
console.info()→logger.info()console.warn()→logger.warn()console.error()→logger.error()console.log()→logger.debug()(only whenLOG_LEVEL=debugortrace)console.debug()→logger.debug()(only whenLOG_LEVEL=debugortrace)
Transport Examples
File Logging
startServer({
logger: {
transport: {
target: "pino/file",
options: { destination: "./logs/app.log" }
}
}
});Multiple Transports
startServer({
logger: {
transport: {
targets: [
{ target: "pino-pretty", options: { colorize: true }, level: "info" },
{ target: "pino/file", options: { destination: "./logs/error.log" }, level: "error" }
]
}
}
});Datadog Integration
npm install pino-datadog-transportstartServer({
logger: {
transport: {
target: "pino-datadog-transport",
options: {
apiKey: process.env.DD_API_KEY,
service: "baasix-api",
env: process.env.NODE_ENV,
ddsource: "nodejs"
}
}
}
});Grafana Loki Integration
npm install pino-lokistartServer({
logger: {
transport: {
target: "pino-loki",
options: {
host: "http://localhost:3100",
labels: { application: "baasix-api", environment: "production" }
}
}
}
});Elasticsearch Integration
npm install pino-elasticsearchstartServer({
logger: {
transport: {
target: "pino-elasticsearch",
options: {
node: "http://localhost:9200",
index: "baasix-logs",
esVersion: 8
}
}
}
});Environment Variables
| Variable | Description | Default |
|---|---|---|
LOG_LEVEL | Set the log level (fatal, error, warn, info, debug, trace, silent) | info |
NODE_ENV | When "development", enables pretty printing | - |
Popular Pino Transports
| Transport | Package | Description |
|---|---|---|
| Pretty | pino-pretty | Human-readable console output |
| File | pino/file | Write logs to file |
| Datadog | pino-datadog-transport | Datadog integration |
| Loki | pino-loki | Grafana Loki integration |
| Elasticsearch | pino-elasticsearch | Elasticsearch integration |
| Sentry | pino-sentry-transport | Sentry error tracking |
| CloudWatch | pino-cloudwatch | AWS CloudWatch Logs |
| Logtail | @logtail/pino | Logtail/Better Stack |
Database (db)
Database connection and transaction management using Drizzle ORM with PostgreSQL.
Functions
initializeDatabase
Initialize the database connection.
function initializeDatabase(): DatabaseExample:
import { initializeDatabase } from '@tspvivek/baasix';
// Called automatically by Baasix, but can be used manually
const db = initializeDatabase();getDatabase
Get the database instance (initializes if needed).
function getDatabase(): DatabaseExample:
import { getDatabase } from '@tspvivek/baasix';
import { eq } from 'drizzle-orm';
const db = getDatabase();
// Use Drizzle ORM queries
const users = await db.select().from(usersTable).where(eq(usersTable.status, 'active'));getSqlClient
Get the raw PostgreSQL client for advanced queries.
function getSqlClient(): PostgresClientExample:
import { getSqlClient } from '@tspvivek/baasix';
const sql = getSqlClient();
// Raw SQL query
const result = await sql`SELECT COUNT(*) FROM users WHERE status = 'active'`;testConnection
Test the database connection.
async function testConnection(): Promise<boolean>closeDatabase
Close the database connection.
async function closeDatabase(): Promise<void>Transactions
import { getDatabase } from '@tspvivek/baasix';
const db = getDatabase();
await db.transaction(async (tx) => {
// All operations in this callback are in a transaction
await tx.insert(ordersTable).values({ ... });
await tx.insert(orderItemsTable).values({ ... });
// Transaction automatically commits on success
// or rolls back on error
});Cache
High-performance caching with automatic invalidation.
Functions
getCacheService
Get the cache service instance.
function getCacheService(): CacheService | nullinitializeCacheService
Initialize the cache service (called automatically).
async function initializeCacheService(): Promise<CacheService>closeCacheService
Close the cache service connection.
async function closeCacheService(): Promise<void>invalidateCollection
Invalidate all cache entries for a collection.
async function invalidateCollection(collection: string): Promise<void>invalidateEntireCache
Clear all cache entries.
async function invalidateEntireCache(): Promise<void>Cache Service Methods
const cache = getCacheService();
// Get a cached value
const value = await cache.get('my-key');
// Set a cached value (TTL in seconds)
await cache.set('my-key', { data: 'value' }, 3600);
// Delete a cached value
await cache.delete('my-key');
// Invalidate by pattern
await cache.invalidateByPattern('user:*');
// Invalidate by tables (used internally)
await cache.invalidateByTables(['posts', 'comments']);
// Get cache stats
const stats = await cache.getStats();
console.log(`Keys: ${stats.keys}, Size: ${stats.size}`);Cache Configuration
# Enable caching
CACHE_ENABLED=true
# Cache adapter: memory, redis, upstash
CACHE_ADAPTER=redis
# Redis configuration
CACHE_REDIS_URL=redis://localhost:6379
# Upstash configuration
CACHE_UPSTASH_URL=https://your-url.upstash.io
CACHE_UPSTASH_TOKEN=your-token
# Default TTL in seconds
CACHE_TTL=3600
# In-memory cache size limit (GB)
CACHE_MAX_SIZE_GB=1Schema Manager
Dynamic schema management for collections and relations.
Methods
getTable
Get a Drizzle table instance for a collection.
schemaManager.getTable(collection: string): PgTableExample:
import { schemaManager, getDatabase } from '@tspvivek/baasix';
import { eq } from 'drizzle-orm';
const db = getDatabase();
const postsTable = schemaManager.getTable('posts');
const posts = await db.select().from(postsTable).where(eq(postsTable.status, 'published'));getPrimaryKey
Get the primary key field name for a collection.
schemaManager.getPrimaryKey(collection: string): stringgetSchema
Get the full schema definition for a collection.
schemaManager.getSchema(collection: string): SchemaDefinitiongetRelation
Get relation definition between collections.
schemaManager.getRelation(collection: string, relationName: string): RelationDefinitionExample:
const relation = schemaManager.getRelation('posts', 'author');
// Returns: { type: 'BelongsTo', relatedCollection: 'users', foreignKey: 'author_Id' }getRelationNames
Get all relation names for a collection.
schemaManager.getRelationNames(collection: string): string[]isParanoid
Check if a collection has soft delete enabled.
schemaManager.isParanoid(collection: string): booleanhasField
Check if a collection has a specific field.
schemaManager.hasField(collection: string, fieldName: string): booleangetFieldDefinition
Get field definition for a collection field.
schemaManager.getFieldDefinition(collection: string, fieldName: string): FieldDefinitionQuery Utilities
Build Drizzle ORM queries from filter objects.
drizzleWhere
Convert a filter object to a Drizzle WHERE clause.
function drizzleWhere(
filter: FilterObject,
options: {
table: PgTable;
tableName: string;
schema?: any;
joins?: any[];
}
): SQL | undefinedFilter Operators:
| Operator | Description | Example |
|---|---|---|
| eq | Equal | { status: { eq: 'active' } } |
| ne | Not equal | { status: { ne: 'deleted' } } |
| gt | Greater than | { price: { gt: 100 } } |
| gte | Greater than or equal | { price: { gte: 100 } } |
| lt | Less than | { price: { lt: 100 } } |
| lte | Less than or equal | { price: { lte: 100 } } |
| in | In array | { status: { in: ['active', 'pending'] } } |
| notIn | Not in array | { status: { notIn: ['deleted'] } } |
| contains | Contains substring | { title: { contains: 'search' } } |
| startsWith | Starts with | { name: { startsWith: 'John' } } |
| endsWith | Ends with | { email: { endsWith: '@example.com' } } |
| isNull | Is null | { deletedAt: { isNull: true } } |
| between | Between range | { price: { between: [10, 100] } } |
| AND | Logical AND | { AND: [{ status: 'active' }, { role: 'admin' }] } |
| OR | Logical OR | { OR: [{ status: 'active' }, { status: 'pending' }] } |
Example:
import { drizzleWhere, schemaManager, getDatabase } from '@tspvivek/baasix';
const db = getDatabase();
const table = schemaManager.getTable('posts');
const whereClause = drizzleWhere({
status: { eq: 'published' },
createdAt: { gte: '2024-01-01' },
OR: [
{ category: 'tech' },
{ category: 'science' }
]
}, {
table,
tableName: 'posts'
});
const posts = await db.select().from(table).where(whereClause);drizzleOrder
Convert sort specification to Drizzle ORDER BY clause.
function drizzleOrder(
sort: string | string[] | Record<string, 'asc' | 'desc'>,
options: { table: PgTable; tableName: string }
): SQL[]Example:
import { drizzleOrder, schemaManager } from '@tspvivek/baasix';
const table = schemaManager.getTable('posts');
// Array format
const orderBy1 = drizzleOrder(['-createdAt', 'title'], { table, tableName: 'posts' });
// Object format
const orderBy2 = drizzleOrder({ createdAt: 'desc', title: 'asc' }, { table, tableName: 'posts' });combineFilters
Combine multiple filter objects with AND logic.
function combineFilters(...filters: FilterObject[]): FilterObjectExample:
const baseFilter = { status: 'active' };
const searchFilter = { title: { contains: 'search' } };
const tenantFilter = { tenant_Id: 'tenant-123' };
const combined = combineFilters(baseFilter, searchFilter, tenantFilter);applyPagination
Calculate limit and offset from pagination options.
function applyPagination(options: {
limit?: number;
page?: number;
offset?: number;
}): { limit?: number; offset?: number }Example:
const { limit, offset } = applyPagination({ page: 2, limit: 10 });
// Returns: { limit: 10, offset: 10 }applyFullTextSearch
Build full-text search conditions.
function applyFullTextSearch(
collection: string,
table: PgTable,
searchQuery: string,
searchFields?: string[],
sortByRelevance?: boolean
): { searchCondition: SQL; orderClause?: SQL }Spatial Utilities
PostGIS spatial/geospatial query utilities.
Methods
createPoint
Create a PostGIS point from coordinates.
spatialUtils.createPoint(lat: number, lng: number): SQLcreatePolygon
Create a PostGIS polygon from coordinates array.
spatialUtils.createPolygon(coordinates: [number, number][]): SQLwithinRadius
Filter by distance from a point.
spatialUtils.withinRadius(
column: string,
lat: number,
lng: number,
radiusMeters: number
): SQLExample:
import { spatialUtils, schemaManager, getDatabase } from '@tspvivek/baasix';
const db = getDatabase();
const storesTable = schemaManager.getTable('stores');
// Find stores within 5km of a location
const nearbyStores = await db
.select()
.from(storesTable)
.where(spatialUtils.withinRadius('location', 40.7128, -74.0060, 5000));calculateDistance
Calculate distance between two points.
spatialUtils.calculateDistance(
column: string,
lat: number,
lng: number
): SQLintersects
Check if geometries intersect.
spatialUtils.intersects(column: string, geometry: SQL): SQLField Utilities
Field validation and manipulation utilities.
Methods
validateFieldValue
Validate a field value against its type.
fieldUtils.validateFieldValue(
value: any,
fieldDef: FieldDefinition
): { valid: boolean; error?: string }coerceFieldValue
Coerce a value to the correct type for a field.
fieldUtils.coerceFieldValue(
value: any,
fieldType: string
): anygetDefaultValue
Get the default value for a field.
fieldUtils.getDefaultValue(fieldDef: FieldDefinition): anyImport Utilities
Bulk import from CSV/JSON files.
Methods
importFromCSV
Import records from a CSV file.
async importUtils.importFromCSV(
collection: string,
filePath: string,
options?: ImportOptions
): Promise<ImportResult>ImportOptions:
| Option | Type | Description |
|---|---|---|
| delimiter | string | CSV delimiter (default: ',') |
| skipHeader | boolean | Skip first row |
| mapping | object | Field name mapping |
| onError | 'skip' | 'throw' | Error handling |
Example:
import { importUtils } from '@tspvivek/baasix';
const result = await importUtils.importFromCSV('products', './import/products.csv', {
mapping: {
'Product Name': 'name',
'Price USD': 'price',
'Category': 'category_Id'
},
onError: 'skip'
});
console.log(`Imported: ${result.success}, Failed: ${result.failed}`);importFromJSON
Import records from a JSON file.
async importUtils.importFromJSON(
collection: string,
filePath: string,
options?: ImportOptions
): Promise<ImportResult>exportToCSV
Export records to CSV.
async importUtils.exportToCSV(
collection: string,
filePath: string,
query?: QueryOptions
): Promise<number>Schema Validator
Validate schema definitions.
Methods
validateSchema
Validate a complete schema definition.
schemaValidator.validateSchema(schema: SchemaDefinition): ValidationResultvalidateField
Validate a field definition.
schemaValidator.validateField(field: FieldDefinition): ValidationResultvalidateRelation
Validate a relation definition.
schemaValidator.validateRelation(relation: RelationDefinition): ValidationResultSeed Utilities
Database seeding for development and testing.
Methods
seedCollection
Seed a collection with data.
async seedUtility.seedCollection(
collection: string,
data: any[],
options?: SeedOptions
): Promise<SeedResult>SeedOptions:
| Option | Type | Description |
|---|---|---|
| truncate | boolean | Clear existing data first |
| skipDuplicates | boolean | Skip duplicate key errors |
| accountability | object | User context for seeding |
seedMultiple
Seed multiple collections with dependency order.
async seedUtility.seedMultiple(
seeds: SeedConfig[]
): Promise<SeedResult[]>generateTemplate
Generate a seed template for a collection.
seedUtility.generateTemplate(collection: string): objectExample:
import { seedUtility } from '@tspvivek/baasix';
// Seed with sample data
await seedUtility.seedCollection('categories', [
{ name: 'Electronics', slug: 'electronics' },
{ name: 'Clothing', slug: 'clothing' },
{ name: 'Books', slug: 'books' }
], { truncate: true });
// Seed multiple collections in order
await seedUtility.seedMultiple([
{
collection: 'categories',
data: categoriesData,
truncate: true
},
{
collection: 'products',
data: productsData,
truncate: true
}
]);Workflow Utilities
Workflow access control utilities.
Methods
checkWorkflowRoleAccess
Check if a role has access to execute a workflow.
async checkWorkflowRoleAccess(
workflowId: string | number,
roleId: string | number
): Promise<boolean>validateWorkflowAccess
Validate complete workflow access for a user.
async validateWorkflowAccess(
workflowId: string | number,
accountability: any
): Promise<{ hasAccess: boolean; reason?: string }>fetchAndValidateWorkflow
Fetch workflow and validate access in one call.
async fetchAndValidateWorkflow(
workflowId: string | number,
accountability: any
): Promise<{ workflow: Workflow; hasAccess: boolean }>Example:
import { validateWorkflowAccess } from '@tspvivek/baasix';
// In a custom endpoint
app.post('/run-workflow/:id', async (req, res) => {
const { hasAccess, reason } = await validateWorkflowAccess(
req.params.id,
req.accountability
);
if (!hasAccess) {
return res.status(403).json({ error: reason });
}
// Execute workflow...
});Sort Utilities
Utilities for sorting and reordering items within collections.
Methods
sortItems
Move an item before another item in a sorted collection.
async sortItems(options: SortOptions): Promise<SortResult>SortOptions:
| Option | Type | Description |
|---|---|---|
| collection | string | Collection name |
| item | string | number | ID of the item to move |
| to | string | number | ID of the target item to move before |
| accountability | object | User accountability for permission checks |
| bypassPermissions | boolean | Skip permission checks (default: false) |
| transaction | any | Database transaction to use |
Example:
import { sortItems } from '@tspvivek/baasix';
// Move task 'abc' before task 'xyz'
const result = await sortItems({
collection: 'tasks',
item: 'abc',
to: 'xyz',
accountability: req.accountability
});
console.log(`Moved item to sort position: ${result.newSort}`);reorderItems
Reorder multiple items at once.
async reorderItems(options: {
collection: string;
items: (string | number)[];
accountability?: any;
bypassPermissions?: boolean;
}): Promise<void>getNextSortValue
Get the next available sort value for a collection.
async getNextSortValue(
collection: string
): Promise<number>Example:
import { getNextSortValue } from '@tspvivek/baasix';
const nextSort = await getNextSortValue('tasks');
// Use when creating a new item to place it at the endError Handling
Standard error handling utilities for consistent error responses.
APIError
Custom error class for API errors with status codes.
class APIError extends Error {
statusCode: number;
isOperational: boolean;
details: any;
constructor(
message: string,
statusCode?: number,
details?: any
)
}Example:
import { APIError } from '@tspvivek/baasix';
// Throw a 404 error
throw new APIError('Resource not found', 404);
// Throw a 400 error with details
throw new APIError('Validation failed', 400, {
field: 'email',
message: 'Invalid email format'
});
// Throw a 403 error
throw new APIError('Permission denied', 403);errorHandler
Express middleware for handling errors consistently.
function errorHandler(
error: Error | APIError,
req: Request,
res: Response,
next: NextFunction
): voidFeatures:
- Handles
APIErrorinstances with custom status codes - Converts PostgreSQL errors to appropriate HTTP responses
- Handles unique constraint violations (409 Conflict)
- Handles foreign key violations (409 Conflict)
- Handles not-null violations (400 Bad Request)
Example:
import express from 'express';
import { errorHandler, APIError } from '@tspvivek/baasix';
const app = express();
// Your routes
app.get('/api/resource/:id', async (req, res, next) => {
try {
const resource = await findResource(req.params.id);
if (!resource) {
throw new APIError('Resource not found', 404);
}
res.json(resource);
} catch (error) {
next(error);
}
});
// Error handler middleware (should be last)
app.use(errorHandler);Type Mapper
Convert JSON schema field definitions to Drizzle column types.
Methods
mapJsonTypeToDrizzle
Map a JSON schema field to a Drizzle column definition.
function mapJsonTypeToDrizzle(
fieldName: string,
fieldSchema: FieldSchema
): anySupported Types:
| JSON Type | Drizzle Type |
|---|---|
| String | varchar (255 default) |
| Text | text |
| Integer | integer |
| BigInt | bigint |
| Decimal | decimal (with precision/scale) |
| Real | real |
| Double | doublePrecision |
| Boolean | boolean |
| DateTime | timestamp (with timezone) |
| Date | date |
| Time | time |
| JSON | json |
| JSONB | jsonb |
| UUID | uuid |
| ENUM | varchar |
| VIRTUAL | text (computed) |
Example:
import { mapJsonTypeToDrizzle } from '@tspvivek/baasix';
const column = mapJsonTypeToDrizzle('email', {
type: 'String',
values: { stringLength: 100 },
allowNull: false
});Relation Utilities
Utilities for managing and building table relationships.
RelationBuilder
Manages association definitions for dynamic schemas.
import { relationBuilder, RelationBuilder } from '@tspvivek/baasix';storeAssociations
Store association definitions for a table.
relationBuilder.storeAssociations(
tableName: string,
associations: Record<string, AssociationDefinition>
): voidgetAssociations
Get associations for a table.
relationBuilder.getAssociations(
tableName: string
): Record<string, AssociationDefinition> | undefinedgetForeignKey
Get the foreign key column name for a relation.
relationBuilder.getForeignKey(
assoc: AssociationDefinition,
defaultKey?: string
): stringAssociationDefinition:
| Field | Type | Description |
|---|---|---|
| type | string | 'HasMany', 'BelongsTo', 'HasOne', 'BelongsToMany', 'M2A' |
| model | string | Target model/table name |
| foreignKey | string | Foreign key column name |
| as | string | Alias for the relation |
| through | string | Junction table (for M2M) |
Example:
import { relationBuilder } from '@tspvivek/baasix';
// Get relations for a table
const postAssociations = relationBuilder.getAssociations('posts');
// Check if it's a one-to-many relation
if (relationBuilder.isOneToMany(postAssociations.comments)) {
console.log('Posts have many comments');
}
// Get foreign key
const fk = relationBuilder.getForeignKey(
postAssociations.author,
'authorId'
);Helper Functions
createForeignKeySQL
Generate SQL for creating a foreign key constraint.
function createForeignKeySQL(
tableName: string,
columnName: string,
referencedTable: string,
referencedColumn?: string,
onDelete?: string,
onUpdate?: string
): stringisPolymorphicRelation
Check if an association is polymorphic (M2A).
function isPolymorphicRelation(
assoc: AssociationDefinition
): booleangetPolymorphicFields
Get the type and ID field names for a polymorphic relation.
function getPolymorphicFields(
relationName: string
): { typeField: string; idField: string }System Schemas
Built-in system schemas for Baasix core tables.
import { systemSchemas } from '@tspvivek/baasix';Available System Schemas
| Collection | Description |
|---|---|
| baasix_SchemaDefinition | Schema definitions storage |
| baasix_Role | User roles |
| baasix_Permission | Role permissions |
| baasix_User | User accounts |
| baasix_UserRole | User-role assignments |
| baasix_Tenant | Multi-tenant organizations |
| baasix_TenantSettings | Tenant-specific settings |
| baasix_File | File metadata |
| baasix_Notification | User notifications |
| baasix_Task | Scheduled tasks |
| baasix_Workflow | Visual workflows |
| baasix_WorkflowRoles | Workflow access control |
| baasix_Hook | Dynamic hooks |
| baasix_Endpoint | Custom endpoints |
Example:
import { systemSchemas } from '@tspvivek/baasix';
// Get all system schema definitions
const schemas = systemSchemas.schemas;
// Find a specific schema
const userSchema = schemas.find(
s => s.collectionName === 'baasix_User'
);
console.log(userSchema.schema.fields);Custom Types
Baasix exports custom Drizzle column types for PostgreSQL-specific features.
PostGIS Geometry Types
import {
point,
lineString,
polygon,
multiPoint,
multiLineString,
multiPolygon,
geometryCollection,
geography,
geometry
} from '@tspvivek/baasix';Array Types
import {
arrayInteger,
arrayBigInt,
arrayText,
arrayVarchar,
arrayUuid,
arrayBoolean,
arrayDecimal,
arrayDouble,
arrayDate,
arrayDateTime,
arrayDateTimeTz,
arrayDateOnly,
arrayTime,
arrayTimeTz,
arrayOf
} from '@tspvivek/baasix';Range Types
import {
rangeInteger,
rangeBigInt,
rangeDecimal,
rangeDate,
rangeDateTime,
rangeDateTimeTz,
rangeDouble,
rangeTime,
rangeTimeTz,
type Range
} from '@tspvivek/baasix';Example:
import { point, arrayText, rangeInteger } from '@tspvivek/baasix';
import { pgTable, uuid, varchar } from 'drizzle-orm/pg-core';
// Using custom types in a schema
const locations = pgTable('locations', {
id: uuid('id').primaryKey(),
name: varchar('name', { length: 255 }),
coordinates: point('coordinates'), // PostGIS point
tags: arrayText('tags'), // TEXT[]
priceRange: rangeInteger('price_range') // int4range
});Soft Delete Plugin
Plugin for implementing soft delete (paranoid mode) on tables.
withSoftDelete
Add soft delete capability to a table schema.
import { withSoftDelete } from '@tspvivek/baasix';
function withSoftDelete<T>(
tableName: string,
columns: T,
options?: SoftDeleteOptions
): T & SoftDeleteMixinExample:
import { withSoftDelete } from '@tspvivek/baasix';
import { pgTable, uuid, varchar, timestamp } from 'drizzle-orm/pg-core';
const posts = pgTable('posts', withSoftDelete('posts', {
id: uuid('id').primaryKey(),
title: varchar('title', { length: 255 }),
createdAt: timestamp('createdAt').defaultNow()
}));
// The table now has a deletedAt columnSoftDeleteHelper
Helper class for working with soft-deleted records.
import { SoftDeleteHelper } from '@tspvivek/baasix';
const softDelete = new SoftDeleteHelper('deletedAt');excludeDeleted
Get filter to exclude soft-deleted records.
softDelete.excludeDeleted(table): SQLonlyDeleted
Get filter to include only soft-deleted records.
softDelete.onlyDeleted(table): SQLmarkAsDeleted
Get current timestamp for soft delete.
softDelete.markAsDeleted(): DatemarkAsRestored
Get null value for restore.
softDelete.markAsRestored(): nullExample:
import { SoftDeleteHelper } from '@tspvivek/baasix';
import { and } from 'drizzle-orm';
const softDelete = new SoftDeleteHelper();
// Query excluding soft-deleted records
const activePosts = await db
.select()
.from(posts)
.where(and(
eq(posts.status, 'published'),
softDelete.excludeDeleted(posts)
));
// Query only soft-deleted records
const deletedPosts = await db
.select()
.from(posts)
.where(softDelete.onlyDeleted(posts));
// Soft delete a record
await db
.update(posts)
.set({ deletedAt: softDelete.markAsDeleted() })
.where(eq(posts.id, postId));
// Restore a record
await db
.update(posts)
.set({ deletedAt: softDelete.markAsRestored() })
.where(eq(posts.id, postId));Auth Utilities
Authentication utilities exported from the auth module.
createAuth
Create an authentication instance.
import { createAuth } from '@tspvivek/baasix';
const auth = createAuth({
secret: process.env.SECRET_KEY,
emailAndPassword: { enabled: true },
socialProviders: {
google: {
clientId: process.env.GOOGLE_CLIENT_ID,
clientSecret: process.env.GOOGLE_CLIENT_SECRET
}
}
});OAuth Providers
import {
google,
facebook,
apple,
github,
credential
} from '@tspvivek/baasix';Auth Services
import {
createSessionService,
createTokenService,
createVerificationService,
validateSessionLimits
} from '@tspvivek/baasix';OAuth2 Utilities
import {
generateState,
generateCodeVerifier,
generateCodeChallenge,
createAuthorizationURL,
validateAuthorizationCode
} from '@tspvivek/baasix';See the SSO Authentication Guide for complete authentication documentation.
Related Documentation
- Services Reference - Baasix services documentation
- Advanced Query Guide - Query syntax in depth
- Database Schema Guide - Schema definitions
- Extensions Guide - Creating extensions
- SSO Authentication Guide - OAuth and SSO setup