Integrating Analytic SQL¶
Last Updated: January 18, 2026
This guide shows how to integrate Analytic SQL into your applications and workflows, whether as a Python library or integrated into other tools like dbt, BI platforms, or databases.
Using as a Python Library¶
Basic Usage¶
The simplest way to use Analytic SQL is as a Python library:
import asql
# Basic query
asql_query = """
from users
where status = "active"
group by country ( # as total_users )
order by -total_users
limit 10
"""
# transpile() returns a list of SQL strings (one per statement)
sql_list = asql.transpile(asql_query, write="postgres")
sql = sql_list[0]
print(sql)
API Reference¶
asql.transpile(sql, read="asql", write="duckdb", pretty=False, schema=None, **kwargs)¶
Transpiles ASQL to SQL for a target dialect.
Parameters:
- sql (str): ASQL query string. Can contain multiple queries separated by semicolons.
- read (str, optional): Source dialect. Default: "asql". Can also be "visual_asql" for JSON-based visual queries.
- write (str, optional): Target SQL dialect. Default: "duckdb". Supported dialects include:
- duckdb (default)
- postgres / postgresql
- mysql
- bigquery
- snowflake
- redshift
- sqlite
- oracle
- mssql / sqlserver
- And more (see SQLGlot dialects)
- pretty (bool): Whether to format SQL output with indentation. Default: False
- schema (dict or Schema, optional): Schema information for column operators, join inference, etc.
- **kwargs: Additional settings passed to the ASQL dialect (e.g., week_start, infer_join_keys)
Returns:
- List[str]: List of generated SQL strings (one per statement)
Raises:
- ASQLSyntaxError: If ASQL syntax is invalid
- ASQLCompilationError: If compilation fails
Example:
import asql
# Single query - transpile() returns a list, get first element
sql = asql.transpile("""
from users
where status == 'active'
""", write="postgres")[0]
# Query with CTE using stash as
multi_query = """
from users where status == 'active' stash as active_users
from active_users group by country ( # as total )
"""
sql = asql.transpile(multi_query, write="bigquery", pretty=True)[0]
SQL to ASQL Conversion¶
Convert SQL back to ASQL syntax using asql.transpile() with reversed dialects:
Example:
import asql
sql = "SELECT country, COUNT(*) AS total FROM users WHERE status = 'active' GROUP BY country"
asql_query = asql.transpile(sql, read='postgres', write='asql')[0]
# Returns:
# from users where status = 'active' group by country select country, COUNT(*) AS total
You can also specify the source dialect for better parsing:
# From BigQuery SQL
asql_query = asql.transpile(sql, read='bigquery', write='asql')[0]
# From PostgreSQL
asql_query = asql.transpile(sql, read='postgres', write='asql')[0]
Error Handling¶
import asql
from asql.errors import ASQLSyntaxError, ASQLCompilationError
try:
sql = asql.transpile("""
from users
where invalid syntax
""", write="postgres")[0]
except ASQLSyntaxError as e:
print(f"Syntax error: {e}")
except ASQLCompilationError as e:
print(f"Compilation error: {e}")
Advanced Usage¶
Pretty Printing¶
sql = asql.transpile(asql_query, write="postgres", pretty=True)[0]
# Returns formatted SQL with indentation
Multiple Dialects¶
asql_query = """
from users
group by country ( # as total )
"""
dialects = ["postgres", "bigquery", "snowflake"]
for dialect in dialects:
sql = asql.transpile(asql_query, write=dialect)[0]
print(f"{dialect}: {sql}")
Batch Compilation¶
queries = [
"""
from users
where status == 'active'
""",
"""
from orders
where amount > 100
""",
"""
from products
where category == 'electronics'
"""
]
compiled = []
for query in queries:
try:
sql = asql.transpile(query, write="postgres")[0]
compiled.append(sql)
except ASQLSyntaxError as e:
print(f"Error in query: {e}")
Integration Patterns¶
Integration with dbt¶
We're building dbt-asql — a native dbt integration that lets you write .asql model files without any wrappers.
dbt-asql (Coming Soon)¶
⚠️ In Development — Not yet published to PyPI
With dbt-asql, you can write dbt models directly in ASQL:
-- models/marts/revenue.asql
SET materialized = incremental;
SET unique_key = id;
from stg_orders
where status = 'completed'
where created_at > {{ start_date || @2024-01-01 }}
{% if is_incremental() %}
where created_at > (select max(created_at) from {{ this }})
{% endif %}
group by region, month(created_at) (
sum(amount) as revenue,
# as order_count
)
order by -revenue
Features:
- No wrappers — Use .asql file extension, no {% asql %} tags
- Auto ref() detection — Just use table names, dbt-asql finds model dependencies
- Simplified variables — {{ start_date }} instead of {{ var('start_date') }}
- Native config — SET materialized = table; instead of {{ config(...) }}
See the dbt-asql README for more details.
Manual Integration Options¶
Until dbt-asql is released, you can integrate ASQL manually:
Option 1: Python Pre-processing
# scripts/compile_asql.py
from pathlib import Path
import asql
def compile_asql_files():
"""Compile .asql files to .sql files for dbt."""
for asql_file in Path("models").rglob("*.asql"):
asql_query = asql_file.read_text()
sql = asql.transpile(asql_query, write="postgres")[0]
sql_file = asql_file.with_suffix(".sql")
sql_file.write_text(sql)
if __name__ == "__main__":
compile_asql_files()
Run before dbt:
python scripts/compile_asql.py && dbt run
Option 2: dbt Python Model
# models/active_users.py
def model(dbt, session):
import asql
asql_query = """
from users
where status == 'active'
group by country ( # as total_users )
"""
sql = asql.transpile(asql_query, write=dbt.config.get('target_type', 'postgres'))[0]
return session.sql(sql)
Integration with BI Tools¶
Tableau Custom SQL¶
Create a wrapper that converts ASQL to SQL for Tableau:
# tableau_asql_wrapper.py
import asql
def tableau_query(asql_query, dialect="postgres"):
"""Convert ASQL to SQL for Tableau Custom SQL."""
sql = asql.transpile(asql_query, write=dialect)[0]
return sql
# Usage in Tableau Custom SQL:
# Use Python script to generate SQL, then paste into Tableau
Looker (LookML) Integration¶
Create a LookML macro that uses ASQL:
# looker_asql_plugin.py
import asql
def compile_asql_for_looker(asql_query, dialect):
"""Compile ASQL for Looker SQL Runner."""
return asql.transpile(asql_query, write=dialect)[0]
Then use in LookML:
sql: |
{% raw %}
{{ compile_asql("""
from users
where status == 'active'
""") }}
{% endraw %}
Power BI Integration¶
Use Python script in Power BI:
# powerbi_asql.py
import pandas as pd
import asql
import pyodbc
def execute_asql(asql_query, connection_string, dialect="mssql"):
"""Execute ASQL query in Power BI."""
sql = asql.transpile(asql_query, write=dialect)[0]
conn = pyodbc.connect(connection_string)
df = pd.read_sql(sql, conn)
conn.close()
return df
Integration with Databases¶
PostgreSQL Function¶
Create a PostgreSQL function that compiles ASQL:
# postgres_asql_extension.py
import asql
import psycopg2
def create_asql_function(conn):
"""Create PostgreSQL function for ASQL compilation."""
cursor = conn.cursor()
# This would require a PostgreSQL extension
# For now, use Python wrapper
pass
# Alternative: Use PL/Python
Database Driver Wrapper¶
Create a database driver that accepts ASQL:
# asql_driver.py
import asql
import psycopg2
class ASQLConnection:
"""Database connection that accepts ASQL queries."""
def __init__(self, connection_string, dialect="postgres"):
self.conn = psycopg2.connect(connection_string)
self.dialect = dialect
def execute(self, asql_query):
"""Execute ASQL query."""
sql = asql.transpile(asql_query, write=self.dialect)[0]
cursor = self.conn.cursor()
cursor.execute(sql)
return cursor.fetchall()
def close(self):
self.conn.close()
# Usage
conn = ASQLConnection("postgresql://user:pass@localhost/db")
results = conn.execute("from users where status == 'active'")
conn.close()
Building Custom Integrations¶
REST API Wrapper¶
Create a REST API that compiles ASQL:
# api.py
from flask import Flask, request, jsonify
import asql
from asql.errors import ASQLSyntaxError, ASQLCompilationError
app = Flask(__name__)
@app.route('/api/compile', methods=['POST'])
def compile_asql():
"""Compile ASQL to SQL via REST API."""
data = request.json
asql_query = data.get('asql')
dialect = data.get('dialect', 'postgres')
pretty = data.get('pretty', False)
try:
sql = asql.transpile(asql_query, write=dialect, pretty=pretty)[0]
return jsonify({'sql': sql})
except ASQLSyntaxError as e:
return jsonify({'error': str(e)}), 400
except ASQLCompilationError as e:
return jsonify({'error': str(e)}), 500
if __name__ == '__main__':
app.run(debug=True)
Command-Line Tool¶
Create a CLI tool:
# asql_cli.py
import argparse
import sys
import asql
from asql.errors import ASQLSyntaxError
def main():
parser = argparse.ArgumentParser(description='Compile ASQL to SQL')
parser.add_argument('query', help='ASQL query string')
parser.add_argument('--dialect', default='postgres', help='SQL dialect')
parser.add_argument('--pretty', action='store_true', help='Pretty print SQL')
parser.add_argument('--file', help='Read ASQL from file')
args = parser.parse_args()
if args.file:
with open(args.file) as f:
asql_query = f.read()
else:
asql_query = args.query
try:
sql = asql.transpile(asql_query, write=args.dialect, pretty=args.pretty)[0]
print(sql)
except ASQLSyntaxError as e:
print(f"Error: {e}", file=sys.stderr)
sys.exit(1)
if __name__ == '__main__':
main()
Jupyter Notebook Integration¶
Use in Jupyter notebooks:
# jupyter_asql.py
import asql
from IPython.display import display, Markdown
def asql_query(query, dialect="postgres", show_sql=True):
"""Execute ASQL query in Jupyter."""
sql = asql.transpile(query, write=dialect, pretty=True)[0]
if show_sql:
display(Markdown(f"**Generated SQL:**\n```sql\n{sql}\n```"))
return sql
# Usage in notebook
sql = asql_query("""
from users
where status == 'active'
group by country ( # as total )
""")
VS Code Extension¶
Create a VS Code extension that compiles ASQL:
// extension.ts
import * as vscode from 'vscode';
import { exec } from 'child_process';
export function activate(context: vscode.ExtensionContext) {
let compileCommand = vscode.commands.registerCommand('asql.compile', () => {
const editor = vscode.window.activeTextEditor;
if (!editor) return;
const asqlQuery = editor.document.getText();
const dialect = vscode.workspace.getConfiguration('asql').get('dialect', 'postgres');
exec(`python -c "import asql; print(asql.transpile('''${asqlQuery}''', write='${dialect}')[0])"`,
(error, stdout, stderr) => {
if (error) {
vscode.window.showErrorMessage(`Compilation error: ${stderr}`);
return;
}
// Open new document with SQL
vscode.workspace.openTextDocument({
content: stdout,
language: 'sql'
}).then(doc => {
vscode.window.showTextDocument(doc);
});
});
});
context.subscriptions.push(compileCommand);
}
Best Practices¶
Error Handling¶
Always wrap compilation in try-except blocks:
import asql
from asql.errors import ASQLSyntaxError, ASQLCompilationError
def safe_transpile(asql_query, dialect="postgres"):
"""Safely transpile ASQL with error handling."""
try:
return asql.transpile(asql_query, write=dialect)[0]
except ASQLSyntaxError as e:
# Log syntax errors
logger.error(f"ASQL syntax error: {e}")
raise
except ASQLCompilationError as e:
# Log compilation errors
logger.error(f"ASQL compilation error: {e}")
raise
Caching Compiled Queries¶
Cache compiled SQL for frequently used queries:
from functools import lru_cache
import asql
@lru_cache(maxsize=100)
def cached_transpile(asql_query, dialect="postgres"):
"""Cache compiled ASQL queries."""
return asql.transpile(asql_query, write=dialect)[0]
Validation¶
Validate ASQL before compilation:
import asql
from asql.errors import ASQLSyntaxError
def validate_asql(asql_query):
"""Validate ASQL query syntax."""
try:
asql.transpile(asql_query, write="postgres")
return True
except ASQLSyntaxError:
return False
Performance Considerations¶
- Compile queries once and reuse SQL strings
- Use connection pooling for database integrations
- Consider caching for frequently used queries
- Batch compile multiple queries when possible
Examples¶
Complete Integration Example¶
Here's a complete example of integrating ASQL into a data pipeline:
# data_pipeline.py
import asql
import psycopg2
from typing import List, Dict
class ASQLPipeline:
"""Data pipeline using ASQL."""
def __init__(self, connection_string, dialect="postgres"):
self.conn = psycopg2.connect(connection_string)
self.dialect = dialect
def execute_query(self, asql_query: str) -> List[Dict]:
"""Execute ASQL query and return results."""
sql = asql.transpile(asql_query, write=self.dialect)[0]
cursor = self.conn.cursor()
cursor.execute(sql)
columns = [desc[0] for desc in cursor.description]
results = [dict(zip(columns, row)) for row in cursor.fetchall()]
cursor.close()
return results
def close(self):
"""Close database connection."""
self.conn.close()
# Usage
pipeline = ASQLPipeline("postgresql://user:pass@localhost/db")
results = pipeline.execute_query("""
from users
where status == 'active'
group by country ( # as total_users )
order by -total_users
limit 10
""")
for row in results:
print(f"{row['country']}: {row['total_users']} users")
pipeline.close()
Next Steps¶
- Check out the Language Specification for complete syntax reference
- See Examples for more query patterns
- Review Architecture for implementation details