SQLAlchemy 1.3 Documentation
SQLAlchemy Core
- SQL Expression Language Tutorial
- SQL Statements and Expressions API
- Schema Definition Language
- Describing Databases with MetaData
- Reflecting Database Objects
- Column INSERT/UPDATE Defaults
- Defining Constraints and Indexes
- Customizing DDL¶
- Custom DDL
- Controlling DDL Sequences
- Using the built-in DDLElement Classes
- DDL Expression Constructs API
- Column and Data Types
- Engine and Connection Use
- Core API Basics
Project Versions
- Previous: Defining Constraints and Indexes
- Next: Column and Data Types
- Up: Home
- On this page:
- Customizing DDL
- Custom DDL
- Controlling DDL Sequences
- Using the built-in DDLElement Classes
- DDL Expression Constructs API
Customizing DDL¶
In the preceding sections we’ve discussed a variety of schema constructs
including Table
,
ForeignKeyConstraint
,
CheckConstraint
, and
Sequence
. Throughout, we’ve relied upon the
create()
and create_all()
methods of
Table
and MetaData
in
order to issue data definition language (DDL) for all constructs. When issued,
a pre-determined order of operations is invoked, and DDL to create each table
is created unconditionally including all constraints and other objects
associated with it. For more complex scenarios where database-specific DDL is
required, SQLAlchemy offers two techniques which can be used to add any DDL
based on any condition, either accompanying the standard generation of tables
or by itself.
Custom DDL¶
Custom DDL phrases are most easily achieved using the
DDL
construct. This construct works like all the
other DDL elements except it accepts a string which is the text to be emitted:
event.listen(
metadata,
"after_create",
DDL("ALTER TABLE users ADD CONSTRAINT "
"cst_user_name_length "
" CHECK (length(user_name) >= 8)")
)
A more comprehensive method of creating libraries of DDL constructs is to use custom compilation - see Custom SQL Constructs and Compilation Extension for details.
Controlling DDL Sequences¶
The DDL
construct introduced previously also has the
ability to be invoked conditionally based on inspection of the
database. This feature is available using the DDLElement.execute_if()
method. For example, if we wanted to create a trigger but only on
the PostgreSQL backend, we could invoke this as:
mytable = Table(
'mytable', metadata,
Column('id', Integer, primary_key=True),
Column('data', String(50))
)
func = DDL(
"CREATE FUNCTION my_func() "
"RETURNS TRIGGER AS $$ "
"BEGIN "
"NEW.data := 'ins'; "
"RETURN NEW; "
"END; $$ LANGUAGE PLPGSQL"
)
trigger = DDL(
"CREATE TRIGGER dt_ins BEFORE INSERT ON mytable "
"FOR EACH ROW EXECUTE PROCEDURE my_func();"
)
event.listen(
mytable,
'after_create',
func.execute_if(dialect='postgresql')
)
event.listen(
mytable,
'after_create',
trigger.execute_if(dialect='postgresql')
)
The DDLElement.execute_if.dialect
keyword also accepts a tuple
of string dialect names:
event.listen(
mytable,
"after_create",
trigger.execute_if(dialect=('postgresql', 'mysql'))
)
event.listen(
mytable,
"before_drop",
trigger.execute_if(dialect=('postgresql', 'mysql'))
)
The DDLElement.execute_if()
method can also work against a callable
function that will receive the database connection in use. In the
example below, we use this to conditionally create a CHECK constraint,
first looking within the PostgreSQL catalogs to see if it exists:
def should_create(ddl, target, connection, **kw):
row = connection.execute(
"select conname from pg_constraint where conname='%s'" %
ddl.element.name).scalar()
return not bool(row)
def should_drop(ddl, target, connection, **kw):
return not should_create(ddl, target, connection, **kw)
event.listen(
users,
"after_create",
DDL(
"ALTER TABLE users ADD CONSTRAINT "
"cst_user_name_length CHECK (length(user_name) >= 8)"
).execute_if(callable_=should_create)
)
event.listen(
users,
"before_drop",
DDL(
"ALTER TABLE users DROP CONSTRAINT cst_user_name_length"
).execute_if(callable_=should_drop)
)
sqlusers.create(engine)
CREATE TABLE users (
user_id SERIAL NOT NULL,
user_name VARCHAR(40) NOT NULL,
PRIMARY KEY (user_id)
)
select conname from pg_constraint where conname='cst_user_name_length'
ALTER TABLE users ADD CONSTRAINT cst_user_name_length CHECK (length(user_name) >= 8)
sqlusers.drop(engine)
select conname from pg_constraint where conname='cst_user_name_length'
ALTER TABLE users DROP CONSTRAINT cst_user_name_length
DROP TABLE users
Using the built-in DDLElement Classes¶
The sqlalchemy.schema
package contains SQL expression constructs that
provide DDL expressions. For example, to produce a CREATE TABLE
statement:
from sqlalchemy.schema import CreateTable
with engine.connect() as conn:
sql conn.execute(CreateTable(mytable))
CREATE TABLE mytable (
col1 INTEGER,
col2 INTEGER,
col3 INTEGER,
col4 INTEGER,
col5 INTEGER,
col6 INTEGER
)
Above, the CreateTable
construct works like any
other expression construct (such as select()
, table.insert()
, etc.).
All of SQLAlchemy’s DDL oriented constructs are subclasses of
the DDLElement
base class; this is the base of all the
objects corresponding to CREATE and DROP as well as ALTER,
not only in SQLAlchemy but in Alembic Migrations as well.
A full reference of available constructs is in DDL Expression Constructs API.
User-defined DDL constructs may also be created as subclasses of
DDLElement
itself. The documentation in
Custom SQL Constructs and Compilation Extension has several examples of this.
The event-driven DDL system described in the previous section
Controlling DDL Sequences is available with other DDLElement
objects as well. However, when dealing with the built-in constructs
such as CreateIndex
, CreateSequence
, etc, the event
system is of limited use, as methods like Table.create()
and
MetaData.create_all()
will invoke these constructs unconditionally.
In a future SQLAlchemy release, the DDL event system including conditional
execution will taken into account for built-in constructs that currently
invoke in all cases.
We can illustrate an event-driven
example with the AddConstraint
and DropConstraint
constructs, as the event-driven system will work for CHECK and UNIQUE
constraints, using these as we did in our previous example of
DDLElement.execute_if()
:
def should_create(ddl, target, connection, **kw):
row = connection.execute(
"select conname from pg_constraint where conname='%s'" %
ddl.element.name).scalar()
return not bool(row)
def should_drop(ddl, target, connection, **kw):
return not should_create(ddl, target, connection, **kw)
event.listen(
users,
"after_create",
AddConstraint(constraint).execute_if(callable_=should_create)
)
event.listen(
users,
"before_drop",
DropConstraint(constraint).execute_if(callable_=should_drop)
)
sqlusers.create(engine)
CREATE TABLE users (
user_id SERIAL NOT NULL,
user_name VARCHAR(40) NOT NULL,
PRIMARY KEY (user_id)
)
select conname from pg_constraint where conname='cst_user_name_length'
ALTER TABLE users ADD CONSTRAINT cst_user_name_length CHECK (length(user_name) >= 8)
sqlusers.drop(engine)
select conname from pg_constraint where conname='cst_user_name_length'
ALTER TABLE users DROP CONSTRAINT cst_user_name_length
DROP TABLE users
While the above example is against the built-in AddConstraint
and DropConstraint
objects, the main usefulness of DDL events
for now remains focused on the use of the DDL
construct itself,
as well as with user-defined subclasses of DDLElement
that aren’t
already part of the MetaData.create_all()
, Table.create()
,
and corresponding “drop” processes.
DDL Expression Constructs API¶
Object Name | Description |
---|---|
Base class for DDL constructs that represent CREATE and DROP or equivalents. |
|
Represent an ALTER TABLE ADD CONSTRAINT statement. |
|
Represent a |
|
Represent a CREATE INDEX statement. |
|
Represent a CREATE SCHEMA statement. |
|
Represent a CREATE SEQUENCE statement. |
|
Represent a CREATE TABLE statement. |
|
A literal DDL statement. |
|
Base class for DDL expression constructs. |
|
Represent an ALTER TABLE DROP CONSTRAINT statement. |
|
Represent a DROP INDEX statement. |
|
Represent a DROP SCHEMA statement. |
|
Represent a DROP SEQUENCE statement. |
|
Represent a DROP TABLE statement. |
|
sort_tables(tables[, skip_fn, extra_dependencies]) |
Sort a collection of |
sort_tables_and_constraints(tables[, filter_fn, extra_dependencies, _warn_for_cycles]) |
Sort a collection of |
- function sqlalchemy.schema.sort_tables(tables, skip_fn=None, extra_dependencies=None)¶
Sort a collection of
Table
objects based on dependency.This is a dependency-ordered sort which will emit
Table
objects such that they will follow their dependentTable
objects. Tables are dependent on another based on the presence ofForeignKeyConstraint
objects as well as explicit dependencies added byTable.add_is_dependent_on()
.Warning
The
sort_tables()
function cannot by itself accommodate automatic resolution of dependency cycles between tables, which are usually caused by mutually dependent foreign key constraints. When these cycles are detected, the foreign keys of these tables are omitted from consideration in the sort. A warning is emitted when this condition occurs, which will be an exception raise in a future release. Tables which are not part of the cycle will still be returned in dependency order.To resolve these cycles, the
ForeignKeyConstraint.use_alter
parameter may be applied to those constraints which create a cycle. Alternatively, thesort_tables_and_constraints()
function will automatically return foreign key constraints in a separate collection when cycles are detected so that they may be applied to a schema separately.Changed in version 1.3.17: - a warning is emitted when
sort_tables()
cannot perform a proper sort due to cyclical dependencies. This will be an exception in a future release. Additionally, the sort will continue to return other tables not involved in the cycle in dependency order which was not the case previously.- Parameters:
skip_fn¶ – optional callable which will be passed a
ForeignKey
object; if it returns True, this constraint will not be considered as a dependency. Note this is different from the same parameter insort_tables_and_constraints()
, which is instead passed the owningForeignKeyConstraint
object.extra_dependencies¶ – a sequence of 2-tuples of tables which will also be considered as dependent on each other.
- function sqlalchemy.schema.sort_tables_and_constraints(tables, filter_fn=None, extra_dependencies=None, _warn_for_cycles=False)¶
Sort a collection of
Table
/ForeignKeyConstraint
objects.This is a dependency-ordered sort which will emit tuples of
(Table, [ForeignKeyConstraint, ...])
such that eachTable
follows its dependentTable
objects. RemainingForeignKeyConstraint
objects that are separate due to dependency rules not satisfied by the sort are emitted afterwards as(None, [ForeignKeyConstraint ...])
.Tables are dependent on another based on the presence of
ForeignKeyConstraint
objects, explicit dependencies added byTable.add_is_dependent_on()
, as well as dependencies stated here using thesort_tables_and_constraints.skip_fn
and/orsort_tables_and_constraints.extra_dependencies
parameters.- Parameters:
filter_fn¶ – optional callable which will be passed a
ForeignKeyConstraint
object, and returns a value based on whether this constraint should definitely be included or excluded as an inline constraint, or neither. If it returns False, the constraint will definitely be included as a dependency that cannot be subject to ALTER; if True, it will only be included as an ALTER result at the end. Returning None means the constraint is included in the table-based result unless it is detected as part of a dependency cycle.extra_dependencies¶ – a sequence of 2-tuples of tables which will also be considered as dependent on each other.
New in version 1.0.0.
See also
- class sqlalchemy.schema.DDLElement¶
Base class for DDL expression constructs.
This class is the base for the general purpose
DDL
class, as well as the various create/drop clause constructs such asCreateTable
,DropTable
,AddConstraint
, etc.DDLElement
integrates closely with SQLAlchemy events, introduced in Events. An instance of one is itself an event receiving callable:event.listen( users, 'after_create', AddConstraint(constraint).execute_if(dialect='postgresql') )
Members
__call__(), against(), bind, callable_, dialect, execute(), execute_at(), execute_if(), on, target
Class signature
class
sqlalchemy.schema.DDLElement
(sqlalchemy.sql.expression.Executable
,sqlalchemy.schema._DDLCompiles
)-
method
sqlalchemy.schema.DDLElement.
__call__(target, bind, **kw)¶ Execute the DDL as a ddl_listener.
-
method
sqlalchemy.schema.DDLElement.
against(target)¶ Return a copy of this DDL against a specific schema item.
-
attribute
sqlalchemy.schema.DDLElement.
bind¶
-
attribute
sqlalchemy.schema.DDLElement.
callable_ = None¶
-
attribute
sqlalchemy.schema.DDLElement.
dialect = None¶
-
method
sqlalchemy.schema.DDLElement.
execute(bind=None, target=None)¶ Execute this DDL immediately.
Executes the DDL statement in isolation using the supplied
Connectable
orConnectable
assigned to the.bind
property, if not supplied. If the DDL has a conditionalon
criteria, it will be invoked with None as the event.- Parameters:
bind¶ – Optional, an
Engine
orConnection
. If not supplied, a validConnectable
must be present in the.bind
property.target¶ – Optional, defaults to None. The target
SchemaItem
for the execute call. Will be passed to theon
callable if any, and may also provide string expansion data for the statement. Seeexecute_at
for more information.
-
method
sqlalchemy.schema.DDLElement.
execute_at(event_name, target)¶ Link execution of this DDL to the DDL lifecycle of a SchemaItem.
Deprecated since version 0.7: The
DDLElement.execute_at()
method is deprecated and will be removed in a future release. Please use theDDLEvents
listener interface in conjunction with theDDLElement.execute_if()
method.Links this
DDLElement
to aTable
orMetaData
instance, executing it when that schema item is created or dropped. The DDL statement will be executed using the same Connection and transactional context as the Table create/drop itself. The.bind
property of this statement is ignored.- Parameters:
A DDLElement instance can be linked to any number of schema items.
execute_at
builds on theappend_ddl_listener
interface ofMetaData
andTable
objects.Caveat: Creating or dropping a Table in isolation will also trigger any DDL set to
execute_at
that Table’s MetaData. This may change in a future release.
-
method
sqlalchemy.schema.DDLElement.
execute_if(dialect=None, callable_=None, state=None)¶ Return a callable that will execute this DDLElement conditionally.
Used to provide a wrapper for event listening:
event.listen( metadata, 'before_create', DDL("my_ddl").execute_if(dialect='postgresql') )
- Parameters:
dialect¶ –
May be a string or tuple of strings. If a string, it will be compared to the name of the executing database dialect:
DDL('something').execute_if(dialect='postgresql')
If a tuple, specifies multiple dialect names:
DDL('something').execute_if(dialect=('postgresql', 'mysql'))
callable_¶ –
A callable, which will be invoked with four positional arguments as well as optional keyword arguments:
- ddl:
This DDL element.
- target:
The
Table
orMetaData
object which is the target of this event. May be None if the DDL is executed explicitly.- bind:
The
Connection
being used for DDL execution- tables:
Optional keyword argument - a list of Table objects which are to be created/ dropped within a MetaData.create_all() or drop_all() method call.
- state:
Optional keyword argument - will be the
state
argument passed to this function.- checkfirst:
Keyword argument, will be True if the ‘checkfirst’ flag was set during the call to
create()
,create_all()
,drop()
,drop_all()
.
If the callable returns a True value, the DDL statement will be executed.
state¶ – any value which will be passed to the callable_ as the
state
keyword argument.
-
attribute
sqlalchemy.schema.DDLElement.
on = None¶
-
attribute
sqlalchemy.schema.DDLElement.
target = None¶
-
method
- class sqlalchemy.schema.DDL(statement, on=None, context=None, bind=None)¶
A literal DDL statement.
Specifies literal SQL DDL to be executed by the database. DDL objects function as DDL event listeners, and can be subscribed to those events listed in
DDLEvents
, using eitherTable
orMetaData
objects as targets. Basic templating support allows a single DDL instance to handle repetitive tasks for multiple tables.Examples:
from sqlalchemy import event, DDL tbl = Table('users', metadata, Column('uid', Integer)) event.listen(tbl, 'before_create', DDL('DROP TRIGGER users_trigger')) spow = DDL('ALTER TABLE %(table)s SET secretpowers TRUE') event.listen(tbl, 'after_create', spow.execute_if(dialect='somedb')) drop_spow = DDL('ALTER TABLE users SET secretpowers FALSE') connection.execute(drop_spow)
When operating on Table events, the following
statement
string substitutions are available:%(table)s - the Table name, with any required quoting applied %(schema)s - the schema name, with any required quoting applied %(fullname)s - the Table name including schema, quoted if needed
The DDL’s “context”, if any, will be combined with the standard substitutions noted above. Keys present in the context will override the standard substitutions.
Members
Class signature
-
method
sqlalchemy.schema.DDL.
__init__(statement, on=None, context=None, bind=None)¶ Create a DDL statement.
- Parameters:
statement¶ –
A string or unicode string to be executed. Statements will be processed with Python’s string formatting operator. See the
context
argument and theexecute_at
method.A literal ‘%’ in a statement must be escaped as ‘%%’.
SQL bind parameters are not available in DDL statements.
on¶ –
Deprecated since version 0.7: The
DDL.on
parameter is deprecated and will be removed in a future release. Please refer toDDLElement.execute_if()
.Optional filtering criteria. May be a string, tuple or a callable predicate. If a string, it will be compared to the name of the executing database dialect:
DDL('something', on='postgresql')
If a tuple, specifies multiple dialect names:
DDL('something', on=('postgresql', 'mysql'))
If a callable, it will be invoked with four positional arguments as well as optional keyword arguments:
- ddl:
This DDL element.
- event:
The name of the event that has triggered this DDL, such as ‘after-create’ Will be None if the DDL is executed explicitly.
- target:
The
Table
orMetaData
object which is the target of this event. May be None if the DDL is executed explicitly.- connection:
The
Connection
being used for DDL execution- tables:
Optional keyword argument - a list of Table objects which are to be created/ dropped within a MetaData.create_all() or drop_all() method call.
If the callable returns a true value, the DDL statement will be executed.
context¶ – Optional dictionary, defaults to None. These values will be available for use in string substitutions on the DDL statement.
bind¶ – Optional. A
Connectable
, used by default whenexecute()
is invoked without a bind argument.
-
method
- class sqlalchemy.schema._CreateDropBase(element, on=None, bind=None)¶
Base class for DDL constructs that represent CREATE and DROP or equivalents.
The common theme of _CreateDropBase is a single
element
attribute which refers to the element to be created or dropped.Class signature
class
sqlalchemy.schema._CreateDropBase
(sqlalchemy.schema.DDLElement
)
- class sqlalchemy.schema.CreateTable(element, on=None, bind=None, include_foreign_key_constraints=None)¶
Represent a CREATE TABLE statement.
Members
Class signature
class
sqlalchemy.schema.CreateTable
(sqlalchemy.schema._CreateDropBase
)-
method
sqlalchemy.schema.CreateTable.
__init__(element, on=None, bind=None, include_foreign_key_constraints=None)¶ Create a
CreateTable
construct.- Parameters:
include_foreign_key_constraints¶ –
optional sequence of
ForeignKeyConstraint
objects that will be included inline within the CREATE construct; if omitted, all foreign key constraints that do not specify use_alter=True are included.New in version 1.0.0.
-
method
- class sqlalchemy.schema.DropTable(element, on=None, bind=None)¶
Represent a DROP TABLE statement.
Class signature
class
sqlalchemy.schema.DropTable
(sqlalchemy.schema._CreateDropBase
)
- class sqlalchemy.schema.CreateColumn(element)¶
Represent a
Column
as rendered in a CREATE TABLE statement, via theCreateTable
construct.This is provided to support custom column DDL within the generation of CREATE TABLE statements, by using the compiler extension documented in Custom SQL Constructs and Compilation Extension to extend
CreateColumn
.Typical integration is to examine the incoming
Column
object, and to redirect compilation if a particular flag or condition is found:from sqlalchemy import schema from sqlalchemy.ext.compiler import compiles @compiles(schema.CreateColumn) def compile(element, compiler, **kw): column = element.element if "special" not in column.info: return compiler.visit_create_column(element, **kw) text = "%s SPECIAL DIRECTIVE %s" % ( column.name, compiler.type_compiler.process(column.type) ) default = compiler.get_column_default_string(column) if default is not None: text += " DEFAULT " + default if not column.nullable: text += " NOT NULL" if column.constraints: text += " ".join( compiler.process(const) for const in column.constraints) return text
The above construct can be applied to a
Table
as follows:from sqlalchemy import Table, Metadata, Column, Integer, String from sqlalchemy import schema metadata = MetaData() table = Table('mytable', MetaData(), Column('x', Integer, info={"special":True}, primary_key=True), Column('y', String(50)), Column('z', String(20), info={"special":True}) ) metadata.create_all(conn)
Above, the directives we’ve added to the
Column.info
collection will be detected by our custom compilation scheme:CREATE TABLE mytable ( x SPECIAL DIRECTIVE INTEGER NOT NULL, y VARCHAR(50), z SPECIAL DIRECTIVE VARCHAR(20), PRIMARY KEY (x) )
The
CreateColumn
construct can also be used to skip certain columns when producing aCREATE TABLE
. This is accomplished by creating a compilation rule that conditionally returnsNone
. This is essentially how to produce the same effect as using thesystem=True
argument onColumn
, which marks a column as an implicitly-present “system” column.For example, suppose we wish to produce a
Table
which skips rendering of the PostgreSQLxmin
column against the PostgreSQL backend, but on other backends does render it, in anticipation of a triggered rule. A conditional compilation rule could skip this name only on PostgreSQL:from sqlalchemy.schema import CreateColumn @compiles(CreateColumn, "postgresql") def skip_xmin(element, compiler, **kw): if element.element.name == 'xmin': return None else: return compiler.visit_create_column(element, **kw) my_table = Table('mytable', metadata, Column('id', Integer, primary_key=True), Column('xmin', Integer) )
Above, a
CreateTable
construct will generate aCREATE TABLE
which only includes theid
column in the string; thexmin
column will be omitted, but only against the PostgreSQL backend.Class signature
class
sqlalchemy.schema.CreateColumn
(sqlalchemy.schema._DDLCompiles
)
- class sqlalchemy.schema.CreateSequence(element, on=None, bind=None)¶
Represent a CREATE SEQUENCE statement.
Class signature
class
sqlalchemy.schema.CreateSequence
(sqlalchemy.schema._CreateDropBase
)
- class sqlalchemy.schema.DropSequence(element, on=None, bind=None)¶
Represent a DROP SEQUENCE statement.
Class signature
class
sqlalchemy.schema.DropSequence
(sqlalchemy.schema._CreateDropBase
)
- class sqlalchemy.schema.CreateIndex(element, on=None, bind=None)¶
Represent a CREATE INDEX statement.
Class signature
class
sqlalchemy.schema.CreateIndex
(sqlalchemy.schema._CreateDropBase
)
- class sqlalchemy.schema.DropIndex(element, on=None, bind=None)¶
Represent a DROP INDEX statement.
Class signature
class
sqlalchemy.schema.DropIndex
(sqlalchemy.schema._CreateDropBase
)
- class sqlalchemy.schema.AddConstraint(element, *args, **kw)¶
Represent an ALTER TABLE ADD CONSTRAINT statement.
Class signature
class
sqlalchemy.schema.AddConstraint
(sqlalchemy.schema._CreateDropBase
)
- class sqlalchemy.schema.DropConstraint(element, cascade=False, **kw)¶
Represent an ALTER TABLE DROP CONSTRAINT statement.
Class signature
class
sqlalchemy.schema.DropConstraint
(sqlalchemy.schema._CreateDropBase
)
- class sqlalchemy.schema.CreateSchema(name, quote=None, **kw)¶
Represent a CREATE SCHEMA statement.
The argument here is the string name of the schema.
Members
Class signature
class
sqlalchemy.schema.CreateSchema
(sqlalchemy.schema._CreateDropBase
)-
method
sqlalchemy.schema.CreateSchema.
__init__(name, quote=None, **kw)¶ Create a new
CreateSchema
construct.
-
method
- class sqlalchemy.schema.DropSchema(name, quote=None, cascade=False, **kw)¶
Represent a DROP SCHEMA statement.
The argument here is the string name of the schema.
Members
Class signature
class
sqlalchemy.schema.DropSchema
(sqlalchemy.schema._CreateDropBase
)-
method
sqlalchemy.schema.DropSchema.
__init__(name, quote=None, cascade=False, **kw)¶ Create a new
DropSchema
construct.
-
method
flambé! the dragon and The Alchemist image designs created and generously donated by Rotem Yaari.
Created using Sphinx 7.2.6. Documentation last generated: Sat 06 Jan 2024 12:15:52 PM