Describing Databases with MetaData

    See also

    Working with Database Metadata - tutorial introduction to SQLAlchemy’s database metadata concept in the

    A collection of metadata entities is stored in an object aptly named MetaData:

    is a container object that keeps together many different features of a database (or multiple databases) being described.

    To represent a table, use the Table class. Its two primary arguments are the table name, then the object which it will be associated with. The remaining positional arguments are mostly Column objects describing each column:

    1. user = Table(
    2. "user",
    3. metadata_obj,
    4. Column("user_id", Integer, primary_key=True),
    5. Column("user_name", String(16), nullable=False),
    6. Column("email_address", String(60)),
    7. Column("nickname", String(50), nullable=False),
    8. )

    Above, a table called user is described, which contains four columns. The primary key of the table consists of the user_id column. Multiple columns may be assigned the primary_key=True flag which denotes a multi-column primary key, known as a composite primary key.

    Note also that each column describes its datatype using objects corresponding to genericized types, such as and String. SQLAlchemy features dozens of types of varying levels of specificity as well as the ability to create custom types. Documentation on the type system can be found at .

    The object contains all of the schema constructs we’ve associated with it. It supports a few methods of accessing these table objects, such as the sorted_tables accessor which returns a list of each Table object in order of foreign key dependency (that is, each table is preceded by all tables which it references):

    1. >>> for t in metadata_obj.sorted_tables:
    2. ... print(t.name)
    3. user
    4. user_preference
    5. invoice
    6. invoice_item

    In most cases, individual objects have been explicitly declared, and these objects are typically accessed directly as module-level variables in an application. Once a Table has been defined, it has a full set of accessors which allow inspection of its properties. Given the following definition:

    1. employees = Table(
    2. "employees",
    3. metadata_obj,
    4. Column("employee_id", Integer, primary_key=True),
    5. Column("employee_name", String(60), nullable=False),
    6. Column("employee_dept", Integer, ForeignKey("departments.department_id")),
    7. )

    Note the ForeignKey object used in this table - this construct defines a reference to a remote table, and is fully described in . Methods of accessing information about this table include:

    1. # access the column "employee_id":
    2. employees.columns.employee_id
    3. # or just
    4. employees.c.employee_id
    5. # via string
    6. employees.c["employee_id"]
    7. # a tuple of columns may be returned using multiple strings
    8. # (new in 2.0)
    9. emp_id, name, type = employees.c["employee_id", "name", "type"]
    10. # iterate through all columns
    11. for c in employees.c:
    12. print(c)
    13. # get the table's primary key columns
    14. for primary_key in employees.primary_key:
    15. print(primary_key)
    16. # get the table's foreign key objects:
    17. for fkey in employees.foreign_keys:
    18. print(fkey)
    19. # access the table's MetaData:
    20. employees.metadata
    21. # access a column's name, type, nullable, primary key, foreign key
    22. employees.c.employee_id.name
    23. employees.c.employee_id.type
    24. employees.c.employee_id.nullable
    25. employees.c.employee_id.primary_key
    26. employees.c.employee_dept.foreign_keys
    27. # get the "key" of a column, which defaults to its name, but can
    28. # be any user-defined string:
    29. employees.c.employee_name.key
    30. # access a column's table:
    31. employees.c.employee_id.table is employees
    32. # get the table related by a foreign key
    33. list(employees.c.employee_dept.foreign_keys)[0].column.table

    Tip

    The FromClause.c collection, synonymous with the collection, is an instance of ColumnCollection, which provides a dictionary-like interface to the collection of columns. Names are ordinarily accessed like attribute names, e.g. employees.c.employee_name. However for special names with spaces or those that match the names of dictionary methods such as or ColumnCollection.values(), indexed access must be used, such as employees.c['values'] or employees.c["some column"]. See for further information.

    Creating and Dropping Database Tables

    Once you’ve defined some objects, assuming you’re working with a brand new database one thing you might want to do is issue CREATE statements for those tables and their related constructs (as an aside, it’s also quite possible that you don’t want to do this, if you already have some preferred methodology such as tools included with your database or an existing scripting system - if that’s the case, feel free to skip this section - SQLAlchemy has no requirement that it be used to create your tables).

    The usual way to issue CREATE is to use create_all() on the object. This method will issue queries that first check for the existence of each individual table, and if not found will issue the CREATE statements:

    1. engine = create_engine("sqlite:///:memory:")
    2. metadata_obj = MetaData()
    3. user = Table(
    4. "user",
    5. metadata_obj,
    6. Column("user_id", Integer, primary_key=True),
    7. Column("user_name", String(16), nullable=False),
    8. Column("email_address", String(60), key="email"),
    9. Column("nickname", String(50), nullable=False),
    10. )
    11. user_prefs = Table(
    12. "user_prefs",
    13. metadata_obj,
    14. Column("pref_id", Integer, primary_key=True),
    15. Column("user_id", Integer, ForeignKey("user.user_id"), nullable=False),
    16. Column("pref_name", String(40), nullable=False),
    17. Column("pref_value", String(100)),
    18. )
    19. metadata_obj.create_all(engine)
    20. PRAGMA table_info(user){}
    21. CREATE TABLE user(
    22. user_id INTEGER NOT NULL PRIMARY KEY,
    23. user_name VARCHAR(16) NOT NULL,
    24. email_address VARCHAR(60),
    25. nickname VARCHAR(50) NOT NULL
    26. )
    27. PRAGMA table_info(user_prefs){}
    28. CREATE TABLE user_prefs(
    29. pref_id INTEGER NOT NULL PRIMARY KEY,
    30. user_id INTEGER NOT NULL REFERENCES user(user_id),
    31. pref_name VARCHAR(40) NOT NULL,
    32. pref_value VARCHAR(100)
    33. )

    create_all() creates foreign key constraints between tables usually inline with the table definition itself, and for this reason it also generates the tables in order of their dependency. There are options to change this behavior such that ALTER TABLE is used instead.

    Dropping all tables is similarly achieved using the method. This method does the exact opposite of create_all() - the presence of each table is checked first, and tables are dropped in reverse order of dependency.

    Creating and dropping individual tables can be done via the create() and drop() methods of . These methods by default issue the CREATE or DROP regardless of the table being present:

    1. engine = create_engine("sqlite:///:memory:")
    2. metadata_obj = MetaData()
    3. employees = Table(
    4. "employees",
    5. metadata_obj,
    6. Column("employee_id", Integer, primary_key=True),
    7. Column("employee_name", String(60), nullable=False, key="name"),
    8. Column("employee_dept", Integer, ForeignKey("departments.department_id")),
    9. )
    10. employees.create(engine)
    11. CREATE TABLE employees(
    12. employee_id SERIAL NOT NULL PRIMARY KEY,
    13. employee_name VARCHAR(60) NOT NULL,
    14. employee_dept INTEGER REFERENCES departments(department_id)
    15. )
    16. {}

    drop() method:

    1. employees.drop(engine)
    2. DROP TABLE employees
    3. {}

    To enable the “check first for the table existing” logic, add the checkfirst=True argument to create() or drop():

    1. employees.create(engine, checkfirst=True)
    2. employees.drop(engine, checkfirst=False)

    While SQLAlchemy directly supports emitting CREATE and DROP statements for schema constructs, the ability to alter those constructs, usually via the ALTER statement as well as other database-specific constructs, is outside of the scope of SQLAlchemy itself. While it’s easy enough to emit ALTER statements and similar by hand, such as by passing a construct to Connection.execute() or by using the construct, it’s a common practice to automate the maintenance of database schemas in relation to application code using schema migration tools.

    The SQLAlchemy project offers the Alembic migration tool for this purpose. Alembic features a highly customizable environment and a minimalistic usage pattern, supporting such features as transactional DDL, automatic generation of “candidate” migrations, an “offline” mode which generates SQL scripts, and support for branch resolution.

    Alembic supersedes the project, which is the original migration tool for SQLAlchemy and is now considered legacy.

    Specifying the Schema Name

    Most databases support the concept of multiple “schemas” - namespaces that refer to alternate sets of tables and other constructs. The server-side geometry of a “schema” takes many forms, including names of “schemas” under the scope of a particular database (e.g. PostgreSQL schemas), named sibling databases (e.g. MySQL / MariaDB access to other databases on the same server), as well as other concepts like tables owned by other usernames (Oracle, SQL Server) or even names that refer to alternate database files (SQLite ATTACH) or remote servers (Oracle DBLINK with synonyms).

    What all of the above approaches have (mostly) in common is that there’s a way of referring to this alternate set of tables using a string name. SQLAlchemy refers to this name as the schema name. Within SQLAlchemy, this is nothing more than a string name which is associated with a object, and is then rendered into SQL statements in a manner appropriate to the target database such that the table is referred towards in its remote “schema”, whatever mechanism that is on the target database.

    The “schema” name may be associated directly with a Table using the argument; when using the ORM with declarative table configuration, the parameter is passed using the __table_args__ parameter dictionary.

    The “schema” name may also be associated with the object where it will take effect automatically for all Table objects associated with that that don’t otherwise specify their own name. Finally, SQLAlchemy also supports a “dynamic” schema name system that is often used for multi-tenant applications such that a single set of Table metadata may refer to a dynamically configured set of schema names on a per-connection or per-statement basis.

    What’s “schema” ?

    SQLAlchemy’s support for database “schema” was designed with first party support for PostgreSQL-style schemas. In this style, there is first a “database” that typically has a single “owner”. Within this database there can be any number of “schemas” which then contain the actual table objects.

    A table within a specific schema is referred towards explicitly using the syntax “<schemaname>.<tablename>”. Contrast this to an architecture such as that of MySQL, where there are only “databases”, however SQL statements can refer to multiple databases at once, using the same syntax except it is “<database>.<tablename>”. On Oracle, this syntax refers to yet another concept, the “owner” of a table. Regardless of which kind of database is in use, SQLAlchemy uses the phrase “schema” to refer to the qualifying identifier within the general syntax of “<qualifier>.<tablename>”.

    See also

    - schema name specification when using the ORM declarative table configuration

    The most basic example is that of the argument using a Core Table object as follows:

    1. metadata_obj = MetaData()
    2. financial_info = Table(
    3. "financial_info",
    4. metadata_obj,
    5. Column("id", Integer, primary_key=True),
    6. Column("value", String(100), nullable=False),
    7. schema="remote_banks",
    8. )

    SQL that is rendered using this , such as the SELECT statement below, will explicitly qualify the table name financial_info with the remote_banks schema name:

    1. >>> print(select(financial_info))
    2. SELECT remote_banks.financial_info.id, remote_banks.financial_info.value
    3. FROM remote_banks.financial_info

    When a Table object is declared with an explicit schema name, it is stored in the internal namespace using the combination of the schema and table name. We can view this in the MetaData.tables collection by searching for the key 'remote_banks.financial_info':

    1. >>> metadata_obj.tables["remote_banks.financial_info"]
    2. Table('financial_info', MetaData(),
    3. Column('id', Integer(), table=<financial_info>, primary_key=True, nullable=False),
    4. Column('value', String(length=100), table=<financial_info>, nullable=False),
    5. schema='remote_banks')

    This dotted name is also what must be used when referring to the table for use with the or ForeignKeyConstraint objects, even if the referring table is also in that same schema:

    1. customer = Table(
    2. "customer",
    3. metadata_obj,
    4. Column("id", Integer, primary_key=True),
    5. Column("financial_info_id", ForeignKey("remote_banks.financial_info.id")),
    6. schema="remote_banks",
    7. )

    The argument may also be used with certain dialects to indicate a multiple-token (e.g. dotted) path to a particular table. This is particularly important on a database such as Microsoft SQL Server where there are often dotted “database/owner” tokens. The tokens may be placed directly in the name at once, such as:

    1. schema = "dbo.scott"

    See also

    Multipart Schema Names - describes use of dotted schema names with the SQL Server dialect.

    The object may also set up an explicit default option for all Table.schema parameters by passing the argument to the top level MetaData construct:

    1. metadata_obj = MetaData(schema="remote_banks")
    2. financial_info = Table(
    3. "financial_info",
    4. metadata_obj,
    5. Column("id", Integer, primary_key=True),
    6. Column("value", String(100), nullable=False),
    7. )

    Above, for any object (or Sequence object directly associated with the ) which leaves the Table.schema parameter at its default of None will instead act as though the parameter were set to the value "remote_banks". This includes that the is cataloged in the MetaData using the schema-qualified name, that is:

    1. metadata_obj.tables["remote_banks.financial_info"]

    When using the or ForeignKeyConstraint objects to refer to this table, either the schema-qualified name or the non-schema-qualified name may be used to refer to the remote_banks.financial_info table:

    1. # either will work:
    2. refers_to_financial_info = Table(
    3. "refers_to_financial_info",
    4. metadata_obj,
    5. Column("id", Integer, primary_key=True),
    6. Column("fiid", ForeignKey("financial_info.id")),
    7. )
    8. # or
    9. refers_to_financial_info = Table(
    10. "refers_to_financial_info",
    11. metadata_obj,
    12. Column("id", Integer, primary_key=True),
    13. Column("fiid", ForeignKey("remote_banks.financial_info.id")),
    14. )

    When using a object that sets MetaData.schema, a that wishes to specify that it should not be schema qualified may use the special symbol BLANK_SCHEMA:

    1. from sqlalchemy import BLANK_SCHEMA
    2. metadata_obj = MetaData(schema="remote_banks")
    3. financial_info = Table(
    4. "financial_info",
    5. metadata_obj,
    6. Column("id", Integer, primary_key=True),
    7. Column("value", String(100), nullable=False),
    8. schema=BLANK_SCHEMA, # will not use "remote_banks"
    9. )

    See also

    MetaData.schema

    The names used by the Table.schema parameter may also be applied against a lookup that is dynamic on a per-connection or per-execution basis, so that for example in multi-tenant situations, each transaction or statement may be targeted at a specific set of schema names that change. The section describes how this feature is used.

    See also

    Translation of Schema Names

    The above approaches all refer to methods of including an explicit schema-name within SQL statements. Database connections in fact feature the concept of a “default” schema, which is the name of the “schema” (or database, owner, etc.) that takes place if a table name is not explicitly schema-qualified. These names are usually configured at the login level, such as when connecting to a PostgreSQL database, the default “schema” is called “public”.

    There are often cases where the default “schema” cannot be set via the login itself and instead would usefully be configured each time a connection is made, using a statement such as “SET SEARCH_PATH” on PostgreSQL or “ALTER SESSION” on Oracle. These approaches may be achieved by using the PoolEvents.connect() event, which allows access to the DBAPI connection when it is first created. For example, to set the Oracle CURRENT_SCHEMA variable to an alternate name:

    1. from sqlalchemy import event
    2. from sqlalchemy import create_engine
    3. engine = create_engine("oracle+cx_oracle://scott:tiger@tsn_name")
    4. @event.listens_for(engine, "connect", insert=True)
    5. def set_current_schema(dbapi_connection, connection_record):
    6. cursor_obj = dbapi_connection.cursor()
    7. cursor_obj.execute("ALTER SESSION SET CURRENT_SCHEMA=%s" % schema_name)
    8. cursor_obj.close()

    Above, the set_current_schema() event handler will take place immediately when the above Engine first connects; as the event is “inserted” into the beginning of the handler list, it will also take place before the dialect’s own event handlers are run, in particular including the one that will determine the “default schema” for the connection.

    For other databases, consult the database and/or dialect documentation for specific information regarding how default schemas are configured.

    Changed in version 1.4.0b2: The above recipe now works without the need to establish additional event handlers.

    See also

    - in the PostgreSQL dialect documentation.

    The schema feature of SQLAlchemy interacts with the table reflection feature introduced at Reflecting Database Objects. See the section for additional details on how this works.

    supports database-specific options. For example, MySQL has different table backend types, including “MyISAM” and “InnoDB”. This can be expressed with Table using mysql_engine:

    1. addresses = Table(
    2. "engine_email_addresses",
    3. metadata_obj,
    4. Column("address_id", Integer, primary_key=True),
    5. Column("remote_user_id", Integer, ForeignKey(users.c.user_id)),
    6. Column("email_address", String(20)),
    7. mysql_engine="InnoDB",
    8. )

    Other backends may support table-level options as well - these would be described in the individual documentation sections for each dialect.

    Column, Table, MetaData API

    attribute sqlalchemy.schema.sqlalchemy.schema.sqlalchemy.schema.BLANK_SCHEMA

    Refers to .

    attribute sqlalchemy.schema.sqlalchemy.schema.sqlalchemy.schema.RETAIN_SCHEMA

    Refers to

    class sqlalchemy.schema.Column

    Represents a column in a database table.

    Members

    __eq__(), , __le__(), , __ne__(), , anon_key_label, , any_(), , asc(), , bool_op(), , collate(), , compile(), , contains(), , desc(), , dialect_options, , endswith(), , foreign_keys, , icontains(), , ilike(), , index, , inherit_cache, , is_distinct_from(), , is_not_distinct_from(), , isnot_distinct_from(), , key, , label(), , match(), , not_in(), , notilike(), , notlike(), , nulls_last(), , nullslast(), , operate(), , proxy_set, , regexp_match(), , reverse_operate(), , shares_lineage(), , timetuple, , unique_params()

    Class signature

    class (sqlalchemy.sql.base.DialectKWArgs, , sqlalchemy.sql.expression.ColumnClause)

    • method __eq__(other: Any) → ColumnOperators

      inherited from the sqlalchemy.sql.expression.ColumnOperators.__eq__ method of

      Implement the == operator.

      In a column context, produces the clause a = b. If the target is None, produces a IS NULL.

    • method sqlalchemy.schema.Column.__init__(_Column\_name_pos: Optional[Union[str, _TypeEngineArgument[_T], ]] = None, _Column__type_pos: Optional[Union[_TypeEngineArgument[_T], SchemaEventTarget]] = None, *args: , _name: Optional[str] = None, type\: Optional[_TypeEngineArgument[_T]] = None, _autoincrement: Union[bool, Literal[‘auto’, ‘ignore_fk’]] = ‘auto’, default: Optional[Any] = None, doc: Optional[str] = None, key: Optional[str] = None, index: Optional[bool] = None, unique: Optional[bool] = None, info: Optional[_InfoType] = None, nullable: Optional[Union[bool, Literal[SchemaConst.NULL_UNSPECIFIED]]] = SchemaConst.NULL_UNSPECIFIED, onupdate: Optional[Any] = None, primary_key: bool = False, server_default: Optional[_ServerDefaultType] = None, server_onupdate: Optional[FetchedValue] = None, quote: Optional[bool] = None, system: bool = False, comment: Optional[str] = None, _proxies: Optional[Any] = None, **dialect_kwargs: Any)

      Construct a new Column object.

      • Parameters:

        • name

          The name of this column as represented in the database. This argument may be the first positional argument, or specified via keyword.

          Names which contain no upper case characters will be treated as case insensitive names, and will not be quoted unless they are a reserved word. Names with any number of upper case characters will be quoted and sent exactly. Note that this behavior applies even for databases which standardize upper case names as case insensitive such as Oracle.

          The name field may be omitted at construction time and applied later, at any time before the Column is associated with a . This is to support convenient usage within the declarative extension.

        • type_

          The column’s type, indicated using an instance which subclasses . If no arguments are required for the type, the class of the type can be sent as well, e.g.:

          1. # use a type with arguments
          2. Column('data', String(50))
          3. # use no arguments
          4. Column('level', Integer)

          The type argument may be the second positional argument or specified by keyword.

          If the type is None or is omitted, it will first default to the special type NullType. If and when this is made to refer to another column using ForeignKey and/or , the type of the remote-referenced column will be copied to this column as well, at the moment that the foreign key is resolved against that remote Column object.

          Changed in version 0.9.0: Support for propagation of type to a from its ForeignKey object has been improved and should be more reliable and timely.

        • *args – Additional positional arguments include various derived constructs which will be applied as options to the column. These include instances of Constraint, , ColumnDefault, , Computed . In some cases an equivalent keyword argument is available such as server_default, default and unique.

        • autoincrement

          Set up “auto increment” semantics for an integer primary key column with no foreign key dependencies (see later in this docstring for a more specific definition). This may influence the DDL that will be emitted for this column during a table create, as well as how the column will be considered when INSERT statements are compiled and executed.

          The default value is the string "auto", which indicates that a single-column (i.e. non-composite) primary key that is of an INTEGER type with no other client-side or server-side default constructs indicated should receive auto increment semantics automatically. Other values include True (force this column to have auto-increment semantics for a as well), False (this column should never have auto-increment semantics), and the string "ignore_fk" (special-case for foreign key columns, see below).

          The term “auto increment semantics” refers both to the kind of DDL that will be emitted for the column within a CREATE TABLE statement, when methods such as MetaData.create_all() and are invoked, as well as how the column will be considered when an INSERT statement is compiled and emitted to the database:

          • DDL rendering (i.e. MetaData.create_all(), ): When used on a Column that has no other default-generating construct associated with it (such as a or Identity construct), the parameter will imply that database-specific keywords such as PostgreSQL SERIAL, MySQL AUTO_INCREMENT, or IDENTITY on SQL Server should also be rendered. Not every database backend has an “implied” default generator available; for example the Oracle backend always needs an explicit construct such as to be included with a Column in order for the DDL rendered to include auto-generating constructs to also be produced in the database.

          • INSERT semantics (i.e. when a construct is compiled into a SQL string and is then executed on a database using Connection.execute() or equivalent): A single-row INSERT statement will be known to produce a new integer primary key value automatically for this column, which will be accessible after the statement is invoked via the attribute upon the Result object. This also applies towards use of the ORM when ORM-mapped objects are persisted to the database, indicating that a new integer primary key will be available to become part of the for that object. This behavior takes place regardless of what DDL constructs are associated with the Column and is independent of the “DDL Rendering” behavior discussed in the previous note above.

          The parameter may be set to True to indicate that a column which is part of a composite (i.e. multi-column) primary key should have autoincrement semantics, though note that only one column within a primary key may have this setting. It can also be set to True to indicate autoincrement semantics on a column that has a client-side or server-side default configured, however note that not all dialects can accommodate all styles of default as an “autoincrement”. It can also be set to False on a single-column primary key that has a datatype of INTEGER in order to disable auto increment semantics for that column.

          Changed in version 1.1: The autoincrement flag now defaults to "auto" which indicates autoincrement semantics by default for single-column integer primary keys only; for composite (multi-column) primary keys, autoincrement is never implicitly enabled; as always, autoincrement=True will allow for at most one of those columns to be an “autoincrement” column. autoincrement=True may also be set on a that has an explicit client-side or server-side default, subject to limitations of the backend database and dialect.

          The setting only has an effect for columns which are:

          • Integer derived (i.e. INT, SMALLINT, BIGINT).

          • Part of the primary key

          • Not referring to another column via ForeignKey, unless the value is specified as 'ignore_fk':

            1. # turn on autoincrement for this column despite
            2. # the ForeignKey()
            3. Column('id', ForeignKey('other.id'),
            4. primary_key=True, autoincrement='ignore_fk')

          It is typically not desirable to have “autoincrement” enabled on a column that refers to another via foreign key, as such a column is required to refer to a value that originates from elsewhere.

          The setting has these effects on columns that meet the above criteria:

          • DDL issued for the column, if the column does not already include a default generating construct supported by the backend such as , will include database-specific keywords intended to signify this column as an “autoincrement” column for specific backends. Behavior for primary SQLAlchemy dialects includes:

            • AUTO INCREMENT on MySQL and MariaDB

            • SERIAL on PostgreSQL

            • IDENTITY on MS-SQL - this occurs even without the Identity construct as the parameter pre-dates this construct.

            • SQLite - SQLite integer primary key columns are implicitly “auto incrementing” and no additional keywords are rendered; to render the special SQLite keyword AUTOINCREMENT is not included as this is unnecessary and not recommended by the database vendor. See the section SQLite Auto Incrementing Behavior for more background.

            • Oracle - The Oracle dialect has no default “autoincrement” feature available at this time, instead the construct is recommended to achieve this (the Sequence construct may also be used).

            • Third-party dialects - consult those dialects’ documentation for details on their specific behaviors.

          • When a single-row construct is compiled and executed, which does not set the Insert.inline() modifier, newly generated primary key values for this column will be automatically retrieved upon statement execution using a method specific to the database driver in use:

            • MySQL, SQLite - calling upon cursor.lastrowid() (see )

            • PostgreSQL, SQL Server, Oracle - use RETURNING or an equivalent construct when rendering an INSERT statement, and then retrieving the newly generated primary key values after execution

            • PostgreSQL, Oracle for Table objects that set to False - for a Sequence only, the is invoked explicitly before the INSERT statement takes place so that the newly generated primary key value is available to the client

            • SQL Server for Table objects that set to False - the SELECT scope_identity() construct is used after the INSERT statement is invoked to retrieve the newly generated primary key value.

            • Third-party dialects - consult those dialects’ documentation for details on their specific behaviors.

          • For multiple-row insert() constructs invoked with a list of parameters (i.e. “executemany” semantics), primary-key retrieving behaviors are generally disabled, however there may be special APIs that may be used to retrieve lists of new primary key values for an “executemany”, such as the psycopg2 “fast insertmany” feature. Such features are very new and may not yet be well covered in documentation.

        • default

          A scalar, Python callable, or expression representing the default value for this column, which will be invoked upon insert if this column is otherwise not specified in the VALUES clause of the insert. This is a shortcut to using ColumnDefault as a positional argument; see that class for full detail on the structure of the argument.

          Contrast this argument to which creates a default generator on the database side.

          See also

          Column INSERT/UPDATE Defaults

        • doc – optional String that can be used by the ORM or similar to document attributes on the Python side. This attribute does not render SQL comments; use the parameter for this purpose.

        • key – An optional string identifier which will identify this Column object on the Table. When a key is provided, this is the only identifier referencing the Column within the application, including ORM attribute mapping; the name field is used only when rendering SQL.

        • index

          When True, indicates that a construct will be automatically generated for this Column, which will result in a “CREATE INDEX” statement being emitted for the when the DDL create operation is invoked.

          Using this flag is equivalent to making use of the Index construct explicitly at the level of the construct itself:

          1. Table(
          2. "some_table",
          3. metadata,
          4. Column("x", Integer),
          5. Index("ix_some_table_x", "x")
          6. )

          To add the Index.unique flag to the , set both the Column.unique and flags to True simultaneously, which will have the effect of rendering the “CREATE UNIQUE INDEX” DDL instruction instead of “CREATE INDEX”.

          The name of the index is generated using the default naming convention which for the construct is of the form ix_<tablename>_<columnname>.

          As this flag is intended only as a convenience for the common case of adding a single-column, default configured index to a table definition, explicit use of the Index construct should be preferred for most use cases, including composite indexes that encompass more than one column, indexes with SQL expressions or ordering, backend-specific index configuration options, and indexes that use a specific name.

          Note

          the attribute on Column does not indicate if this column is indexed or not, only if this flag was explicitly set here. To view indexes on a column, view the collection or use Inspector.get_indexes().

          See also

          Configuring Constraint Naming Conventions

        • info – Optional data dictionary which will be populated into the SchemaItem.info attribute of this object.

        • nullable

          When set to False, will cause the “NOT NULL” phrase to be added when generating DDL for the column. When True, will normally generate nothing (in SQL this defaults to “NULL”), except in some very specific backend-specific edge cases where “NULL” may render explicitly. Defaults to True unless is also True or the column specifies a Identity, in which case it defaults to False. This parameter is only used when issuing CREATE TABLE statements.

          Note

          When the column specifies a Identity this parameter is in general ignored by the DDL compiler. The PostgreSQL database allows nullable identity column by setting this parameter to True explicitly.

        • onupdate

          A scalar, Python callable, or ClauseElement representing a default value to be applied to the column within UPDATE statements, which will be invoked upon update if this column is not present in the SET clause of the update. This is a shortcut to using as a positional argument with for_update=True.

          See also

          Column INSERT/UPDATE Defaults - complete discussion of onupdate

        • primary_key – If True, marks this column as a primary key column. Multiple columns can have this flag set to specify composite primary keys. As an alternative, the primary key of a can be specified via an explicit PrimaryKeyConstraint object.

        • server_default

          A instance, str, Unicode or text() construct representing the DDL DEFAULT value for the column.

          String types will be emitted as-is, surrounded by single quotes:

          1. Column('x', Text, server_default="val")
          2. x TEXT DEFAULT 'val'

          A expression will be rendered as-is, without quotes:

          1. Column('y', DateTime, server_default=text('NOW()'))
          2. y DATETIME DEFAULT NOW()

          Strings and text() will be converted into a DefaultClause object upon initialization.

          This parameter can also accept complex combinations of contextually valid SQLAlchemy expressions or constructs:

          1. from sqlalchemy import create_engine
          2. from sqlalchemy import Table, Column, MetaData, ARRAY, Text
          3. from sqlalchemy.dialects.postgresql import array
          4. engine = create_engine(
          5. 'postgresql+psycopg2://scott:tiger@localhost/mydatabase'
          6. )
          7. metadata_obj = MetaData()
          8. tbl = Table(
          9. "foo",
          10. metadata_obj,
          11. ARRAY(Text),
          12. server_default=array(["biz", "bang", "bash"])
          13. )
          14. )
          15. metadata_obj.create_all(engine)

          The above results in a table created with the following SQL:

          1. CREATE TABLE foo (
          2. bar TEXT[] DEFAULT ARRAY['biz', 'bang', 'bash']
          3. )

          Use to indicate that an already-existing column will generate a default value on the database side which will be available to SQLAlchemy for post-fetch after inserts. This construct does not specify any DDL and the implementation is left to the database, such as via a trigger.

          See also

          Server-invoked DDL-Explicit Default Expressions - complete discussion of server side defaults

        • server_onupdate

          A instance representing a database-side default generation function, such as a trigger. This indicates to SQLAlchemy that a newly generated value will be available after updates. This construct does not actually implement any kind of generation function within the database, which instead must be specified separately.

          Warning

          This directive does not currently produce MySQL’s “ON UPDATE CURRENT_TIMESTAMP()” clause. See Rendering ON UPDATE CURRENT TIMESTAMP for MySQL / MariaDB’s explicit_defaults_for_timestamp for background on how to produce this clause.

          See also

        • quote – Force quoting of this column’s name on or off, corresponding to True or False. When left at its default of None, the column identifier will be quoted according to whether the name is case sensitive (identifiers with at least one upper case character are treated as case sensitive), or if it’s a reserved word. This flag is only needed to force quoting of a reserved word which is not known by the SQLAlchemy dialect.

        • unique

          When , and the Column.index parameter is left at its default value of False, indicates that a construct will be automatically generated for this Column, which will result in a “UNIQUE CONSTRAINT” clause referring to this column being included in the CREATE TABLE statement emitted, when the DDL create operation for the object is invoked.

          When this flag is True while the Column.index parameter is simultaneously set to True, the effect instead is that a construct which includes the Index.unique parameter set to True is generated. See the documentation for for additional detail.

          Using this flag is equivalent to making use of the UniqueConstraint construct explicitly at the level of the construct itself:

          1. Table(
          2. "some_table",
          3. metadata,
          4. Column("x", Integer),
          5. UniqueConstraint("x")
          6. )

          The UniqueConstraint.name parameter of the unique constraint object is left at its default value of None; in the absence of a for the enclosing MetaData, the UNIQUE CONSTRAINT construct will be emitted as unnamed, which typically invokes a database-specific naming convention to take place.

          As this flag is intended only as a convenience for the common case of adding a single-column, default configured unique constraint to a table definition, explicit use of the construct should be preferred for most use cases, including composite constraints that encompass more than one column, backend-specific index configuration options, and constraints that use a specific name.

          Note

          the Column.unique attribute on does not indicate if this column has a unique constraint or not, only if this flag was explicitly set here. To view indexes and unique constraints that may involve this column, view the Table.indexes and/or collections or use Inspector.get_indexes() and/or

          See also

          UNIQUE Constraint

          Column.index

        • system

          When True, indicates this is a “system” column, that is a column which is automatically made available by the database, and should not be included in the columns list for a CREATE TABLE statement.

          For more elaborate scenarios where columns should be conditionally rendered differently on different backends, consider custom compilation rules for .

        • comment

          Optional string that will render an SQL comment on table creation.

          New in version 1.2: Added the Column.comment parameter to .

    • method sqlalchemy.schema.Column.__le__(other: Any) →

      inherited from the sqlalchemy.sql.expression.ColumnOperators.__le__ method of ColumnOperators

      Implement the <= operator.

      In a column context, produces the clause a <= b.

    • method __lt__(other: Any) → ColumnOperators

      inherited from the sqlalchemy.sql.expression.ColumnOperators.__lt__ method of

      Implement the < operator.

      In a column context, produces the clause a < b.

    • method sqlalchemy.schema.Column.__ne__(other: Any) →

      inherited from the sqlalchemy.sql.expression.ColumnOperators.__ne__ method of ColumnOperators

      Implement the != operator.

      In a column context, produces the clause a != b. If the target is None, produces a IS NOT NULL.

    • method all_() → ColumnOperators

      inherited from the method of ColumnOperators

      Produce an clause against the parent object.

      See the documentation for all_() for examples.

      Note

      be sure to not confuse the newer method with its older ARRAY-specific counterpart, the method, which a different calling syntax and usage pattern.

      New in version 1.1.

    • attribute sqlalchemy.schema.Column.anon_key_label

      inherited from the attribute of ColumnElement

      Deprecated since version 1.4: The attribute is now private, and the public accessor is deprecated.

    • attribute sqlalchemy.schema.Column.anon_label

      inherited from the attribute of ColumnElement

      Deprecated since version 1.4: The attribute is now private, and the public accessor is deprecated.

    • method sqlalchemy.schema.Column.any_() →

      inherited from the ColumnOperators.any_() method of

      Produce an any_() clause against the parent object.

      See the documentation for for examples.

      Note

      be sure to not confuse the newer ColumnOperators.any_() method with its older -specific counterpart, the Comparator.any() method, which a different calling syntax and usage pattern.

      New in version 1.1.

    • classmethod argument_for(dialect_name, argument_name, default)

      inherited from the DialectKWArgs.argument_for() method of

      Add a new kind of dialect-specific keyword argument for this class.

      E.g.:

      1. Index.argument_for("mydialect", "length", None)
      2. some_index = Index('a', 'b', mydialect_length=5)

      The DialectKWArgs.argument_for() method is a per-argument way adding extra arguments to the dictionary. This dictionary provides a list of argument names accepted by various schema-level constructs on behalf of a dialect.

      New dialects should typically specify this dictionary all at once as a data member of the dialect class. The use case for ad-hoc addition of argument names is typically for end-user code that is also using a custom compilation scheme which consumes the additional arguments.

      • Parameters:

        • dialect_name – name of a dialect. The dialect must be locatable, else a NoSuchModuleError is raised. The dialect must also include an existing collection, indicating that it participates in the keyword-argument validation and default system, else ArgumentError is raised. If the dialect does not include this collection, then any keyword argument can be specified on behalf of this dialect already. All dialects packaged within SQLAlchemy include this collection, however for third party dialects, support may vary.

        • argument_name – name of the parameter.

        • default – default value of the parameter.

    1. New in version 0.9.4.
    • method asc() → ColumnOperators

      inherited from the method of ColumnOperators

      Produce a clause against the parent object.

    • method sqlalchemy.schema.Column.between(cleft: Any, cright: Any, symmetric: bool = False) →

      inherited from the ColumnOperators.between() method of

      Produce a between() clause against the parent object, given the lower and upper range.

    • method bool_op(opstring: str, precedence: int = 0, python_impl: Optional[Callable[[…], Any]] = None) → Callable[[Any], Operators]

      inherited from the method of Operators

      Return a custom boolean operator.

      This method is shorthand for calling and passing the Operators.op.is_comparison flag with True. A key advantage to using is that when using column constructs, the “boolean” nature of the returned expression will be present for PEP 484 purposes.

      See also

    • method sqlalchemy.schema.Column.cast(type\: _TypeEngineArgument[_T]_) → [_T]

      inherited from the ColumnElement.cast() method of

      Produce a type cast, i.e. CAST(<expression> AS <type>).

      This is a shortcut to the cast() function.

      See also

      cast()

      New in version 1.0.7.

    • method sqlalchemy.schema.Column.collate(collation: str) →

      inherited from the ColumnOperators.collate() method of

      Produce a collate() clause against the parent object, given the collation string.

      See also

    • method sqlalchemy.schema.Column.compare(other: , **kw: Any) → bool

      inherited from the ClauseElement.compare() method of

      Compare this ClauseElement to the given .

      Subclasses should override the default behavior, which is a straight identity comparison.

      **kw are arguments consumed by subclass compare() methods and may be used to modify the criteria for comparison (see ColumnElement).

    • method compile(bind: Optional[Union[Engine, ]] = None, dialect: Optional[Dialect] = None, **kw: Any) →

      inherited from the CompilerElement.compile() method of CompilerElement

      Compile this SQL expression.

      The return value is a Compiled object. Calling str() or unicode() on the returned value will yield a string representation of the result. The object also can return a dictionary of bind parameter names and values using the params accessor.

      • Parameters:

        • bind – An Connection or which can provide a Dialect in order to generate a object. If the bind and dialect parameters are both omitted, a default SQL compiler is used.

        • column_keys – Used for INSERT and UPDATE statements, a list of column names which should be present in the VALUES clause of the compiled statement. If None, all columns from the target table object are rendered.

        • dialect – A Dialect instance which can generate a object. This argument takes precedence over the bind argument.

        • compile_kwargs

          optional dictionary of additional parameters that will be passed through to the compiler within all “visit” methods. This allows any custom flag to be passed through to a custom compilation construct, for example. It is also used for the case of passing the literal_binds flag through:

          1. from sqlalchemy.sql import table, column, select
          2. t = table('t', column('x'))
          3. s = select(t).where(t.c.x == 5)
          4. print(s.compile(compile_kwargs={"literal_binds": True}))

          New in version 0.9.0.

    1. See also
    2. [How do I render SQL expressions as strings, possibly with bound parameters inlined?]($e9fd44a49fe37bbb.md#faq-sql-expression-string)
    • method sqlalchemy.schema.Column.concat(other: Any) →

      inherited from the ColumnOperators.concat() method of

      Implement the ‘concat’ operator.

      In a column context, produces the clause a || b, or uses the concat() operator on MySQL.

    • method sqlalchemy.schema.Column.contains(other: Any, **kw: Any) →

      inherited from the ColumnOperators.contains() method of

      Implement the ‘contains’ operator.

      Produces a LIKE expression that tests against a match for the middle of a string value:

      1. column LIKE '%' || <other> || '%'

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.contains("foobar"))

      Since the operator uses LIKE, wildcard characters "%" and "_" that are present inside the <other> expression will behave like wildcards as well. For literal string values, the ColumnOperators.contains.autoescape flag may be set to True to apply escaping to occurrences of these characters within the string value so that they match as themselves and not as wildcard characters. Alternatively, the parameter will establish a given character as an escape character which can be of use when the target expression is not a literal string.

      • Parameters:

        • other – expression to be compared. This is usually a plain string value, but can also be an arbitrary SQL expression. LIKE wildcard characters % and _ are not escaped by default unless the ColumnOperators.contains.autoescape flag is set to True.

        • autoescape

          boolean; when True, establishes an escape character within the LIKE expression, then applies it to all occurrences of "%", "_" and the escape character itself within the comparison value, which is assumed to be a literal string and not a SQL expression.

          An expression such as:

          1. somecolumn.contains("foo%bar", autoescape=True)

          Will render as:

          1. somecolumn LIKE '%' || :param || '%' ESCAPE '/'

          With the value of :param as "foo/%bar".

        • escape

          a character which when given will render with the ESCAPE keyword to establish that character as the escape character. This character can then be placed preceding occurrences of % and _ to allow them to act as themselves and not wildcard characters.

          An expression such as:

          1. somecolumn.contains("foo/%bar", escape="^")

          Will render as:

          1. somecolumn LIKE '%' || :param || '%' ESCAPE '^'

          The parameter may also be combined with :

          1. somecolumn.contains("foo%bar^bat", escape="^", autoescape=True)

          Where above, the given literal parameter will be converted to "foo^%bar^^bat" before being passed to the database.

    1. See also
    2. [ColumnOperators.startswith()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.startswith "sqlalchemy.sql.expression.ColumnOperators.startswith")
    3. [ColumnOperators.endswith()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.endswith "sqlalchemy.sql.expression.ColumnOperators.endswith")
    4. [ColumnOperators.like()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.like "sqlalchemy.sql.expression.ColumnOperators.like")
    • method sqlalchemy.schema.Column.copy(**kw: Any) → [Any]

      Deprecated since version 1.4: The Column.copy() method is deprecated and will be removed in a future release.

    • method desc() → ColumnOperators

      inherited from the method of ColumnOperators

      Produce a clause against the parent object.

    • attribute sqlalchemy.schema.Column.dialect_kwargs

      inherited from the attribute of DialectKWArgs

      A collection of keyword arguments specified as dialect-specific options to this construct.

      The arguments are present here in their original <dialect>_<kwarg> format. Only arguments that were actually passed are included; unlike the collection, which contains all options known by this dialect including defaults.

      The collection is also writable; keys are accepted of the form <dialect>_<kwarg> where the value will be assembled into the list of options.

      New in version 0.9.2.

      Changed in version 0.9.4: The DialectKWArgs.dialect_kwargs collection is now writable.

      See also

      - nested dictionary form

    • attribute sqlalchemy.schema.Column.dialect_options

      inherited from the attribute of DialectKWArgs

      A collection of keyword arguments specified as dialect-specific options to this construct.

      This is a two-level nested registry, keyed to <dialect_name> and <argument_name>. For example, the postgresql_where argument would be locatable as:

      1. arg = my_object.dialect_options['postgresql']['where']

      New in version 0.9.2.

      See also

      - flat dictionary form

    • method sqlalchemy.schema.Column.distinct() →

      inherited from the ColumnOperators.distinct() method of

      Produce a distinct() clause against the parent object.

    • method endswith(other: Any, escape: Optional[str] = None, autoescape: bool = False) → ColumnOperators

      inherited from the method of ColumnOperators

      Implement the ‘endswith’ operator.

      Produces a LIKE expression that tests against a match for the end of a string value:

      1. column LIKE '%' || <other>

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.endswith("foobar"))

      Since the operator uses LIKE, wildcard characters "%" and "_" that are present inside the <other> expression will behave like wildcards as well. For literal string values, the flag may be set to True to apply escaping to occurrences of these characters within the string value so that they match as themselves and not as wildcard characters. Alternatively, the ColumnOperators.endswith.escape parameter will establish a given character as an escape character which can be of use when the target expression is not a literal string.

      • Parameters:

        • other – expression to be compared. This is usually a plain string value, but can also be an arbitrary SQL expression. LIKE wildcard characters % and _ are not escaped by default unless the flag is set to True.

        • autoescape

          boolean; when True, establishes an escape character within the LIKE expression, then applies it to all occurrences of "%", "_" and the escape character itself within the comparison value, which is assumed to be a literal string and not a SQL expression.

          An expression such as:

          1. somecolumn.endswith("foo%bar", autoescape=True)

          Will render as:

          1. somecolumn LIKE '%' || :param ESCAPE '/'

          With the value of :param as "foo/%bar".

        • escape

          a character which when given will render with the ESCAPE keyword to establish that character as the escape character. This character can then be placed preceding occurrences of % and _ to allow them to act as themselves and not wildcard characters.

          An expression such as:

          1. somecolumn.endswith("foo/%bar", escape="^")

          Will render as:

          1. somecolumn LIKE '%' || :param ESCAPE '^'

          The parameter may also be combined with ColumnOperators.endswith.autoescape:

          Where above, the given literal parameter will be converted to "foo^%bar^^bat" before being passed to the database.

    1. See also
    2. [ColumnOperators.startswith()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.startswith "sqlalchemy.sql.expression.ColumnOperators.startswith")
    3. [ColumnOperators.contains()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.contains "sqlalchemy.sql.expression.ColumnOperators.contains")
    4. [ColumnOperators.like()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.like "sqlalchemy.sql.expression.ColumnOperators.like")
    • attribute expression

      inherited from the ColumnElement.expression attribute of

      Return a column expression.

      Part of the inspection interface; returns self.

    • attribute sqlalchemy.schema.Column.foreign_keys: Set[] = frozenset({})

      inherited from the ColumnElement.foreign_keys attribute of

      A collection of all ForeignKey marker objects associated with this .

      Each object is a member of a Table-wide .

      See also

      Table.foreign_keys

    • method get_children(*, column_tables=False, **kw)

      inherited from the ColumnClause.get_children() method of

      Return immediate child HasTraverseInternals elements of this HasTraverseInternals.

      This is used for visit traversal.

      **kw may contain flags that change the collection that is returned, for example to return a subset of items in order to cut down on larger traversals, or to return child items from a different context (such as schema-level collections instead of clause-level).

    • method sqlalchemy.schema.Column.icontains(other: Any, **kw: Any) →

      inherited from the ColumnOperators.icontains() method of

      Implement the icontains operator, e.g. case insensitive version of ColumnOperators.contains().

      Produces a LIKE expression that tests against an insensitive match for the middle of a string value:

      1. lower(column) LIKE '%' || lower(<other>) || '%'

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.icontains("foobar"))

      Since the operator uses LIKE, wildcard characters "%" and "_" that are present inside the <other> expression will behave like wildcards as well. For literal string values, the flag may be set to True to apply escaping to occurrences of these characters within the string value so that they match as themselves and not as wildcard characters. Alternatively, the ColumnOperators.icontains.escape parameter will establish a given character as an escape character which can be of use when the target expression is not a literal string.

      • Parameters:

        • other – expression to be compared. This is usually a plain string value, but can also be an arbitrary SQL expression. LIKE wildcard characters % and _ are not escaped by default unless the flag is set to True.

        • autoescape

          boolean; when True, establishes an escape character within the LIKE expression, then applies it to all occurrences of "%", "_" and the escape character itself within the comparison value, which is assumed to be a literal string and not a SQL expression.

          An expression such as:

          1. somecolumn.icontains("foo%bar", autoescape=True)

          Will render as:

          1. lower(somecolumn) LIKE '%' || lower(:param) || '%' ESCAPE '/'

          With the value of :param as "foo/%bar".

        • escape

          a character which when given will render with the ESCAPE keyword to establish that character as the escape character. This character can then be placed preceding occurrences of % and _ to allow them to act as themselves and not wildcard characters.

          An expression such as:

          1. somecolumn.icontains("foo/%bar", escape="^")

          Will render as:

          1. lower(somecolumn) LIKE '%' || lower(:param) || '%' ESCAPE '^'

          The parameter may also be combined with ColumnOperators.contains.autoescape:

          1. somecolumn.icontains("foo%bar^bat", escape="^", autoescape=True)

          Where above, the given literal parameter will be converted to "foo^%bar^^bat" before being passed to the database.

    1. See also
    2. [ColumnOperators.contains()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.contains "sqlalchemy.sql.expression.ColumnOperators.contains")
    • method iendswith(other: Any, escape: Optional[str] = None, autoescape: bool = False) → ColumnOperators

      inherited from the method of ColumnOperators

      Implement the iendswith operator, e.g. case insensitive version of .

      Produces a LIKE expression that tests against an insensitive match for the end of a string value:

      1. lower(column) LIKE '%' || lower(<other>)

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.iendswith("foobar"))

      Since the operator uses LIKE, wildcard characters "%" and "_" that are present inside the <other> expression will behave like wildcards as well. For literal string values, the ColumnOperators.iendswith.autoescape flag may be set to True to apply escaping to occurrences of these characters within the string value so that they match as themselves and not as wildcard characters. Alternatively, the parameter will establish a given character as an escape character which can be of use when the target expression is not a literal string.

      • Parameters:

        • other – expression to be compared. This is usually a plain string value, but can also be an arbitrary SQL expression. LIKE wildcard characters % and _ are not escaped by default unless the ColumnOperators.iendswith.autoescape flag is set to True.

        • autoescape

          boolean; when True, establishes an escape character within the LIKE expression, then applies it to all occurrences of "%", "_" and the escape character itself within the comparison value, which is assumed to be a literal string and not a SQL expression.

          An expression such as:

          1. somecolumn.iendswith("foo%bar", autoescape=True)

          Will render as:

          1. lower(somecolumn) LIKE '%' || lower(:param) ESCAPE '/'

          With the value of :param as "foo/%bar".

        • escape

          a character which when given will render with the ESCAPE keyword to establish that character as the escape character. This character can then be placed preceding occurrences of % and _ to allow them to act as themselves and not wildcard characters.

          An expression such as:

          1. somecolumn.iendswith("foo/%bar", escape="^")

          Will render as:

          1. lower(somecolumn) LIKE '%' || lower(:param) ESCAPE '^'

          The parameter may also be combined with :

          1. somecolumn.endswith("foo%bar^bat", escape="^", autoescape=True)

          Where above, the given literal parameter will be converted to "foo^%bar^^bat" before being passed to the database.

    1. See also
    2. [ColumnOperators.endswith()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.endswith "sqlalchemy.sql.expression.ColumnOperators.endswith")
    • method sqlalchemy.schema.Column.ilike(other: Any, escape: Optional[str] = None) →

      Implement the ilike operator, e.g. case insensitive LIKE.

      In a column context, produces an expression either of the form:

      1. lower(a) LIKE lower(other)

      Or on backends that support the ILIKE operator:

      1. a ILIKE other

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.ilike("%foobar%"))
      • Parameters:

        • other – expression to be compared

        • escape

          optional escape character, renders the ESCAPE keyword, e.g.:

          1. somecolumn.ilike("foo/%bar", escape="/")
    1. See also
    2. [ColumnOperators.like()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.like "sqlalchemy.sql.expression.ColumnOperators.like")
    • method sqlalchemy.schema.Column.in_(other: Any) →

      inherited from the ColumnOperators.in_() method of

      Implement the in operator.

      In a column context, produces the clause column IN <other>.

      The given parameter other may be:

      • A list of literal values, e.g.:

        1. stmt.where(column.in_([1, 2, 3]))

        In this calling form, the list of items is converted to a set of bound parameters the same length as the list given:

        1. WHERE COL IN (?, ?, ?)
      • A list of tuples may be provided if the comparison is against a tuple_() containing multiple expressions:

        1. from sqlalchemy import tuple_
        2. stmt.where(tuple_(col1, col2).in_([(1, 10), (2, 20), (3, 30)]))
      • An empty list, e.g.:

        1. stmt.where(column.in_([]))

        In this calling form, the expression renders an “empty set” expression. These expressions are tailored to individual backends and are generally trying to get an empty SELECT statement as a subquery. Such as on SQLite, the expression is:

        1. WHERE col IN (SELECT 1 FROM (SELECT 1) WHERE 1!=1)

        Changed in version 1.4: empty IN expressions now use an execution-time generated SELECT subquery in all cases.

      • A bound parameter, e.g. , may be used if it includes the bindparam.expanding flag:

        1. stmt.where(column.in_(bindparam('value', expanding=True)))

        In this calling form, the expression renders a special non-SQL placeholder expression that looks like:

        1. WHERE COL IN ([EXPANDING_value])

        This placeholder expression is intercepted at statement execution time to be converted into the variable number of bound parameter form illustrated earlier. If the statement were executed as:

        1. connection.execute(stmt, {"value": [1, 2, 3]})

        The database would be passed a bound parameter for each value:

        1. WHERE COL IN (?, ?, ?)

        New in version 1.2: added “expanding” bound parameters

        If an empty list is passed, a special “empty list” expression, which is specific to the database in use, is rendered. On SQLite this would be:

        1. WHERE COL IN (SELECT 1 FROM (SELECT 1) WHERE 1!=1)

        New in version 1.3: “expanding” bound parameters now support empty lists

      • a construct, which is usually a correlated scalar select:

        1. stmt.where(
        2. column.in_(
        3. select(othertable.c.y).
        4. where(table.c.x == othertable.c.x)
        5. )
        6. )

        In this calling form, ColumnOperators.in_() renders as given:

        1. WHERE COL IN (SELECT othertable.y
        2. FROM othertable WHERE othertable.x = table.x)
      • Parameters:

        other – a list of literals, a construct, or a bindparam() construct that includes the flag set to True.

    • attribute sqlalchemy.schema.Column.index: Optional[bool]

      The value of the parameter.

      Does not indicate if this Column is actually indexed or not; use .

      See also

      Table.indexes

    • attribute info

      inherited from the SchemaItem.info attribute of

      Info dictionary associated with the object, allowing user-defined data to be associated with this SchemaItem.

      The dictionary is automatically generated when first accessed. It can also be specified in the constructor of some objects, such as and Column.

    • attribute inherit_cache: Optional[bool] = True

      Indicate if this HasCacheKey instance should make use of the cache key generation scheme used by its immediate superclass.

      The attribute defaults to None, which indicates that a construct has not yet taken into account whether or not its appropriate for it to participate in caching; this is functionally equivalent to setting the value to False, except that a warning is also emitted.

      This flag can be set to True on a particular class, if the SQL that corresponds to the object does not change based on attributes which are local to this class, and not its superclass.

      See also

      - General guideslines for setting the HasCacheKey.inherit_cache attribute for third-party or user defined SQL constructs.

    • method is_(other: Any) → ColumnOperators

      inherited from the method of ColumnOperators

      Implement the IS operator.

      Normally, IS is generated automatically when comparing to a value of None, which resolves to NULL. However, explicit usage of IS may be desirable if comparing to boolean values on certain platforms.

      See also

    • method sqlalchemy.schema.Column.is_distinct_from(other: Any) →

      inherited from the ColumnOperators.is_distinct_from() method of

      Implement the IS DISTINCT FROM operator.

      Renders “a IS DISTINCT FROM b” on most platforms; on some such as SQLite may render “a IS NOT b”.

      New in version 1.1.

    • method sqlalchemy.schema.Column.is_not(other: Any) →

      inherited from the ColumnOperators.is_not() method of

      Implement the IS NOT operator.

      Normally, IS NOT is generated automatically when comparing to a value of None, which resolves to NULL. However, explicit usage of IS NOT may be desirable if comparing to boolean values on certain platforms.

      Changed in version 1.4: The is_not() operator is renamed from isnot() in previous releases. The previous name remains available for backwards compatibility.

      See also

      ColumnOperators.is_()

    • method is_not_distinct_from(other: Any) → ColumnOperators

      inherited from the method of ColumnOperators

      Implement the IS NOT DISTINCT FROM operator.

      Renders “a IS NOT DISTINCT FROM b” on most platforms; on some such as SQLite may render “a IS b”.

      Changed in version 1.4: The is_not_distinct_from() operator is renamed from isnot_distinct_from() in previous releases. The previous name remains available for backwards compatibility.

      New in version 1.1.

    • method isnot(other: Any) → ColumnOperators

      inherited from the method of ColumnOperators

      Implement the IS NOT operator.

      Normally, IS NOT is generated automatically when comparing to a value of None, which resolves to NULL. However, explicit usage of IS NOT may be desirable if comparing to boolean values on certain platforms.

      Changed in version 1.4: The is_not() operator is renamed from isnot() in previous releases. The previous name remains available for backwards compatibility.

      See also

    • method sqlalchemy.schema.Column.isnot_distinct_from(other: Any) →

      inherited from the ColumnOperators.isnot_distinct_from() method of

      Implement the IS NOT DISTINCT FROM operator.

      Renders “a IS NOT DISTINCT FROM b” on most platforms; on some such as SQLite may render “a IS b”.

      Changed in version 1.4: The is_not_distinct_from() operator is renamed from isnot_distinct_from() in previous releases. The previous name remains available for backwards compatibility.

      New in version 1.1.

    • method sqlalchemy.schema.Column.istartswith(other: Any, escape: Optional[str] = None, autoescape: bool = False) →

      inherited from the ColumnOperators.istartswith() method of

      Implement the istartswith operator, e.g. case insensitive version of ColumnOperators.startswith().

      Produces a LIKE expression that tests against an insensitive match for the start of a string value:

      1. lower(column) LIKE lower(<other>) || '%'

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.istartswith("foobar"))

      Since the operator uses LIKE, wildcard characters "%" and "_" that are present inside the <other> expression will behave like wildcards as well. For literal string values, the flag may be set to True to apply escaping to occurrences of these characters within the string value so that they match as themselves and not as wildcard characters. Alternatively, the ColumnOperators.istartswith.escape parameter will establish a given character as an escape character which can be of use when the target expression is not a literal string.

      • Parameters:

        • other – expression to be compared. This is usually a plain string value, but can also be an arbitrary SQL expression. LIKE wildcard characters % and _ are not escaped by default unless the flag is set to True.

        • autoescape

          boolean; when True, establishes an escape character within the LIKE expression, then applies it to all occurrences of "%", "_" and the escape character itself within the comparison value, which is assumed to be a literal string and not a SQL expression.

          An expression such as:

          1. somecolumn.istartswith("foo%bar", autoescape=True)

          Will render as:

          1. lower(somecolumn) LIKE lower(:param) || '%' ESCAPE '/'

          With the value of :param as "foo/%bar".

        • escape

          a character which when given will render with the ESCAPE keyword to establish that character as the escape character. This character can then be placed preceding occurrences of % and _ to allow them to act as themselves and not wildcard characters.

          An expression such as:

          1. somecolumn.istartswith("foo/%bar", escape="^")

          Will render as:

          1. lower(somecolumn) LIKE lower(:param) || '%' ESCAPE '^'

          The parameter may also be combined with ColumnOperators.istartswith.autoescape:

          1. somecolumn.istartswith("foo%bar^bat", escape="^", autoescape=True)

          Where above, the given literal parameter will be converted to "foo^%bar^^bat" before being passed to the database.

    1. See also
    2. [ColumnOperators.startswith()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.startswith "sqlalchemy.sql.expression.ColumnOperators.startswith")
    • attribute key: str = None

      inherited from the ColumnElement.key attribute of

      The ‘key’ that in some circumstances refers to this object in a Python namespace.

      This typically refers to the “key” of the column as present in the .c collection of a selectable, e.g. sometable.c["somekey"] would return a Column with a .key of “somekey”.

    • attribute kwargs

      inherited from the DialectKWArgs.kwargs attribute of

      A synonym for DialectKWArgs.dialect_kwargs.

    • method label(name: Optional[str]) → Label[_T]

      inherited from the method of ColumnElement

      Produce a column label, i.e. <columnname> AS <name>.

      This is a shortcut to the function.

      If ‘name’ is None, an anonymous label name will be generated.

    • method sqlalchemy.schema.Column.like(other: Any, escape: Optional[str] = None) →

      inherited from the ColumnOperators.like() method of

      Implement the like operator.

      In a column context, produces the expression:

      1. a LIKE other

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.like("%foobar%"))
      • Parameters:

        • other – expression to be compared

        • escape

          optional escape character, renders the ESCAPE keyword, e.g.:

          1. somecolumn.like("foo/%bar", escape="/")
    1. See also
    2. [ColumnOperators.ilike()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.ilike "sqlalchemy.sql.expression.ColumnOperators.ilike")
    • method sqlalchemy.schema.Column.match(other: Any, **kwargs: Any) →

      inherited from the ColumnOperators.match() method of

      Implements a database-specific ‘match’ operator.

      ColumnOperators.match() attempts to resolve to a MATCH-like function or operator provided by the backend. Examples include:

      • PostgreSQL - renders x @@ plainto_tsquery(y)

      • MySQL - renders MATCH (x) AGAINST (y IN BOOLEAN MODE)

        See also

        - MySQL specific construct with additional features.

      • Oracle - renders CONTAINS(x, y)

      • other backends may provide special implementations.

      • Backends without any special implementation will emit the operator as “MATCH”. This is compatible with SQLite, for example.

    • method sqlalchemy.schema.Column.not_ilike(other: Any, escape: Optional[str] = None) →

      inherited from the ColumnOperators.not_ilike() method of

      implement the NOT ILIKE operator.

      This is equivalent to using negation with ColumnOperators.ilike(), i.e. ~x.ilike(y).

      Changed in version 1.4: The not_ilike() operator is renamed from notilike() in previous releases. The previous name remains available for backwards compatibility.

      See also

    • method sqlalchemy.schema.Column.not_in(other: Any) →

      inherited from the ColumnOperators.not_in() method of

      implement the NOT IN operator.

      This is equivalent to using negation with ColumnOperators.in_(), i.e. ~x.in_(y).

      In the case that other is an empty sequence, the compiler produces an “empty not in” expression. This defaults to the expression “1 = 1” to produce true in all cases. The may be used to alter this behavior.

      Changed in version 1.4: The not_in() operator is renamed from notin_() in previous releases. The previous name remains available for backwards compatibility.

      Changed in version 1.2: The ColumnOperators.in_() and operators now produce a “static” expression for an empty IN sequence by default.

      See also

      ColumnOperators.in_()

    • method not_like(other: Any, escape: Optional[str] = None) → ColumnOperators

      inherited from the method of ColumnOperators

      implement the NOT LIKE operator.

      This is equivalent to using negation with , i.e. ~x.like(y).

      Changed in version 1.4: The not_like() operator is renamed from notlike() in previous releases. The previous name remains available for backwards compatibility.

      See also

      ColumnOperators.like()

    • method notilike(other: Any, escape: Optional[str] = None) → ColumnOperators

      inherited from the method of ColumnOperators

      implement the NOT ILIKE operator.

      This is equivalent to using negation with , i.e. ~x.ilike(y).

      Changed in version 1.4: The not_ilike() operator is renamed from notilike() in previous releases. The previous name remains available for backwards compatibility.

      See also

      ColumnOperators.ilike()

    • method notin_(other: Any) → ColumnOperators

      inherited from the method of ColumnOperators

      implement the NOT IN operator.

      This is equivalent to using negation with , i.e. ~x.in_(y).

      In the case that other is an empty sequence, the compiler produces an “empty not in” expression. This defaults to the expression “1 = 1” to produce true in all cases. The create_engine.empty_in_strategy may be used to alter this behavior.

      Changed in version 1.4: The not_in() operator is renamed from notin_() in previous releases. The previous name remains available for backwards compatibility.

      Changed in version 1.2: The and ColumnOperators.not_in() operators now produce a “static” expression for an empty IN sequence by default.

      See also

    • method sqlalchemy.schema.Column.notlike(other: Any, escape: Optional[str] = None) →

      inherited from the ColumnOperators.notlike() method of

      implement the NOT LIKE operator.

      This is equivalent to using negation with ColumnOperators.like(), i.e. ~x.like(y).

      Changed in version 1.4: The not_like() operator is renamed from notlike() in previous releases. The previous name remains available for backwards compatibility.

      See also

    • method sqlalchemy.schema.Column.nulls_first() →

      inherited from the ColumnOperators.nulls_first() method of

      Produce a nulls_first() clause against the parent object.

      Changed in version 1.4: The nulls_first() operator is renamed from nullsfirst() in previous releases. The previous name remains available for backwards compatibility.

    • method nulls_last() → ColumnOperators

      inherited from the method of ColumnOperators

      Produce a clause against the parent object.

      Changed in version 1.4: The nulls_last() operator is renamed from nullslast() in previous releases. The previous name remains available for backwards compatibility.

    • method sqlalchemy.schema.Column.nullsfirst() →

      inherited from the ColumnOperators.nullsfirst() method of

      Produce a nulls_first() clause against the parent object.

      Changed in version 1.4: The nulls_first() operator is renamed from nullsfirst() in previous releases. The previous name remains available for backwards compatibility.

    • method nullslast() → ColumnOperators

      inherited from the method of ColumnOperators

      Produce a clause against the parent object.

      Changed in version 1.4: The nulls_last() operator is renamed from nullslast() in previous releases. The previous name remains available for backwards compatibility.

    • method sqlalchemy.schema.Column.op(opstring: str, precedence: int = 0, is_comparison: bool = False, return_type: Optional[Union[Type[[Any]], TypeEngine[Any]]] = None, python_impl: Optional[Callable[…, Any]] = None) → Callable[[Any], ]

      inherited from the Operators.op() method of

      Produce a generic operator function.

      e.g.:

      produces:

      1. somecolumn * 5

      This function can also be used to make bitwise operators explicit. For example:

      1. somecolumn.op('&')(0xff)

      is a bitwise AND of the value in somecolumn.

      • Parameters:

        • opstring – a string which will be output as the infix operator between this element and the expression passed to the generated function.

        • precedence

          precedence which the database is expected to apply to the operator in SQL expressions. This integer value acts as a hint for the SQL compiler to know when explicit parenthesis should be rendered around a particular operation. A lower number will cause the expression to be parenthesized when applied against another operator with higher precedence. The default value of 0 is lower than all operators except for the comma (,) and AS operators. A value of 100 will be higher or equal to all operators, and -100 will be lower than or equal to all operators.

          See also

          I’m using op() to generate a custom operator and my parenthesis are not coming out correctly - detailed description of how the SQLAlchemy SQL compiler renders parenthesis

        • is_comparison

          legacy; if True, the operator will be considered as a “comparison” operator, that is which evaluates to a boolean true/false value, like ==, >, etc. This flag is provided so that ORM relationships can establish that the operator is a comparison operator when used in a custom join condition.

          Using the is_comparison parameter is superseded by using the method instead; this more succinct operator sets this parameter automatically, but also provides correct PEP 484 typing support as the returned object will express a “boolean” datatype, i.e. BinaryExpression[bool].

        • return_type – a class or object that will force the return type of an expression produced by this operator to be of that type. By default, operators that specify Operators.op.is_comparison will resolve to , and those that do not will be of the same type as the left-hand operand.

        • python_impl

          an optional Python function that can evaluate two Python values in the same way as this operator works when run on the database server. Useful for in-Python SQL expression evaluation functions, such as for ORM hybrid attributes, and the ORM “evaluator” used to match objects in a session after a multi-row update or delete.

          e.g.:

          1. >>> expr = column('x').op('+', python_impl=lambda a, b: a + b)('y')

          The operator for the above expression will also work for non-SQL left and right objects:

          1. 15

          New in version 2.0.

    1. See also
    2. [Operators.bool\_op()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.Operators.bool_op "sqlalchemy.sql.expression.Operators.bool_op")
    3. [Redefining and Creating New Operators]($e8ad009010586d59.md#types-operators)
    4. [Using custom operators in join conditions]($b68ea79e4b407a37.md#relationship-custom-operator)
    • method sqlalchemy.schema.Column.operate(op: OperatorType, *other: Any, **kwargs: Any) → [Any]

      inherited from the ColumnElement.operate() method of

      Operate on an argument.

      This is the lowest level of operation, raises NotImplementedError by default.

      Overriding this on a subclass can allow common behavior to be applied to all operations. For example, overriding ColumnOperators to apply func.lower() to the left and right side:

      1. class MyComparator(ColumnOperators):
      2. def operate(self, op, other, **kwargs):
      3. return op(func.lower(self), func.lower(other), **kwargs)
      • Parameters:

        • op – Operator callable.

        • *other – the ‘other’ side of the operation. Will be a single scalar for most operations.

        • **kwargs – modifiers. These may be passed by special operators such as ColumnOperators.contains().

    • method params(*optionaldict, **kwargs)

      inherited from the Immutable.params() method of Immutable

      Return a copy with bindparam() elements replaced.

      Returns a copy of this ClauseElement with elements replaced with values taken from the given dictionary:

      1. >>> clause = column('x') + bindparam('foo')
      2. >>> print(clause.compile().params)
      3. {'foo':None}
      4. >>> print(clause.params({'foo':7}).compile().params)
      5. {'foo':7}
    • attribute sqlalchemy.schema.Column.proxy_set: util.generic_fn_descriptor[FrozenSet[Any]]

      inherited from the attribute of ColumnElement

      set of all columns we are proxying

      as of 2.0 this is explicitly deannotated columns. previously it was effectively deannotated columns but wasn’t enforced. annotated columns should basically not go into sets if at all possible because their hashing behavior is very non-performant.

    • method references(column: Column[Any]) → bool

      Return True if this Column references the given column via foreign key.

    • method regexp_match(pattern: Any, flags: Optional[str] = None) → ColumnOperators

      inherited from the method of ColumnOperators

      Implements a database-specific ‘regexp match’ operator.

      E.g.:

      1. stmt = select(table.c.some_column).where(
      2. table.c.some_column.regexp_match('^(b|c)')
      3. )

      attempts to resolve to a REGEXP-like function or operator provided by the backend, however the specific regular expression syntax and flags available are not backend agnostic.

      Examples include:

      • PostgreSQL - renders x ~ y or x !~ y when negated.

      • Oracle - renders REGEXP_LIKE(x, y)

      • SQLite - uses SQLite’s REGEXP placeholder operator and calls into the Python re.match() builtin.

      • other backends may provide special implementations.

      • Backends without any special implementation will emit the operator as “REGEXP” or “NOT REGEXP”. This is compatible with SQLite and MySQL, for example.

      Regular expression support is currently implemented for Oracle, PostgreSQL, MySQL and MariaDB. Partial support is available for SQLite. Support among third-party dialects may vary.

      • Parameters:

        • pattern – The regular expression pattern string or column clause.

        • flags – Any regular expression string flags to apply. Flags tend to be backend specific. It can be a string or a column clause. Some backends, like PostgreSQL and MariaDB, may alternatively specify the flags as part of the pattern. When using the ignore case flag ‘i’ in PostgreSQL, the ignore case regexp match operator ~* or !~* will be used.

    1. New in version 1.4.
    2. See also
    3. [ColumnOperators.regexp\_replace()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.regexp_replace "sqlalchemy.sql.expression.ColumnOperators.regexp_replace")
    • method sqlalchemy.schema.Column.regexp_replace(pattern: Any, replacement: Any, flags: Optional[str] = None) →

      inherited from the ColumnOperators.regexp_replace() method of

      Implements a database-specific ‘regexp replace’ operator.

      E.g.:

      1. stmt = select(
      2. table.c.some_column.regexp_replace(
      3. 'XY',
      4. flags='g'
      5. )
      6. )

      ColumnOperators.regexp_replace() attempts to resolve to a REGEXP_REPLACE-like function provided by the backend, that usually emit the function REGEXP_REPLACE(). However, the specific regular expression syntax and flags available are not backend agnostic.

      Regular expression replacement support is currently implemented for Oracle, PostgreSQL, MySQL 8 or greater and MariaDB. Support among third-party dialects may vary.

      • Parameters:

        • pattern – The regular expression pattern string or column clause.

        • pattern – The replacement string or column clause.

        • flags – Any regular expression string flags to apply. Flags tend to be backend specific. It can be a string or a column clause. Some backends, like PostgreSQL and MariaDB, may alternatively specify the flags as part of the pattern.

    1. New in version 1.4.
    2. See also
    3. [ColumnOperators.regexp\_match()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.regexp_match "sqlalchemy.sql.expression.ColumnOperators.regexp_match")
    • method reverse_operate(op: OperatorType, other: Any, **kwargs: Any) → ColumnElement[Any]

      inherited from the method of ColumnElement

      Reverse operate on an argument.

      Usage is the same as .

    • method sqlalchemy.schema.Column.self_group(against: Optional[OperatorType] = None) → [Any]

      inherited from the ColumnElement.self_group() method of

      Apply a ‘grouping’ to this ClauseElement.

      This method is overridden by subclasses to return a “grouping” construct, i.e. parenthesis. In particular it’s used by “binary” expressions to provide a grouping around themselves when placed into a larger expression, as well as by constructs when placed into the FROM clause of another select(). (Note that subqueries should be normally created using the method, as many platforms require nested SELECT statements to be named).

      As expressions are composed together, the application of self_group() is automatic - end-user code should never need to use this method directly. Note that SQLAlchemy’s clause constructs take operator precedence into account - so parenthesis might not be needed, for example, in an expression like x OR (y AND z) - AND takes precedence over OR.

      The base method of ClauseElement just returns self.

    • method shares_lineage(othercolumn: ColumnElement[Any]) → bool

      inherited from the method of ColumnElement

      Return True if the given has a common ancestor to this ColumnElement.

    • method startswith(other: Any, escape: Optional[str] = None, autoescape: bool = False) → ColumnOperators

      inherited from the method of ColumnOperators

      Implement the startswith operator.

      Produces a LIKE expression that tests against a match for the start of a string value:

      1. column LIKE <other> || '%'

      E.g.:

      1. stmt = select(sometable).\
      2. where(sometable.c.column.startswith("foobar"))

      Since the operator uses LIKE, wildcard characters "%" and "_" that are present inside the <other> expression will behave like wildcards as well. For literal string values, the flag may be set to True to apply escaping to occurrences of these characters within the string value so that they match as themselves and not as wildcard characters. Alternatively, the ColumnOperators.startswith.escape parameter will establish a given character as an escape character which can be of use when the target expression is not a literal string.

      • Parameters:

        • other – expression to be compared. This is usually a plain string value, but can also be an arbitrary SQL expression. LIKE wildcard characters % and _ are not escaped by default unless the flag is set to True.

        • autoescape

          boolean; when True, establishes an escape character within the LIKE expression, then applies it to all occurrences of "%", "_" and the escape character itself within the comparison value, which is assumed to be a literal string and not a SQL expression.

          An expression such as:

          1. somecolumn.startswith("foo%bar", autoescape=True)

          Will render as:

          1. somecolumn LIKE :param || '%' ESCAPE '/'

          With the value of :param as "foo/%bar".

        • escape

          a character which when given will render with the ESCAPE keyword to establish that character as the escape character. This character can then be placed preceding occurrences of % and _ to allow them to act as themselves and not wildcard characters.

          An expression such as:

          1. somecolumn.startswith("foo/%bar", escape="^")

          Will render as:

          1. somecolumn LIKE :param || '%' ESCAPE '^'

          The parameter may also be combined with ColumnOperators.startswith.autoescape:

          1. somecolumn.startswith("foo%bar^bat", escape="^", autoescape=True)

          Where above, the given literal parameter will be converted to "foo^%bar^^bat" before being passed to the database.

    1. See also
    2. [ColumnOperators.endswith()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.endswith "sqlalchemy.sql.expression.ColumnOperators.endswith")
    3. [ColumnOperators.contains()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.contains "sqlalchemy.sql.expression.ColumnOperators.contains")
    4. [ColumnOperators.like()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnOperators.like "sqlalchemy.sql.expression.ColumnOperators.like")
    • attribute timetuple: Literal[None] = None

      inherited from the ColumnOperators.timetuple attribute of

      Hack, allows datetime objects to be compared on the LHS.

    • attribute sqlalchemy.schema.Column.unique: Optional[bool]

      The value of the parameter.

      Does not indicate if this Column is actually subject to a unique constraint or not; use and Table.constraints.

      See also

      Table.constraints.

    • method unique_params(*optionaldict, **kwargs)

      inherited from the Immutable.unique_params() method of Immutable

      Return a copy with bindparam() elements replaced.

      Same functionality as , except adds unique=True to affected bind parameters so that multiple statements can be used.

    class sqlalchemy.schema.MetaData

    A collection of Table objects and their associated schema constructs.

    Holds a collection of objects as well as an optional binding to an Engine or . If bound, the Table objects in the collection and their columns may participate in implicit SQL execution.

    The objects themselves are stored in the MetaData.tables dictionary.

    is a thread-safe object for read operations. Construction of new tables within a single MetaData object, either explicitly or via reflection, may not be completely thread-safe.

    See also

    - Introduction to database metadata

    Members

    __init__(), , create_all(), , reflect(), , sorted_tables,

    Class signature

    class sqlalchemy.schema.MetaData (sqlalchemy.schema.HasSchemaAttr)

    • method __init__(schema: Optional[str] = None, quote_schema: Optional[bool] = None, naming_convention: Optional[Dict[str, str]] = None, info: Optional[_InfoType] = None) → None

      Create a new MetaData object.

      • Parameters:

        • schema

          The default schema to use for the Table, , and potentially other objects associated with this MetaData. Defaults to None.

          See also

          - details on how the MetaData.schema parameter is used.

          Sequence.schema

        • quote_schema – Sets the quote_schema flag for those , Sequence, and other objects which make usage of the local schema name.

        • info

          Optional data dictionary which will be populated into the attribute of this object.

          New in version 1.0.0.

        • naming_convention

          a dictionary referring to values which will establish default naming conventions for Constraint and objects, for those objects which are not given a name explicitly.

          The keys of this dictionary may be:

          • a constraint or Index class, e.g. the UniqueConstraint, class, the Index class

          • a string mnemonic for one of the known constraint classes; "fk", "pk", "ix", "ck", "uq" for foreign key, primary key, index, check, and unique constraint, respectively.

          • the string name of a user-defined “token” that can be used to define new naming tokens.

          The values associated with each “constraint class” or “constraint mnemonic” key are string naming templates, such as "uq_%(table_name)s_%(column_0_name)s", which describe how the name should be composed. The values associated with user-defined “token” keys should be callables of the form fn(constraint, table), which accepts the constraint/index object and as arguments, returning a string result.

          The built-in names are as follows, some of which may only be available for certain types of constraint:

          New in version 1.3.0: - added new %(column_0N_name)s, %(column_0_N_name)s, and related tokens that produce concatenations of names, keys, or labels for all columns referred to by a given constraint.

          See also

          Configuring Constraint Naming Conventions - for detailed usage examples.

    • method clear() → None

      Clear all Table objects from this MetaData.

    • method sqlalchemy.schema.MetaData.create_all(bind: _CreateDropBind, tables: Optional[_typing_Sequence[]] = None, checkfirst: bool = True) → None

      Create all tables stored in this metadata.

      Conditional by default, will not attempt to recreate tables already present in the target database.

      • Parameters:

        • bind – A Connection or used to access the database.

        • tables – Optional list of Table objects, which is a subset of the total tables in the MetaData (others are ignored).

        • checkfirst – Defaults to True, don’t issue CREATEs for tables already present in the target database.

    • method sqlalchemy.schema.MetaData.drop_all(bind: _CreateDropBind, tables: Optional[_typing_Sequence[]] = None, checkfirst: bool = True) → None

      Drop all tables stored in this metadata.

      Conditional by default, will not attempt to drop tables not present in the target database.

      • Parameters:

        • bind – A Connection or used to access the database.

        • tables – Optional list of Table objects, which is a subset of the total tables in the MetaData (others are ignored).

        • checkfirst – Defaults to True, only issue DROPs for tables confirmed to be present in the target database.

    • method sqlalchemy.schema.MetaData.reflect(bind: Union[, Connection], schema: Optional[str] = None, views: bool = False, only: Optional[_typing_Sequence[str]] = None, extend_existing: bool = False, autoload_replace: bool = True, resolve_fks: bool = True, **dialect_kwargs: Any) → None

      Load all available table definitions from the database.

      Automatically creates Table entries in this MetaData for any table available in the database but not yet present in the MetaData. May be called multiple times to pick up tables recently added to the database, however no special action is taken if a table in this MetaData no longer exists in the database.

      • Parameters:

        • bind – A or Engine used to access the database.

        • schema – Optional, query and reflect tables from an alternate schema. If None, the schema associated with this is used, if any.

        • views – If True, also reflect views (materialized and plain).

        • only

          Optional. Load only a sub-set of available named tables. May be specified as a sequence of names or a callable.

          If a sequence of names is provided, only those tables will be reflected. An error is raised if a table is requested but not available. Named tables already present in this MetaData are ignored.

          If a callable is provided, it will be used as a boolean predicate to filter the list of potential table names. The callable is called with a table name and this MetaData instance as positional arguments and should return a true value for any table to reflect.

        • extend_existing

          Passed along to each Table as .

          New in version 0.9.1.

        • autoload_replace

          Passed along to each Table as .

          New in version 0.9.1.

        • resolve_fks

          if True, reflect Table objects linked to objects located in each Table. For , this has the effect of reflecting related tables that might otherwise not be in the list of tables being reflected, for example if the referenced table is in a different schema or is omitted via the MetaData.reflect.only parameter. When False, objects are not followed to the Table in which they link, however if the related table is also part of the list of tables that would be reflected in any case, the object will still resolve to its related Table after the operation is complete. Defaults to True.

          New in version 1.3.0.

          See also

          Table.resolve_fks

        • **dialect_kwargs

    • method remove(table: Table) → None

      Remove the given Table object from this MetaData.

    • attribute sorted_tables

      Returns a list of Table objects sorted in order of foreign key dependency.

      The sorting will place objects that have dependencies first, before the dependencies themselves, representing the order in which they can be created. To get the order in which the tables would be dropped, use the reversed() Python built-in.

      Warning

      The MetaData.sorted_tables attribute cannot by itself accommodate automatic resolution of dependency cycles between tables, which are usually caused by mutually dependent foreign key constraints. When these cycles are detected, the foreign keys of these tables are omitted from consideration in the sort. A warning is emitted when this condition occurs, which will be an exception raise in a future release. Tables which are not part of the cycle will still be returned in dependency order.

      To resolve these cycles, the parameter may be applied to those constraints which create a cycle. Alternatively, the sort_tables_and_constraints() function will automatically return foreign key constraints in a separate collection when cycles are detected so that they may be applied to a schema separately.

      Changed in version 1.3.17: - a warning is emitted when cannot perform a proper sort due to cyclical dependencies. This will be an exception in a future release. Additionally, the sort will continue to return other tables not involved in the cycle in dependency order which was not the case previously.

      See also

      sort_tables()

      MetaData.tables

      Inspector.get_sorted_table_and_fkc_names()

    • attribute tables: util.FacadeDict[str, Table]

      A dictionary of objects keyed to their name or “table key”.

      The exact key is that determined by the Table.key attribute; for a table with no attribute, this is the same as Table.name. For a table with a schema, it is typically of the form schemaname.tablename.

      See also

      MetaData.sorted_tables

    class sqlalchemy.schema.SchemaConst

    An enumeration.

    Members

    , NULL_UNSPECIFIED,

    Class signature

    class sqlalchemy.schema.SchemaConst (enum.Enum)

    • attribute BLANK_SCHEMA = 2

      Symbol indicating that a Table or should have ‘None’ for its schema, even if the parent MetaData has specified a schema.

      See also

      Table.schema

      New in version 1.0.14.

    • attribute sqlalchemy.schema.SchemaConst.NULL_UNSPECIFIED = 3

      Symbol indicating the “nullable” keyword was not passed to a Column.

      This is used to distinguish between the use case of passing nullable=None to a , which has special meaning on some backends such as SQL Server.

    • attribute sqlalchemy.schema.SchemaConst.RETAIN_SCHEMA = 1

      Symbol indicating that a , Sequence or in some cases a object, in situations where the object is being copied for a Table.to_metadata() operation, should retain the schema name that it already has.

    class sqlalchemy.schema.SchemaItem

    Base class for items that define a database schema.

    Members

    Class signature

    class sqlalchemy.schema.SchemaItem (sqlalchemy.sql.expression.SchemaEventTarget, )

    • attribute sqlalchemy.schema.SchemaItem.info

      Info dictionary associated with the object, allowing user-defined data to be associated with this .

      The dictionary is automatically generated when first accessed. It can also be specified in the constructor of some objects, such as Table and .

    class sqlalchemy.schema.Table

    Represent a table in a database.

    e.g.:

    1. mytable = Table(
    2. "mytable", metadata,
    3. Column('mytable_id', Integer, primary_key=True),
    4. Column('value', String(50))
    5. )

    The Table object constructs a unique instance of itself based on its name and optional schema name within the given object. Calling the Table constructor with the same name and same argument a second time will return the same Table object - in this way the constructor acts as a registry function.

    See also

    Describing Databases with MetaData - Introduction to database metadata

    Members

    , add_is_dependent_on(), , append_column(), , argument_for(), , columns, , compile(), , corresponding_column(), , delete(), , dialect_kwargs, , drop(), , exported_columns, , foreign_keys, , implicit_returning, , info, , insert(), , join(), , kwargs, , outerjoin(), , primary_key, , schema, , self_group(), , tablesample(), , tometadata(), , update()

    Class signature

    class (sqlalchemy.sql.base.DialectKWArgs, sqlalchemy.schema.HasSchemaAttr, , sqlalchemy.inspection.Inspectable)

    • method sqlalchemy.schema.Table.__init__(name: str, metadata: , *args: SchemaItem, schema: Optional[Union[str, Literal[SchemaConst.BLANK_SCHEMA]]] = None, quote: Optional[bool] = None, quote_schema: Optional[bool] = None, autoload_with: Optional[Union[, Connection]] = None, autoload_replace: bool = True, keep_existing: bool = False, extend_existing: bool = False, resolve_fks: bool = True, include_columns: Optional[Collection[str]] = None, implicit_returning: bool = True, comment: Optional[str] = None, info: Optional[Dict[Any, Any]] = None, listeners: Optional[_typing_Sequence[[str, Callable[…, Any]]]] = None, prefixes: Optional[_typing_Sequence[str]] = None, _extend_on: Optional[Set[Table]] = None, _no_init: bool = True, **kw: Any) → None

      Constructor for .

      • Parameters:

        • name

          The name of this table as represented in the database.

          The table name, along with the value of the schema parameter, forms a key which uniquely identifies this Table within the owning collection. Additional calls to Table with the same name, metadata, and schema name will return the same object.

          Names which contain no upper case characters will be treated as case insensitive names, and will not be quoted unless they are a reserved word or contain special characters. A name with any number of upper case characters is considered to be case sensitive, and will be sent as quoted.

          To enable unconditional quoting for the table name, specify the flag quote=True to the constructor, or use the quoted_name construct to specify the name.

        • metadata – a object which will contain this table. The metadata is used as a point of association of this table with other tables which are referenced via foreign key. It also may be used to associate this table with a particular Connection or .

        • *args – Additional positional arguments are used primarily to add the list of Column objects contained within this table. Similar to the style of a CREATE TABLE statement, other constructs may be added here, including PrimaryKeyConstraint, and .

        • autoload_replace

          Defaults to True; when using Table.autoload_with in conjunction with , indicates that Column objects present in the already-existing object should be replaced with columns of the same name retrieved from the autoload process. When False, columns already present under existing names will be omitted from the reflection process.

          Note that this setting does not impact Column objects specified programmatically within the call to that also is autoloading; those Column objects will always replace existing columns of the same name when is True.

          See also

          Table.autoload_with

        • autoload_with – An Engine or object, or a Inspector object as returned by against one, with which this Table object will be reflected. When set to a non-None value, the autoload process will take place for this table against the given engine or connection.

        • extend_existing

          When True, indicates that if this is already present in the given MetaData, apply further arguments within the constructor to the existing .

          If Table.extend_existing or are not set, and the given name of the new Table refers to a that is already present in the target MetaData collection, and this specifies additional columns or other constructs or flags that modify the table’s state, an error is raised. The purpose of these two mutually-exclusive flags is to specify what action should be taken when a Table is specified that matches an existing , yet specifies additional constructs.

          Table.extend_existing will also work in conjunction with to run a new reflection operation against the database, even if a Table of the same name is already present in the target ; newly reflected Column objects and other options will be added into the state of the , potentially overwriting existing columns and options of the same name.

          As is always the case with Table.autoload_with, objects can be specified in the same Table constructor, which will take precedence. Below, the existing table mytable will be augmented with objects both reflected from the database, as well as the given Column named “y”:

          1. Table("mytable", metadata,
          2. Column('y', Integer),
          3. extend_existing=True,
          4. autoload_with=engine
          5. )

          See also

          Table.autoload_replace

        • implicit_returning

          True by default - indicates that RETURNING can be used, typically by the ORM, in order to fetch server-generated values such as primary key values and server side defaults, on those backends which support RETURNING.

          In modern SQLAlchemy there is generally no reason to alter this setting, except for some backend specific cases (see Triggers in the SQL Server dialect documentation for one such example).

        • include_columns – A list of strings indicating a subset of columns to be loaded via the autoload operation; table columns who aren’t present in this list will not be represented on the resulting Table object. Defaults to None which indicates all columns should be reflected.

        • resolve_fks

          Whether or not to reflect objects related to this one via ForeignKey objects, when is specified. Defaults to True. Set to False to disable reflection of related tables as ForeignKey objects are encountered; may be used either to save on SQL calls or to avoid issues with related tables that can’t be accessed. Note that if a related table is already present in the collection, or becomes present later, a ForeignKey object associated with this will resolve to that table normally.

          New in version 1.3.

          See also

          MetaData.reflect.resolve_fks

        • info – Optional data dictionary which will be populated into the attribute of this object.

        • keep_existing

          When True, indicates that if this Table is already present in the given MetaData, ignore further arguments within the constructor to the existing , and return the Table object as originally created. This is to allow a function that wishes to define a new on first call, but on subsequent calls will return the same Table, without any of the declarations (particularly constraints) being applied a second time.

          If or Table.keep_existing are not set, and the given name of the new refers to a Table that is already present in the target collection, and this Table specifies additional columns or other constructs or flags that modify the table’s state, an error is raised. The purpose of these two mutually-exclusive flags is to specify what action should be taken when a is specified that matches an existing Table, yet specifies additional constructs.

          See also

        • listeners

          A list of tuples of the form (<eventname>, <fn>) which will be passed to listen() upon construction. This alternate hook to allows the establishment of a listener function specific to this Table before the “autoload” process begins. Historically this has been intended for use with the event, however note that this event hook may now be associated with the MetaData object directly:

          1. def listen_for_reflect(table, column_info):
          2. "handle the column reflection event"
          3. # ...
          4. t = Table(
          5. 'sometable',
          6. autoload_with=engine,
          7. listeners=[
          8. ('column_reflect', listen_for_reflect)
          9. ])

          See also

        • must_exist – When True, indicates that this Table must already be present in the given MetaData collection, else an exception is raised.

        • prefixes – A list of strings to insert after CREATE in the CREATE TABLE statement. They will be separated by spaces.

        • quote

          Force quoting of this table’s name on or off, corresponding to True or False. When left at its default of None, the column identifier will be quoted according to whether the name is case sensitive (identifiers with at least one upper case character are treated as case sensitive), or if it’s a reserved word. This flag is only needed to force quoting of a reserved word which is not known by the SQLAlchemy dialect.

          Note

          setting this flag to False will not provide case-insensitive behavior for table reflection; table reflection will always search for a mixed-case name in a case sensitive fashion. Case insensitive names are specified in SQLAlchemy only by stating the name with all lower case characters.

        • quote_schema – same as ‘quote’ but applies to the schema identifier.

        • schema

          The schema name for this table, which is required if the table resides in a schema other than the default selected schema for the engine’s database connection. Defaults to None.

          If the owning of this Table specifies its own parameter, then that schema name will be applied to this Table if the schema parameter here is set to None. To set a blank schema name on a that would otherwise use the schema set on the owning MetaData, specify the special symbol .

          New in version 1.0.14: Added the BLANK_SCHEMA symbol to allow a to have a blank schema name even when the parent MetaData specifies .

          The quoting rules for the schema name are the same as those for the name parameter, in that quoting is applied for reserved words or case-sensitive names; to enable unconditional quoting for the schema name, specify the flag quote_schema=True to the constructor, or use the quoted_name construct to specify the name.

        • comment

          Optional string that will render an SQL comment on table creation.

          New in version 1.2: Added the parameter to Table.

        • **kw – Additional keyword arguments not mentioned above are dialect specific, and passed in the form <dialectname>_<argname>. See the documentation regarding an individual dialect at for detail on documented arguments.

    • method sqlalchemy.schema.Table.add_is_dependent_on(table: ) → None

      Add a ‘dependency’ for this Table.

      This is another Table object which must be created first before this one can, or dropped after this one.

      Usually, dependencies between tables are determined via ForeignKey objects. However, for other situations that create dependencies outside of foreign keys (rules, inheriting), this method can manually establish such a link.

    • method sqlalchemy.schema.Table.alias(name: Optional[str] = None, flat: bool = False) → NamedFromClause

      inherited from the method of FromClause

      Return an alias of this .

      E.g.:

      1. a2 = some_table.alias('a2')

      The above code creates an Alias object which can be used as a FROM clause in any SELECT statement.

      See also

      alias()

    • method append_column(column: ColumnClause[Any], replace_existing: bool = False) → None

      Append a to this Table.

      The “key” of the newly added , i.e. the value of its .key attribute, will then be available in the .c collection of this Table, and the column definition will be included in any CREATE TABLE, SELECT, UPDATE, etc. statements generated from this construct.

      Note that this does not change the definition of the table as it exists within any underlying database, assuming that table has already been created in the database. Relational databases support the addition of columns to existing tables using the SQL ALTER command, which would need to be emitted for an already-existing table that doesn’t contain the newly added column.

      • Parameters:

        replace_existing

        When True, allows replacing existing columns. When False, the default, an warning will be raised if a column with the same .key already exists. A future version of sqlalchemy will instead rise a warning.

        New in version 1.4.0.

    • method sqlalchemy.schema.Table.append_constraint(constraint: Union[, Constraint]) → None

      Append a to this Table.

      This has the effect of the constraint being included in any future CREATE TABLE statement, assuming specific DDL creation events have not been associated with the given object.

      Note that this does not produce the constraint within the relational database automatically, for a table that already exists in the database. To add a constraint to an existing relational database table, the SQL ALTER command must be used. SQLAlchemy also provides the AddConstraint construct which can produce this SQL when invoked as an executable clause.

    • classmethod argument_for(dialect_name, argument_name, default)

      inherited from the DialectKWArgs.argument_for() method of

      Add a new kind of dialect-specific keyword argument for this class.

      E.g.:

      1. Index.argument_for("mydialect", "length", None)
      2. some_index = Index('a', 'b', mydialect_length=5)

      The DialectKWArgs.argument_for() method is a per-argument way adding extra arguments to the dictionary. This dictionary provides a list of argument names accepted by various schema-level constructs on behalf of a dialect.

      New dialects should typically specify this dictionary all at once as a data member of the dialect class. The use case for ad-hoc addition of argument names is typically for end-user code that is also using a custom compilation scheme which consumes the additional arguments.

      • Parameters:

        • dialect_name – name of a dialect. The dialect must be locatable, else a NoSuchModuleError is raised. The dialect must also include an existing collection, indicating that it participates in the keyword-argument validation and default system, else ArgumentError is raised. If the dialect does not include this collection, then any keyword argument can be specified on behalf of this dialect already. All dialects packaged within SQLAlchemy include this collection, however for third party dialects, support may vary.

        • argument_name – name of the parameter.

        • default – default value of the parameter.

    1. New in version 0.9.4.
    • attribute c

      inherited from the FromClause.c attribute of

      A synonym for FromClause.columns

      • Returns:

        a

    • attribute sqlalchemy.schema.Table.columns

      inherited from the attribute of FromClause

      A named-based collection of objects maintained by this FromClause.

      The , or c collection, is the gateway to the construction of SQL expressions using table-bound or other selectable-bound columns:

      1. select(mytable).where(mytable.c.somecolumn == 5)
      • Returns:

        a object.

    • method sqlalchemy.schema.Table.compare(other: , **kw: Any) → bool

      inherited from the ClauseElement.compare() method of

      Compare this ClauseElement to the given .

      Subclasses should override the default behavior, which is a straight identity comparison.

      **kw are arguments consumed by subclass compare() methods and may be used to modify the criteria for comparison (see ColumnElement).

    • method compile(bind: Optional[Union[Engine, ]] = None, dialect: Optional[Dialect] = None, **kw: Any) →

      inherited from the CompilerElement.compile() method of CompilerElement

      Compile this SQL expression.

      The return value is a Compiled object. Calling str() or unicode() on the returned value will yield a string representation of the result. The object also can return a dictionary of bind parameter names and values using the params accessor.

      • Parameters:

        • bind – An Connection or which can provide a Dialect in order to generate a object. If the bind and dialect parameters are both omitted, a default SQL compiler is used.

        • column_keys – Used for INSERT and UPDATE statements, a list of column names which should be present in the VALUES clause of the compiled statement. If None, all columns from the target table object are rendered.

        • dialect – A Dialect instance which can generate a object. This argument takes precedence over the bind argument.

        • compile_kwargs

          optional dictionary of additional parameters that will be passed through to the compiler within all “visit” methods. This allows any custom flag to be passed through to a custom compilation construct, for example. It is also used for the case of passing the literal_binds flag through:

          1. from sqlalchemy.sql import table, column, select
          2. t = table('t', column('x'))
          3. s = select(t).where(t.c.x == 5)
          4. print(s.compile(compile_kwargs={"literal_binds": True}))

          New in version 0.9.0.

    1. See also
    2. [How do I render SQL expressions as strings, possibly with bound parameters inlined?]($e9fd44a49fe37bbb.md#faq-sql-expression-string)
    1. See also
    2. [Selectable.exported\_columns]($75ae4d183452a412.md#sqlalchemy.sql.expression.Selectable.exported_columns "sqlalchemy.sql.expression.Selectable.exported_columns") - the [ColumnCollection]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnCollection "sqlalchemy.sql.expression.ColumnCollection") that is used for the operation.
    3. [ColumnCollection.corresponding\_column()]($aafca12b71ff5dd3.md#sqlalchemy.sql.expression.ColumnCollection.corresponding_column "sqlalchemy.sql.expression.ColumnCollection.corresponding_column") - implementation method.
    • method create(bind: _CreateDropBind, checkfirst: bool = False) → None

      Issue a CREATE statement for this Table, using the given or Engine for connectivity.

      See also

      .

    • method sqlalchemy.schema.Table.delete() →

      inherited from the TableClause.delete() method of

      Generate a delete() construct against this .

      E.g.:

      1. table.delete().where(table.c.id==7)

      See delete() for argument and usage information.

    • attribute description

      inherited from the TableClause.description attribute of

    • attribute sqlalchemy.schema.Table.dialect_kwargs

      inherited from the attribute of DialectKWArgs

      A collection of keyword arguments specified as dialect-specific options to this construct.

      The arguments are present here in their original <dialect>_<kwarg> format. Only arguments that were actually passed are included; unlike the collection, which contains all options known by this dialect including defaults.

      The collection is also writable; keys are accepted of the form <dialect>_<kwarg> where the value will be assembled into the list of options.

      New in version 0.9.2.

      Changed in version 0.9.4: The DialectKWArgs.dialect_kwargs collection is now writable.

      See also

      - nested dictionary form

    • attribute sqlalchemy.schema.Table.dialect_options

      inherited from the attribute of DialectKWArgs

      A collection of keyword arguments specified as dialect-specific options to this construct.

      This is a two-level nested registry, keyed to <dialect_name> and <argument_name>. For example, the postgresql_where argument would be locatable as:

      1. arg = my_object.dialect_options['postgresql']['where']

      New in version 0.9.2.

      See also

      - flat dictionary form

    • method sqlalchemy.schema.Table.drop(bind: _CreateDropBind, checkfirst: bool = False) → None

      Issue a DROP statement for this , using the given Connection or for connectivity.

      See also

      MetaData.drop_all().

    • attribute entity_namespace

      inherited from the FromClause.entity_namespace attribute of

      Return a namespace used for name-based access in SQL expressions.

      This is the namespace that is used to resolve “filter_by()” type expressions, such as:

      1. stmt.filter_by(address='some address')

      It defaults to the .c collection, however internally it can be overridden using the “entity_namespace” annotation to deliver alternative results.

    • attribute sqlalchemy.schema.Table.exported_columns

      inherited from the attribute of FromClause

      A that represents the “exported” columns of this Selectable.

      The “exported” columns for a object are synonymous with the FromClause.columns collection.

      New in version 1.4.

      See also

      SelectBase.exported_columns

    • attribute foreign_key_constraints

      ForeignKeyConstraint objects referred to by this .

      This list is produced from the collection of ForeignKey objects currently associated.

      See also

      Table.foreign_keys

    • attribute sqlalchemy.schema.Table.foreign_keys

      inherited from the attribute of FromClause

      Return the collection of marker objects which this FromClause references.

      Each ForeignKey is a member of a -wide ForeignKeyConstraint.

      See also

    • method sqlalchemy.schema.Table.get_children(*, omit_attrs: Tuple[str, …] = (), **kw: Any) → Iterable[HasTraverseInternals]

      inherited from the HasTraverseInternals.get_children() method of HasTraverseInternals

      Return immediate child HasTraverseInternals elements of this HasTraverseInternals.

      This is used for visit traversal.

      **kw may contain flags that change the collection that is returned, for example to return a subset of items in order to cut down on larger traversals, or to return child items from a different context (such as schema-level collections instead of clause-level).

    • attribute implicit_returning = False

      inherited from the TableClause.implicit_returning attribute of

      TableClause doesn’t support having a primary key or column -level defaults, so implicit returning doesn’t apply.

    • attribute indexes: Set[Index]

      A collection of all objects associated with this Table.

      See also

    • attribute sqlalchemy.schema.Table.info

      inherited from the attribute of SchemaItem

      Info dictionary associated with the object, allowing user-defined data to be associated with this .

      The dictionary is automatically generated when first accessed. It can also be specified in the constructor of some objects, such as Table and .

    • attribute sqlalchemy.schema.Table.inherit_cache: Optional[bool] = None

      inherited from the HasCacheKey.inherit_cache attribute of

      Indicate if this HasCacheKey instance should make use of the cache key generation scheme used by its immediate superclass.

      The attribute defaults to None, which indicates that a construct has not yet taken into account whether or not its appropriate for it to participate in caching; this is functionally equivalent to setting the value to False, except that a warning is also emitted.

      This flag can be set to True on a particular class, if the SQL that corresponds to the object does not change based on attributes which are local to this class, and not its superclass.

      See also

      - General guideslines for setting the HasCacheKey.inherit_cache attribute for third-party or user defined SQL constructs.

    • method insert() → Insert

      inherited from the method of TableClause

      Generate an construct against this TableClause.

      E.g.:

      1. table.insert().values(name='foo')

      See for argument and usage information.

    • method sqlalchemy.schema.Table.is_derived_from(fromclause: Optional[]) → bool

      inherited from the FromClause.is_derived_from() method of

      Return True if this FromClause is ‘derived’ from the given FromClause.

      An example would be an Alias of a Table is derived from that Table.

    • method join(right: _FromClauseArgument, onclause: Optional[_ColumnExpressionArgument[bool]] = None, isouter: bool = False, full: bool = False) → Join

      inherited from the method of FromClause

      Return a from this FromClause to another FromClause.

      E.g.:

      1. from sqlalchemy import join
      2. j = user_table.join(address_table,
      3. user_table.c.id == address_table.c.user_id)
      4. stmt = select(user_table).select_from(j)

      would emit SQL along the lines of:

      1. SELECT user.id, user.name FROM user
      2. JOIN address ON user.id = address.user_id
      • Parameters:

        • right – the right side of the join; this is any object such as a Table object, and may also be a selectable-compatible object such as an ORM-mapped class.

        • onclause – a SQL expression representing the ON clause of the join. If left at None, will attempt to join the two tables based on a foreign key relationship.

        • isouter – if True, render a LEFT OUTER JOIN, instead of JOIN.

        • full

          if True, render a FULL OUTER JOIN, instead of LEFT OUTER JOIN. Implies FromClause.join.isouter.

          New in version 1.1.

    1. See also
    2. [join()]($75ae4d183452a412.md#sqlalchemy.sql.expression.join "sqlalchemy.sql.expression.join") - standalone function
    3. [Join]($75ae4d183452a412.md#sqlalchemy.sql.expression.Join "sqlalchemy.sql.expression.Join") - the type of object produced
    • attribute key

      Return the ‘key’ for this Table.

      This value is used as the dictionary key within the collection. It is typically the same as that of Table.name for a table with no Table.schema set; otherwise it is typically of the form schemaname.tablename.

    • attribute kwargs

      inherited from the DialectKWArgs.kwargs attribute of

      A synonym for DialectKWArgs.dialect_kwargs.

    • method lateral(name: Optional[str] = None) → LateralFromClause

      inherited from the Selectable.lateral() method of

      Return a LATERAL alias of this Selectable.

      The return value is the construct also provided by the top-level lateral() function.

      New in version 1.1.

      See also

      - overview of usage.

    • method sqlalchemy.schema.Table.outerjoin(right: _FromClauseArgument, onclause: Optional[_ColumnExpressionArgument[bool]] = None, full: bool = False) →

      inherited from the FromClause.outerjoin() method of

      Return a Join from this to another FromClause, with the “isouter” flag set to True.

      E.g.:

      1. from sqlalchemy import outerjoin
      2. j = user_table.outerjoin(address_table,
      3. user_table.c.id == address_table.c.user_id)

      The above is equivalent to:

      1. j = user_table.join(
      2. address_table,
      3. user_table.c.id == address_table.c.user_id,
      4. isouter=True)
      • Parameters:

        • right – the right side of the join; this is any FromClause object such as a object, and may also be a selectable-compatible object such as an ORM-mapped class.

        • onclause – a SQL expression representing the ON clause of the join. If left at None, FromClause.join() will attempt to join the two tables based on a foreign key relationship.

        • full

          if True, render a FULL OUTER JOIN, instead of LEFT OUTER JOIN.

          New in version 1.1.

    1. See also
    2. [FromClause.join()]($75ae4d183452a412.md#sqlalchemy.sql.expression.FromClause.join "sqlalchemy.sql.expression.FromClause.join")
    3. [Join]($75ae4d183452a412.md#sqlalchemy.sql.expression.Join "sqlalchemy.sql.expression.Join")
    • method params(*optionaldict, **kwargs)

      inherited from the Immutable.params() method of Immutable

      Return a copy with bindparam() elements replaced.

      Returns a copy of this ClauseElement with elements replaced with values taken from the given dictionary:

      1. >>> clause = column('x') + bindparam('foo')
      2. >>> print(clause.compile().params)
      3. {'foo':None}
      4. >>> print(clause.params({'foo':7}).compile().params)
      5. {'foo':7}
    • attribute sqlalchemy.schema.Table.primary_key

      inherited from the attribute of FromClause

      Return the iterable collection of objects which comprise the primary key of this _selectable.FromClause.

      For a Table object, this collection is represented by the which itself is an iterable collection of Column objects.

    • method replace_selectable(old: FromClause, alias: ) → SelfSelectable

      inherited from the Selectable.replace_selectable() method of

      Replace all occurrences of FromClause ‘old’ with the given object, returning a copy of this FromClause.

      Deprecated since version 1.4: The method is deprecated, and will be removed in a future release. Similar functionality is available via the sqlalchemy.sql.visitors module.

    • attribute sqlalchemy.schema.Table.schema: Optional[str] = None

      inherited from the attribute of FromClause

      Define the ‘schema’ attribute for this .

      This is typically None for most objects except that of Table, where it is taken as the value of the argument.

    • method sqlalchemy.schema.Table.select() →

      inherited from the FromClause.select() method of

      Return a SELECT of this FromClause.

      e.g.:

      1. stmt = some_table.select().where(some_table.c.id == 5)

      See also

      - general purpose method which allows for arbitrary column lists.

    • method sqlalchemy.schema.Table.self_group(against: Optional[OperatorType] = None) →

      inherited from the ClauseElement.self_group() method of

      Apply a ‘grouping’ to this ClauseElement.

      This method is overridden by subclasses to return a “grouping” construct, i.e. parenthesis. In particular it’s used by “binary” expressions to provide a grouping around themselves when placed into a larger expression, as well as by constructs when placed into the FROM clause of another select(). (Note that subqueries should be normally created using the method, as many platforms require nested SELECT statements to be named).

      As expressions are composed together, the application of self_group() is automatic - end-user code should never need to use this method directly. Note that SQLAlchemy’s clause constructs take operator precedence into account - so parenthesis might not be needed, for example, in an expression like x OR (y AND z) - AND takes precedence over OR.

      The base method of ClauseElement just returns self.

    • method table_valued() → TableValuedColumn[Any]

      inherited from the NamedFromClause.table_valued() method of NamedFromClause

      Return a TableValuedColumn object for this FromClause.

      A TableValuedColumn is a that represents a complete row in a table. Support for this construct is backend dependent, and is supported in various forms by backends such as PostgreSQL, Oracle and SQL Server.

      E.g.:

      1. >>> from sqlalchemy import select, column, func, table
      2. >>> a = table("a", column("id"), column("x"), column("y"))
      3. >>> stmt = select(func.row_to_json(a.table_valued()))
      4. >>> print(stmt)
      5. SELECT row_to_json(a) AS row_to_json_1
      6. FROM a

      New in version 1.4.0b2.

      See also

      Working with SQL Functions - in the

    • method sqlalchemy.schema.Table.tablesample(sampling: Union[float, [Any]], name: Optional[str] = None, seed: Optional[roles.ExpressionElementRole[Any]] = None) → TableSample

      inherited from the method of FromClause

      Return a TABLESAMPLE alias of this .

      The return value is the TableSample construct also provided by the top-level function.

      New in version 1.1.

      See also

      tablesample() - usage guidelines and parameters

    • method to_metadata(metadata: ~sqlalchemy.sql.schema.MetaData, schema: ~typing.Union[str, ~typing.Literal[<SchemaConst.RETAIN_SCHEMA: 1>]] = SchemaConst.RETAIN_SCHEMA, referred_schema_fn: ~typing.Optional[~typing.Callable[[~sqlalchemy.sql.schema.Table, ~typing.Optional[str], ~sqlalchemy.sql.schema.ForeignKeyConstraint, ~typing.Optional[str]], ~typing.Optional[str]]] = None, name: ~typing.Optional[str] = None) → Table

      Return a copy of this associated with a different MetaData.

      E.g.:

      1. m1 = MetaData()
      2. user = Table('user', m1, Column('id', Integer, primary_key=True))
      3. m2 = MetaData()
      4. user_copy = user.to_metadata(m2)

      Changed in version 1.4: The function was renamed from Table.tometadata().

      • Parameters:

        • metadata – Target object, into which the new Table object will be created.

        • schema

          optional string name indicating the target schema. Defaults to the special symbol which indicates that no change to the schema name should be made in the new Table. If set to a string name, the new will have this new name as the .schema. If set to None, the schema will be set to that of the schema set on the target MetaData, which is typically None as well, unless set explicitly:

          1. m2 = MetaData(schema='newschema')
          2. # user_copy_one will have "newschema" as the schema name
          3. user_copy_one = user.to_metadata(m2, schema=None)
          4. m3 = MetaData() # schema defaults to None
          5. # user_copy_two will have None as the schema name
          6. user_copy_two = user.to_metadata(m3, schema=None)
        • referred_schema_fn

          optional callable which can be supplied in order to provide for the schema name that should be assigned to the referenced table of a . The callable accepts this parent Table, the target schema that we are changing to, the object, and the existing “target schema” of that constraint. The function should return the string schema name that should be applied. To reset the schema to “none”, return the symbol BLANK_SCHEMA. To effect no change, return None or RETAIN_SCHEMA.

          Changed in version 1.4.33: The referred_schema_fn function may return the BLANK_SCHEMA or RETAIN_SCHEMA symbols.

          E.g.:

          1. def referred_schema_fn(table, to_schema,
          2. constraint, referred_schema):
          3. if referred_schema == 'base_tables':
          4. return referred_schema
          5. else:
          6. return to_schema
          7. new_table = table.to_metadata(m2, schema="alt_schema",
          8. referred_schema_fn=referred_schema_fn)

          New in version 0.9.2.

        • name

          optional string name indicating the target table name. If not specified or None, the table name is retained. This allows a Table to be copied to the same target with a new name.

          New in version 1.0.0.

    • method sqlalchemy.schema.Table.tometadata(metadata: ~sqlalchemy.sql.schema.MetaData, schema: ~typing.Union[str, ~typing.Literal[<SchemaConst.RETAIN_SCHEMA: 1>]] = SchemaConst.RETAIN_SCHEMA, referred_schema_fn: ~typing.Optional[~typing.Callable[[~sqlalchemy.sql.schema.Table, ~typing.Optional[str], ~sqlalchemy.sql.schema.ForeignKeyConstraint, ~typing.Optional[str]], ~typing.Optional[str]]] = None, name: ~typing.Optional[str] = None) →

      Return a copy of this Table associated with a different .

      Deprecated since version 1.4: Table.tometadata() is renamed to

      See Table.to_metadata() for a full description.

    • method unique_params(*optionaldict, **kwargs)

      inherited from the Immutable.unique_params() method of

      Return a copy with bindparam() elements replaced.

      Same functionality as , except adds unique=True to affected bind parameters so that multiple statements can be used.