Custom Types

    A frequent need is to force the “string” version of a type, that is the one rendered in a CREATE TABLE statement or other SQL function like CAST, to be changed. For example, an application may want to force the rendering of BINARY for all platforms except for one, in which is wants BLOB to be rendered. Usage of an existing generic type, in this case , is preferred for most use cases. But to control types more accurately, a compilation directive that is per-dialect can be associated with any type:

    The above code allows the usage of BINARY, which will produce the string BINARY against all backends except SQLite, in which case it will produce BLOB.

    See the section , a subsection of Custom SQL Constructs and Compilation Extension, for additional examples.

    Augmenting Existing Types

    The TypeDecorator allows the creation of custom types which add bind-parameter and result-processing behavior to an existing type object. It is used when additional in-Python of data to and/or from the database is required.

    Note

    The bind- and result-processing of TypeDecorator is in addition to the processing already performed by the hosted type, which is customized by SQLAlchemy on a per-DBAPI basis to perform processing specific to that DBAPI. While it is possible to replace this handling for a given type through direct subclassing, it is never needed in practice and SQLAlchemy no longer supports this as a public use case.

    ORM Tip

    The can be used to provide a consistent means of converting some type of value as it is passed into and out of the database. When using the ORM, a similar technique exists for converting user data from arbitrary formats which is to use the validates() decorator. This technique may be more appropriate when data coming into an ORM model needs to be normalized in some way that is specific to the business case and isn’t as generic as a datatype.

    class sqlalchemy.types.TypeDecorator

    Allows the creation of types which add additional functionality to an existing type.

    This method is preferred to direct subclassing of SQLAlchemy’s built-in types as it ensures that all required functionality of the underlying type is kept in place.

    Typical usage:

    1. import sqlalchemy.types as types
    2. class MyType(types.TypeDecorator):
    3. '''Prefixes Unicode values with "PREFIX:" on the way in and
    4. strips it off on the way out.
    5. '''
    6. impl = types.Unicode
    7. cache_ok = True
    8. def process_bind_param(self, value, dialect):
    9. return "PREFIX:" + value
    10. def process_result_value(self, value, dialect):
    11. return value[7:]
    12. def copy(self, **kw):
    13. return MyType(self.impl.length)

    The class-level impl attribute is required, and can reference any class. Alternatively, the load_dialect_impl() method can be used to provide different type classes based on the dialect given; in this case, the impl variable can reference TypeEngine as a placeholder.

    The class-level flag indicates if this custom TypeDecorator is safe to be used as part of a cache key. This flag defaults to None which will initially generate a warning when the SQL compiler attempts to generate a cache key for a statement that uses this type. If the is not guaranteed to produce the same bind/result behavior and SQL generation every time, this flag should be set to False; otherwise if the class produces the same behavior each time, it may be set to True. See TypeDecorator.cache_ok for further notes on how this works.

    Types that receive a Python type that isn’t similar to the ultimate type used may want to define the method. This is used to give the expression system a hint when coercing Python objects into bind parameters within expressions. Consider this expression:

    1. mytable.c.somecol + datetime.date(2009, 5, 15)

    Above, if “somecol” is an Integer variant, it makes sense that we’re doing date arithmetic, where above is usually interpreted by databases as adding a number of days to the given date. The expression system does the right thing by not attempting to coerce the “date()” value into an integer-oriented bind parameter.

    However, in the case of TypeDecorator, we are usually changing an incoming Python type to something new - TypeDecorator by default will “coerce” the non-typed side to be the same type as itself. Such as below, we define an “epoch” type that stores a date value as an integer:

    1. class MyEpochType(types.TypeDecorator):
    2. impl = types.Integer
    3. epoch = datetime.date(1970, 1, 1)
    4. def process_bind_param(self, value, dialect):
    5. return (value - self.epoch).days
    6. def process_result_value(self, value, dialect):
    7. return self.epoch + timedelta(days=value)

    Our expression of somecol + date with the above type will coerce the “date” on the right side to also be treated as MyEpochType.

    This behavior can be overridden via the TypeDecorator.coerce_compared_value() method, which returns a type that should be used for the value of the expression. Below we set it such that an integer value will be treated as an Integer, and any other value is assumed to be a date and will be treated as a MyEpochType:

    1. def coerce_compared_value(self, op, value):
    2. if isinstance(value, int):
    3. return Integer()
    4. else:
    5. return self

    Warning

    Note that the behavior of coerce_compared_value is not inherited by default from that of the base type. If the is augmenting a type that requires special logic for certain types of operators, this method must be overridden. A key example is when decorating the JSON and types; the default rules of TypeEngine.coerce_compared_value() should be used in order to deal with operators like index operations:

    1. from sqlalchemy import JSON
    2. from sqlalchemy import TypeDecorator
    3. class MyJsonType(TypeDecorator):
    4. impl = JSON
    5. cache_ok = True
    6. def coerce_compared_value(self, op, value):
    7. return self.impl.coerce_compared_value(op, value)

    Without the above step, index operations such as mycol['foo'] will cause the index value 'foo' to be JSON encoded.

    Similarly, when working with the datatype, the type coercion for index operations (e.g. mycol[5]) is also handled by TypeDecorator.coerce_compared_value(), where again a simple override is sufficient unless special rules are needed for particular operators:

    1. from sqlalchemy import ARRAY
    2. from sqlalchemy import TypeDecorator
    3. class MyArrayType(TypeDecorator):
    4. impl = ARRAY
    5. cache_ok = True
    6. def coerce_compared_value(self, op, value):
    7. return self.impl.coerce_compared_value(op, value)

    Members

    , operate(), , __init__(), , bind_processor(), , coerce_to_is_types, , comparator_factory, , copy(), , literal_processor(), , process_bind_param(), , process_result_value(), , sort_key_function,

    Class signature

    class sqlalchemy.types.TypeDecorator (sqlalchemy.sql.expression.SchemaEventTarget, , sqlalchemy.types.TypeEngine)

    • attribute cache_ok: Optional[bool] = None

      inherited from the ExternalType.cache_ok attribute of

      Indicate if statements using this ExternalType are “safe to cache”.

      The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a :

      1. class MyType(TypeDecorator):
      2. impl = String
      3. cache_ok = True
      4. def __init__(self, choices):
      5. self.choices = tuple(choices)
      6. self.internal_only = True

      The cache key for the above type would be equivalent to:

      1. >>> MyType(["a", "b", "c"])._static_cache_key
      2. (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

      The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

      The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

      To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

      1. class LookupType(UserDefinedType):
      2. '''a custom type that accepts a dictionary as a parameter.
      3. this is the non-cacheable version, as "self.lookup" is not
      4. hashable.
      5. '''
      6. def __init__(self, lookup):
      7. self.lookup = lookup
      8. def get_col_spec(self, **kw):
      9. return "VARCHAR(255)"
      10. def bind_processor(self, dialect):
      11. # ... works with "self.lookup" ...

      Where “lookup” is a dictionary. The type will not be able to generate a cache key:

      1. >>> type_ = LookupType({"a": 10, "b": 20})
      2. >>> type_._static_cache_key
      3. <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
      4. produce a cache key because the ``cache_ok`` flag is not set to True.
      5. Set this flag to True if this type object's state is safe to use
      6. in a cache key, or False to disable this warning.
      7. symbol('no_cache')

      If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

      1. >>> # set cache_ok = True
      2. >>> type_.cache_ok = True
      3. >>> # this is the cache key it would generate
      4. >>> key = type_._static_cache_key
      5. >>> key
      6. (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))
      7. >>> # however this key is not hashable, will fail when used with
      8. >>> # SQLAlchemy statement cache
      9. >>> some_cache = {key: "some sql value"}
      10. Traceback (most recent call last): File "<stdin>", line 1,
      11. in <module> TypeError: unhashable type: 'dict'

      The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

      1. class LookupType(UserDefinedType):
      2. '''a custom type that accepts a dictionary as a parameter.
      3. The dictionary is stored both as itself in a private variable,
      4. and published in a public variable as a sorted tuple of tuples,
      5. which is hashable and will also return the same value for any
      6. two equivalent dictionaries. Note it assumes the keys and
      7. values of the dictionary are themselves hashable.
      8. '''
      9. cache_ok = True
      10. def __init__(self, lookup):
      11. self._lookup = lookup
      12. # assume keys/values of "lookup" are hashable; otherwise
      13. # they would also need to be converted in some way here
      14. self.lookup = tuple(
      15. (key, lookup[key]) for key in sorted(lookup)
      16. )
      17. def get_col_spec(self, **kw):
      18. return "VARCHAR(255)"
      19. def bind_processor(self, dialect):
      20. # ... works with "self._lookup" ...

      Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

      1. >>> LookupType({"a": 10, "b": 20})._static_cache_key
      2. (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

      New in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

      New in version 1.4.28: - added the mixin which generalizes the cache_ok flag to both the TypeDecorator and classes.

      See also

      SQL Compilation Caching

    • class Comparator

      A that is specific to TypeDecorator.

      User-defined classes should not typically need to modify this.

      Class signature

      class sqlalchemy.types.TypeDecorator.Comparator (sqlalchemy.types.Comparator)

      • method operate(op: OperatorType, *other: Any, **kwargs: Any) → ColumnElement[_CT]

        Operate on an argument.

        This is the lowest level of operation, raises NotImplementedError by default.

        Overriding this on a subclass can allow common behavior to be applied to all operations. For example, overriding to apply func.lower() to the left and right side:

        1. class MyComparator(ColumnOperators):
        2. def operate(self, op, other, **kwargs):
        3. return op(func.lower(self), func.lower(other), **kwargs)

        Parameters:

        • op – Operator callable.

        • *other – the ‘other’ side of the operation. Will be a single scalar for most operations.

        • **kwargs – modifiers. These may be passed by special operators such as ColumnOperators.contains().

      • method sqlalchemy.types.TypeDecorator.Comparator.reverse_operate(op: OperatorType, other: Any, **kwargs: Any) → [_CT]

        Reverse operate on an argument.

        Usage is the same as operate().

    • method __init__(*args: Any, **kwargs: Any)

      Construct a TypeDecorator.

      Arguments sent here are passed to the constructor of the class assigned to the impl class level attribute, assuming the impl is a callable, and the resulting object is assigned to the self.impl instance attribute (thus overriding the class attribute of the same name).

      If the class level impl is not a callable (the unusual case), it will be assigned to the same instance attribute ‘as-is’, ignoring those arguments passed to the constructor.

      Subclasses can override this to customize the generation of self.impl entirely.

    • method bind_expression(bindparam: BindParameter[_T]) → Optional[[_T]]

      Given a bind value (i.e. a BindParameter instance), return a SQL expression which will typically wrap the given parameter.

      Note

      This method is called during the SQL compilation phase of a statement, when rendering a SQL string. It is not necessarily called against specific values, and should not be confused with the method, which is the more typical method that processes the actual value passed to a particular parameter at statement execution time.

      Subclasses of TypeDecorator can override this method to provide custom bind expression behavior for the type. This implementation will replace that of the underlying implementation type.

    • method bind_processor(dialect: Dialect) → Optional[_BindProcessorType[_T]]

      Provide a bound value processing function for the given .

      This is the method that fulfills the TypeEngine contract for bound value conversion which normally occurs via the method.

      Note

      User-defined subclasses of TypeDecorator should not implement this method, and should instead implement so that the “inner” processing provided by the implementing type is maintained.

      • Parameters:

        dialect – Dialect instance in use.

    • method sqlalchemy.types.TypeDecorator.coerce_compared_value(op: Optional[OperatorType], value: Any) → Any

      By default, returns self. This method is called by the expression system when an object using this type is on the left or right side of an expression against a plain Python object which does not yet have a SQLAlchemy type assigned:

      1. expr = table.c.somecolumn + 35

      Where above, if somecolumn uses this type, this method will be called with the value operator.add and 35. The return value is whatever SQLAlchemy type should be used for 35 for this particular operation.

    • attribute coerce_to_is_types: Sequence[Type[Any]] = (<class ‘NoneType’>,)

      Specify those Python types which should be coerced at the expression level to “IS <constant>” when compared using == (and same for IS NOT in conjunction with !=).

      For most SQLAlchemy types, this includes NoneType, as well as bool.

      modifies this list to only include NoneType, as typedecorator implementations that deal with boolean types are common.

      Custom TypeDecorator classes can override this attribute to return an empty tuple, in which case no values will be coerced to constants.

    • method column_expression(column: ColumnElement[_T]) → Optional[[_T]]

      Given a SELECT column expression, return a wrapping SQL expression.

      Note

      This method is called during the SQL compilation phase of a statement, when rendering a SQL string. It is not called against specific values, and should not be confused with the TypeDecorator.process_result_value() method, which is the more typical method that processes the actual value returned in a result row subsequent to statement execution time.

      Subclasses of can override this method to provide custom column expression behavior for the type. This implementation will replace that of the underlying implementation type.

      See the description of TypeEngine.column_expression() for a complete description of the method’s use.

    • attribute comparator_factory: _ComparatorFactory[Any]

      A Comparator class which will apply to operations performed by owning objects.

      The comparator_factory attribute is a hook consulted by the core expression system when column and SQL expression operations are performed. When a class is associated with this attribute, it allows custom re-definition of all existing operators, as well as definition of new operators. Existing operators include those provided by Python operator overloading such as ColumnOperators.__add__() and ColumnOperators.__eq__(), those provided as standard attributes of ColumnOperators such as ColumnOperators.like() and ColumnOperators.in_().

      Rudimentary usage of this hook is allowed through simple subclassing of existing types, or alternatively by using . See the documentation section Redefining and Creating New Operators for examples.

    • method compare_values(x: Any, y: Any) → bool

      Given two values, compare them for equality.

      By default this calls upon TypeEngine.compare_values() of the underlying “impl”, which in turn usually uses the Python equals operator ==.

      This function is used by the ORM to compare an original-loaded value with an intercepted “changed” value, to determine if a net change has occurred.

    • method copy(**kw: Any) → SelfTypeDecorator

      Produce a copy of this TypeDecorator instance.

      This is a shallow copy and is provided to fulfill part of the contract. It usually does not need to be overridden unless the user-defined TypeDecorator has local state that should be deep-copied.

    • method get_dbapi_type(dbapi: module) → Optional[Any]

      Return the DBAPI type object represented by this TypeDecorator.

      By default this calls upon of the underlying “impl”.

    • method sqlalchemy.types.TypeDecorator.literal_processor(dialect: ) → Optional[_LiteralProcessorType[_T]]

      Provide a literal processing function for the given Dialect.

      This is the method that fulfills the contract for literal value conversion which normally occurs via the TypeEngine.literal_processor() method.

      Note

      User-defined subclasses of should not implement this method, and should instead implement TypeDecorator.process_literal_param() so that the “inner” processing provided by the implementing type is maintained.

    • method load_dialect_impl(dialect: Dialect) → [Any]

      Return a TypeEngine object corresponding to a dialect.

      This is an end-user override hook that can be used to provide differing types depending on the given dialect. It is used by the implementation of type_engine() to help determine what type should ultimately be returned for a given .

      By default returns self.impl.

    • method sqlalchemy.types.TypeDecorator.process_bind_param(value: Optional[_T], dialect: ) → Any

      Receive a bound parameter value to be converted.

      Custom subclasses of TypeDecorator should override this method to provide custom behaviors for incoming data values. This method is called at statement execution time and is passed the literal Python data value which is to be associated with a bound parameter in the statement.

      The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.

      • Parameters:

        • value – Data to operate upon, of any type expected by this method in the subclass. Can be None.

        • dialect – the in use.

    1. See also
    2. [Augmenting Existing Types](#types-typedecorator)
    3. [TypeDecorator.process\_result\_value()](#sqlalchemy.types.TypeDecorator.process_result_value "sqlalchemy.types.TypeDecorator.process_result_value")
    • method sqlalchemy.types.TypeDecorator.process_literal_param(value: Optional[_T], dialect: ) → str

      Receive a literal parameter value to be rendered inline within a statement.

      Note

      This method is called during the SQL compilation phase of a statement, when rendering a SQL string. Unlike other SQL compilation methods, it is passed a specific Python value to be rendered as a string. However it should not be confused with the TypeDecorator.process_bind_param() method, which is the more typical method that processes the actual value passed to a particular parameter at statement execution time.

      Custom subclasses of should override this method to provide custom behaviors for incoming data values that are in the special case of being rendered as literals.

      The returned string will be rendered into the output string.

    • method sqlalchemy.types.TypeDecorator.process_result_value(value: Optional[Any], dialect: Any) → Optional[_T]

      Receive a result-row column value to be converted.

      Custom subclasses of should override this method to provide custom behaviors for data values being received in result rows coming from the database. This method is called at result fetching time and is passed the literal Python data value that’s extracted from a database result row.

      The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.

      • Parameters:

        • value – Data to operate upon, of any type expected by this method in the subclass. Can be None.

        • dialect – the Dialect in use.

    1. See also
    2. [Augmenting Existing Types](#types-typedecorator)
    3. [TypeDecorator.process\_bind\_param()](#sqlalchemy.types.TypeDecorator.process_bind_param "sqlalchemy.types.TypeDecorator.process_bind_param")
    • method result_processor(dialect: Dialect, coltype: Any) → Optional[_ResultProcessorType[_T]]

      Provide a result value processing function for the given .

      This is the method that fulfills the TypeEngine contract for bound value conversion which normally occurs via the method.

      Note

      User-defined subclasses of TypeDecorator should not implement this method, and should instead implement so that the “inner” processing provided by the implementing type is maintained.

      • Parameters:

        • dialect – Dialect instance in use.

        • coltype – A SQLAlchemy data type

    • attribute sqlalchemy.types.TypeDecorator.sort_key_function: Optional[Callable[[Any], Any]]

      A sorting function that can be passed as the key to sorted.

      The default value of None indicates that the values stored by this type are self-sorting.

      New in version 1.3.8.

    • method type_engine(dialect: Dialect) → [Any]

      Return a dialect-specific TypeEngine instance for this .

      In most cases this returns a dialect-adapted form of the TypeEngine type represented by self.impl. Makes usage of dialect_impl(). Behavior can be customized here by overriding .

    A few key recipes follow.

    A common source of confusion regarding the type is that it is intended to deal only with Python unicode objects on the Python side, meaning values passed to it as bind parameters must be of the form u'some string' if using Python 2 and not 3. The encoding/decoding functions it performs are only to suit what the DBAPI in use requires, and are primarily a private implementation detail.

    The use case of a type that can safely receive Python bytestrings, that is strings that contain non-ASCII characters and are not u'' objects in Python 2, can be achieved using a TypeDecorator which coerces as needed:

    1. from sqlalchemy.types import TypeDecorator, Unicode
    2. class CoerceUTF8(TypeDecorator):
    3. """Safely coerce Python bytestrings to Unicode
    4. before passing off to the database."""
    5. impl = Unicode
    6. def process_bind_param(self, value, dialect):
    7. if isinstance(value, str):
    8. value = value.decode("utf-8")
    9. return value

    Some database connectors like those of SQL Server choke if a Decimal is passed with too many decimal places. Here’s a recipe that rounds them down:

    1. from sqlalchemy.types import TypeDecorator, Numeric
    2. from decimal import Decimal
    3. class SafeNumeric(TypeDecorator):
    4. """Adds quantization to Numeric."""
    5. impl = Numeric
    6. def __init__(self, *arg, **kw):
    7. TypeDecorator.__init__(self, *arg, **kw)
    8. self.quantize_int = -self.impl.scale
    9. self.quantize = Decimal(10) ** self.quantize_int
    10. def process_bind_param(self, value, dialect):
    11. if isinstance(value, Decimal) and value.as_tuple()[2] < self.quantize_int:
    12. value = value.quantize(self.quantize)
    13. return value

    Timestamps in databases should always be stored in a timezone-agnostic way. For most databases, this means ensuring a timestamp is first in the UTC timezone before it is stored, then storing it as timezone-naive (that is, without any timezone associated with it; UTC is assumed to be the “implicit” timezone). Alternatively, database-specific types like PostgreSQLs “TIMESTAMP WITH TIMEZONE” are often preferred for their richer functionality; however, storing as plain UTC will work on all databases and drivers. When a timezone-intelligent database type is not an option or is not preferred, the can be used to create a datatype that convert timezone aware timestamps into timezone naive and back again. Below, Python’s built-in datetime.timezone.utc timezone is used to normalize and denormalize:

    Receives and returns Python uuid() objects. Uses the PG UUID type when using PostgreSQL, CHAR(32) on other backends, storing them in stringified hex format. Can be modified to store binary in CHAR(16) if desired:

    1. from sqlalchemy.types import TypeDecorator, CHAR
    2. from sqlalchemy.dialects.postgresql import UUID
    3. import uuid
    4. class GUID(TypeDecorator):
    5. """Platform-independent GUID type.
    6. Uses PostgreSQL's UUID type, otherwise uses
    7. CHAR(32), storing as stringified hex values.
    8. """
    9. impl = CHAR
    10. cache_ok = True
    11. def load_dialect_impl(self, dialect):
    12. if dialect.name == "postgresql":
    13. return dialect.type_descriptor(UUID())
    14. else:
    15. return dialect.type_descriptor(CHAR(32))
    16. def process_bind_param(self, value, dialect):
    17. if value is None:
    18. return value
    19. elif dialect.name == "postgresql":
    20. return str(value)
    21. else:
    22. if not isinstance(value, uuid.UUID):
    23. return "%.32x" % uuid.UUID(value).int
    24. else:
    25. # hexstring
    26. return "%.32x" % value.int
    27. def process_result_value(self, value, dialect):
    28. if value is None:
    29. return value
    30. else:
    31. if not isinstance(value, uuid.UUID):
    32. value = uuid.UUID(value)
    33. return value

    Linking Python uuid.UUID to the Custom Type for ORM mappings

    When declaring ORM mappings using Annotated Declarative Table mappings, the custom GUID type defined above may be associated with the Python uuid.UUID datatype by adding it to the , which is typically defined on the DeclarativeBase class:

    1. import uuid
    2. from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column
    3. class Base(DeclarativeBase):
    4. type_annotation_map = {
    5. uuid.UUID: GUID,
    6. }

    With the above configuration, ORM mapped classes which extend from Base may refer to Python uuid.UUID in annotations which will make use of GUID automatically:

    1. class MyModel(Base):
    2. __tablename__ = "my_table"
    3. id: Mapped[uuid.UUID] = mapped_column(primary_key=True)

    This type uses simplejson to marshal Python data structures to/from JSON. Can be modified to use Python’s builtin json encoder:

    1. from sqlalchemy.types import TypeDecorator, VARCHAR
    2. import json
    3. class JSONEncodedDict(TypeDecorator):
    4. """Represents an immutable structure as a json-encoded string.
    5. Usage:
    6. JSONEncodedDict(255)
    7. """
    8. impl = VARCHAR
    9. cache_ok = True
    10. def process_bind_param(self, value, dialect):
    11. if value is not None:
    12. value = json.dumps(value)
    13. return value
    14. def process_result_value(self, value, dialect):
    15. if value is not None:
    16. value = json.loads(value)
    17. return value

    Adding Mutability

    The ORM by default will not detect “mutability” on such a type as above - meaning, in-place changes to values will not be detected and will not be flushed. Without further steps, you instead would need to replace the existing value with a new one on each parent object to detect changes:

    1. obj.json_value["key"] = "value" # will *not* be detected by the ORM
    2. obj.json_value = {"key": "value"} # *will* be detected by the ORM

    The above limitation may be fine, as many applications may not require that the values are ever mutated once created. For those which do have this requirement, support for mutability is best applied using the sqlalchemy.ext.mutable extension. For a dictionary-oriented JSON structure, we can apply this as:

    1. json_type = MutableDict.as_mutable(JSONEncodedDict)
    2. class MyClass(Base):
    3. # ...
    4. json_data = Column(json_type)

    See also

    Mutation Tracking

    Dealing with Comparison Operations

    The default behavior of TypeDecorator is to coerce the “right hand side” of any expression into the same type. For a type like JSON, this means that any operator used must make sense in terms of JSON. For some cases, users may wish for the type to behave like JSON in some circumstances, and as plain text in others. One example is if one wanted to handle the LIKE operator for the JSON type. LIKE makes no sense against a JSON structure, but it does make sense against the underlying textual representation. To get at this with a type like JSONEncodedDict, we need to coerce the column to a textual form using or type_coerce() before attempting to use this operator:

    1. from sqlalchemy import type_coerce, String
    2. stmt = select(my_table).where(type_coerce(my_table.c.json_data, String).like("%foo%"))

    provides a built-in system for working up type translations like these based on operators. If we wanted to frequently use the LIKE operator with our JSON object interpreted as a string, we can build it into the type by overriding the TypeDecorator.coerce_compared_value() method:

    1. from sqlalchemy.sql import operators
    2. from sqlalchemy import String
    3. class JSONEncodedDict(TypeDecorator):
    4. impl = VARCHAR
    5. cache_ok = True
    6. def coerce_compared_value(self, op, value):
    7. if op in (operators.like_op, operators.not_like_op):
    8. return String()
    9. else:
    10. return self
    11. def process_bind_param(self, value, dialect):
    12. if value is not None:
    13. value = json.dumps(value)
    14. return value
    15. def process_result_value(self, value, dialect):
    16. if value is not None:
    17. value = json.loads(value)
    18. return value

    Above is just one approach to handling an operator like “LIKE”. Other applications may wish to raise NotImplementedError for operators that have no meaning with a JSON object such as “LIKE”, rather than automatically coercing to text.

    Applying SQL-level Bind/Result Processing

    As seen in the section Augmenting Existing Types, SQLAlchemy allows Python functions to be invoked both when parameters are sent to a statement, as well as when result rows are loaded from the database, to apply transformations to the values as they are sent to or from the database. It is also possible to define SQL-level transformations as well. The rationale here is when only the relational database contains a particular series of functions that are necessary to coerce incoming and outgoing data between an application and persistence format. Examples include using database-defined encryption/decryption functions, as well as stored procedures that handle geographic data.

    Any , UserDefinedType or subclass can include implementations of TypeEngine.bind_expression() and/or , which when defined to return a non-None value should return a ColumnElement expression to be injected into the SQL statement, either surrounding bound parameters or a column expression. For example, to build a Geometry type which will apply the PostGIS function ST_GeomFromText to all outgoing values and the function ST_AsText to all incoming data, we can create our own subclass of which provides these methods in conjunction with func:

    1. from sqlalchemy import func
    2. from sqlalchemy.types import UserDefinedType
    3. class Geometry(UserDefinedType):
    4. def get_col_spec(self):
    5. return "GEOMETRY"
    6. def bind_expression(self, bindvalue):
    7. return func.ST_GeomFromText(bindvalue, type_=self)
    8. def column_expression(self, col):
    9. return func.ST_AsText(col, type_=self)

    We can apply the Geometry type into metadata and use it in a select() construct:

    1. geometry = Table(
    2. "geometry",
    3. metadata,
    4. Column("geom_id", Integer, primary_key=True),
    5. Column("geom_data", Geometry),
    6. )
    7. print(
    8. select(geometry).where(
    9. geometry.c.geom_data == "LINESTRING(189412 252431,189631 259122)"
    10. )
    11. )

    The resulting SQL embeds both functions as appropriate. ST_AsText is applied to the columns clause so that the return value is run through the function before passing into a result set, and ST_GeomFromText is run on the bound parameter so that the passed-in value is converted:

    1. SELECT geometry.geom_id, ST_AsText(geometry.geom_data) AS geom_data_1
    2. FROM geometry
    3. WHERE geometry.geom_data = ST_GeomFromText(:geom_data_2)

    The method interacts with the mechanics of the compiler such that the SQL expression does not interfere with the labeling of the wrapped expression. Such as, if we rendered a select() against a of our expression, the string label is moved to the outside of the wrapped expression:

    1. print(select(geometry.c.geom_data.label("my_data")))

    Output:

    1. SELECT ST_AsText(geometry.geom_data) AS my_data
    2. FROM geometry

    Another example is we decorate BYTEA to provide a PGPString, which will make use of the PostgreSQL pgcrypto extension to encrypt/decrypt values transparently:

    1. from sqlalchemy import (
    2. create_engine,
    3. String,
    4. select,
    5. func,
    6. MetaData,
    7. Table,
    8. Column,
    9. type_coerce,
    10. TypeDecorator,
    11. )
    12. from sqlalchemy.dialects.postgresql import BYTEA
    13. class PGPString(TypeDecorator):
    14. impl = BYTEA
    15. cache_ok = True
    16. def __init__(self, passphrase):
    17. super(PGPString, self).__init__()
    18. self.passphrase = passphrase
    19. def bind_expression(self, bindvalue):
    20. # convert the bind's type from PGPString to
    21. # String, so that it's passed to psycopg2 as is without
    22. # a dbapi.Binary wrapper
    23. bindvalue = type_coerce(bindvalue, String)
    24. return func.pgp_sym_encrypt(bindvalue, self.passphrase)
    25. def column_expression(self, col):
    26. return func.pgp_sym_decrypt(col, self.passphrase)
    27. metadata_obj = MetaData()
    28. message = Table(
    29. "message",
    30. metadata_obj,
    31. Column("username", String(50)),
    32. Column("message", PGPString("this is my passphrase")),
    33. )
    34. with engine.begin() as conn:
    35. metadata_obj.create_all(conn)
    36. conn.execute(message.insert(), username="some user", message="this is my message")
    37. print(
    38. conn.scalar(select(message.c.message).where(message.c.username == "some user"))
    39. )

    The pgp_sym_encrypt and pgp_sym_decrypt functions are applied to the INSERT and SELECT statements:

    1. INSERT INTO message (username, message)
    2. VALUES (%(username)s, pgp_sym_encrypt(%(message)s, %(pgp_sym_encrypt_1)s))
    3. -- 'pgp_sym_encrypt_1': 'this is my passphrase'}
    4. SELECT pgp_sym_decrypt(message.message, %(pgp_sym_decrypt_1)s) AS message_1
    5. FROM message
    6. WHERE message.username = %(username_1)s
    7. -- {'pgp_sym_decrypt_1': 'this is my passphrase', 'username_1': 'some user'}

    SQLAlchemy Core defines a fixed set of expression operators available to all column expressions. Some of these operations have the effect of overloading Python’s built-in operators; examples of such operators include ColumnOperators.__eq__() (table.c.somecolumn == 'foo'), (~table.c.flag), and ColumnOperators.__add__() (table.c.x + table.c.y). Other operators are exposed as explicit methods on column expressions, such as (table.c.value.in_(['x', 'y'])) and ColumnOperators.like() (table.c.value.like('%ed%')).

    When the need arises for a SQL operator that isn’t directly supported by the already supplied methods above, the most expedient way to produce this operator is to use the method on any SQL expression object; this method is given a string representing the SQL operator to render, and the return value is a Python callable that accepts any arbitrary right-hand side expression:

    1. >>> from sqlalchemy import column
    2. >>> expr = column("x").op(">>")(column("y"))
    3. >>> print(expr)
    4. x >> y

    When making use of custom SQL types, there is also a means of implementing custom operators as above that are automatically present upon any column expression that makes use of that column type, without the need to directly call Operators.op() each time the operator is to be used.

    To achieve this, a SQL expression construct consults the object associated with the construct in order to determine the behavior of the built-in operators as well as to look for new methods that may have been invoked. TypeEngine defines a “comparison” object implemented by the class to provide the base behavior for SQL operators, and many specific types provide their own sub-implementations of this class. User-defined Comparator implementations can be built directly into a simple subclass of a particular type in order to override or define new operations. Below, we create a subclass which overrides the ColumnOperators.__add__() operator, which in turn uses to produce the custom SQL itself:

    1. from sqlalchemy import Integer
    2. class MyInt(Integer):
    3. class comparator_factory(Integer.Comparator):
    4. def __add__(self, other):
    5. return self.op("goofy")(other)

    The above configuration creates a new class MyInt, which establishes the TypeEngine.comparator_factory attribute as referring to a new class, subclassing the class associated with the Integer type.

    Usage:

    1. >>> sometable = Table("sometable", metadata, Column("data", MyInt))
    2. >>> print(sometable.c.data + 5)
    3. sometable.data goofy :data_1

    The implementation for is consulted by an owning SQL expression, by instantiating the Comparator with itself as the expr attribute. This attribute may be used when the implementation needs to refer to the originating object directly:

    1. from sqlalchemy import Integer
    2. class MyInt(Integer):
    3. class comparator_factory(Integer.Comparator):
    4. def __add__(self, other):
    5. return func.special_addition(self.expr, other)

    New methods added to a Comparator are exposed on an owning SQL expression object using a dynamic lookup scheme, which exposes methods added to onto the owning ColumnElement expression construct. For example, to add a log() function to integers:

    Using the above type:

    1. >>> print(sometable.c.data.log(5))
    2. log(:log_1, :log_2)

    When using for comparison operations that return a boolean result, the Operators.op.is_comparison flag should be set to True:

    1. class MyInt(Integer):
    2. class comparator_factory(Integer.Comparator):
    3. def is_frobnozzled(self, other):
    4. return self.op("--is_frobnozzled->", is_comparison=True)(other)

    Unary operations are also possible. For example, to add an implementation of the PostgreSQL factorial operator, we combine the construct along with a custom_op to produce the factorial expression:

    1. from sqlalchemy import Integer
    2. from sqlalchemy.sql.expression import UnaryExpression
    3. from sqlalchemy.sql import operators
    4. class MyInteger(Integer):
    5. class comparator_factory(Integer.Comparator):
    6. def factorial(self):
    7. return UnaryExpression(
    8. self.expr, modifier=operators.custom_op("!"), type_=MyInteger
    9. )

    Using the above type:

    1. >>> from sqlalchemy.sql import column
    2. >>> print(column("x", MyInteger).factorial())
    3. x !

    See also

    TypeEngine.comparator_factory

    Creating New Types

    The UserDefinedType class is provided as a simple base class for defining entirely new database types. Use this to represent native database types not known by SQLAlchemy. If only Python translation behavior is needed, use instead.

    Object NameDescription

    UserDefinedType

    Base for user defined types.

    class sqlalchemy.types.UserDefinedType

    Base for user defined types.

    This should be the base of new types. Note that for most cases, is probably more appropriate:

    1. import sqlalchemy.types as types
    2. class MyType(types.UserDefinedType):
    3. cache_ok = True
    4. def __init__(self, precision = 8):
    5. self.precision = precision
    6. def get_col_spec(self, **kw):
    7. return "MYTYPE(%s)" % self.precision
    8. def bind_processor(self, dialect):
    9. def process(value):
    10. return value
    11. return process
    12. def result_processor(self, dialect, coltype):
    13. def process(value):
    14. return value
    15. return process

    Once the type is made, it’s immediately usable:

    1. table = Table('foo', metadata_obj,
    2. Column('id', Integer, primary_key=True),
    3. Column('data', MyType(16))
    4. )

    The get_col_spec() method will in most cases receive a keyword argument type_expression which refers to the owning expression of the type as being compiled, such as a Column or construct. This keyword is only sent if the method accepts keyword arguments (e.g. **kw) in its argument signature; introspection is used to check for this in order to support legacy forms of this function.

    New in version 1.0.0: the owning expression is passed to the get_col_spec() method via the keyword argument type_expression, if it receives **kw in its signature.

    The UserDefinedType.cache_ok class-level flag indicates if this custom is safe to be used as part of a cache key. This flag defaults to None which will initially generate a warning when the SQL compiler attempts to generate a cache key for a statement that uses this type. If the UserDefinedType is not guaranteed to produce the same bind/result behavior and SQL generation every time, this flag should be set to False; otherwise if the class produces the same behavior each time, it may be set to True. See for further notes on how this works.

    New in version 1.4.28: Generalized the ExternalType.cache_ok flag so that it is available for both as well as UserDefinedType.

    Members

    , coerce_compared_value(),

    Class signature

    class sqlalchemy.types.UserDefinedType (, sqlalchemy.types.TypeEngineMixin, sqlalchemy.types.TypeEngine, sqlalchemy.util.langhelpers.EnsureKWArg)

    • attribute cache_ok: Optional[bool] = None

      inherited from the ExternalType.cache_ok attribute of

      Indicate if statements using this ExternalType are “safe to cache”.

      The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a :

      1. class MyType(TypeDecorator):
      2. impl = String
      3. cache_ok = True
      4. def __init__(self, choices):
      5. self.choices = tuple(choices)
      6. self.internal_only = True

      The cache key for the above type would be equivalent to:

      1. >>> MyType(["a", "b", "c"])._static_cache_key
      2. (<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

      The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

      The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

      To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

      1. class LookupType(UserDefinedType):
      2. '''a custom type that accepts a dictionary as a parameter.
      3. this is the non-cacheable version, as "self.lookup" is not
      4. hashable.
      5. '''
      6. def __init__(self, lookup):
      7. self.lookup = lookup
      8. def get_col_spec(self, **kw):
      9. return "VARCHAR(255)"
      10. def bind_processor(self, dialect):
      11. # ... works with "self.lookup" ...

      Where “lookup” is a dictionary. The type will not be able to generate a cache key:

      1. >>> type_ = LookupType({"a": 10, "b": 20})
      2. >>> type_._static_cache_key
      3. <stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
      4. produce a cache key because the ``cache_ok`` flag is not set to True.
      5. Set this flag to True if this type object's state is safe to use
      6. in a cache key, or False to disable this warning.
      7. symbol('no_cache')

      If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

      1. >>> # set cache_ok = True
      2. >>> type_.cache_ok = True
      3. >>> # this is the cache key it would generate
      4. >>> key = type_._static_cache_key
      5. >>> key
      6. (<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))
      7. >>> # however this key is not hashable, will fail when used with
      8. >>> # SQLAlchemy statement cache
      9. >>> some_cache = {key: "some sql value"}
      10. Traceback (most recent call last): File "<stdin>", line 1,
      11. in <module> TypeError: unhashable type: 'dict'

      The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

      1. class LookupType(UserDefinedType):
      2. '''a custom type that accepts a dictionary as a parameter.
      3. The dictionary is stored both as itself in a private variable,
      4. and published in a public variable as a sorted tuple of tuples,
      5. which is hashable and will also return the same value for any
      6. two equivalent dictionaries. Note it assumes the keys and
      7. values of the dictionary are themselves hashable.
      8. '''
      9. cache_ok = True
      10. def __init__(self, lookup):
      11. self._lookup = lookup
      12. # assume keys/values of "lookup" are hashable; otherwise
      13. # they would also need to be converted in some way here
      14. self.lookup = tuple(
      15. (key, lookup[key]) for key in sorted(lookup)
      16. )
      17. def get_col_spec(self, **kw):
      18. return "VARCHAR(255)"
      19. def bind_processor(self, dialect):
      20. # ... works with "self._lookup" ...

      Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

      1. >>> LookupType({"a": 10, "b": 20})._static_cache_key
      2. (<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

      New in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

      New in version 1.4.28: - added the mixin which generalizes the cache_ok flag to both the TypeDecorator and classes.

      See also

      SQL Compilation Caching

    • method coerce_compared_value(op: Optional[OperatorType], value: Any) → TypeEngine[Any]

      Suggest a type for a ‘coerced’ Python value in an expression.

      Default behavior for is the same as that of TypeDecorator; by default it returns self, assuming the compared value should be coerced into the same type as this one. See for more detail.

    • attribute sqlalchemy.types.UserDefinedType.ensure_kwarg: str = ‘get_col_spec’

      a regular expression that indicates method names for which the method should accept **kw arguments.

      The class will scan for methods matching the name template and decorate them if necessary to ensure **kw parameters are accepted.

    It is important to note that database types which are modified to have additional in-Python behaviors, including types based on TypeDecorator as well as other user-defined subclasses of datatypes, do not have any representation within a database schema. When using database the introspection features described at , SQLAlchemy makes use of a fixed mapping which links the datatype information reported by a database server to a SQLAlchemy datatype object. For example, if we look inside of a PostgreSQL schema at the definition for a particular database column, we might receive back the string "VARCHAR". SQLAlchemy’s PostgreSQL dialect has a hardcoded mapping which links the string name "VARCHAR" to the SQLAlchemy VARCHAR class, and that’s how when we emit a statement like Table('my_table', m, autoload_with=engine), the object within it would have an instance of VARCHAR present inside of it.

    The implication of this is that if a object makes use of type objects that don’t correspond directly to the database-native type name, if we create a new Table object against a new collection for this database table elsewhere using reflection, it will not have this datatype. For example:

    1. >>> from sqlalchemy import (
    2. ... Table,
    3. ... Column,
    4. ... MetaData,
    5. ... create_engine,
    6. ... PickleType,
    7. ... Integer,
    8. ... )
    9. >>> metadata = MetaData()
    10. >>> my_table = Table(
    11. ... "my_table", metadata, Column("id", Integer), Column("data", PickleType)
    12. ... )
    13. >>> engine = create_engine("sqlite://", echo="debug")
    14. >>> my_table.create(engine)
    15. INFO sqlalchemy.engine.base.Engine
    16. CREATE TABLE my_table (
    17. id INTEGER,
    18. data BLOB
    19. )

    Above, we made use of PickleType, which is a that works on top of the LargeBinary datatype, which on SQLite corresponds to the database type BLOB. In the CREATE TABLE, we see that the BLOB datatype is used. The SQLite database knows nothing about the we’ve used.

    If we look at the datatype of my_table.c.data.type, as this is a Python object that was created by us directly, it is PickleType:

    1. >>> my_table.c.data.type
    2. PickleType()

    However, if we create another instance of using reflection, the use of PickleType is not represented in the SQLite database we’ve created; we instead get back :

    1. >>> metadata_two = MetaData()
    2. >>> my_reflected_table = Table("my_table", metadata_two, autoload_with=engine)
    3. INFO sqlalchemy.engine.base.Engine PRAGMA main.table_info("my_table")
    4. INFO sqlalchemy.engine.base.Engine ()
    5. DEBUG sqlalchemy.engine.base.Engine Col ('cid', 'name', 'type', 'notnull', 'dflt_value', 'pk')
    6. DEBUG sqlalchemy.engine.base.Engine Row (0, 'id', 'INTEGER', 0, None, 0)
    7. DEBUG sqlalchemy.engine.base.Engine Row (1, 'data', 'BLOB', 0, None, 0)
    8. >>> my_reflected_table.c.data.type
    9. BLOB()

    Typically, when an application defines explicit Table metadata with custom types, there is no need to use table reflection because the necessary metadata is already present. However, for the case where an application, or a combination of them, need to make use of both explicit Table metadata which includes custom, Python-level datatypes, as well as objects which set up their Column objects as reflected from the database, which nevertheless still need to exhibit the additional Python behaviors of the custom datatypes, additional steps must be taken to allow this.

    The most straightforward is to override specific columns as described at . In this technique, we simply use reflection in combination with explicit Column objects for those columns for which we want to use a custom or decorated datatype:

    1. >>> metadata_three = MetaData()
    2. >>> my_reflected_table = Table(
    3. ... "my_table",
    4. ... metadata_three,
    5. ... Column("data", PickleType),
    6. ... autoload_with=engine,
    7. ... )

    The my_reflected_table object above is reflected, and will load the definition of the “id” column from the SQLite database. But for the “data” column, we’ve overridden the reflected object with an explicit definition that includes our desired in-Python datatype, the PickleType. The reflection process will leave this object intact:

    1. >>> my_reflected_table.c.data.type
    2. PickleType()

    A more elaborate way to convert from database-native type objects to custom datatypes is to use the DDLEvents.column_reflect() event handler. If for example we knew that we wanted all datatypes to in fact be PickleType, we could set up a rule across the board:

    1. from sqlalchemy import BLOB
    2. from sqlalchemy import event
    3. from sqlalchemy import PickleType
    4. from sqlalchemy import Table
    5. @event.listens_for(Table, "column_reflect")
    6. def _setup_pickletype(inspector, table, column_info):

    When the above code is invoked before any table reflection occurs (note also it should be invoked only once in the application, as it is a global rule), upon reflecting any that includes a column with a BLOB datatype, the resulting datatype will be stored in the object as PickleType.