TicDatFactory

TicDatFactory(self, **init_fields)

Primary class for ticdat library. This class is constructed with a schema. It can be used to generate TicDat objects, to write TicDat objects to different file types, or to perform bulk query operations to diagnose common data integrity failures.

Analytical code that uses TicDat objects can be used, without change, on different data sources, thus facilitating the "separate model from data" design goal.

:param init_fields: a mapping of tables to primary key fields and data fields. Each field listing consists of two sub lists ... first primary keys fields, than data fields.

ex: TicDatFactory (categories = [["name"],["minNutrition", "maxNutrition"]], foods = [["name"],["cost"]] nutritionQuantities = [["food", "category"],["qty"]])

Use '*' instead of a pair of lists for generic tables, which will render as pandas.DataFrame objects.

ex: TicDatFactory (typical_table = [["primary key field"],["data field"]], generic_table = '*')

add_implied_foreign_keys

TicDatFactory.add_implied_foreign_keys()

Cascades foreign keys downward (i.e. computes implied foreign key relationships) Calling this routine won't change the boolean result of subsequent calls to find_foreign_key_failures, but it might change the size of the find_foreign_key_failures returned dictionary. :return:

add_parameter

TicDatFactory.add_parameter(name,
                            default_value,
                            number_allowed=True,
                            inclusive_min=True,
                            inclusive_max=False,
                            min=0,
                            max=inf,
                            must_be_int=False,
                            strings_allowed=(),
                            nullable=False,
                            datetime=False,
                            enforce_type_rules=True)

Add (or reset) a parameters option. Requires that a parameters table with one primary key field and one data field already be present. The legal parameters options will be enforced as part of find_data_row_failures Note that if you are using this function, then you would typically read from the parameters table indirectly, by using the dictionary returned by create_full_parameters_dict.

:param name: name of the parameter to add or reset

:param default_value: default value for the parameter (used for create_full_parameters_dict)

:param number_allowed: boolean does this parameter allow numbers?

:param inclusive_min: if number allowed, is the min inclusive?

:param inclusive_max: if number allowed, is the max inclusive?

:param min: if number allowed, the minimum value

:param max: if number allowed, the maximum value

:param must_be_int: boolean : if number allowed, must the number be integral?

:param strings_allowed: if a collection - then a list of the strings allowed. The empty collection prohibits strings. If a "*", then any string is accepted.

:param nullable: boolean : can this parameter be set to null (aka None)

:param datetime: If truthy, then number_allowed through strings_allowed are ignored. Should the data either be a datetime.datetime object or a string that can be parsed into a datetime.datetime object?

:param enforce_type_rules: boolean: ignore all of number_allowed through nullable, and only enforce the parameter names and default values :return:

add_data_row_predicate

TicDatFactory.add_data_row_predicate(
  table,
  predicate,
  predicate_name=None,
  predicate_kwargs_maker=None,
  predicate_failure_response='Boolean')

The purpose of calling add_data_row_predicate is to prepare for a future call to find_data_row_failures. See https://bit.ly/3e9pdCP for more details on these two functions.

Adds a data row predicate for a table. Row predicates can be used to check for sophisticated data integrity problems of the sort that can't be easily handled with a data type rule. For example, a min_supply column can be verified to be no larger than a max_supply column.

:param table: table in the schema

:param predicate: A one argument function that accepts a table row as an argument and returns Truthy if the row is valid and Falsey otherwise. (See below, there are other arguments that can refine how predicate works). The row argument passed to predicate will be a dict that maps field name to data value for all fields (both primary key and data field) in the table. Note - if None is passed as a predicate, then any previously added predicate matching (table, predicate_name) will be removed.

:param predicate_name: The name of the predicate. If omitted, the smallest non-colliding number will be used.

:param predicate_kwargs_maker: A function used to support predicate if predicate accepts more than just the row argument. This function accepts a single dat argument and is called exactly once per find_data_row_failures call. If predicate_kwargs_maker returns a dict, then this dict is unpacked for each call to predicate. An error (or a bulk row failure) results if predicate_kwargs_maker fails to return a dict.

:param predicate_failure_response: Either "Boolean" or "Error Message". If the latter then predicate indicates a clean row by returning True (the one and only literal True in Python) and a dirty row by returning a non-empty string (which is an error message).

See find_data_row_failures for details on handling exceptions thrown by predicate or predicate_kwargs_maker. :return:

add_foreign_key

TicDatFactory.add_foreign_key(native_table, foreign_table, mappings)

Adds a foreign key relationship to the schema. Adding a foreign key doesn't block the entry of child records that fail to find a parent match. It does make it easy to recognize such records (with find_foreign_key_failures()) and to remove such records (with remove_foreign_key_failures())

:param native_table: (aka child table). The table with fields that must match some other table.

:param foreign_table: (aka parent table). The table providing the matching entries.

:param mappings: For simple foreign keys, a [native_field, foreign_field] pair. For compound foreign keys an iterable of [native_field, foreign_field] pairs.

:return:

as_dict

TicDatFactory.as_dict(ticdat)

Returns the ticdat object as a dictionary. Note that, as a nested class, TicDat objects cannot be pickled directly. Instead, the dictionary returned by this function can be pickled. For unpickling, first unpickle the pickled dictionary, and then pass it, unpacked, to the TicDat constructor.

(Note that if you want to pickle a TicDatFactory, you can use a similar approach with schema)

:param ticdat: a TicDat object whose data is to be returned as a dict

:return: A dictionary that can either be pickled, or unpacked to a TicDat constructor

clear_data_type

TicDatFactory.clear_data_type(table, field)

clears the data type for a field. By default, fields don't have types. Adding a data type doesn't block data of the wrong type from being entered. Data types are useful for recognizing errant data entries. If no data type is specified (the default) then no errant data will be recognized.

:param table: table in the schema

:param field:

:return:

clear_foreign_keys

TicDatFactory.clear_foreign_keys(native_table=None)

create a TicDatFactory

:param native_table: optional. The table whose foreign keys should be cleared. If omitted, all foreign keys are cleared.

clone

TicDatFactory.clone(table_restrictions=None, clone_factory=None)

clones the TicDatFactory

:param table_restrictions : if None, then argument is ignored. Otherwise, a container listing the tables to keep in the clone. Tables outside table_restrictions are removed from the clone.

:param clone_factory : optional. Defaults to TicDatFactory. Can also be PanDatFactory. Can also be a function, in which case it should behave similarly to create_from_full_schema. If clone_factory=PanDatFactory, the row predicates that use predicate_kwargs_maker won't be copied over.

:return: a clone of the TicDatFactory. Returned object will based on clone_factory, if provided.

Note - If you want to remove tables via a clone, then call like this tdf_new = tdf.clone(table_restrictions=set(tdf.all_tables).difference(tables_to_remove)) Other schema editing operations are available with clone_add_a_table, clone_add_a_column, clone_remove_a_column and clone_rename_a_column.

clone_add_a_column

TicDatFactory.clone_add_a_column(table,
                                 field,
                                 field_type,
                                 field_position='append')

add a column to the TicDatFactory

:param table: table in the schema

:param field: name of the new field to be added

:param field_type: either "primary key" or "data"

:param field_position: integer between 0 and the length of self.primary_key_fields[table] (if "primary key") or self.data_fields[table] (if "data"), inclsuive. Alternately, can be "append", which will just insert the column at the end of the appropriate list.

:return: a clone of the TicDatFactory, with field inserted into location field_position for field_type

clone_add_a_table

TicDatFactory.clone_add_a_table(table, pk_fields, df_fields)

add a table to the TicDatFactory

:param table: table not in the schema

:param pk_fields: container of the primary key fields

:param df_fields: container of the data fields

:return: a clone of the TicDatFactory, with the new table added

clone_remove_a_column

TicDatFactory.clone_remove_a_column(table, field)

remove a column from the TicDatFactory

:param table: table in the schema

:param field: name of the field to be removed

:return: a clone of the TicDatFactory, with field removed

clone_rename_a_column

TicDatFactory.clone_rename_a_column(table, field, new_field)

rename a column in the TicDatFactory

:param table: table in the schema

:param field: name of the field to be removed

:param new_field: new name for the field

:return: a clone of the TicDatFactory, with field renamed to new_field. Data types, default values, foreign keys and tooltips will reflect the new field name, but row predicates will be copied over as-is (and thus you will need to re-create them as needed).

copy_from_ampl_variables

TicDatFactory.copy_from_ampl_variables(ampl_variables)

copies the solution results from ampl_variables into a new ticdat object

:param ampl_variables: a dict mapping from (table_name, field_name) -> amplpy.variable.Variable (amplpy.variable.Variable is the type object returned by AMPL.getVariable) table_name should refer to a table in the schema that has primary key fields. field_name can refer to a data field for table_name, or it can be falsey. If the latter, then AMPL variables that pass the filter (see below) will simply populate the primary key of the table_name. Note that by default, only non-zero data is copied over. If you want to override this filter, then instead of mapping to amplpy.variable.Variable you should map to a (amplpy.variable.Variable, filter) where filter accepts a data value and returns a boolean.

:return: a deep copy of the ampl_variables into a ticdat object

copy_tic_dat

TicDatFactory.copy_tic_dat(tic_dat, freeze_it=False)

copies the tic_dat object into a new tic_dat object performs a deep copy

:param tic_dat: a ticdat object

:param freeze_it: boolean. should the returned object be frozen?

:return: a deep copy of the tic_dat argument

copy_to_ampl

TicDatFactory.copy_to_ampl(tic_dat,
                           field_renamings=None,
                           excluded_tables=None)

copies the tic_dat object into a new tic_dat object populated with amplpy.DataFrame objects performs a deep copy

:param tic_dat: a ticdat object

:param field_renamings: dict or None. If fields are to be renamed in the copy, then a mapping from (table_name, field_name) -> new_field_name If a data field is to be omitted, then new_field can be falsey table_name cannot refer to an excluded table. (see below)

:param excluded_tables: If truthy, a list of tables to be excluded from the copy. Tables without primary key fields are always excluded.

:return: a deep copy of the tic_dat argument into amplpy.DataFrames

copy_to_pandas

TicDatFactory.copy_to_pandas(tic_dat,
                             table_restrictions=None,
                             drop_pk_columns=None,
                             reset_index=False)

copies the tic_dat object into a new object populated with pandas.DataFrame objects performs a deep copy

:param tic_dat: a ticdat object

:param table_restrictions: If truthy, a list of tables to turn into data frames. Defaults to all tables.

:param drop_pk_columns: boolean or None. should the primary key columns be dropped from the data frames after they have been incorporated into the index. If None, then pk fields will be dropped only for tables with data fields :param reset_index: boolean. If true, then drop_pk_columns is ignored and the returned DataFrames have a simple integer index with both primary key and data fields as columns.

:return: a deep copy of the tic_dat argument into DataFrames To get a valid pan_object object, either set drop_pk_columns to False or set reset_index to True. I.e. copy_1 = tdf.copy_to_pandas(dat, drop_pk_columns=False) copy_2 = tdf.copy_to_pandas(dat, reset_index=True) assert all(PanDatFactory(**tdf.schema()).good_pan_dat_object(_) for _ in [copy_1, copy_2])

    Note that None will be converted to nan in the returned object (as is the norm for pandas.DataFrame)

create_from_full_schema

TicDatFactory.create_from_full_schema(full_schema)

create a TicDatFactory complete with default values, data types, and foreign keys

:param full_schema: a dictionary consistent with the data returned by a call to schema() with include_ancillary_info = True

:return: a TicDatFactory reflecting the tables, fields, default values, data types, and foreign keys consistent with the full_schema argument

create_full_parameters_dict

TicDatFactory.create_full_parameters_dict(dat)

create a fully populated dictionary of all the parameters

:param dat: a TicDat object that has a parameters table

:return: a dictionary that maps parameter option to actual dat.parameters value. if the specific option isn't part of dat.parameters, then the default value is used. Note that for datetime parameters, the default will be coerced into a datetime object, if possible.

enable_foreign_key_links

TicDatFactory.enable_foreign_key_links()

call to enable foreign key links. For ex. a TicDat object made from a factory with foreign key enabled will pass the following assert assert (dat.foods["chicken"].nutritionQuantities["protein"] is dat.categories["protein"].nutritionQuantities["chicken"] is dat.nutritionQuantities["chicken", "protein"]) Note that by default, TicDatFactories don't create foreign key links since doing so can slow down TicDat creation.

:return:

find_data_row_failures

TicDatFactory.find_data_row_failures(tic_dat,
                                     exception_handling='__debug__',
                                     max_failures=inf)

Finds the data row failures for a ticdat object

:param tic_dat: ticdat object

:param exception_handling: One of "Handled as Failure", "Unhandled" or "debug" "Handled as Failure": Any exception generated by calling a row predicate function will indicate a data failure for that row. (Similarly, predicate_kwargs_maker exceptions create an entry in the returned failure dictionary). "Unhandled": Exceptions resulting from calling a row predicate (or a predicate_kwargs_maker) will not be handled by data_row_failures. "debug": Since "Handled as Failure" makes more sense for production runs and "Unhandled" makes more sense for debugging, this option will use the latter if debug is True and the former otherwise. See -o and debug in Python documentation for more details.

:param max_failures: number. An upper limit on the number of failures to find. Will short circuit and return ASAP with a partial failure enumeration when this number is reached.

:return: A dictionary constructed as follow:

The keys are namedtuples with members "table", "predicate_name".

The values of the returned dictionary are tuples indicating which rows failed the predicate test. For tables with a primary key this tuple will contain the primary key value of each failed row. Otherwise, this tuple will list the positions of the failed rows.

If the predicate_failure_response for the predicate is "Error Message" (instead of "Boolean") then the values of the returned dict will themselves be namedtuples with members "primary_key" and "error_message".

If a predicate_kwargs_maker is provided and it fails (either by failing to return a dictionary or by throwing a handled exception) then a similar namedtuple is entered as the value, with primary_key='*' and error_message as a string.

find_data_type_failures

TicDatFactory.find_data_type_failures(tic_dat, max_failures=inf)

Finds the data type failures for a ticdat object

:param tic_dat: ticdat object

:param max_failures: number. An upper limit on the number of failures to find. Will short circuit and return ASAP with a partial failure enumeration when this number is reached.

:return: A dictionary constructed as follow:

The keys are namedtuples with members "table", "field". Each (table,field) pair has data values that are inconsistent with its data type. (table, field) pairs with no data type at all are never part of the returned dictionary.

The values of the returned dictionary are namedtuples with the following attributes.

--> bad_values - the distinct values for the (table, field) pair that are inconsistent with the data type for (table, field).

--> pks - the distinct primary key entries of the table containing the bad_values data. (will be row index for tables with no primary key)

That is to say, bad_values tells you which values in field are failing the data type check, and pks tells you which table rows will have their field entry changed if you call replace_data_type_failures().

Note that for primary key fields (but not data fields) with no explicit data type, a temporary filter that excludes only Null will be applied. If you want primary key fields to allow Null, you must explicitly opt-in by calling set_data_type appropriately. See issue https://github.com/ticdat/ticdat/issues/46 for more info.

find_foreign_key_failures

TicDatFactory.find_foreign_key_failures(tic_dat,
                                        verbosity='High',
                                        max_failures=inf)

Finds the foreign key failures for a ticdat object

:param tic_dat: ticdat object

:param max_failures: number. An upper limit on the number of failures to find. Will short circuit and return ASAP with a partial failure enumeration when this number is reached.

:param verbosity: either "High" or "Low"

:return: A dictionary constructed as follow (for verbosity = 'High'):

The keys are namedtuples with members "native_table", "foreign_table", "mapping", "cardinality".

The key data matches the arguments to add_foreign_key that constructed the foreign key (with "cardinality" being deduced from the overall schema).

The values are namedtuples with the following members.

--> native_values - the values of the native fields that failed to match

--> native_pks - the primary key entries of the native table rows corresponding to the native_values.

That is to say, native_values tells you which values in the native table can't find a foreign key match, and thus generate a foreign key failure. native_pks tells you which native table rows will be removed if you call remove_foreign_key_failures().

For verbosity = 'Low' a simpler return object is created that doesn't use namedtuples and omits the foreign key cardinality.

freeze_me

TicDatFactory.freeze_me(tic_dat)

Freezes a ticdat object

:param tic_dat: ticdat object

:return: tic_dat, after it has been frozen

get_row_predicates

TicDatFactory.get_row_predicates(table)

return all the row predicates for a given table

:param table: a table in the schema

:return: a dictionary mapping predicate_name to RowPredicateInfo named tuple (the entries of which are based on the prior call to add_data_row_predicate).

good_tic_dat_object

TicDatFactory.good_tic_dat_object(
  data_obj,
  bad_message_handler=<function TicDatFactory.<lambda> at 0x1369fda20>,
  row_checking='strict')

determines if an object can be can be converted to a TicDat data object.

:param data_obj: the object to verify

:param bad_message_handler: a call back function to receive description of any failure message

:param row_checking: either "generous" or "strict". If the latter, then we expect all the rows to be dicts with the correct columns (except for things like generic tables) defaults to strict since this is the protector for the solve functions

:return: True if the dataObj can be converted to a TicDat data object. False otherwise.

good_tic_dat_table

TicDatFactory.good_tic_dat_table(
  data_table,
  table_name,
  bad_message_handler=<function TicDatFactory.<lambda> at 0x1369fdc60>,
  row_checking='generous')

determines if an object can be can be converted to a TicDat data table.

:param dataObj: the object to verify

:param table_name: the name of the table

:param bad_message_handler: a call back function to receive description of any failure message

:param row_checking: either "generous" or "strict". If the latter, then we expect all the rows to be dicts with the correct columns (except for things like generic tables) defaults to generous since this gets used a lot internally

:return: True if the dataObj can be converted to a TicDat data table. False otherwise.

obfusimplify

TicDatFactory.obfusimplify(tic_dat,
                           table_prepends={},
                           skip_tables=(),
                           freeze_it=False)

copies the tic_dat object into a new, obfuscated, simplified tic_dat object

:param tic_dat: a ticdat object

:param table_prepends: a dictionary with mapping each table to the prepend it should apply when its entries are renamed. A valid table prepend must be all caps and not end with I. Should be restricted to entity tables (single field primary that is not a foreign key child)

:param skip_tables: a listing of entity tables whose single field primary key shouldn't be renamed

:param freeze_it: boolean. should the returned copy be frozen?

:return: A named tuple with the following components.

copy : a deep copy of the tic_dat argument, with the single field primary key values renamed to simple "short capital letters followed by numbers" strings.

renamings : a dictionary matching the new entries to their original (table, primary key value) this entry can be used to cross reference any diagnostic information gleaned from the obfusimplified copy to the original names. For example, "P5 has no production" can easily be recognized as "Product KX12212 has no production".

remove_foreign_key_failures

TicDatFactory.remove_foreign_key_failures(tic_dat, propagate=True)

Removes foreign key failures (i.e. child records with no parent table record)

:param tic_dat: ticdat object

:param propagate boolean: remove cascading failures? (if removing the child record results in new failures, should those be removed as well?)

:return: tic_dat, with the foreign key failures removed

remove_parameter

TicDatFactory.remove_parameter(name)

Undo a previous call to add_parameter.

:param name: name of the parameter to remove

:return:

replace_data_type_failures

TicDatFactory.replace_data_type_failures(tic_dat, replacement_values={})

Replace the data cells with data type failures with the default value for the appropriate field.

:param tic_dat: a TicDat object appropriate for this schema

:param replacement_values: a dictionary mapping (table, field) to replacement value. the default value will be used for (table, field) pairs not in replacement_values

:return: the tic_dat object with replacements made. The tic_dat object itself will be edited in place.

Replaces any of the data failures found in find_data_type_failures() with the appropriate replacement_value.

Note - won't perform primary key replacements.

schema

TicDatFactory.schema(include_ancillary_info=False)

:param include_ancillary_info: if True, include all the foreign key, default, and data type information as well. Otherwise, just return table-fields dictionary

:return: a dictionary with table name mapping to a list of lists defining primary key fields and data fields If include_ancillary_info, this table-fields dictionary is just one entry in a more comprehensive dictionary.

set_ampl_data

TicDatFactory.set_ampl_data(tic_dat, ampl, table_to_set_name=None)

performs bulk setData on the AMPL first argument.

:param tic_dat: an AmplTicDat object created by calling copy_to_ampl

:param ampl: an amplpy.AMPL object

:param table_to_set_name: a mapping of table_name to ampl set name :return:

set_data_type

TicDatFactory.set_data_type(table,
                            field,
                            number_allowed=True,
                            inclusive_min=True,
                            inclusive_max=False,
                            min=0,
                            max=inf,
                            must_be_int=False,
                            strings_allowed=(),
                            nullable=False,
                            datetime=False)

sets the data type for a field. By default, fields don't have types. Adding a data type doesn't block data of the wrong type from being entered. Data types are useful for recognizing errant data entries with find_data_type_failures(). Errant data entries can be replaced with replace_data_type_failures().

:param table: a table in the schema

:param field: a field for this table

:param number_allowed: boolean does this field allow numbers?

:param inclusive_min: boolean : if number allowed, is the min inclusive?

:param inclusive_max: boolean : if number allowed, is the max inclusive?

:param min: if number allowed, the minimum value

:param max: if number allowed, the maximum value

:param must_be_int: boolean : if number allowed, must the number be integral?

:param strings_allowed: if a collection - then a list of the strings allowed. The empty collection prohibits strings. If a "*", then any string is accepted.

:param nullable : boolean : can this value contain null (aka None)

:param datetime: If truthy, then number_allowed through strings_allowed are ignored. Should the data either be a datetime.datetime object or a string that can be parsed into a datetime.datetime object? Note that the various readers will try to coerce strings into datetime.datetime objects on read for fields with datetime data types. pandas.Timestamp is itself a datetime.datetime, and the bias will be to create such an object. :return:

set_default_value

TicDatFactory.set_default_value(table, field, default_value)

sets the default value for a specific field

:param table: a table in the schema

:param field: a field in the table

:param default_value: the default value to apply

:return:

set_default_values

TicDatFactory.set_default_values(**tableDefaults)

sets the default values for the fields

:param tableDefaults: A dictionary of named arguments. Each argument name (i.e. each key) should be a table name Each value should itself be a dictionary mapping data field names to default values

Ex: tdf.set_default_values(categories = {"minNutrition":0, "maxNutrition":float("inf")}, foods = {"cost":0}, nutritionQuantities = {"qty":0})

:return:

set_duplicates_ticdat_init

TicDatFactory.set_duplicates_ticdat_init(value)

Set the duplicates_ticdat_init for the TicDatFactory. Choices are: --> 'assert' : an assert is raised if duplicate rows are passed to TicDat.init --> 'warn' : emit a warning if duplicate rows are passed to TicDat.init --> 'ignore' : don't do anything if duplicate rows are passed to TicDat.init :param value: either 'assert', 'warn' or 'ignore' :return:

set_generator_tables

TicDatFactory.set_generator_tables(g)

sets which tables are to be generator tables. Generator tables are represented as generators pulled from the actual data store. This prevents them from being fulled loaded into memory. Generator tables are only appropriate for truly massive data tables with no primary key.

:param g: An iterable of table name.

:return:

set_infinity_io_flag

TicDatFactory.set_infinity_io_flag(value)

Set the infinity_io_flag for the TicDatFactory. 'N/A' (the default) is recognized as a flag to disable infinity I/O buffering.

If numeric, when writing data to the file system (or a database), float("inf") will be replaced by the infinity_io_flag and float("-inf") will be replaced by -infinity_io_flag, prior to writing. Similarly, the read data will replace any number >= the infinity_io_flag with float("inf") and any number smaller than float("-inf") with -infinity_io_flag.

If None, then +/- infinity will be replaced by None prior to writing. Similarly, subsequent to reading, None will be replaced either by float("inf") or float("-inf"), depending on field data types. Note that None flagging will only perform replacements on fields whose data types allow infinity and not None.

For all cases, these replacements will be done on a temporary copy of the data that is created prior to writing.

Also note that none of the these replacements will be done on the parameters table. The assumption is the parameters table will be serialized to a string/string database table. Infinity can thus be represented by "inf"/"-inf" in such serializations. File readers will attempt to cast strings to floats on a row-by-row basis, as determined by add_parameter settings. File writers will cast parameters table entries to strings (assuming the add_parameters functionality is being used).

:param value: a valid infinity_io_flag

:return:

set_tooltip

TicDatFactory.set_tooltip(table, field, tooltip)

Set the tooltip for a table, or for a (table, field) pair.

:param table: a table in the schema

:param field: an empty string (if you want to set the tooltip for a table) or a field for this table

:param tooltip: an empty string (if you want to delete a previously set tooltip) or the tooltip you want to set

:return:

After calling this function, the tooltips property for this TicDatFactory will be appropriately adjusted.

set_xlsx_trailing_empty_rows

TicDatFactory.set_xlsx_trailing_empty_rows(value)

Set the xlsx_trailing_empty_rows for the TicDatFactory. Choices are: --> 'prune' : (the default) when reading an xlsx/xlsm file, look for trailing all None rows in each table, and prune them --> 'ignore': retain such rows With the move to openpyxl for xlsx/xlsm file reading, its more likely that Excel users accidentally creating trailing all none rows. :param value: either 'prune' or 'ignore' :return:

freeze_me

freeze_me(x)

Freezes a ticdat object

:param x: ticdat object

:return: x, after it has been frozen