Is Type
Definition
Asserts that the data is of a specific type.
Field Scope
Single: The rule evaluates a single specified field.
Accepted Types
Type | |
---|---|
String |
General Properties
Name | Supported |
---|---|
Filter Allows the targeting of specific data based on conditions |
|
Coverage Customization Allows adjusting the percentage of records that must meet the rule's conditions |
The filter allows you to define a subset of data upon which the rule will operate.
It requires a valid Spark SQL expression that determines the criteria rows in the DataFrame should meet. This means the expression specifies which rows the DataFrame should include based on those criteria. Since it's applied directly to the Spark DataFrame, traditional SQL constructs like WHERE clauses are not supported.
Examples
Direct Conditions
Simply specify the condition you want to be met.
Combining Conditions
Combine multiple conditions using logical operators like AND
and OR
.
Correct usage
Incorrect usage
Utilizing Functions
Leverage Spark SQL functions to refine and enhance your conditions.
Correct usage
Incorrect usage
Using scan-time variables
To refer to the current dataframe being analyzed, use the reserved dynamic variable {{ _qualytics_self }}
.
Correct usage
Incorrect usage
While subqueries can be useful, their application within filters in our context has limitations. For example, directly referencing other containers or the broader target container in such subqueries is not supported. Attempting to do so will result in an error.
Important Note on {{ _qualytics_self }}
The {{ _qualytics_self }}
keyword refers to the dataframe that's currently under examination. In the context of a full scan, this variable represents the entire target container. However, during incremental scans, it only reflects a subset of the target container, capturing just the incremental data. It's crucial to recognize that in such scenarios, using {{ _qualytics_self }}
may not encompass all entries from the target container.
Specific Properties
Specify the expected type for the data in the field.
Name | Description |
---|---|
Field Type |
The type that values in the selected field should conform to. |
Anomaly Types
Type | Supported |
---|---|
Record Flag inconsistencies at the row level |
|
Shape Flag inconsistencies in the overall patterns and distributions of a field |
Example
Objective: Ensure that all L_QUANTITY entries in the LINEITEM table are of Integral type.
Sample Data
L_ORDERKEY | L_QUANTITY |
---|---|
1 | "10" |
2 | "15.5" |
3 | "Ten" |
{
"description": "Ensure that all L_QUANTITY entries in the LINEITEM table are of Integral type",
"coverage": 1,
"properties": {
"field_type":"Integral"
},
"tags": [],
"fields": ["L_QUANTITY"],
"additional_metadata": {"key 1": "value 1", "key 2": "value 2"},
"rule": "isType",
"container_id": {container_id},
"template_id": {template_id},
"filter": "1=1"
}
Anomaly Explanation
In the sample data above, the entries with L_ORDERKEY
2 and 3 do not satisfy the rule because their L_QUANTITY
values are not of Integral type.
graph TD
A[Start] --> B[Retrieve L_QUANTITY]
B --> C{Is L_QUANTITY of Integral type?}
C -->|Yes| D[Move to Next Record/End]
C -->|No| E[Mark as Anomalous]
E --> D
Potential Violation Messages
Record Anomaly
The L_QUANTITY
value of Ten
is not a valid Integral.
Shape Anomaly
In L_QUANTITY
, 66.667% of 3 filtered records (2) are not a valid Integral.