The following sections describes all the known or existing issues related to the Amorphic bulk load ingestion feature.

Oracle as source issues

1. Target datatype is created as Numeric/Double when source is Number:

Issue description: When the datatype at source is Number without any precision or scale then datatype created in the target redshift is of Numeric/Double

Explanation: As per the AWS documentation when the scale and precision is 0 at the source we have to use a Real equivalent datatype in the target, Refer for more details in the data types section –>

Generic issues

  • When a task is created with Target location as S3 and has a tranformation rule to remove column then data is being distorted in the csv files that are created in S3. This is because of a bug on AWS end and nothing has to do with Amorphic, AWS doens’t have any ETA when this issue will be fixed, if AWS fixed the issue then it will get automatically reflected in Amorphic.