Describe the bug
During the initial cassandra read, Spark seems to implicitly convert the varint to decimal(38,0). If the value exceeds 38 digits, it automatically converts it to null before the replicator conversion.
To Reproduce
Steps to reproduce the behavior:
Attempt to load a table with long varint value e.g. 1595542979496957281074989646537940572646. It will be loaded as null value.
Expected behavior
varint should be read in full and transferred to target as per source length and content.