https://github.com/apache/spark
Revision 19991047d5b5316412d8b1763807c5945a705bff authored by Gengliang Wang on 28 July 2022, 18:26:34 UTC, committed by Gengliang Wang on 28 July 2022, 18:26:34 UTC
### What changes were proposed in this pull request?

In Spark 3.3, the error message of ANSI CAST is improved. However, the table insertion is using the same CAST expression:
```
> create table tiny(i tinyint);
> insert into tiny values (1000);

org.apache.spark.SparkArithmeticException[CAST_OVERFLOW]: The value 1000 of the type "INT" cannot be cast to "TINYINT" due to an overflow. Use `try_cast` to tolerate overflow and return NULL instead. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
```

Showing the hint of `If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error` doesn't help at all. This PR is to fix the error message. After changes, the error message of this example will become:
```
org.apache.spark.SparkArithmeticException: [CAST_OVERFLOW_IN_TABLE_INSERT] Fail to insert a value of "INT" type into the "TINYINT" type column `i` due to an overflow. Use `try_cast` on the input value to tolerate overflow and return NULL instead.
```
### Why are the changes needed?

Show proper error messages on the overflow errors of table insert. The current message is super confusing.

### Does this PR introduce _any_ user-facing change?

Yes, after changes it show proper error messages on the overflow errors of table insert.

### How was this patch tested?

Unit test

Closes #37311 from gengliangwang/PR_TOOL_PICK_PR_37283_BRANCH-3.3.

Authored-by: Gengliang Wang <gengliang@apache.org>
Signed-off-by: Gengliang Wang <gengliang@apache.org>
1 parent 609efe1
History
Tip revision: 19991047d5b5316412d8b1763807c5945a705bff authored by Gengliang Wang on 28 July 2022, 18:26:34 UTC
[SPARK-39865][SQL][3.3] Show proper error messages on the overflow errors of table insert
Tip revision: 1999104

README.md

back to top