Skip to content
This repository has been archived by the owner on Feb 12, 2025. It is now read-only.

ClickHouseDialect, wrong definition of timestamp type #463

Open
EbiousVi opened this issue Jul 23, 2024 · 0 comments
Open

ClickHouseDialect, wrong definition of timestamp type #463

EbiousVi opened this issue Jul 23, 2024 · 0 comments

Comments

@EbiousVi
Copy link

Environment

  • OS version: MacOS 14
  • JDK version: 17
  • ClickHouse Server version: 23.10.6.60
  • Spark version: 3.5.0
  • Project dependencies
    • com.clickhouse:clickhouse-jdbc:jar:0.6.3:compile
    • com.github.housepower:clickhouse-integration-spark_2.12:jar:2.7.1:compile
    • com.github.housepower:clickhouse-spark-runtime-3.4_2.12:jar:0.7.3:compile

Steps to reproduce

  1. create table with type DateTime('Europe/Moscow')
    CREATE TABLE IF NOT EXISTS default.foo
    (
    application_id String,
    event_datetime DateTime('Europe/Moscow'),
    event_receive_datetime DateTime('Europe/Moscow')
    )
    ENGINE = Log

  2. Try to read table with Spark datasource api v2

When execute this code spark throws TABLE OR VIEW NOT FOUND exception. When I debugging the code, I realized that the case condition is not satisfied. Link to source code where it happens

Although the type matches the regular expression. I'm not an expert in Scala. Help me understand why this happens. Sample code attached.

java code.txt
pom.txt

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant