Home Custom Data Types for DataFrame columns when using Spark JDBC
Reply: 0

Custom Data Types for DataFrame columns when using Spark JDBC

user335
1#
user335 Published in April 25, 2018, 5:08 am

I know I can use a custom dialect for having a correct mapping between my db and spark but how can I create a custom table schema with specific field data types and lengths when I use spark's jdbc.write options? I would like to have granular control over my table schemas when I load a table from spark.

You need to login account before you can post.

About| Privacy statement| Terms of Service| Advertising| Contact us| Help| Sitemap|
Processed in 0.393776 second(s) , Gzip On .

© 2016 Powered by mzan.com design MATCHINFO