Skip to main content

Community

Connector Improvement: CLOB Support

Answered

Please sign in to leave a comment.

Comments

3 comments

  • Official comment

    Hi Melinda,

    Thanks for your post on our feature request portal!

    Replicating large objects (such as BLOBs and CLOBs), is a fairly common feature request, and we are hearing various use cases from our customers on why they need this data in their warehouses. This is something we are investigating so that we can properly understand customers' use cases and build an optimal solution.

    I have a few questions about your request:

    1. What types of analytics are you hoping to perform with CLOBs? Will you be querying these binary objects directly, or performing any post-load transformations to extract parts of data that are needed to queries?
    2. What are the typical sizes of the CLOBs you have in your database? An important note is that Oracle has a very high size limit (several GB) whereas Snowflake typically limits the size of objects to several MBs. What functionality would you expect if the size of the object in the source exceeds the max size in the destination?

    As we study use cases and hear from other customers on these two questions, we will build out a plan for CLOB support. While we don't have a committed timeframe yet, once we understand these use cases we should have a plan for this feature soon.

    Hi Kevin,

     

    To answer your questions.  We have CLOB data in both Oracle and SalesForce.  These are really long character data, not binary and we will be parsing those CLOB values into data vault architectures of dimensions and facts.   Also the data we have in CLOB fields is typically less than 250 MB.

    Does that help?

    Has Fivetran added CLOB support for Oracle Databases?  We have CLOBS we need to move to Snowflake.