Problem Summary
How to move data in Local Data Processing from tables to Azure blob file storage as flat files?
Need clarification for using Local Data Processing.
-
Does it need a separate license for this
-
Is it a must to set up a Hadoop client ?
-
And does the output files on Azure Blob FS come on all sorts of flat files or just in CSV, JSON, and AVRO formats?
Answer
The requirements to write to Azure Blob storage are here: https://www.hvr-software.com/docs/location-class-requirements/requirements-for-azure-blob-fs
A Hadoop client is required because Local Data Processing uses WebHDFS REST calls to deliver data into Blob Storage.
Whether you need a separate license depends on the kind of license you have. Please contact your Local Data Processing account manager for this.
Local Data Processing supports file formats Parquet, JSON, Avro, CSV or XML (https://www.hvr-software.com/docs/actions/fileformat).
If you want something else then you would deliver to one of these formats and transform the data using an AgentPlugin, https://www.hvr-software.com/docs/actions/agentplugin.