0 0
Read Time:1 Minute, 52 Second

Copy AWS S3 to Gen2 to Snowflake: During the previous post we have migrated data from S3 to ADLS Gen2.Feed file from S3 are now available in Azure Storage. To continue our use case, we will load data from Gen2 to Snowflake. To start with

  • We will use existing Gen2 Linked Services i.e. LinkedServiceGen2 (created in last post) to read data from Azure.
  • Create the new Linked services pointing to the Snowflake.
Snowflake Linked Service
  • Create the Pipeline with source and Target in below way:
    • Source pointing to the LinkedServiceGen2
Source LS
  • Target pointing to the snowflake
Target Snowflake
Pipeline Execution

Reason:

Azure Data Factory while copying data from Azure Data Lake Gen 2 to Snowflake use a storage account as stage. If the stage is not configured we get this error in Data Factory even when source is a csv file in Azure data lake. So to resolve the issue we need to create a Staging Area points to the Blob Area. So first data would load into Blob as temporary and final ingest to snowflake.

  • Create the Linked services pointing to the Blob using SAS URI.
    • Generate and Copy the SAS Token
    • Copy the SAS URL as well
Blob SAS URI
Blob LS
  • Now Modify the Pipeline in below way.
Add Staging
  • Verify the table in Snowflake
Snowflake Table
  • Run the complete Pipeline
Gen2 to Snowflake Pipeline
  • Verify the output and log
S3 to Gen2 Log
Gen2 to SF
  • Verify data in Snowflake
Snowflake Table

Hence,We have concluded the complete pipeline i.e. Copy AWS S3 to Gen2 to Snowflake.

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *