Easy Copy with AZ Copy

avidgator 940 views 16 slides Mar 03, 2017
Slide 1
Slide 1 of 16
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16

About This Presentation

The Azure Storage service provides a massively scalable solution for applications that require scalable, durable, and highly available storage for their data.  What are your options if you need to get a bunch of data into, out of, or between your Azure Storage accounts?  .  This talk will offer a...


Slide Content

John Garland Principal Consultant [email protected] @dotnetgator Easy Copy with AzCopy Using AzCopy to work with Azure Storage

Cloud-native storage solution for Microsoft Azure Designed for durability, availability Pay-as-you-go: amount stored & egress (ingress is free) Grouped into “accounts” Up to 500 TB of data stored per account Up to 100 named accounts per subscription (can be increased via support) Storage Services for working with Blob, Table, Queue, File Azure Storage

REST API PowerShell, Azure CLI Azure Storage SDK 1 st & 3 rd Party GUI Tools MSFT Storage Explorer List (updated 7/2015): http :// j.mp/AzureStorageExplorers Direct-Ship Drives Azure Services Working With Data In Azure Storage And what if I want to move data between Storage Accounts?

Command-line utility Copy data to/from Blob, File, Table storage Operates with local file system or between storage accounts Wrapper around .NET Storage API, with parallel execution, journaling, progress http:// aka.ms/downloadazcopy Introducing AzCopy

AzCopy Fundamentals

AzCopy uses a “Journal File” Default Location: % SystemDrive %\Users\%username%\ AppData \Local\Microsoft\Azure\ AzCopy Can be repointed with /z If the file exists If the command matches, the journal contents are used to resume If the command does not match, you are prompted to overwrite the journal The journal file is deleted upon successful completion of an operation Resuming Incomplete Operations

Resuming

JSON or CSV specified with / PayloadFormat value CSV download includes a .schema.csv file for each data file <account name>_<table name>_<timestamp >.< volume index>_< crc >. json Volume Index = m_n Can Split based on partition key range with PKRS parameter – allows parallel export Can Split based on size with SplitSize parameter (Size in MB, min =32) m = partition key range index, n = split index (zeros if omitted) Working with Azure Storage Tables - Export

JSON only Uses a manifest file to locate data files and perform validation Manifest file is created as part of Export <account name>_<table name>_<timestamp>.manifest Specify how to handle PK/RK collisions with EntityOperation – InsertOrSkip , InserOrMerge , InsertOrReplace Working with Azure Storage Tables - Import

Storage Tables

Based on the core data movement framework that powers AzCopy Works with Blobs & Files NuGet & Source (with Samples) NuGet: https://www.nuget.org/packages/Microsoft.Azure.Storage.DataMovement Git : https :// github.com/Azure/azure-storage-net-data-movement Introducing the Data Movement Library One VERY interesting sample – “S3ToAzureSample”

TransferManager Copy, CopyDirectory , Download, DownlodDirectory , Upload, UploadDirectory Set Azure Storage coordinates using “standard” Azure Storage API calls Options (Recursive, SearchPattern , etc.) TransferContext ( ProgressHandler , OverwriteCallback , etc.) There are some “best practices” you should follow to set up HTTP communications the way DML needs it Working With the Data Movement Library

The Data Movement API

April 16, Microsoft Office Alpharetta https:// GAB-ATL.eventbrite.com

April 16, 2016 Microsoft Offices, Alpharetta https://GAB-ATL.eventbrite.com

Microsoft Ignite September 26–30, 2016 Atlanta , GA ignite.microsoft.com