My AWS Snowcone experience.

Or how the new AWS Snowcone service solved my offsite data backup requirements in the times of COVID19.

Disclaimer : For full transparency I haven’t been endorsed by AWS nor by Synology for this article.

TL;DR

For $60 /€50 I got a secure, offline, data transfer of approximately 4TB to my nearest AWS region in about 10 days(including shipping times) with the AWS Snowcone.

Background

I am the lucky owner of a Synology DS218+ NAS equipped with 2*4TB disks configured as a RAID1 volume. In my NAS I store photos, music, videos and personal documents. The total volume of data almost amounts to 4TB so far. My NAS is located at home and can be accessed remotely using the quickconnect.to utility with DynDNS. It has never failed me (touch wood) and I have never been disappointed by the product so far. However I have been facing a dilemma for many months.

A Synology DS218+ NAS

Before the COVID pandemic, my data backup strategy would consist of taking a monthly backup copy of my NAS (with a reminder set in my calendar) using the MS robocopy utility to an external hard drive. The utility works fine and allows to circumvent some of the limitations of the Windows filesystem related to maximum depth or other filenames and directory character limitations. With the speed of a directly attached USB3 external hard drive (5 Gbit/s max) the NAS copy would take several hours overnight. I would then take my backed up data in my external hard drive and leave it at my office in a locked pedestal, until the next month, when I would repeat the process.
I knew I could potentially loose data with this strategy, but this would be limited to a month max (my Recovery Point Objective (RPO)). If my flat had burned down I could get my backup in a matter of a few hours from my office (Recovery Time Objective (RTO)). It was an acknowledged and accepted risk for the cost of the solution. This worked fine for many years.

Until now.

With the covid crisis I did not have the “chance” to set foot in an office anymore as we (finally) started all working remotely. As I write these lines I have not been to an office for nearly a year.

And I never thought this would be an issue…
My issue was that I had nowhere to leave an off-site backup of my data anymore. I had my NAS data and its backup in the same place. Which for me was an unacceptable risk. I could have left a copy of my data at a nearby’s friend but in times of Covid, meeting each other was not always an option.

Of course I thought of offline backups to the Cloud. I attempted to use the features of Synology DSM station to backup my NAS to AWS or other cloud provider object storage service, like the Glacier backup add-on Synology DSM package.

Glacier backup add-on Synology DSM package.

The only thing is that I have this “inconvenient” amount of data (several TB) : Not too big but not too small neither. It is however too much to transfer over my limited ISP connection. With several TB of data over a, say, 5Mbps upload speed connection (on a good day…) this would have taken me nearly 74 days (7.5 weeks, nearly 2 months), if the transfer was to be error-free.

How long does data transfer take
GCP — How long does data transfer take — Google graph.

This was the dilemma I was facing. A roughly 74 days upload time to back up my data to the cloud. Stuck at home due to the restrictions and with my moody broadband/ISP, I looked into alternatives:

  • Transport my NAS (with its inherent risks) to another place with a better Internet connection. Still this would have taken days were I would have had to monitor the upload and pray for no errors in the transfer, my NAS being unavailable during that time.
  • Increase my bandwidth : Fiber is available in my neighbourhood but as I am renting this would mean I would have to request this to my landlord. Over the months this could have become a pricey solution too.
  • Ship my data using an offline data transfer job.

If you are familiar with migration strategy to the Cloud you may be aware that an offline copy of your data is an option when your ISP or link to the Cloud is deemed not fast enough. Enterprises have typically TBs of data (or more) to transfer to the Cloud over several 100Mbps or possibly 1Gbps connections. They face the same challenge of spending days or more to migrate the data to a Cloud provider. This is known as the Fedex bandwidth paradigm.

Public Cloud providers came up with an answer to these lengthy transfer times which was to offer an offline migration process to the Cloud, providing shorter migration times. AWS’s answer has been the Snow family for its Migration & Transfer hub, Google’s GCP has the Transfer Appliance, Microsoft Azure has the Azure Data Box and Alibaba has the Data Transport offering.

But all of these solutions were answering a problem for larger organisations and enterprises to move their TB or PB of data to the cloud in a reasonable amount of time. As an individual I could not order any of these solutions as they would have been an expensive option for my need.

This was the case until approximately 6 months ago in December 2020 when AWS announced to the world their new product Snowcone at Re:invent2020.

From the Snowcone page on the AWS website : AWS Snowcone is the smallest member of the AWS Snow Family of edge computing, edge storage, and data transfer devices, weighing in at 4.5 pounds (2.1 kg) with 8 terabytes of usable storage. Snowcone is ruggedized, secure, and purpose-built for use outside of a traditional data center. Its small form factor makes it a perfect fit for tight spaces or where portability is a necessity and network connectivity is unreliable. You can use Snowcone in backpacks on first responders, or for IoT, vehicular, and drone use cases. You can execute compute applications at the edge, and you can ship the device with data to AWS for offline data transfer, or you can transfer data online with AWS DataSync from edge locations.

When I found out about Snowcone I immediately understood the potential for individuals and/or Small or Medium Enterprises (SMEs) requiring offline data transfer to the Cloud at a fraction of previous costs.

To confirm the cheaper solution I used the AWS Pricing calculator for a 10 days loan (free) of 1 device(s) Snowball Edge Storage Optimized Data Transfer Only x 300.00 job fee per device = 300.00 total fee (On-Demand) for 4TB : 4,096 GB x 0.03 USD per GB = 122.88 USD for export jobs to Frankfurt eu-central-1 AWS region. AWS calculator indicated a total of 422.88 USD (object storage cost excluded).

10 days loan (free) of 1 device(s) for 4TB export jobs to Frankfurt eu-central-1 AWS region.

In comparison transferring to an AWS region using an AWS Snowcone device would have been a fraction of that price :

Pricing example found at https://aws.amazon.com/snowcone/pricing/

So for a transfer job of 5TB to a S3 bucket in the Frankfurt eu-central-1 AWS region this cost (only) $60 /€50 (for 5 days), plus another 5days x $6 for a 10 day service, $90 in total (object storage cost excluded), roughly 5 times cheaper than with a AWS Snowball Edge (less if you return the appliance within 5 days), with the following cost breakdown:

AWS Snowcone cost breakdown

With this cost in mind I headed to the AWS Console’s Migration & Transfer section, AWS Snow Family to start an offline data transfer job with AWS Snowcone, part of the Snow family. There are several use cases (import/export to S3 and local edge computing) for the snow family device usage and selected the Import into Amazon S3 option.

AWS Snowcone shipping preferences
Various AWS Snowcone job options (storage and compute)

You need to provide your own power supply for the AWS Snowcone, which in times of greater adoption of the USB-C port should not be much of an issue.

Next screen in the job setup ask you to confirm the encryption for the device. The keys for encryption can be easily generated from the AWS KMS service.
You have the choice to use the default AWS KMS key or to create your own, which is indeed the preferable option.

Following AWS recommended method I choose to generate a symmetric KMS Customer Master Key (CMK) from the Key Management Service KMS in the Security, Identity, & Compliance section of the console.

Specify the Amazon Resource Name (ARN) you have generated with KMS.

AWS generates a service-linked role — a unique type of IAM role that is linked directly to an AWS service — associated with your transfer job.

Apply your SNS notification preferences

Snowcone job review screen :

In the console you can find the job summary

AWS Snowcone data transfer job summary

The transfer job is created and you can then check its status in the console: (Job created / Preparing device / Preparing shipment / In transit to you / Delivered to you / In transit to AWS /At sorting facility / At AWS / Importing / Completed).

Sample SNS notification for Snowcone transfer.
AWS preparing Snowcone

Preparing shipment

AWS has finished preparing the device for this job, and we’re getting ready to ship it to you. Your UPS tracking number will be available soon.

In transit to you

Your device for this job is being shipped to you by UPS.

It has arrived ! The appliance was delivered on the morning of the 3rd day after I created the AWS Snowcone transfer job (I chose standard posting). I was expecting the Snowcone to be shipped from the AWS region of eu-central-1 Frankfurt and it had been the case.

Here are the photos of the device. The device does not come with any packaging. The electronic ink, similar to the one of an eReader is easily readable.

AWS Snowcone

Once the transfer job created AWS instructs you to install the AWS Ops Hub for Snow Family utility which is available for free for multiple Operating Systems (OS), Windows, macos and Linux. This is the way to operate and interact with the AWS Snowcone device.

AWS OpsHub takes all the existing operations available in the Snowball API and presents them as a simple graphical user interface. This interface helps you quickly and easily migrate data to the AWS Cloud and deploy edge computing applications on Snow Family Devices.

Using your Snowcone

  • Download the Snow client, the command line client you’ll use. Get AWS OpsHub
  • Connect the appliance to your network, and then get its IP address from the digital display. Learn More
  • Get your credentials (manifest and unlock code) from the section at the bottom of this page to authenticate to the device.
  • Then transfer your on-premises data onto the device. The first 10 days you have the device on-site are free. Learn more about pricing
AWS OpsHub for Snow Family utility — Select your device.
AWS OpsHub for Snow Family utility — Set up your device.

Obtain the credentials from the AWS Console : the unlock code and the manifest file (binary JIDd1438262–1cbb-48cb-9b4a-574fqwertzdfzg_manifest.bin).

AWS OpsHub for Snow Family utility — unlock your Snowcone.
AWS Snowcone OpsHub utility dashboard
AWS Snowcone OpsHub utility dashboard
AWS Snowcone OpsHub dashboard — devices
AWS Snowcone OpsHub dashboard — services

Click on enable and start NFS file storage service.

Restricting allowed hosts to my home network.
NFS service enabled and mountpoint available.

After having copied my data to the Snowcone appliance I posted it 5 days later back to AWS at my nearest UPS point access point.

Et voila !

The import has been completed and a job completion report has been generated

AWS Snowcone job completion report
All my data have been imported to S3, standard class.

Some aspects to consider for your own usage of the AWS Snowcone:

  • RTFM ! Before getting yourself an AWS Snowcone (or other Snow Family products), do read the the AWS Snowcone user manual to get familiar with the requirements.
  • Even though you are doing a copy from you local LAN to the AWS Snowcone, this still may take sometime depending on the amount to transfer. Please find some of the published AWS Snowcone performance:
Snowcone performance : https://docs.aws.amazon.com/snowball/latest/snowcone-guide/snowcone-performance.html
  • I was not too sure when the free 5 days of usage were starting and ending. Upon receipt of the appliance ? And when you ship it back ?
  • I intended to connect my NAS and directly attach the AWS Snowcone to the USB3.0 port. The AWS Snowcone do has a USB-C port however I found out it is disabled, as per the AWS Snowcone user manual : “USB The first USB-C connection is not active”. This was disappointing as I wanted to use the Synology DSM GUI to create my copy to AWS Snowcone job over a USB3.0 port.
  • The only file server option is NFS and not CIFS. If you have a Windows based client, you must use a method to mount the NFS share to your MS Windows OS.
mount -t nfs4 192.168.2.104:/buckets/mynasbackuptos3 /mnt/snowcone
  • Transfer was unexpectedly slow. My home network is set to 1Gbps and the copy speed would peak at around 50mbps. I had hoped for a faster connection on a local LAN. I do not know if the speed limitation was caused by the AWS Snowcone, my Synology DSM, or something else…
Snowcone resource usage during copy to NFS service.
Synology transfer status.
  • After a while the AWS Snowcone fan started to get noisier. Not an issue in itself but you would notice it if you walked in a room and this was on.
  • During initial setup the Snowcone would not join my local network, neither using DHCP nor manual setup as the eth1 interface would not come up (no blinking LED). I suspected the antiquated router in my flat(auto negotiating at 100Mbps) was the cause of it. I was right and this is confirmed in the AWS Snowcone user manual. Snowcone does not support any links speed lower than 1Gbps. I used a switch with 1Gbps network speed and the RJ45 interface came up instantly.
  • You would need to ensure your USB-C power supply can power the Snowcone device. You require at least 45W of power and your USB-C phone charger may not suffice. Do check in the in the AWS Snowcone user manual.

For $60 /€50 I got a secure, offline, data transfer of approximately 4TB to AWS for an off site backup, in about 10 days(including shipping times) with the AWS Snowcone.

AWS by releasing the Snowcone product has clearly identified a niche for its customers. Much less bulky than its Snowball parent (or absolutely tiny compared to a snowmobile), a more cost-effective solution the Snowcone really answered a need for private users, media professionals in or out in the field and Small & Medium Enterprises (SMEs). It filled a gap for private users and SMEs alike, an offer which is not available (yet) with other public CSPs.

Written by Philippe Le Gal — PLuG IT Consulting.

#Synology #NAS #AWS #Snowcone #Transfer #Hub #offsite #data #backup #COVID19 #DS218+ #DSM #backup # archive #Windows #CIFS #NFS #WSL #off-site #off-line #Covid #offline #object #storage #service #S3 #Synology #aws #cloud #Snowcone #migration #AWS #Migration #Hub #AWS #Application #Migration #Service #AMS #Application #Discovery #Service #ADS #Database #Migration #Service #DBMS #Server #Migration #Service #SMS #Transfer #Family #Snow #Family #DataSync #awscloud #cloudcomputing #data #cybersecurity #cloudsecurity #Fedex #bandwidth #Snow #family #GCP #Transfer #Appliance #Microsoft #Azure #Data #Box #Alibaba #Data #Transport #Re:invent2020 #edge #computing #storage #Small or Medium Enterprises #SME #Pricing #calculator #Frankfurt #eu-central-1 #Migration #Transfer #Export #Import #KMS #CMK #symmetric #Security #Identity #Compliance #service-linked #role #UPS #appliance #electronic #ink #eReader #transfer #job #Ops #Hub #utility #Manifest #Dashboard #USB-C #mount #nfs4 #DHCP #eth1 #1Gbps #network #speed #RJ45 #interfaces #secure #offline #data #transfer #Philippe #Le #Gal #PLuGITConsulting.

--

--

Philippe Le Gal - PLuG IT Consulting

Philippe Le Gal is an Infrastructure / Solutions architect with 20 years of experience in IT in on-prem technical architectures and Cloud technologies.