CloudBerry Explorer offers PowerShell extension to manage file operations across Amazon Simple Storage Service (Amazon S3), Amazon Glacier and file system. Windows PowerShell is a command-line shell that helps IT professionals to easily control system and accelerate automation. It includes several system administration utilities, improved navigation of common management data such as the registry, certificate store, or WMI, etc.
What is good about PowerShell and CloudBerry Explorer Snap-in?
PowerShell Snap-in allows using the majority of Amazon S3 functionality. You can combine CloudBerry Explorer commands with PowerShell commands. PowerShell is designed to operate with .Net objects, so you are not limited to command syntax. You can write complicated scripts with loops and conditions. You can schedule periodical tasks like data backup or cleanup.
PowerShell Snap-in is provided as-is. We don't offer support for this feature.
Supported Storage Providers: Amazon S3, Amazon Glacier, S3 compatible storage providers.
This is an example of copying files from local disk to S3 bucket:
Example:
The file results.xls will be copied to the S3 bucket.
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls"
This can be scheduled for every weekend to copy files into S3 storage (for safety reason for Example).
Example:
This will copy all files and folders from c:\workdata\ to S3 bucket "myBucket". A new directory named by date like 2008_11_01 will be created.
$new_folder_format = Get-Date -uformat "%Y_%m_%d"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket" | Add-CloudFolder $new_folder_format
$src = Get-CloudFilesystemConnection | Select-CloudFolder -path "c:\workdata\"
$src | Copy-CloudItem $destination -filter "*"
Commands
USING SSE-C IN AMAZON S3
You can use Server Side Encryption with Customer-provided key (SSE-C) when uploading files to Amazon S3 and manage already SSE-C encrypted S3 files.
There are new parameters:
-DestinationSseCustomerKey (alias: -DstSSEKey) – defines an encryption key for copy, move or rename operation. This key is needed if you want to encrypt files with SSE-C.
-SourceSseCustomerKey (alias: -SrcSSEKey) – defines an encryption key to download from Amazon S3 or edit file’s settings for file(s) encrypted with SSE-C.
Note: for the operations such as "local to S3" and "S3 to local" you need to specify only one key: -DstSSEKey for upload; -SrcSSEKey for download. For the operations such as “S3 to S3” or rename on S3 you can use two keys and they can be different – it allows you to modify the SSE-C key for already encrypted files.
These parameters were added to the following commands:
Copy-CloudItem
Move-CloudItem
Rename-CloudItem
Set-CloudItemStorageClass (backward compatibility: Set-CloudStorageClass)
Add-CloudItemHeaders
Get-CloudItemHeaders
There is a new command:
Set-CloudItemServerSideEncryption – allows to set or change the SSE settings for existing S3 file (e.g. set/reset SSE-C encryption; reset any SSE encryption; switch SSE to SSE-C, or vice versa)
Example: Upload to Amazon S3 with SSE-C
1. Generate a 256-bit encryption key (256-bit key for AES-256) – this example demonstrates key generation using password-based key derivation functionality PBKDF2.
$iterations = 100000
$salt = [byte[]] (1,2,3,4,5,6,7,8)
$password = "My$Super9Password"
$binaryKey=(New-Object System.Security.Cryptography.Rfc2898DeriveBytes([System.Text.Encoding]::UTF8.GetBytes($password), $salt, $iterations)).GetBytes(32)
IMPORTANT NOTE: $password is just an example value. Make sure to use your character sequence.
2. Copy data from local to Amazon S3 with SSE-C using generated key:
$source | Copy-CloudItem $dest -DstSSEkey $binaryKey -filter *
where $source is a local folder, $dest is Amazon S3 bucket (folder). For example:
$source = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Company\DailyReports"
$s3 = Get-CloudS3Connection -k yourAccessKey -s yourSecretKey
$dest = $s3 | Select-CloudItem "mycompany/reports"
Example: Download SSE-C encrypted file from Amazon S3
$dest | Copy-CloudItem $source -SrcSSEKey $binaryKey -filter "monthlyReport-Jul2014.docx"
To move files, just replace Copy-CloudItem with Move-CloudItem.
Example: Rename existing SSE-C encrypted file with keeping encryption with the same key
$dest | Rename-CloudItem –name "monthlyReport-Jul2014.docx" -newname "monthlyReport-Aug2014.docx" -SrcSSEKey $binaryKey -DstSSEKey $binaryKey
Example: Copy existing SSE-C encrypted file inside S3 with keeping encryption with the same key
$dest | Copy-CloudItem $dest2 -filter “monthlyReport-Jul2014.docx” -SrcSSEkey $binaryKey -DstSSEkey $binaryKey
Example: Set or change SSE-C encryption for existing S3 file
Encrypt non-encrypted S3 file with SSE-C
$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-May2014.docx” -DstSSEkey $binaryKey
Encrypt non-encrypted S3 file with SSE
$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-Apr2014.docx” -SSE
Decrypt SSE-C encrypted S3 file (i.e. reset SSE-C)
$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-May2014.docx” -SrcSSEKey $binaryKey
Reset SSE encryption for S3 file
$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-Apr2014.docx” -SSE:$false
Example: Change Storage Class for SSE-C encrypted file
$dest | Set-CloudItemStorageClass -filter “monthlyReport-May2014.docx” -SrcSSEKey $binaryKey
UPLOAD TO AMAZON GLACIER
You can set the connection to your Amazon Glacier account, set connection options, upload files to Amazon Glacier and set filters for files to upload. Also, you can restore data from Amazon Glacier using PowerShell commands. Check out the Examples below:
Example: Uploading to Amazon Glacier
# Add snap-in
add-pssnapin CloudBerryLab.Explorer.PSSnapIn
# Enable logging and specify the path
Set-Logging -LogPath "C:\Users\user1\AppData\Local\CloudBerry S3 Explorer PRO\Logs\PowerShell.log" -LogLevel Info
# Create connection
$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]
# Set options
Set-CloudOption -GlacierRetrievalRateLimitType Specified
Set-CloudOption -GlacierChunkSizeMB 4
Set-CloudOption -GlacierParallelUpload 1
Set-CloudOption -GlacierPeakRetrievalRateLimit 23.5
# Select vault
$vault = $conn | Select-CloudFolder -Path "us-east-1/[YOUR VAULT]"
# Let's copy to vault
$destination = $vault
# Select source folder
$src = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Tmp[YOUR SOURCE FOLDER PATH]"
# Upload files to Glacier by the filter
#$src | Copy-CloudItem $destination -filter "sample.txt"
# Upload all files to Glacier
$src | Copy-CloudItem $destination -filter "*"
# Delete vault
$conn | Remove-CloudBucket $vault
Example: Retrieving data from Amazon Glacier
# Add snap-in
add-pssnapin CloudBerryLab.Explorer.PSSnapIn
# Enable logging and specify the path
Set-Logging -LogPath "C:\Users\user1\AppData\Local\CloudBerry S3 Explorer PRO\Logs\PowerShell.log" -LogLevel Info
# Create connection
$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]
# Get existing vault
$vault = $conn | Select-CloudFolder -Path "us-east-1/[YOUR VAULT]"
# Get vault inventory.
Note: this command may take up to 5 hours to execute if inventory has not been prepared yet.
$invJob = $vault | Get-Inventory
# Now read vault archives
$archives = $vault | get-clouditem
# Select destination local folder.
$dst = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Tmp [YOUR DESTINATION FOLDER PATH]"
# Copy files from vault. Only files located in C:\Tmp folder are copied.
Note: this command may take many hours to execute when files have not been prepared for copying yet.
$vault | Copy-CloudItem $dst -filter "C:\Tmp\*.*"
ENABLING SERVER SIDE ENCRYPTION
SSE is enabled with "-sse" switch. Applicable for Copy-CloudItem and Copy-CloudSyncFolders commands when uploading to Amazon S3.
Example: Enabling SSE for Copy-CloudItem:
source | Copy-CloudItem $dest -Filter *.mov -sse
Example: Enabling SSE for Copy-CloudSyncFolders:
$src | Copy-CloudSyncFolders $destination -IncludeFiles "*.jpg" -sse
Example: Enable SSL for connection:
# Create a connection with SSL
$s3 = Get-CloudS3Connection -UseSSL -Key $key -Secret $secret
Options supported for Copy-CloudSyncFolders:
-StorageClass defines storage class for files (it can be rrs or standard, or standard_ia)
-IncludeFiles allows to specify certain files for sync using the standard wildcards (for example: *.exe; *.dll; d*t.doc; *.t?t)
-ExcludeFiles allows excluding certain files from sync using the standard wildcards (for Example *.exe; *.dll; d*t.doc; *.t?t)
-ExcludeFolders allows skipping certain folders (for Example bin; *temp*; My*)
Example: Sync only JPG files and setting RRS storage class while syncing the files to the S3 storage
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -Path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudSyncFolders $destination -IncludeFiles "*.jpg" -StorageClass rrs
Example: Sync entire folder excluding \temp folder and .tmp files
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -Path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudSyncFolders $destination -IncludeSubfolders -ExcludeFiles "*.tmp" -ExcludeFolders "temp"
SETTING A STORAGE CLASS
You can set a storage class for a certain file or several files:
Set-CloudStorageClass
Storage Class: rrs, standard, standard_ia
Example: Setting an RRS storage class to a specified item:
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$bucket = $s3 | Select-CloudFolder -Path $bucketname
$item = $bucket | Get-CloudItem $itemname
$item | Set-CloudStorageClass -StorageClass rrs
Example: Setting RRS storage class to all text files in a specified folder:
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$bucket = $s3 | Select-CloudFolder -Path $bucketname
$folder = $bucket | Get-CloudItem $foldername
$folder | Set-CloudStorageClass -Filter *.txt -StorageClass rrs
Or you can set storage class while copying files to S3 storage -StorageClass in Copy-CloudItem.
Example: Setting RRS storage class to a file while uploading it to the S3 storage:
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls" -StorageClass rrs
ADVANCED PARAMETERS FOR "Copy-CloudSyncFolders"
Copy-CloudSyncFolders supports advanced parameters:
-DeleteOnTarget delete files from the target if they no longer exist on the source
-IncludeSubfolders include subfolders into synchronization
-CompareByContent use MD5 hash to compare the content of files (PRO only)
-MissingOnly copy only missing files, ignore files that exist both on source and target
GENERATING WEB URLs
Using Get-CloudUrl you can generate HTTP, HTTPS or RTMP URLs and also HTML code for streaming video files.
Example: Generating short URL for JPG files and save the output to a file
$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -ChilpIt >> C:\urls.txt
Example: Generating signed URL
$dest | Get-CloudUrl -Filter *.jpg -Type HTTPS -Expire 01/01/2011 >> C:\urls.txt
Example: Generating CloudFront signed URL (where $domain is a CloudFront distribution domain name)
$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -Expire 01/01/2011 -DomainName $domain>> C:\urls.txt
Example: Generate signed URL for the private content item (where $domain is Streaming distribution domain name)
$policy = New-CloudPolicy -PrivateKey $privatekey -KeyPairId $keypairid -IsCanned
$dest | Get-CloudUrl -Filter *.flv -Type RTMP -Policy $policy -Expire 01/01/2011 -DomainName $domain >> C:\urls.txt
SETTING CUSTOM CONTENT TYPES AND HTTP HEADERS
Example: Adding a new content type for .flv
Add-CloudContentType -Extension .flv -Type video/x-flv
Get-CloudContentTypes - displays a list of predefined and custom content types
Any file with .flv extension uploaded to S3 will have a proper content-type: video/x-flv.
Example: Getting HTTP headers for an item ($s3 is an S3 connection)
$s3 | Select-CloudFolder myvideos | Get-CloudItem cats.flv | Get-CloudItemHeaders
Example: Setting HTTP headers to items
$headers = New-CloudHeaders Expires "Thu, 1 Apr 12:00:00 GMT"
$s3 | Select-CloudFolder myvideos | Add-CloudItemHeaders -Filter *.flv -Headers $headers
Example: Setting HTTP headers when copy/move
$headers = New-CloudHeaders Cache-Control private
$source | Copy-CloudItem $dest -Filter *.mov -Headers $headers
RENAMING ITEMS
Example: Renaming folder "favorites" to "thrillers" that is located in the bucket "my videos"
$s3 | Select-CloudFolder myvidoes | Rename-CloudItem -Name favourites -NewName thrillers
APPLY ACL FOR ALL SUBFOLDERS AND FILES
Example: Make all files inside "myvideos/thrillers" and its subfolders as public read
$s3 | Select-CloudFolder myvideos/thrillers | Add-CloudItemPermission< -UserName "All Users" -Read -Descendants
SET LOGGING FOR POWERSHELL
Set-Logging -LogPath <path> -LogLevel <value>
Values: nolog, fatal, error, warning, info, debug
ADVANCED OPTIONS (PRO ONLY)
Set-CloudOption -ThreadCount <number>
Defines count of threads for multithreading uploading/downloading.
Set-CloudOption -UseCompression <value>
Defines whether to use compression or not.
Set-CloudOption -UseChunks <value> -ChunkSizeKB <sizeinKB>
Defines the size of the chunk in KB; files larger than a chunk will be divided into chunks.
Values: 1 or 0
If you want to download a file that was divided into chunks on S3 storage you should enable "chunk transparency" mode before downloading the file to download it as a single file:
Set-CloudOption -ChunkTransparency 1
When you copy or move files to S3 these files can inherit ACL from parent object: bucket or folder.
Set-CloudOption -PermissionsInheritance <value>
Values: "donotinherit", "onlyforcloudfront", "inheritall"
Example:
Set-CloudOption -PermissionsInheritance "inheritall"
$s3 = Get-CloudS3Connection <key> <secret>
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls"
The file "result.xls" will automatically have the same ACL as "myBucket/weeklyreport".
Set-CloudOption -KeepExistingHeaders
Keep existing HTTP headers when replacing files on S3.
Set-CloudOption -DoNotChangePermissionsForExisting <value>
Keep ACL for files when replacing them on S3.
Values: 1 or 0
Set-CloudOption -KeepExistingPemissionsOnCloudCopy <value>
Keep source permissions when copying within S3.
Values: 1 or 0
Copy-CloudSyncFolders
Copy-CloudSyncFolders synchronizes local folders with the Amazon S3 bucket. You should specify the source folder (local or S3) in the pipeline.
-Source <CloudFolder> Amazon S3 bucket or folder or local folder
-Target <CloudFolder> Amazon S3 bucket or folder or local folder
Example:
$s3 = Get-CloudS3Connection <key> <secret>
$source = $s3 | Select-CloudFolder -Path boooks/sync
$local = Get-CloudFileSystemConnection
$target = $local | Select-CloudFolder C:\temp\sync
$source | Copy-CloudSyncFolders $target
Or synchronize content in both ways.
$source | Copy-CloudSyncFolders $target -Bidirectional
New-CloudBucket
New-CloudBucket Creates a new bucket. You should specify S3 connection in pipeline.
-Connection <CloudS3Connection> - S3 connection
-Name <String> - Bucket name
-Location <String> - Bucket location. US (USA) or EU (europe). By default US location is used.
Example:
$s3 = Get-CloudS3Connection <key> <secret>
$s3 | New-CloudBucket mytestbucket EU Remove-CloudBucket
Removes bucket. Before bucket will be removed all contents must be removed. It can take a long time, progress is displayed.
-Connection <CloudS3Connection> S3 Connection
-Name <String> Bucket name
-Force Suppress warning messages
-Bucket <CloudFolder> Bucket object
Example:
$s3 | Remove-CloudBucket mytestbucket
Get-CloudItemACL - Returns all access control entry for the specified item. It can be an S3 bucket, folder or file. You can get the item using Select-CloudFolder or Get-CloudItems commands.
-Item <CloudItem> Cloud item, it can be a bucket, s3 folder or s3 file.
Example:
$fld = $s3 | Select-CloudFolder mytestbucket/documents $fld | Get-CloudItemACL
Add-CloudItemPermission - Grants permission to user or group. If the user is not in the ACL, user entry will be added.
-Item <CloudItem> Cloud item, it can be a bucket, s3 folder or s3 file.
-UserName <String> Username or group
-Write Grant write permission
-WriteACP Grant write ACP permission
-Read Grant read permission
-ReadACP Grant read ACP permission
-FullControl Grant full control permission. This means that all other permits will be granted.
-CloudACE <CloudACE> Access control entry
Example:
$fld | Add-CloudItemPermission "All Users" -Read
Remove-CloudItemPermission - Revokes permission to user or group. If the RemoveUser parameter is specified, user entry will be removed from the access control list.
-Item <CloudItem> Cloud item, it can be a bucket, s3 folder or s3 file.
-UserName <String> Username or group
-Write Revoke write permission
-WriteACP Revoke write ACP permission
-Read Revoke read permission
-ReadACP Revoke read ACP permission
-FullControl Revoke full control permission. This means that all other permits will be removed.
-CloudACE <CloudACE> Access control entry
Example:
$fld | Remove-CloudItemPermission "All Users" -Read
Set-CloudOption - Set options for snap-in
-PathStyle <String> - Path style if this flag is specified. VHost otherwise.
-ProxyAddress <String> - Proxy address
-ProxyPort <Int32> - Proxy port
-ProxyUser <String> - Proxy user name
-ProxyPassword <String> - Proxy user password
-CheckFileConsistency - Check file consistency. The MD5 hash is used for checking.
Other Commands
Add-CloudFolder - Create new folder
-Folder <CloudFolder>
- Current folder
-Name <String>
- New folder name
Copy-CloudItem - Copy cloud item (file or folder) to the Destination
-Destination <CloudFolder>
- Destination folder
-Filter <String>
- Item filter, * and ? are permitted
-Folder <CloudFolder>
- Current folder
Get-CloudFilesystemConnection - Get connection to local file system
Get-CloudItem - List files and folder in current folder
-Filter <String>
- Item filter, * and ? are permitted
-Folder <CloudFolder>
- Current folder
Get-CloudRootFolder - Get root folders
-Connection <BaseCloudConnection>
- Connection object
Get-CloudS3Connection - Set Amazon S3 connection
-Key <String>
- Access Key for S3 connection
-Secret <String>
- Secret Key for S3 connection
-Endpoint <String>
- Endpoint for S3 compatible storage
-UseSSL
- Enables SSL for connection
-SignatureVersion
- Defines an authentication version. By default, 4.
Note: to define S3 compatible connection you need to specify Signature Version = 2
$s3 = Get-CloudS3Connection -Key $key -Secret $secret -SignatureVersion 2
Get-CloudGlacierConnection - Set Amazon Glacier connection
-Key <String>
- Access Key for Amazon Glacier connection
-Secret <String>
- Secret Key for Amazon Glacier connection
-UseSSL
- Enables SSL for connection
Move-CloudItem - Move cloud item (file or folder) to the Destination
-Destination <CloudFolder>
- Destination folder
-Filter <String>
- Item filter, * and ? are permitted
-Folder <Folder>
- Current folder
Remove-CloudItem - Remove cloud items (file or folder)
-Filter <String>
- Item filter, * and ? are permitted
-Folder <CloudFolder>
- Current folder
Select-CloudFolder - Get a cloud folder. It must be used for getting a folder for other commands as the current folder.
-Connection <BaseCloudConnection>
- Connection object
-Path <String>
- Path
-Folder <CloudFolder>
- Folder object
Installation
Powershell Snap-In must be registered and added to the console.
System Requirements
.NET Framework 4.0 (full version)
Windows Management Framework 3.0
Registering Snap-In
If the PowerShell is installed before the installation of CloudBerry Explorer, you do not need to install Snap-in. Otherwise, run the following command in the CloudBerry Explorer installation folder (c:\Program Files\CloudBerryLab\CloudBerry Explorer for Amazon S3):
C:\Windows\Microsoft.NET\Framework\v4.0.30319\InstallUtil.exe CloudBerryLab.Explorer.PSSnapIn.dll
Note: For x64 the command must be like : C:\Windows\Microsoft.NET\Framework64\v4.0.30319\InstallUtil.exe "C:\Program Files (x86)\CloudBerryLab\CloudBerry Explorer for Amazon S3\CloudBerryLab.Explorer.PSSnapIn.dll"
Note: For PRO version the default installation folder is "C:\Program Files\CloudBerryLab\CloudBerry S3 Explorer PRO"; on x64 - "C:\Program Files (x86)\CloudBerryLab\CloudBerry S3 Explorer PRO"
Note: You can do this from the command line or PowerShell.
You can verify that the CloudBerry Explorer Snap-in is registered. Run the following command:
Get-PSsnapin -Registered
PowerShell displays registered Snap-Ins. Check that CloudBerryLab.Explorer.PSSnapIn is on the list.
Adding Snap-In to console
You can check that CloudBerry Explorer Snap-in is registered by the running command above.
To add Snap-In to console run the following command:
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
Now the new command will be available.
Exporting console configuration
You should run the Add-PSSnapin command anytime you start PowerShell or you can save the configuration using the following.
- Run PowerShell.
- Add Snap-In to console.
- Run the command: Export-Console CloudBerryExplorerConfig
CloudBerryExplorerConfig is the name of a console file to save the configuration. To start the PowerShell from a saved configuration run the command:
C:\Program Files\Command Shell> PS -PSConsoleFile CloudBerryExplorerConfig.psc1.
CloudBerry Explorer commands will be available.
Not found what you're looking for?
Contact our customer care team here.