AWS’s response to the Google Colab is the AWS SageMaker Studio Lab. In this tutorial, we will show you how to connect AWS SageMaker Studio Lab with S3.
Install the awscli Library
We can work with the Jupyter notebooks directly. First, we will install the awscli
library.
%pip install awscli
Create a Directory to Store the Credentials and the Config files
The next step is to make a directory called .aws
where we will store the credentials
and the config
files.
We make the .aws
directory:
!mkdir ~/.aws
Then we create the credentials
file storing the access and secret keys:
%%writefile ~/.aws/credentials [default] aws_access_key_id = < paste your access key here, run this cell, then delete the cell > aws_secret_access_key = < paste your secret key here, run this cell, then delete the cell >
Finally, we create the config
file.
%%writefile ~/.aws/config [default] region=us-east-1 output=json
Note that the locations of the credentials
and the config
files are:
/home/studio-lab-user/.aws/credentials
/home/studio-lab-user/.aws/config
Let’s do an ls -lta
to see the files and the directories under the parent directory.
!ls -lta
As we can see the .aws
file is there! Let’s get the content of the config
file.
!cat .aws/config
Copy from S3 to Amazon SageMaker Studio Lab
Now we are set. Let’s see how we can copy a bucket from S3 to Studio Lab. Note that the working directory is the /home/studio-lab-user
. Let’s copy the S3 bucket called gpipisbucket
under the /home/studio-lab-user/MyS3Buckets/gpipisbucket
path.
!aws s3 cp s3://gpipisbucket /home/studio-lab-user/MyS3Buckets/gpipisbucket --recursive
And voilà! We copy the S3 bucket to Studio Lab
Related Articles
You may find useful the following articles:
- How to Interact with AWS using AWS Data Wrangler
- A Basic Introduction to Boto3
- How to choose an AWS profile in Boto3
- How to Delete an S3 Object with Boto3
- How to Filter Files from S3 Buckets using S3 Select and Boto3
- How to Interact with S3 using AWS CLI
- AWS S3 CLI ls in a Human Readable Format
- AWS S3 Sync Example