AsyncS3Bucket

class lsst.ts.salobj.AsyncS3Bucket(name: str, *, create: bool = False, profile: str | None = None, domock: bool = False)

Bases: object

Asynchronous interface to an Amazon Web Services s3 bucket.

Parameters:
namestr

Name of bucket. If using Amazon Web Services see <https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingBucket.html> for details. In particular note that bucket names must be globally unique across all AWS accounts.

createbool, optional

If true and the bucket does not exist, create it. If false then assume the bucket exists. You will typically want true if using a mock server (domock true).

profilestr, optional

Profile name; use the default profile if None.

domockbool, optional

If true then start a mock S3 server. This is recommended for running in simulation mode.

Notes

Reads the following Environment Variables; follow the link for details:

  • S3_ENDPOINT_URL: The endpoint URL, e.g. http://foo.bar:9000.

The format for bucket names, file keys, and largeFileEvent event URLs is described in CAP 452

Attributes:
service_resourceboto3.resources.factory.s3.ServiceResource

The resource used to access the S3 service. Primarly provided for unit tests.

namestr

The bucket name.

profilestr

The profile name, or None if not specified.

bucketboto3.resources.s3.Bucket

The S3 bucket.

Methods Summary

download(key[, callback])

Download a file-like object from the bucket.

exists(key)

Check if a specified file exists in the bucket.

make_bucket_name(s3instance[, s3category])

Make an S3 bucket name.

make_key(salname, salindexname, generator, date)

Make a key for an item of data.

size(key)

Get the size in bytes of a given file in the bucket.

stop_mock()

Stop the mock s3 service, if running.

upload(fileobj, key[, callback])

Upload a file-like object to the bucket.

Methods Documentation

async download(key: str, callback: collections.abc.Callable[[int], None] | None = None) BytesIO

Download a file-like object from the bucket.

Parameters:
keystr

Name of the file in the bucket.

callbackcallable, optional

Function to call with updates while writing. The function receives one argument: the number of bytes read so far. If the transfer is successful then it will always be called at least once, and the sum of the number of bytes for all calls will equal the size of the file.

Returns:
fileobjio.BytesIO

The downloaded data as a file-like object.

Notes

To convert a file-like object fileobj to an astropy.io.fits.HDUList named hdulist:

hdulist = astropy.io.fits.open(fileobj)

The callback function is called by S3.Bucket.download_fileobj.

async exists(key: str) bool

Check if a specified file exists in the bucket.

Parameters:
keystr

Name of the potential file in the bucket.

static make_bucket_name(s3instance: str, s3category: str = 'LFA') str

Make an S3 bucket name.

Parameters:
s3instancestr

S3 server instance. Typically “Summit”, “Tucson” or “NCSA”.

s3categorystr, optional

Category of S3 server. The default is “LFA”, for the Large File Annex.

Returns:
bucket_namestr

The S3 bucket name in the format described below:

Raises:
ValueError

If one or more arguments does not meet the rules below or the resulting bucket name is longer than 63 characters.

Notes

The rules for all arguments are as follows:

  • Each argument must start and end with a letter or digit.

  • Each argument may only contain letters, digits, and “.”.

The returned bucket name is cast to lowercase (because S3 bucket names may not contain uppercase letters) and has format:

rubinobs-{s3category}-{s3instance}]
static make_key(salname: str, salindexname: str | int | None, generator: str, date: Time, other: str | None = None, suffix: str = '.dat') str

Make a key for an item of data.

Parameters:
salnamestr

SAL component name, e.g. ATPtg.

salindexnamestr, int, or None

For an indexed SAL component: a name associated with the SAL index, or just the index itself if there is no name. Specify None for a non-indexed SAL component. For example: “MT” for Main Telescope, “AT” for auxiliary telescope, or “ATBlue” for the AT Blue Fiber Spectrograph.

generatorstr

Dataset type.

dateastropy.time.Time

A date – typically the date the data was taken.

otherstr or None, optional

Additional text to identify the data and make the key unique. If None use date.tai.isot: date as TAI, in ISO-8601 format, with a “T” between the date and time and a precision of milliseconds.

suffixstr, optional

Key suffix, e.g. “.fits”.

Returns:
keystr

The key, as described below.

Notes

The returned key has format:

{fullsalname}/{generator}/{yyyy}/{mm}/{dd}/
    {fullsalname}-{generator}-{other}{suffix}

where:

  • fullsalname = {salname}:{salindexname} if salindexname, else salname.

  • yyyy, mm, dd are the “observing day”: the year, month and day at TAI date - 12 hours, with 4, 2, 2 digits, respectively. The “observing day” does not change during nighttime observing at the summit. Year, month and day are determined after rounding the date to milliseconds, so the reported observing day is consistent with the default value for other.

Note that the url field of the largeFileObjectAvailable event should have the format f”s3://{bucket}/{key}”

async size(key: str) int

Get the size in bytes of a given file in the bucket.

Parameters:
keystr

Name of the file in the bucket.

stop_mock() None

Stop the mock s3 service, if running. A no-op if not running.

async upload(fileobj: BinaryIO, key: str, callback: collections.abc.Callable[[int], None] | None = None) None

Upload a file-like object to the bucket.

Parameters:
fileobjfile-like object

File-like object that can be read as binary data.

keystr

Name to use for the file in the bucket.

callbackcallable, optional

Synchronous function to call with updates while writing. The function receives one argument: the number of bytes written. If the transfer is successful then it will always be called at least once, and the sum of the number of bytes for all calls will equal the size of the file. The function is called by the boto3 library, which is why it must be synchronous.

Notes

To create a file-like object fileobj from an astropy.io.fits.HDUList named hdulist:

fileobj = io.BytesIO()
hdulist.writeto(fileobj)
fileobj.seek(0)